The question states - For an array, find the maximum value of (j-i)*min(a[j], a[i]), where i and j are distinct indices of the array in the range {0,1, .., n-1} where n is the array length.
I went with an ill-thought-out solution where I have 2 pointers at both ends of the array and reject the one with a smaller array value. But it is still working. How?
i, j = 0, n-1
ansSoFar = 0
while i <= j:
ansSoFar = max(ansSoFar, (j-i)*min(A[i], A[j]))
if A[i] > A[j]:
j -= 1
else:
i = 1
return ansSoFar
CodePudding user response:
Suppose A[0] is the smaller out of A[0] and A[n-1].
In this case, this is the best pairing we could possibly get with A[0] because if we change the other half of the pair we will definitely reduce (j-i), and may also reduce the value of A[j].
So we know that the best answer is either the current value (including A[0] and A[n-1]), or the best we can get out of elements 1..n-1.
Similarly, if A[n-1] is the smaller, the best answer is either the current value or the best we can get out of elements 0..n-2.
The given code is therefore correct, because it evaluates the current value with the extremes, and then reduces the problem by chopping off the head or the tail as appropriate.