lets say I have an array of 1's and 0's like below:
[1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0, 1]
and lets say I have a subarray from it like below:
[1, 0, 0, 1]
how to I find the total number of 1's given any subarray (like the one above) in O(1) time?
the method constructed so that the above can happen should take O(n) time.
I thought about constructing a prefix sum of the input array (which should take O(n) time ).
So the prefix sum array for the above example is:
[1, 1, 1, 2, 3, 3, 4, 4, 4, 5, 6, 6, 7]
Then when given a subarray, I grab the start and end indexes of the sub array - and match it to the corresponding values in the prefix sum array and subtract the values like: value at end index - value at start index
so the example subarray above has the end index: 3
and start index 0
if look at indexes 0
and 3
in the prefix sum array, we get the values 1
and 2
respectively.
so following the formula we do 2 - 1
we get 1
. So this indicates that the number of 1s in the example subarray is 1 but instead it is 2.
so this doesn't seem to work for subarrays that contain the very first or the last element of the input array. Help would be much appreciated.
CodePudding user response:
The "prefix sum" array is good and should work. Note that the prefix sum array should have one more element than the original array.
If original array a
is indexed from 0 to n-1, then prefix-sum array s
should be indexed from 0 to n, with s[k]
equal to sum of first k elements: s[0] = 0
and s[k] = a[0] ... a[k-1]
. Then the sum of subarray start,end
is equal to s[end 1]-s[start]
.
With your example:
a = [1, 0, 0, 1, 1, 0, 1, 0, 0, 1, 1, 0, 1]
s = [0, 1, 1, 1, 2, 3, 3, 4, 4, 4, 5, 6, 6, 7]
sum([1,0,0,1] = sum(start=0,end=3) = s[3 1] - s[0] = 2 - 0 = 2.