Given a finite random sequence of bits, how can the minimum number of bit toggles necessary to result in a sorted sequence (i.e. any and all 0's are before any and all 1's) be determined?
Note, homogeneous sequences (e.g. 0000000
and 1111111
) are considered sorted by this definition.
Also, this is not technically "sorting" the sequence because elements are toggled in-place, not restricted to swapping with other elements, is there better word to describe this activity than "sorting"?
CodePudding user response:
Let Z(n) be the cost of setting the first n bits all 0.
Let X(n) be the cost of minimum cost of "sorting" the first n bits.
We have:
Z(0) = 0, X(0) = 0
if the ith bit is 0: Z(i) = Z(i-1), X(i) = min( Z(i-1), X(i-1) 1 )
if the ith bit is 1: Z(i) = Z(i-1) 1, X(i) = X(i-1)
The answer is X(n).
It's even easier in code:
z=0
x=0
for bit in sequence:
if bit == 0:
x = min(z,x 1)
else:
z = z 1
return x
CodePudding user response:
One canonical dynamic program would be to evaluate in O(1) time and space 4 states for each bit: the cost of keeping the bit the same or toggling it, while assigning it as rightmost or leftmost of the relevant section (also incurring the relevant cost for the implied toggles due to the assignment).
Sorry, the above applies to the general problem - where "sorted" could be in either direction.