Home > Blockchain >  I keep getting this question wrong. Counting Bits using javascript
I keep getting this question wrong. Counting Bits using javascript

Time:06-24

This is the question. Given an integer n, return an array ans of length n 1 such that for each i (0 <= i <= n), ans[i] is the number of 1's in the binary representation of i.

https://leetcode.com/problems/counting-bits/

And this is my solution below. If the input is 2, expected output should be [0,1,1] but I keep getting [0,2,2]. Why is that???

var countBits = function(n) {
    //n=3. [0,1,2,3]
    var arr=[0];
     for (var i=1; i<=n; i  ){
         var sum = 0;
         var value = i;
         while(value != 0){
             sum  = value%2;
             value /= 2;
         }
         arr.push(sum);
     }
    return arr;
};

console.log(countBits(3));

CodePudding user response:

You're doing way too much work.

Suppose b is the largest power of 2 corresponding to the first bit in i. Evidently, i has exactly one more 1 in its binary representation than does i - b. But since you're generating the counts in order, you've already worked out how many 1s there are in i - b.

The only trick is how to figure out what b is. And to do that, we use another iterative technique: as you list numbers, b changes exactly at the moment that i becomes twice the previous value of b:

const countBits = function(n) {
    let arr = [0], bit = 1;
    for (let i = 1; i <= n; i  ){
        if (i == bit   bit) bit  = bit;
        arr.push(arr[i - bit]   1);
    }
    return arr;
};

console.log(countBits(20));

This technique is usually called "dynamic programming". In essence, it takes a recursive definition and computes it bottom-up: instead of starting at the desired argument and recursing down to the base case, it starts at the base and then computes each intermediate value which will be needed until it reaches the target. In this case, all intermediate values are needed, saving us from having to think about how to compute only the minimum number of intermediate values necessary.

CodePudding user response:

use floor():

sum  = Math.floor(value%2);
value = Math.floor(value/2);

I guess your algorithm works for some typed language where integers division results in integers

CodePudding user response:

Think of it this way: if you know how many ones are there in a number X, then you immediately know how many ones are there in X*2 (the same) and X*2 1 (one more). Since you're processing numbers in order, you can just push both derived counts to the result and skip to the next number:

let b = [0, 1]

for (let i = 1; i <= N / 2; i  ) {
    b.push(b[i])
    b.push(b[i]   1)
}

Since we push two numbers at once, the result will be one-off for even N, you have to pop the last number afterwards.

  • Related