Home > front end >  why this code isn't converting decimals to binary from decimals 0 to 31
why this code isn't converting decimals to binary from decimals 0 to 31

Time:08-16

So I was trying to convert a decimal number to binary using c. In this code every thing seems to be working well. Also this code does work for decimals from 32, 33 and go on. But this code doesn't work for decimals from 0 to 31. What's the bug in here.

#include <stdio.h>
#include <math.h>

int main(void)
{

    int decimal;

    printf("Enter the decimal value: ");
    scanf("%i", &decimal);
    int n, remainder;
    int i, j, k;

    for (int i = 0; i < decimal; i  )
    {
        if (pow(2, i) <= decimal)
        {
            n = i;
        }
    }

    char index[n];
    int quotient[n];

    quotient[0] = decimal;

    for (i = 0; i <= n; i  )
    {
        quotient[i   1] = quotient[i] / 2;
        remainder = quotient[i] % 2;

        if (remainder == 1)
        {
            index[i] = '1';
        }
        else
        {
            index[i] = '0';
        }
    }

    for (int k = n; k >= 0; k--)
    {
        printf("%c", index[k]);
    }

    return 0;
}

CodePudding user response:

The number returned by scanf() is a bit pattern (binary) representation of the base10 value you enter.

You've got the right idea (trying to 'pick apart' those bits), but the method used is dubious and confusing.

Below is a loop of some familiar values being converted to strings of 1's and 0's.

Consider what it is doing at each step...

int main() {
    for( int i = 253; i <= 258; i   ) {
        printf( "Decimal %d: ", i );

        unsigned int bitmask = 0;
        bitmask = ~bitmask;
        bitmask &= ~(bitmask >> 1); // High bitmask ready

        // skip over leading 0's (optional)
        while( bitmask && (bitmask & i) == 0 ) bitmask >>= 1;

        // loop using bitmask to output 1/0, then shift mask
        do {
            putchar( (bitmask & i) ? '1' : '0' );
        } while( (bitmask >>= 1) != 0 );

        putchar( '\n' );
    }
    return 0;
}

Output:

Decimal 253: 11111101
Decimal 254: 11111110
Decimal 255: 11111111
Decimal 256: 100000000
Decimal 257: 100000001
Decimal 258: 100000010

CodePudding user response:

I don't like this, but you seem to want to use pow() to find the highest set bit in the incoming integer. Perhaps this will lead you to the solution you are looking for.

int main() {

    int n = 0;

    // generate a series of numbers as input
    for( int decimal = 1; decimal < 1000*1000*1000; decimal = decimal * 2   1 ) {

        // Limited of 32 bit integers INCLUDING sign bit
        for (int i = 0; i < 31; i  ) {

            int guess = (int)pow( 2, i ); // make a guess with this value
            printf( "Guess %d\n", guess );

            if( guess > decimal ) { // guess now encompasses set bits
                n = i; // Found what is needed.
                break;
            }
        }
        printf( "decimal input = %d: examining %d bits\n", decimal, n );
        getchar();
    }


    return 0;
}

Here's one sample of the output of the above.

Guess 1
Guess 2
Guess 4
Guess 8
Guess 16
Guess 32
Guess 64
decimal input = 63: examining 6 bits

Note: this sort of thing will only work for positive values. It'll probably blow up if you want the bit pattern of a negative integer.

And, because you want to store an array of quotient, you need to dimension it to have 1 elements to avoid stepping out of bounds as you peel-off each bit through division...

  • Related