I'm sure the answer is simple but I can't get my head around so I hope you can help me :)
I'm trying to print the bits of a unsigned char, here with the value 2, the issue is that I get a wrong answer when casting from argv[1][0] and the good one when sending directly the number 2.
void print_bits(unsigned char octet)
{
int i;
unsigned char b;
i = 8;
while (i--)
{
b = (octet >> i & 1) '0';
write(1, &b, 1);
}
}
int main(int argc, char **argv)
{
print_bits((unsigned char)argv[1][0]);
write(1, "\n", 1);
print_bits(2);
write(1, "\n", 1);
return (0);
}
gives me :
00110010
00000010
I notice when debugging that, if I print the "octet" variable at the start of print_bits
, I get the ASCII value of 2 when casting from argv and the value "\002" when directly inputting the value 2.
It feels like a casting issue ?
Thank you!
CodePudding user response:
char
are just integers, when you input '2' from argv, it reads the character '2' (which is equal to 50 in ASCII). In your second case, you input the int 2, hence the two different outputs. Your function is correct and so is the output.