Home > OS >  representing a hexadecimal value by converting it to char
representing a hexadecimal value by converting it to char

Time:04-03

so I am outputting the char 0x11a1 by converting it to char than I multiply 0x11a1 by itself and output it again but I do not get what I expect to get as by doing this {int hgvvv = chch0;} and outputting to the console I can see that the computer thinks that 0x11a1 * 0x11a1 equals 51009 but it actually equals 20367169

As a result I do not gat what I want.

Could you please explain to me why?

char chch0 = (char)0x11a1;
Console.WriteLine(chch0);
chch0 = (char)(chch0 * chch0);
Console.WriteLine(chch0);
int hgvvv = chch0;
Console.WriteLine(hgvvv);

CodePudding user response:

We know that 1 bytes is 8 bits. We know that a char in c# is 2 bytes, which would be 16 bits.

If we multiply 0x11a1 X 0x11a1 we get 0x136c741.

0x136c741 in binary is 0001001101101100011101000001

Considering we only have 16 bits - we would only see the last 16 bits which is: 1100011101000001

1100011101000001 in hex is 0xc741.

This is 51009 that you are seeing.

You are being limited by the type size of char in c#. Hope this answer cleared things up!

CodePudding user response:

By enabling the checked context in your project or by adding it this way in your code:

checked {
    char chch0 = (char)0x11a1;
    Console.WriteLine(chch0);
    chch0 = (char)(chch0 * chch0); // OverflowException
    Console.WriteLine(chch0);
    int hgvvv = chch0;
    Console.WriteLine(hgvvv);
}

You will see that you will get an OverflowException, because the char type (2 bytes big) is only able to store values up to Char.MaxValue = 0xFFFF.

The value you expect (20367169) is larger than than 0xFFFF and you basically get only the two least significate bytes the type was able to store. Which is:

Console.WriteLine(20367169 & 0xFFFF);
// prints: 51009
  •  Tags:  
  • c#
  • Related