I am ANDing two characters and assigning the result to the first character. 'a' '\u0889'
'a' in binary 00000000 01100001
'\u0889' in binary 00001000 10001001
So the & will be 00000000 00000001
Why does it output the ☺ emoji
char comp58 = 'a';
char comp96 ='\u0889';
Console.WriteLine(comp96&=comp58);
CodePudding user response:
That is because when you are ANDing the two characters, the result is explicitly casted as a character. And it turns out that ASCII character 1 (start of heading), when rendered, is a smiley face ☺.
So what you are doing is:
char comp58 = 'a';
char comp96 ='\u0889';
comp96 &= comp58;
Console.WriteLine(comp96); // comp96 is a character and is treated as one. ASCII 1 is smiley face.
If you want to output as an integer you need to convert it to an integer.
Console.WriteLine((int)(comp96&=comp58));
or
char comp58 = 'a';
char comp96 ='\u0889';
int andResult = com96 & comp58;
Console.WriteLine(andResult);
CodePudding user response:
Your code produces the exact result you can expect.
internaly the characters are utf-16 so
'a' (or as unicode written '\u0061') is in binary representation 00000000 01100001 (97 in decimal) '\u0889' is in binary representation 00001000 10001001 (2185 in decimal)
a bitwise and will result in 00000000 00000001 or in decimal 1
Then you write this unicode character to a non unicode console. On my system for example the console uses CP850. And when you look that up here https://en.wikipedia.org/wiki/Code_page_850. This Codepage shares the first half wit CP437 that you can lookup here: https://en.wikipedia.org/wiki/Code_page_437#Characters an the character with code 1 is the smiley face
You could use Console.OutputEncoding = System.Text.Encoding.UTF8;
to change your console encoding to utf-8 or switch to any encoding you want to use.