char
is a type that have one byte in C , in a way that we can use it as signed
or unsigned
, changing the values it can allocate. I'm new using debugger in Visual Studio and also in reading about memory. I'm using the following code:
int main() {
signed char a = 170;
signed char* b = &a;
}
the range of a
variable should be -128 to 127, the value of the variable is converted and -86 is settled, but when I get the value of the b
variable, which is the memory alocation, to see what is there I get:
0x0019F99B aa cc cc cc cc 6d f3 ea 54 c4 f9 19 00 13 1f a0 00 01 00 00 00 60 78 76 00 d0 b5 76 00 01 00 00 00 60 78 76 00 d0 b5 76 00 20 fa 19 00 67 1d a0 00 e9 f0 ea 54 23 10 a0 00 23 10 a0 00 00 60 ªÌÌÌÌmóêTÄù.... .....`xv.еv.....`xv.еv. ú..g. .éðêT#. .#. ..
but aa in hexadecimal evaluates to 170. What happened?
CodePudding user response:
The 170
literal is an int
represented by 0x000000AA
.
When you convert that into a single byte signed char
, it simply truncates the bytes, so you wind up with 0xAA
, which happens to be -86
in twos-complement notation.