Home > Software engineering >  If I want to store an integer into a char type variable, which byte of the integer will be stored?
If I want to store an integer into a char type variable, which byte of the integer will be stored?

Time:04-19

int a = 0x11223344;

char b = (char)a;

I am new to programming and learning C. Why do I get value of b here as D?

CodePudding user response:

If I want to store an integer into a char type variable, which byte of the integer will be stored?

This is not fully defined by the C standard.

In the particular situation you tried it, what likely happened is that the low eight bits of 0x11223344 were stored in b, producing 4416 (6810) in b, and printing that prints “D” because your system using ASCII character codes, and 68 is the ASCII code for “D”.

However, you should be wary of something like this working, because it is contingent on several things, and variations are possible.

First, the C standard allows char to be signed or unsigned. It also allows char to be any width that is eight bits or greater. In most C implementations today, it is eight bits.

Second, the conversion from int to char depends on whether char is signed or unsigned and may not be fully defined by the C standard.

If char is unsigned, then the conversion is defined to wrap modulo M 1, where M is the largest value representable in char. Effectively, this is the same as taking the low byte of the value. If the unsigned char has eight bits, its M is 255, so M 1 is 256.

If char is signed and the value is out of range of the char type, the conversion is implementation-defined: It may either trap or produce an implementation-defined value. Your C implementation may wrap conversions to signed integer types similarly to how it wraps conversions to unsigned types, but another reasonable behavior is to “clamp” out-of-range values to the limits of the type, CHAR_MIN and CHAR_MAX. For example, converting −8000 to char could yield the minimum, −128, while converting 0x11223344 to char could yield the maximum, 127.

Third, the C standard does not require implementations to use ASCII. It is very common to use ASCII. (Usually, the character encoding is not just ASCII, because ASCII covers only values from 0 to 127. C implementations often use some extension beyond ASCII for values from 128 to 255.)

  • Related