#include<stdio.h>
int main()
{
int i = 577;
printf("%c",i);
return 0;
}
After compiling, its giving output "A". Can anyone explain how i'm getting this?
CodePudding user response:
%c
will only accept values up to 255 included, then it will start from 0 again !
577 % 256 = 65; // (char code for 'A')
CodePudding user response:
This has to do with how the value is converted.
The %c
format specifier expects an int
argument and then converts it to type unsigned char
. The character for the resulting unsigned char
is then written.
Section 7.21.6.1p8 of the C standard regarding format specifiers for printf
states the following regarding c
:
If no
l
length modifier is present, theint
argument is converted to anunsigned char
, and the resulting character is written.
When converting a value to a smaller unsigned type, what effectively happens is that the higher order bytes are truncated and the lower order bytes have the resulting value.
Section 6.3.1.3p2 regarding integer conversions states:
Otherwise, if the new type is unsigned, the value is converted by repeatedly adding or subtracting one more than the maximum value that can be represented in the new type until the value is in the range of the new type.
Which, when two's complement representation is used, is the same as truncating the high-order bytes.
For the int
value 577, whose value in hexadecimal is 0x241, the low order byte is 0x41 or decimal 65. In ASCII this code is the character A
which is what is printed.
CodePudding user response:
Just output the value of the variable i
in the hexadecimal representation
#include <stdio.h>
int main( void )
{
int i = 577;
printf( "i = %#x\n", i );
}
The program output will be
i = 0x241
So the least significant byte contains the hexadecimal value 0x41
that represents the ASCII code of the letter 'A'
.
CodePudding user response:
577 in hex is 0x241. The ASCII representation of 'A'
is 0x41. You're passing an int
to printf
but then telling printf
to treat it as a char
(because of %c
). A char
is one-byte wide and so printf
looks at the first argument you gave it and reads the least significant byte which is 0x41.
To print an integer, you need to use %d
or %i
.
CodePudding user response:
How does printing 577 with %c output "A"?
With printf()
. "%c"
matches an int
argument*1. The int
value is converted to an unsigned char
value of 65 and the corresponding character*2, 'A'
is then printed.
This makes no difference if a char
is signed or unsigned or encoded with 2's complement or not. There is no undefined behavior (UB). It makes no difference how the argument is passed, on the stack, register, or .... The endian of int
is irrelevant. The argument value is converted to an unsigned char
and the corresponding character is printed.
*1All int
values are allowed [INT_MIN...INT_MAX]
.
When a char
value is passed as ...
argument, it is first converted to an int
and then passed.
char ch = 'A';
printf("%c", ch); // ch is converted to an `int` and passed to printf().
*2 65 is an ASCII A
, the ubiquitous encoding of characters. Rarely other encodings are used.
CodePudding user response:
It doesn't have to output A
. It happens to do so in your case since %c
makes it take one byte (from the stack most likely) and it found that part of an int
which happend to represent an A
.