How could the array defined as int
store the string values? Just look at the code, arri[]
is defined as an integer but storing string values? And also the array defined as a character is storing integer values. How is this possible?
int main(void) {
int arri[] = {'1' , '2', 'a'};
int *ptri = arri;
char arrc[] = {11, 21 , 31 };
char *ptrc = arrc;
printf("%d" , *arri);
printf("%d" , *ptri);
printf("%d" , *arrc );
printf("%d" , *ptrc);
return 0;
}
CodePudding user response:
How could the array defined as int store the string values?
There are no strings in the code snippets you provided.
In this declaration
int arri[] = {'1' , '2', 'a'};
the initializers that represent integer character constants having the type int
are used to initialize elements of the array. These character constants are stored internally as their codes. For example in the ASCII character table the integer character constants internally have correspondingly values 49, 50, and 97.
Here is a demonstrative program
#include <stdio.h>
int main(void)
{
int arri[] = {'1' , '2', 'a'};
const size_t N = sizeof( arri ) / sizeof( *arri );
for ( size_t i = 0; i < N; i )
{
printf( "'%c' = %d ", arri[i], arri[i] );
}
putchar( '\n' );
return 0;
}
The program output is
'1' = 49 '2' = 50 'a' = 97
When the conversion specifier %c
is used the function printf
tries to output them as (graphical) symbols.
Pay attention to that when the conversion specifier %d
is used to output an object of the type char
then there is performed the integer promotion that promotes the object of the type char
to an expression of the type int
.
In this declaration
char arrc[] = {11, 21 , 31 };
the integer constants have values that fit into the range of values that can be stored in an object of the type char
.
In the both cases there is no truncation or overflow.
CodePudding user response:
The first thing to make clear is that don't actual store a character like 'a'
anywhere inside the computer. You actually store a number. For 'a'
that number is decimal 97. The computer itself has no knowledge about this being an 'a'
. The computer only sees it as a number. It's only when you send that number to a device expecting characters (e.g. a terminal, a printer, etc) that some device driver changes the number to display of the character 'a'
.
See https://en.wikipedia.org/wiki/ASCII for a description of the mapping between charcters and numbers.
The C standard allows you to use characters just as-if they were numbers. The compiler automatically converts the character to the corresponding number. Therefore
int x = 'a';
is exactly the same as
int x = 97;
and your line
int arri[] = {'1' , '2', 'a'};
is the same as
int arri[] = {49 , 50, 97};
As already mentioned the type char
is just storing numbers - just like the type int
. The difference is just the range of numbers that can be stored. Typically a char
is 1 byte of memory and int
is 4 bytes (but it's system dependant).
So this code
char arrc[] = {11, 21 , 31 };
simply stores those 3 decimal numbers. Typically using 1 byte for each number.
The interresting part is this line:
printf("%d" , *arrc );
Here *arrc
is the number 11 stored in 1 byte (typically). So how can it be printed using %d
which expects an int
?
The answer is "Default argument promotions". For variadic functions (like printf
) this means that integer types "smaller" than int
shall be converted to int
before the function call. Note that char
is considered an integer type so this also applies for char
.
So in your case the number 11 stored in char
(1 byte) will automatically be converted to the number 11 stored in int
(4 bytes). Consequently the printf
function will receive an int
and will be able to print is as such.
CodePudding user response:
In the statement
int foo = 3.14159;
the double
value is automatically converted to int
(to 3
) [implicit conversion]. There is nothing that prohibits the conversion, so the assigment ("of a double
to an int
") is ok.
Same thing with your example
char foo = 65; // the int value is implicitly converted to type char
char bar[] = { 66, 67, 0 }; // 3 conversions ok
char baz = 20210914; // possibly erroneous conversion from int to char
// in this case the compiler will, probably, warn you
// 20210914 is beyond the range of char, so this is, technically, UB
Note that
int a = 'b';
the 'b'
above is a value of int
type, so there really is no conversion.
char b = 'c'; // implicit conversion from int to char ok
int c = b; // implicit conversion from char to int ok