After long time, I was doing some experiments on array with this program and printing output in decimal using %lu.
The confusing part I observed is when I use cast with unsigned long, array address '&thing 1' increment by just 1,
140733866717248 140733866717248
140733866717249 140733866717249
When I don't cast then array addresses,
140720750924480 140720750924480
140720750924481 140720750924488
How address in first part increment by just 1 on int type address '&'?
How casting affect the values here?
Example program:
#include <stdio.h>
int main(void)
#if 0 /* (unsigned long)&thing --> 140733866717248
(unsigned long)&thing 1 --> 140733866717249*/
{
int thing[8];
printf("%lu %lu\n", (unsigned long)thing, (unsigned long)&thing );
printf("%lu %lu\n", (unsigned long)thing 1, (unsigned long)&thing 1);
return 0;
}
#endif
#if 1 /* &thing --> 140720750924480
&thing 1 --> 140720750924488*/
{
int thing[8];
printf("%lu %lu\n", thing, &thing );
printf("%lu %lu\n", thing 1, &thing 1);
return 0;
}
#endif
CodePudding user response:
In the first example, you are adding 1 to unsigned long values, so that just adds 1
In the second example, you are adding 1 to a pointer, which increases the pointer value by the size of the pointed at type. So with thing 1
, thing
is an int *
, so it increases by sizeof(int), while with &thing 1
, &thing
is an int (*)[8]
, so it increases by the size of that (32).
Result from running the code you posted:
140733007047872 140733007047872
140733007047876 140733007047904