Home > Software design >  What is difference between sizeof(int)*10 and sizeof(40)
What is difference between sizeof(int)*10 and sizeof(40)

Time:11-27

A

int *numptr = malloc(sizeof(int)*10);

B

int *numptr = malloc(sizeof(40));

it's on the 32bit

i can't understand what is difference. there is no information in the book i have.

is A and B 100% same thing?

CodePudding user response:

You're allocating a different amount of space in each case.

For case A, you have first have sizeof(int). Presumably, an int is 4 bytes on your system, so this expression evaluates to 4. So malloc(sizeof(int)*10) is allocating space for 4 * 10 = 40 bytes.

For case B, you have sizeof(40). This is giving you the size of the constant 40 whose type is int, so sizeof(40) is 4. This then means that malloc(sizeof(40)) is allocating space for 4 bytes.

CodePudding user response:

An int isn’t guaranteed to be 4 bytes wide. It’s only guaranteed to represent values in the range [-32767..32767], so it’s only guaranteed to be 16 bits (2 bytes) wide.

Yes, it’s 4 bytes on most modern desktop platforms, but it doesn’t have to be.

Besides, 10 * sizeof (int) more clearly conveys that you’re allocating space for 10 int objects.

CodePudding user response:

40 is an integer, so sizeof(40) should return the same thing as sizeof(int). Thus, sizeof(int) * 10 is the size of 10 integers, but sizeof(40) is the size of a single integer.

  •  Tags:  
  • c
  • Related