Home > database >  Why casting short* to int* shows incorrect value
Why casting short* to int* shows incorrect value

Time:01-20

To better learn how malloc and pointers work internally, I created an array of short. On my system, int is double the size of short, so I created another pointer q of type int* and set its address to the casted value of p:

#include <stdio.h>
#include <stdlib.h>
#include <assert.h>

int main() {
    short* p = (short*) malloc(2 * sizeof(short));
    int* q = (int*) p;
    
    assert(sizeof *q == 2 * sizeof *p);
    
    p[0] = 0;
    p[1] = 1;
    
    printf("%u\n", *q);
}

When I print *q it shows the number 65536 instead of 1 and I can't figure out why. If I understand correctly, p should be represented as the following (assuming short is 2 bytes and int is 4 bytes):

        p[0]                  p[1]
0000 0000 0000 0000 | 0000 0000 0000 0001

So *q should read 4 bytes hence reading the value 1. Instead it shows 65536 which is represented as:

0000 0000 0000 0001 0000 0000 0000 0000

CodePudding user response:

Most systems these days use little-endian byte ordering, which mean that the least significant byte comes first.

So the bytes starting at p[1] contain 0x01 0x00, not 0x00 0x01. This also means the bytes starting at p[0] are 0x00 0x00 0x10 0x00. If these bytes are then interpreted as a 4 byte int it has the value 0x00010000, i.e. 65536 decimal.

Also, reinterpreting bytes in this fashion (i.e. taking a pointer to one type, casting it to another pointer type, and dereferencing), is an aliasing violation and triggers undefined behavior, so there is no guarantee this will always work in this way.

CodePudding user response:

This is due to endianness (https://en.wikipedia.org/wiki/Endianness).

This determines which byte comes first in memory. Therefore, if you flip the bytes in your representation, you get exactly what you provided as the representation for 65536.

You seem to be on a little endian machine.

  • Related