Home > Software design >  What is the endian order of the host machine/device?
What is the endian order of the host machine/device?

Time:05-17

Actually the initial question was: "What exactly the kCGBitmapByteOrderDefault is?" Because sometimes CGImageGetByteOrderInfo(CGImageRef image) returns kCGBitmapByteOrderDefault as a result. But this just tells that it is equal to the endian order of the host machine/device.

Then the question automatically changes to: "What is the endian order of the host machine/device?"

Apple docs tell that there are several constants which can be the exact answer:

kCGImageByteOrder16Big
kCGImageByteOrder16Little
kCGImageByteOrder32Big
kCGImageByteOrder32Little

So, I need solution which returns some value from the list above.

CodePudding user response:

You said:

Because sometimes CGImageGetByteOrderInfo(CGImageRef image) returns kCGBitmapByteOrderDefault as a result. But this just tells that it is equal to the endian order of the host machine/device.

It is not the endian order of the machine. It is the default endianness of CoreGraphics. See What does kCGBitmapByteOrderDefault actually mean? on Apple Forums. They point out that while it is not documented, kCGBitmapByteOrderDefault appears to employ big-endian order.

One can confirm this by creating a kCGBitmapByteOrderDefault context, and another with kCGBitmapByteOrder32Big, and compare how they render a particular color. On my little-endian machine, the results of kCGBitmapByteOrderDefault were the same as those from kCGBitmapByteOrder32Big, not kCGBitmapByteOrder32Little. This is consistent with claims made in that forum discussion.

FWIW, PNG and JPG use “network byte order” (i.e., big-endian). Also, there is an intuitive appeal to have, for example, the bytes for an RGBA image appear in the order of red, green, blue, and then alpha, rather than the other way around. If CoreGraphics wanted to pick a standard used regardless of hardware, big-endian is not an unreasonable choice. I just wish they documented it.


That having been said, when I need to access/manipulate a pixel buffer for an image, I always render the image to a buffer with an explicit endianness as outlined in Apple’s Technical Q&A QA1509. Personally, though, I let let CoreGraphics manage the data provider for me, rather than manual malloc/free suggested by that old Technical Q&A doc. See this permutation of QA1509’s code.

  • Related