So I'm currently looking into endianness and I noticed that we read and write binary as well as hex from right to left. Why is that? It goes against every fiber in my body since I'm so used to writing numbers left to right with the normal decimal system.
For instance 01 5E
is 350
. Is it 350 big endian or little endian?
This Console.WriteLine(BitConverter.IsLittleEndian);
prints out True
meaning that my current CPU architecture uses Little Endian to read numbers.
Doing this
int x = 350;
Console.WriteLine(x.ToString("X"));
Prints out 15E
and this is where I get confused because looking over at https://www.scadacore.com/tools/programming-calculators/online-hex-converter/ at the bottom left corner after inputting 015E
it says that it's 350 Big Endian, so why is it printing out 15E
in my application if my CPU uses Little Endian?
CodePudding user response:
Endianness is not usually directly observable - it is an implementation detail right until you do things like blitting between primitive types and raw memory. Your use of ToString("X")
does not do this - it uses integer rules, and integer rules hide endianness details, and use human-friendly rules, i.e. big-endian. So basically: this is a non-issue in your case. If you were using unsafe
, BitConverter
, or similar: then it might matter.
CodePudding user response:
Internally, 350 is stored as 5E 01 00 00, but you're printing the int number expressed in hex, which is a human readable operation, which prints 15E. If you cast the int to a byte array, or saved the int to disk, and dumped the bytes as hex, you'd see the 5E 01 00 00. Endianness is really an intimate internal CPU architecture thing.