6

I've read descriptions online describing big and little endian. However, they all seem to basically read the same way and I am still confused on the the actual implementation regarding "most" and "least" significant bytes. I understand that little endian values evaluate the "least significant" values first and under big endian the "most significant" bytes are evaluated first. However, I'm unclear as to the meaning of "most" and "least" significant. I think it would help me to understand if I use an actual example which I will put forth here:

I have an integer value: 12345

If I convert it to a hex value using the Windows calculator, I get a value of: 3039 (basically a two byte value). Is the value 3039 showing the bytes representing the integer value 12345 stored as a little or big endian value and how do I determine this based on the value?

David Segonds
  • 83,345
  • 10
  • 45
  • 66
GregH
  • 12,278
  • 23
  • 73
  • 109

1 Answers1

12

Endian-ness refers to how numbers are stored in memory. It has nothing to do with evaluation order of bytes. If memory addresses increase left to right across this page, then on a big-endian machine your number would be stored

30 39

and on a little-endian machine

39 30

Your calculator is always going to display numbers as we read them, which is the big-endian way, even though numbers are stored in little-endian fashion on the Intel hardware you're probably using.

Kyle Jones
  • 5,492
  • 1
  • 21
  • 30