Endianness, “Most Significant”, and “Least Significant”

北城余情 提交于 2019-12-10 13:14:55

问题


I've read descriptions online describing big and little endian. However, they all seem to basically read the same way and I am still confused on the the actual implementation regarding "most" and "least" significant bytes. I understand that little endian values evaluate the "least significant" values first and under big endian the "most significant" bytes are evaluated first. However, I'm unclear as to the meaning of "most" and "least" significant. I think it would help me to understand if I use an actual example which I will put forth here:

I have an integer value: 12345

If I convert it to a hex value using the Windows calculator, I get a value of: 3039 (basically a two byte value). Is the value 3039 showing the bytes representing the integer value 12345 stored as a little or big endian value and how do I determine this based on the value?


回答1:


Endian-ness refers to how numbers are stored in memory. It has nothing to do with evaluation order of bytes. If memory addresses increase left to right across this page, then on a big-endian machine your number would be stored

30 39

and on a little-endian machine

39 30

Your calculator is always going to display numbers as we read them, which is the big-endian way, even though numbers are stored in little-endian fashion on the Intel hardware you're probably using.



来源:https://stackoverflow.com/questions/8830917/endianness-most-significant-and-least-significant

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!