Quote:
Originally Posted by Chris_M
Is anyone able to explain why I am getting different results?
|
Code:
$ hexdump /bin/bash | head -n1
0000000 457f 464c 0101 0001 0000 0000 0000 0000
$ hexdump -d /bin/bash|head -n1
0000000 17791 17996 00257 00001 00000 00000 00000 00000
$ hexdump -C /bin/bash | head
00000000 7f 45 4c 46 01 01 01 00 00 00 00 00 00 00 00 00 |.ELF............|
Without options, hexdump outputs two-byte numbers. So, the first value is the number 0x457f, or decimal 17791.
However, byte 7f is located at position 0, byte 45 is located at position 1. This is so because Intel CPUs (and the ARM CPU in my tiny server, as I just confirmed) are little-endian: numbers are stored so that the lower bytes come first.
hexdump -C shows each byte as it is positioned: First 7f, then 45 and so on. This works for strings and, in general, one-byte values.
Quote:
Coming from big-endian, I find the x86 little endian format arcane beyond words.
|
Finally somebody who shares my pain. It was much more intuitive when UNIX ran on 68000 processors.