To address your first question, what does the following code do?
Code:
int i = 1;
char *p = (char *) &i;
The first line declares an integer and assigns one into it.
The second line says give me a pointer to a character and point to the address of the integer declared in line one. This can be broken down as follows:
is the definition of the character pointer
is the address of the integer i
is a type cast from integer pointer to character pointer
You second question identifies why this may be necessary:
An integer will consist of at least two bytes (16 Bits) Different operating systems store integers differently, in what is known as Big endian or little endian. This indicates the order in which the bytes are stored internally,
as a programmer you don't need to worry about it because the compiler will always manage the storage of an integer in a consistent way. Big endian stores the most significant bytes in the left most position whilst little endian stores the least significant bytes in the left most position.
So where does the character pointer come in? This provides a useful overlay on the storage of an integer
Code:
Big Endian 00000000 00000001 most significant byte stored on the left
Little Endian 00000001 00000000 least significant byte stored on the left
|______| |______|
char pointer [0] [1]
By treating the character pointer as an array of characters the first element will look at the left most byte of the integer and it is then possible to determine if the integer is stored internally as a Big endian or a little endian number.
So after all that why may we want to know how a number is stored?
I said that as a programmer we don't need to know, I lied
It is important when we connect computers on a network, because they may be running different O/S the way they store numbers internally (such as an IP address or a port number) may differ and so it is important to ensure that on a network they 'speak' the same language, this is often referred to as network byte order.
As an example take a Windows machine (which uses Big endian) and Linux machine (which uses Little endian); if a two byte number 1 is sent from one machine to the other without any regard on the byte order then the receiving machine will not see the number one but because the byte order is switched from what it expect it will see the number 256.
Sorry for being so long, but I hope that helps.
graeme