How a computer knows it's a char and not a number?
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
The computer is a pretty dumb machine. It's only virtue is that it can do lots (and when we say "lots" we really mean it) of very simple operations with numbers in very short amounts of time.
All the chips know about are numbers (well, not even that, but anyway...). It's up to the applications, programming languages, interpreters and, ultimately, ourselves, to give one or another meaning to the results that the machine shows in one or another form.
Distribution: Cinnamon Mint 17.3 and 18 at present.
A character is represented by an octet (8 bytes)
Hmmm... not as I remember it. That should be 8 bits. In ASCII code, the letter "A" is 10000010 (one byte) in binary or 101 in octal (I've still got an old DEC PDP 11 programming card lying about somewhere.) but then, your computer has to know you're using ASCII rather than something else.
Other ASCII codes are here
At the low level, its 8 Bits = 1 byte in std ASCII in modern systems, aka octet.
The reason for 'octet' is that other systems don't always use 8 bits.
Note that a word size is the basic default size of a sequence of bits used by the computer. Usually a power of 2.
Over time we've had 4, 8, 16, 32 and now 64 bit words.
Note that old mainframe systems sometimes used others sizes eg 36 (note that's a multiple of 2, although not an integer power).
Incidentally, IBM mainframes use a different notation from ASCII, known as EBCDIC.
Word sizes vary a lot - historically there have been 1 bit computers (ALUs actually, used in old calculators), 12 bit (PDP-8 and PDP-12), 18 bit (PDP-15, and PDP-10 sort of, later 36 bit DecSystem 10). One odd use of the DecSystem 10 was support for variable byte sizes (1 - 36 bits). This was used occasionally for error recovery reading raw data from 9 track tapes (8 data bits + 1 parity). Errors could then be analyzed by software to determine how to correct data above and beyond what the hardware could do (I was doing this to recover data from broken backup tapes).
One of the more extreme general purpose computers was the IBM stretch (http://en.wikipedia.org/wiki/IBM_7030_Stretch) which had support for 1 to 8 bit bytes, 7 bit registers, and up to 64 bit floating point , and even a 128 bit accumulator.