LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Non-*NIX Forums > Programming
User Name
Password
Programming This forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.

Notices


Reply
  Search this Thread
Old 07-06-2010, 03:27 PM   #31
ForzaItalia2006
Member
 
Registered: Dec 2009
Location: Walldorf, Germany
Distribution: (X)Ubuntu, Arch, Gentoo
Posts: 205

Rep: Reputation: 67

Quote:
Originally Posted by bluegospel View Post
Forza, please bear with me. I have not ignored any of the posts here. I need time to sort through them all, and to study their content. Most of them are over my head, as C++ is very new to me, and my programming experience in general is very limited (which should be obvious). Please, continue to be patient with me.
Sure, we stay patient :-) But, it would be great to get some progress on this issue, because it is a very simple problem. Let's start again ...

The distinction between an integer and a string (char *) should be clear. A string (char *) is a contiguous amount of memory containing character encodings, e.g. one ASCII character per 1 byte, finalized by a NUL ('\0') byte in the case of ASCII.

Environment variables generally intend to modify the behavior of your program without changing the source code or re-compiling your code. For example, the environment variable LD_LIBRARY_PATH is used to instruct the dynamic loader where to search for dynamic libraries. Linux and other operating systems store environment variables as strings (char *). So, if you set the environment variable CONTENT_LENGTH to, say, 1024, you will NOT get the value 1024 but the string "1024" consisting of the ASCII values '1' , '0', '2', '4'. So if do a getenv("CONTENT_LENGTH"), you get a pointer to the above character sequence. For practical purpose, let's assume that getenv(3) returns the address 0xc0000000 (decimal: 3221225472).

So far, so good. No matter where this environment variable comes from, but what you want to do is to convert this string "1024" ('1','0','2','4') into the value 1024. I hope that this distinction is clear either, because otherwise we would need to start at the very beginning of computer science, and that definitely isn't the purpose of this forum.

So, based on this theory, let's get back to your code.

Code:
char* billschedContentLength = getenv("CONTENT_LENGTH");
The character pointer now points to the value of the environment variable "CONTENT_LENGTH", as above 0xc0000000. If you would dereference this pointer, you would get the value 49, assuming that the environment variable has the value "1024".

Code:
int nContentLength = atoi(billschedContentLength);
Here, you convert the string "1024" into the value 1024, which is good!!! :-)

Code:
billschedBuffer = malloc(billschedContentLength+1);
As you said, malloc(3) is a function to allocate some memory from the heap. To allocate this memory, you need to tell malloc(3) the amount of memory to reserve. malloc(3) expects this value to be of type 'size_t' which is 'unsigned long' on most platforms. Now, what you finally do is, to pass the address (0xc0000000) to malloc and not the value 1024. Just to be clear, neither C nor C++ implicitly convert strings (char *) to numbers. In your case, you would allocate a _very_ large amount of memory, although you only want to allocate 1024 bytes. So, finally, you need to change this last line of code to

Code:
billschedBuffer = malloc(nContentLength+1);
Clear?

But, just a different question. You always said, that you got this template from somewhere? Could you give us this link? Because this template seems to be completely incorrect ...

Andi
 
Old 07-06-2010, 04:06 PM   #32
bluegospel
Member
 
Registered: Jan 2010
Distribution: centOS
Posts: 404

Original Poster
Rep: Reputation: 53
Thank you. Having studied some more tutorials coupled with the information provided in this thread, I'm getting a better understanding of variables, strings, arrays and pointers in C++. I've just never dealt with a language this low-level, directly working with memory and such. Thanks for your time & input.

Here's that link (the excerpt is halfway down): http://www.geekdaily.net/2007/08/06/...-cgi-tutorial/
 
Old 07-06-2010, 04:19 PM   #33
ForzaItalia2006
Member
 
Registered: Dec 2009
Location: Walldorf, Germany
Distribution: (X)Ubuntu, Arch, Gentoo
Posts: 205

Rep: Reputation: 67
Quote:
Originally Posted by bluegospel View Post
Thank you. Having studied some more tutorials coupled with the information provided in this thread, I'm getting a better understanding of variables, strings, arrays and pointers in C++. I've just never dealt with a language this low-level, directly working with memory and such. Thanks for your time & input.

Here's that link (the excerpt is halfway down): http://www.geekdaily.net/2007/08/06/...-cgi-tutorial/
Okay, thanks for providing this link. That explains where the env vars come from and it also explains where your error comes from. BTW, you should have looked at the attached code (at this tutorial), http://library.thinkquest.org/16728/data/cgi.cpp, there the coding is correct ...

Andi
 
Old 07-06-2010, 04:23 PM   #34
Sergei Steshenko
Senior Member
 
Registered: May 2005
Posts: 4,481

Rep: Reputation: 454Reputation: 454Reputation: 454Reputation: 454Reputation: 454
Quote:
Originally Posted by bluegospel View Post
Thank you. Having studied some more tutorials coupled with the information provided in this thread, I'm getting a better understanding of variables, strings, arrays and pointers in C++. I've just never dealt with a language this low-level, directly working with memory and such. Thanks for your time & input.

Here's that link (the excerpt is halfway down): http://www.geekdaily.net/2007/08/06/...-cgi-tutorial/
Don't trust anybody who teaches you - including us .

Verify everything in the official documentation (manpages, language standard).

Probably start from "C" - I think I've already suggested this.
 
Old 07-06-2010, 04:50 PM   #35
bluegospel
Member
 
Registered: Jan 2010
Distribution: centOS
Posts: 404

Original Poster
Rep: Reputation: 53
Code:
billschedBuffer = malloc(billschedContentLength+1);
I do have one question. Why do we add 1 to the length we obtain from the environmental variable?
 
Old 07-06-2010, 05:12 PM   #36
MTK358
LQ 5k Club
 
Registered: Sep 2009
Posts: 6,443
Blog Entries: 3

Rep: Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723
Because strings in C must end with a 0, which takes up an extra memory slot.
 
Old 07-07-2010, 07:33 AM   #37
bluegospel
Member
 
Registered: Jan 2010
Distribution: centOS
Posts: 404

Original Poster
Rep: Reputation: 53
I've been hung up at one point: I was going to ask, "how can an array of characters include an integer at its end?" But, having researched this, I'll rather ask if the following was in error:

Quote:
A string is actually a pointer to the first character of an array of chars. The last character of a string is always 0 (not ASCII '0', but the integer 0. It's often specified as "'\0'" instead of "0" in this case.).
Should this read "The last character of a string is always 0 (not the integer 0, but ASCII '\0'?]"

Otherwise, I'm confused, since an array is supposed to be of one data type.

Last edited by bluegospel; 07-07-2010 at 07:37 AM.
 
Old 07-07-2010, 07:38 AM   #38
MTK358
LQ 5k Club
 
Registered: Sep 2009
Posts: 6,443
Blog Entries: 3

Rep: Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723Reputation: 723
Quote:
Originally Posted by bluegospel View Post
Should this read "The last character of a string is always 0 (not the integer 0, but ASCII '0'?]"
WRONG WRONG WRONG

Quote:
Originally Posted by bluegospel View Post
Otherwise, I'm confused, since an array is supposed to be of one data type.
ASCII and integers are the SAME DATA TYPE! char is actually a very short int (because there aren't many ASCII characters, using int would be overkill and a serious waste of RAM), and it can be used to store small numbers!

http://en.wikipedia.org/wiki/Ascii

ASCII is just a specification of what integer corresponds to what picture printed on the screen. That's all. The CPU has no concept of ASCII, characters, or letters.

In fact, when the C compiler comes across a character in single quotes, it just looks up an ASCII table and pastes in the appropriate integer (this is a bit oversimplified, but you get the idea).

For example, the ASCII character '0' is actually just the integer value 48, and 'A' is 65.

Last edited by MTK358; 07-07-2010 at 07:42 AM.
 
Old 07-07-2010, 07:49 AM   #39
Sergei Steshenko
Senior Member
 
Registered: May 2005
Posts: 4,481

Rep: Reputation: 454Reputation: 454Reputation: 454Reputation: 454Reputation: 454
Quote:
Originally Posted by bluegospel View Post
I've been hung up at one point: I was going to ask, "how can an array of characters include an integer at its end?" But, having researched this, I'll rather ask if the following was in error:



Should this read "The last character of a string is always 0 (not the integer 0, but ASCII '\0'?]"

Otherwise, I'm confused, since an array is supposed to be of one data type.
If you have a null-terminated string, its last character is ASCII '\0'. Which for the majority of CPUs means a single byte with 0 value.

Last edited by Sergei Steshenko; 07-07-2010 at 08:00 AM.
 
1 members found this post helpful.
Old 07-07-2010, 07:50 AM   #40
bluegospel
Member
 
Registered: Jan 2010
Distribution: centOS
Posts: 404

Original Poster
Rep: Reputation: 53
Quote:
ASCII is just a specification of what integer corresponds to what picture printed on the screen. That's all. The CPU has no concept of ASCII, characters, or letters.
Okay, so 00000011 is fundamentally a 3 no matter what data type is specified. But the 3 is interpreted differently depending how it's used. It's not the 00000011 being interpreted by use, but the number it represents. So 00000011 does not mean one thing here, and another there. The 3 means one thing here, and another there, because 00000011 is fundamentally a 3.

Last edited by bluegospel; 07-07-2010 at 07:54 AM.
 
1 members found this post helpful.
Old 07-07-2010, 07:52 AM   #41
Sergei Steshenko
Senior Member
 
Registered: May 2005
Posts: 4,481

Rep: Reputation: 454Reputation: 454Reputation: 454Reputation: 454Reputation: 454
Quote:
Originally Posted by bluegospel View Post
Okay, so 00000011 is fundamentally a 3 no matter what data type is specified. But the 3 is interpreted differently depending how it's used. It's not the 00000011 being interpreted by use, but the number it represents. So 00000011 does not mean one thing here, and another there. The 3 means one thing here, and another there, because 00000011 is fundamentally a 3.
Well, but if you consider 10011, it may be 19 or, for example, 13 - the latter is for BCD (Binary Coded Decimals).
 
Old 07-07-2010, 07:59 AM   #42
bluegospel
Member
 
Registered: Jan 2010
Distribution: centOS
Posts: 404

Original Poster
Rep: Reputation: 53
Quote:
If you have a null-terminated string, it's last character is ASCII '\0'. Which for the majority of CPUs means a single byte with 0 value.
Thanks Sergei. That makes it more clear.
 
Old 07-07-2010, 08:05 AM   #43
bluegospel
Member
 
Registered: Jan 2010
Distribution: centOS
Posts: 404

Original Poster
Rep: Reputation: 53
Quote:
Well, but if you consider 10011, it may be 19 or, for example, 13 - the latter is for BCD (Binary Coded Decimals).
Okay, thanks. What I'm really after at this point is the question of the basic substance of an 8-bit byte: is it fundamentally a 19, which could be interpreted variously, or is is it 10011, which can be interpreted variously?
 
Old 07-07-2010, 08:19 AM   #44
Sergei Steshenko
Senior Member
 
Registered: May 2005
Posts: 4,481

Rep: Reputation: 454Reputation: 454Reputation: 454Reputation: 454Reputation: 454
Quote:
Originally Posted by bluegospel View Post
Okay, thanks. What I'm really after at this point is the question of the basic substance of an 8-bit byte: is it fundamentally a 19, which could be interpreted variously, or is is it 10011, which can be interpreted variously?
Somebody somewhere decides how to represent numbers. Plain binary and BCD are just different decisions/representation. One rarely encounters BCD nowadays. IBM used to like BCD a lot.
 
Old 07-07-2010, 10:59 AM   #45
Wim Sturkenboom
Senior Member
 
Registered: Jan 2005
Location: Roodepoort, South Africa
Distribution: Ubuntu 12.04, Antix19.3
Posts: 3,794

Rep: Reputation: 282Reputation: 282Reputation: 282
The substance of an 8-bit byte is 8 bits What the meaning is of what is in there depends on the code that you write.
Examples:
If it contains the bit pattern 01101000, the code can use it as a number (104) or as a character ('h').
If it contains 11111111, the code can use it as a signed number (-1) or a unsigned number (255)
 
  


Reply

Tags
c++, executable, execute, form, program



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Errors, Errors, and more Errors (KDE 3.4.x GUI Errors) Dralnu Linux - Software 2 05-13-2006 08:30 AM
Qt errors devit Programming 1 02-09-2004 03:45 PM
Errors during filesystem check with one kernel while no errors with other kernel Potentials Linux - General 11 12-30-2003 04:24 AM
QMAIL errors errors.. YourForum Linux - Software 0 11-27-2003 12:30 PM
2 many errors wesley Linux - Newbie 1 08-02-2001 11:42 PM

LinuxQuestions.org > Forums > Non-*NIX Forums > Programming

All times are GMT -5. The time now is 08:33 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration