ProgrammingThis forum is for all programming questions.
The question does not have to be directly related to Linux and any language is fair game.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
<antiroach> i wrote a program that recurses x number of times. when x is 25000 it core dumps. i ran a debugger and it said a stack overflow happened. is there any way i can increase stack space? or how else can i avoid running out of stack space?
<Sparr> antiroach: stop using the stack
<StoneCyph> antiroach: yes, you can increase stack space. how to prevent that problem? don't recurse to a depth of 5 digits.
<antiroach> how can i increase the stack space
<Kniht> antiroach: use an iterative solution instead
<antiroach> it has to be recursive
<antiroach> this algorithm finds a best move given a position.
<Kniht> no, it doesn't have to be
<antiroach> its like a game thing. yes it does. its part of the specs.
<Kniht> any recursive algorithm can be made iterative
<StoneCyph> antiroach: you're not listening. this is like saying "i built a car that goes fast. when i drive it into the wall at 500 miles per hour, it crumples. can i make it stronger?" "yes, but in general, stop driving into walls."
<[eloi]> StoneCyph: no, more like, "can i move the wall back a bit?"
Pretty similar to what you are asking... All in all you cant solve that problem and you need to work around it, unless you intend to expand your computers memory...
Actualy I think this is a valid questions, maybe he do have a program that will use that amount of data? I myself been playing with terraforming rutins, and physics engiens for these worlds.. The worlds themself are much containted on harddrive and swapped up and such, but the "current state" data is huge to, and If I would ever continue my work, its not unreasonble to exspect to use 1Gig of memory for these calculations. Recursive calls kills the stack, yes it does, yes thats a wall you should avoid. But if you have 2Gig in your machine, or 512Mb and 4Gig swap... Why shouldnt you be able to do this: struct stateData currentState.. ?
Sorry, It is my fault. I should have told you I have 2 giga bytes memory fixed in my PC. And if I reduce the size of that array to 800MB, I can run the program with 2 threads at the same time, which cost 1.5GB memory.
And also I can change the stacksize to unlimited, but it still refuse to work.
Originally posted by Stack Pretty similar to what you are asking... All in all you cant solve that problem and you need to work around it, unless you intend to expand your computers memory...
PS: some FORTRAN compilers used to have support for virtual arrays - ie., arrays kept mostly on disk and paged into/out of memory as needed. I don't know if your FORTRAN compiler does that, but it's worth investigating.
Originally posted by nowonmai the above C example is C++
Whoops Sorry. I think it's pretty obvious what my language of choice is now...
I took the liberty to test this a little. This code snippet (which is C++) shows what happens on my machine:
char *i = new char; // 1GB
cout << "Caught a bad alloc exception" << endl;
Without the try block, the program exits with a core dump. Otherwise the exception line is printed.
One interesting thing to note is that I can easily allocate 300 MB 3 times, like this:
char *i = new char; // 300 MB
char *j = new char;
char *k = new char;
The memory is never really allocated as physical memory until I start using it. Every memory request is checked to see if it can be satisfied, but the allocator routine doesn't know about other requests that haven't been used.