C++ Segfault with Large Arrays
I am having a problem with a program I have written.
I previously had Gentoo installed and it worked fine. Then my hard drive broke, and I replaced it and reinstalled Gentoo. I had managed to back up the work, and so the program is identical to what it was before. However, I am now having a problem which is that the program segfaults. It is written in C++, and compiled with g++, using -Wall and -pedantic, and it still compiles fine. I have declared several large arrays, and they are of constant size (defined by a #define declaration) and the program exits immediately with a segmentation fault. It exits as soon as I declare my class in the main program. If I reduce the size of the arrays, it works fine (but not with enough accuracy for what I need), but as soon as they exceed a certain size, I get the same problem. I have not precisely determined the size that causes this problem, but it is something in the region of the largest array being of size double[1000][1000]. I have plenty of memory for this (1GB RAM + 1GB swap) and as I say, it worked before I reinstalled the operating system. I would be *extremely* grateful if anyone can point me in the right direction on this one. Guy |
So allocate your arrays dynamically if you want big arrays. On most stack-based systems there's a limit on how big variables can be that live on the stack. This question has been asked a thousand times before, use google before asking.
|
OK, thanks, I'll look into that.
I tried googling for some answers, but found nothing of use with "large array segmentation fault", just lots about not indexing invalid array elements, which I already knew. |
I don't know if you did that now, but searching the comp.lang.c++ newsgroup is often very helpful when you have problems
regarding C++. |
All times are GMT -5. The time now is 07:39 PM. |