Hi guys,
I have a custom scientific fortran 77 program that requires significant amounts of memory. I am trying to enlarge one of the arrays to deal with more data than previously.
If I increase the array (matrix) size to the desired amount, the process is immediately killed by linux. This is true even if run by root (sudo). If I decrease the array size to the point where the program will just run, ps aux shows that at run time the program has allocated 2.5GB of memory. My system has 4 GB RAM + 8 GB swap space. I'm on a Core 2 Duo P8700 (2.53GHz), but running 32 bit openSuSE 11.1 because 64 bit linux still seems to have issues
"swapon -s" reveals that none of my swap space is being used.
I have even tried setting overcommit to "always" i.e. "sudo sysctl vm.overcommit_memory=1" and the behaviour is the same.
Why is my program getting killed when I increase the size of the array? Is there some intrinsic limit to the size of a fortran 77 program? I have more than enough swap space for the program (several times over!). Why won't linux swap something and run my program?
Thanks