I've been trying to compile a program for a few days that has 5 insane 25,000-line C++ files (not to mention they use Qt) that can't seem to behave during compilation (
segatex.) I've already contacted the maintainer asking him/her to break up the source files. My question immediately pertains to this problem but it also extends to the generality of
all "inconsiderate" projects.
I've tried to control resources with
ulimit: I set nearly everything to a generous 256MB, but gcc exits with "out of virtual memory." Run without the limitations it peaks out at about 1.5GB of memory for a single 25k-line file, around 800MB of which is real memory. Needless to say, I can barely run
ls in another login at the same time. I've also tried to use a nice of 19 and compile without
-pipe, but to no avail.
More than actually getting this program to compile, I'd like to know if one can force gcc to accept the limitations it's given and find a way through it. Thanks.
ta0kira