program got random crash on LINUX, big memory issue? LINUX bug?
Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
for example i have some software that opens a 32-bit file pointer to write data to scratch files as it's running, and then later open a 32-bit file pointer to read data. problem is when my program writes "big" data as in when it writes a single file greater than 2GB I believe whatever 2^32 or 2^31 bytes comes out to be i forget the technical part of it right now. so don't think just the stack size set to unlimited should fix it, you need to scrutinize your program and really know what's going on and see where some number limit is being exceeded.
in regards to my 32-bit file pointer in code, they should have used fopen64 or whatever uses a long integer as a file pointer rather than a signed or unsigned 4 byte integer which causes my 2GB file limit size. I found that cause of the crash then figured, ok i'll make sure i don't write a single scratch file greater than 2GB and i'll just split things up into separate scratch files. Then i still crashed, with no difference in how it crashed with error messages and such so i didn't know if my 32-bit file pointer was the cause or not... until i realized i then exceeded the "nofile" limit which i think defaults to 128 if not specified in limits.conf. i was opening maybe a couple hundred file pointers and exceeded the nofile (number of open file pointers) security limit on the system. so really consider all those items in limits.conf along with trying to understand what the limits are of all the variables you are using.... 16-bit short integer signed has max of 32767 and min of -32768 things like that.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.