Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I am currently having difficulties with vixie-cron (vixie-cron-4.1-44.EL4) running on CentOS 4.3
Here is the situation, I set up a bunch of cronjobs (100 or so in a normal user's crontab) that execute PERL scripts. There is about 20 jobs that run every minute (100 in a 5 minute period).
Now here is the problem, I keep seeing a 'System error' message in the cron log files and the PERL script fails to execute.
Code:
Aug 1 10:21:01 livestats crond[3585]: System error
If I run these scripts manually, I have no problems and the load on the server is ok. There is no pattern - that I can see - to when I get these errors.
I was able to upgrade the Kernel and vixie-cron to latest available version from the distro, but I still get these errors.
If anyone has a suggestion to give, I am all ears.
Have you edited the crontab file manually, or are you using the 'crontab -e' command? The crontab command checks syntax; perhaps there's an error in the file?
The 'System error' message in the cron log files appears at (from what I can tell) at random times. It is never at the same time with the same scripts, it changes... different times, different scritps.
Perhaps the error is being reported by one of the scripts (or something they in turn run), and not by cron itself. Are there any root emails from cron? If so, that's the likely cause.
I did think about that possibility. My scripts interface with MySQL and other daemons using PERL modules. If that is the case, then the daemons are throwing bad exceptions that cause the whole script to die.
I did look at the various configurations files and made some changes which seems to have help (I think), but I still get some 'System Errors'.
However, I now believe that these 'System Errors' may be caused by a lack of system resources (i.e. CPU availability).
Is it possible that CRON (or another Linux process) comes along and decides to kill my script if there is not enough available CPU available?
If you have resource limits set (e.g. ulimit), then that can certainly stop your program. If the system runs out of memory, the OOM (Out Of Memory killer in the kernel) could kill the process to save the system. If that happened, there will be a message in the system log (/var/log/messages).
I don't think memory is the case, I see no errors in /var/log/messages and if OOM does kill a process, it tells you which one it killed and outputs the message on the console and in /var/log/messages. The 'System Error' message only appears in /var/log/cron
At the Linux level (kernel level), CPU time and memory size are set to unlimited. I was thinking more along the CRON level, are there any limitation to vixie-cron itself that would cause CRON to kill a process (i.e. cronjob) ?
Not that I'm aware of. There are other limits system that can be exceeded besides CPU and memory. Most distributions have a default maximum on user processes (if a processes goes into a loop spawning processes), and open files (a loop opening temp files, for example). The result will be the same - the process will fail.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.