LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Server (https://www.linuxquestions.org/questions/linux-server-73/)
-   -   Server aburptly stops running High CPU usage PERL script after about 5 hours (https://www.linuxquestions.org/questions/linux-server-73/server-aburptly-stops-running-high-cpu-usage-perl-script-after-about-5-hours-554356/)

bpmee 05-16-2007 09:46 AM

Server aburptly stops running High CPU usage PERL script after about 5 hours
 
Hi All,

I am using Fedora Core 5, Apache webserver, and ISPconfig with Multi-threaded PERL.

I have a script that uses a low (10) number of PERL threads to go through each of my websites and edit various .php pages that are designated by using grep and "foreaching" them into an array for processing.

When I run the script from shell, I see that it takes up about 105% CPU power, but the server generally continues to function and the script continues to run.

Until... about 5 hours later it abruptly stops and disappears from my "top" processes.

I don't see any PERL error displayed on my shell screen...

Also, I know that it hasn't finished going through each website since I built in a log function that outputs progress to a .txt. file. Further, I told PERL to output "SCRIPT COMPLETE" to my text file when finished, but this doesn't happen!

Is it because my server's CPU is under 105% + load (in addition to serving web requests) for 5 hours that causes the script to stop?:scratch:

How can I make this "unlimited" temporarily so that the script can run and finish the tasks I need?

Here is my ulimit -a output - everything appears to be max:

Code:

core file size          (blocks, -c) 0
data seg size          (kbytes, -d) unlimited
max nice                        (-e) 0
file size              (blocks, -f) unlimited
pending signals                (-i) 100000
max locked memory      (kbytes, -l) 32
max memory size        (kbytes, -m) unlimited
open files                      (-n) 2000
pipe size            (512 bytes, -p) 8
POSIX message queues    (bytes, -q) 819200
max rt priority                (-r) 0
stack size              (kbytes, -s) 10240
cpu time              (seconds, -t) unlimited
max user processes              (-u) 100000
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited
file locks                      (-x) unlimited

Thanks for any help!:)

MensaWater 05-17-2007 07:01 PM

You might insert sleep statements in between each loop. Since you didn't post the script its not clear but I've seen many a script eat processors because they don't have breathing room built into them. What happens is essentially the same resources are getting used by each run of a loop and often the prior loop hasn't completed use of the resources before the next one starts (and the next one, and the next one etc...).

So say if you had a simple shell script that did something like:

while true
do find /home/myname -print
done

Just doing the find of /home/myname might not hurt much once but you'd see the above loop would quickly spawn many find statements and over time would eat your CPU up.

If however you had something like:

while true
do find /home/myname -print
sleep 30
done

It is very unlikely you'd see much impact on your CPU at all (assuming your home directory didn't have 10,000,000 files in it).

Doing the same loop with find of / instead of /home/myname would require you to put in a longer sleep because each find run would take much longer to run.

Similarly if you're processing many files as you say you are likely doing it via a loop and probably need to put in some sleep statements to allow each run of the loop to complete before the next one kicks off.

bpmee 05-18-2007 12:24 AM

Thanks, and excellent reply!
 
Hi,

Thanks for your reply! That is very helpful.:study:

Actually, I don't have a system that uses "sleep" or other programming to allow the processor time to breath.

I'm going to look at the script again and make sure there is time for all the PERL threads to complete before running over each other and/or other things in the script.

Great advice...! :cool:

MensaWater 05-18-2007 11:24 AM

It's not so much that the "system requires sleeps" but rather that infinite (or least very long) loops essentially use pretty much the same resources for each run. Without putting in a governor like sleep you end up with multiple runs trying to process concurrently. While the system will allow for a certain amount of this there is a point of diminishing returns. The sleep statement just tells it to not do that so it frees the resources before using them in the next run from the loop.

Restated: The system is designed for multi-users and multi-processes but it works well for this because typically the users and/or processes aren't all using exactly the same resources at the same moment. In a loop however the processes typically are trying to use the same resources at nearly the same moment.


All times are GMT -5. The time now is 04:02 AM.