LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 03-17-2005, 08:46 AM   #1
cosmicperl
LQ Newbie
 
Registered: Aug 2003
Posts: 15

Rep: Reputation: 0
To many processes, Pls Help!


Hi All,
I got woken up today by my phone ringing with clients going mad about my server being down. When I finally managed to get in via SSH I found there weren't any runaway processes as I suspected, but over 300 processes putting the load average up to over 120. CPU usage was less than 10% and there was plenty of memory free.


I need to limit the server to about 100 processes as it doesn't cope with many more. I did a few searches and came across info on limits.conf so I added the lines:-


* soft nproc 20
* hard nproc 30
root hard nproc 50

When I did a test to see if this had worked by bombarding my server with requests (an easier method of testing would be uch appreciated) it seemed to make no difference at all.

Do I have to restart something to have the changes come into effect? I've restarted apache and xinet.

Is there a way I can set a total process limit for the system regardless of user? If so how do I do it and how do it effect the changes after I've made them.


Thanks very much in advance.


Just a few search strings i tried, to help people find this article in the future:-

loading limits.conf changes
limiting the number of processes
combating dos attacks
 
Old 03-17-2005, 09:20 AM   #2
cosmicperl
LQ Newbie
 
Registered: Aug 2003
Posts: 15

Original Poster
Rep: Reputation: 0
Update:-

I've now updated my apache httpd.conf to read:-
StartServers 8
MinSpareServers 5
MaxSpareServers 8
MaxClients 40
MaxRequestsPerChild 1000

Instead of:-
StartServers 8
MinSpareServers 5
MaxSpareServers 30
MaxClients 150
MaxRequestsPerChild 1000

Which should help. I run a lot of scripts so half the time a http process also results in a perl process.


Any help on how I can reduce server load would be much appreciated. I'm pussled as to why the CPU and memory usage isn't that high, but the load is huge and the derver has slowed to a virtual halt. Still need help on my original questions.
 
Old 03-17-2005, 11:43 AM   #3
sigsegv
Senior Member
 
Registered: Nov 2004
Location: Third rock from the Sun
Distribution: NetBSD-2, FreeBSD-5.4, OpenBSD-3.[67], RHEL[34], OSX 10.4.1
Posts: 1,197

Rep: Reputation: 47
Quote:
Originally posted by cosmicperl
Any help on how I can reduce server load would be much appreciated. I'm pussled as to why the CPU and memory usage isn't that high, but the load is huge and the derver has slowed to a virtual halt. Still need help on my original questions.
You now know what escapes *many* *NIX users (including most of the ones that claim to be experts) -- load average has little to nothing to do with the percent of the CPU/memory being consumed ...

The Load Averages are a look at how badly the system is backed up and not necessarily how much CPU is being burnt. The Linux kernel looks at the process table every 5 seconds to see how many processes it has a) running on the CPU (TASK_RUNNING) and b) how many it has runnable (TASK_RUNNABLE). These are then added together. Then, at the 1 minute mark, it divides the total number by the number of checks (12) to get you your 1 minute load average. A load average of 120 is *VERY* high. About 100 times what it should be on a single CPU machine.

What's funny about situations like this is that it's usually the client that called that is causing the problem. All it takes to have what you have is a single CGI stuck in a loop that won't get off the processor or something spawning threads like mad or the like (though the threads will usually eat RAM like there's no tomorrow).

My suggestion is to find who's breaking the system and break their hands.

As for limits.conf -- I'm fairly sure that the entries in there are a "per user" kind of thing. In other words -- each user can have 30 processes.

HTH

NOTE: The math in the above explanation isn't entirely accurate, but serves conversation much better than the reality ... See sched.h for more information
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
what do these processes do? sdat1333 Fedora 2 11-26-2004 08:40 PM
too many processes linus123 Linux - Distributions 1 11-16-2004 05:30 AM
monitoring active processes and identifying the required processes. gajaykrishnan Programming 2 08-13-2004 01:58 AM
What processes can go? Mike-BB Linux - Software 4 07-31-2004 08:44 PM
pls pls pls help me ! i'm tired with httpd config on fedora apache 2.0.48 AngelOfTheDamn Fedora 0 01-24-2004 05:12 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 01:57 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration