Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
When I run the command ps -aux|grep -i HTTP|wc -l on a normal basis , I get
around 58 connection.When I run the command when my server cpu avg shoots to
over 1000pts , I get 1003 connections , yet when I my DC whats happening
they say my traffic is normal and I am not getting an attack.I have CSF on one server and APF on the other.When its happening I see my server load climbing , then to the 100hundred then my website gets slow until the load climbs to 2000 pts and then my site crashes .The attack seems to die down when I restart my servers though and it happens 2 times peer day.I have a dual quad core, 17 GB a RAM , linux centos 64 bit.Can you guys explain whats happening or suggest and server management company to help?
It sounds like your apache maxclients setting is far too large for your server. If you can drop this to a reasonable setting it may alleviate your problems.
How much RAM do you have in your server? Also, it seems that server crumbles under 2000 children but is still manageable around 1000. I would say empirically you can set that to 1024 if you want the site to stay up. If you need to increase your performance there are several places to start looking: PHP (apc and memcached), mysql (lots of stuff here), and a caching load balancer (varnish is an excellent option). If all of that isn't enough or makes things worse you'll need to look into load balancing over multiple servers as your next scalability option.
Shouldn't be too low but you'll want to monitor your server over the next few days to be sure that you're not hitting MaxClients on a regular basis. If you are then you may want to bring it up and implement other optimizations if possible.
The ServerLimit is the maximum number of children and is a hard limit. The MaxClient is the maximum number of clients that can be serviced simultaneously (which corresponds to the number of child processes). MaxRequestPerChild should remain large as that is the number of requests that process can handle before being killed and a new process taking its place.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.