Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I want to put some checks in place on my user directories. Last night some one was downloading AVIs from some user account and just about every single server process and thread was devoted to it limiting access to every other page on the server. I guess an alternative would limit the amount of threads or bandwidth that can go to a single IP address.
My main config for processes and threads is like this:
My userdir just has CGI enabled and everything else at defaults. I think I want to disable php in the user directories also (I should be able to google that one).
I'm looking at the apache bandwidth mod (mod_bw) for the moment which is a good start. if anyone knows how to limit process usage per IP or per user directory, let me know. Thanks
ok, mod_bw does not allow you to restrict per user or IP. It lets you specify an IP or sub net (or ALL) and limit that, but it does not let you specify a generic per user limitation. I'm surprised this isn't build into apache.
Apache doesn't have any concept of users, its just a webpage serving engine. If your webapp has a login then you could keep track of them eg use a DB backend and limit to x sessions per user.
Yeah, I see what you mean. I was hoping at least a mod could help, but I guess there are other ways to deal with it. For now I have just limited bandwidth on very large video file types which should help. I'm still worried about people taking all active servers/threads up though.
I guess I could maybe catch the problem outside the web server process (maybe at the level of the TCP/IP connection to port 80).
A single person can't possibly be using up all your bandwidth. You must have been DDoSed or something.
yes one person could do that .
if the bandwidth is not high then very easily done
i had the "Univ. of Chicago " ( the student servers ) download and RE-DOWNLOAD my whole site over and over and over
downloades and images and php and css files EVERYTHING to the tune of about 4 gig every 6 hours .
i had to ban the univ.
so one ip address can do that
apache can be set to allow how many threads can be used
it is someplace near the top of the httpd.conf
I don't think it was a DDoS. Looking at the log and the files involved, it looks like it was one user who put 1 GB of .AVIs his user directory showing his little boy walking and stuff. Then he went to China to visit family and tried to download them all at once using an accelerator app. He did this from a different IP each night for 3 or 4 nights (probably because he was visiting a few different family households).
He may have been maxing the bandwidth, but more likely he was overloading the server by having it spawn to many threads/processes. When I ran 'ps aux | grep www-data' it showed 50+ apache2 processes (threads?).
Here are my process/thread settings. I didn't think there was any way I could tweak this to help since there are no per user settings. I could raise the max limits, but I figure then his download accelerator would have just opened more connections.
The reason one person can't max out your bandwidth is because as soon as someone else tries to access your sites your bandwidth get's divided between both users. So no one person can take all the bandwidth. It's like how you can be downloading something on your home computer and then start surfing the web. Web surfing will be a little slow at the start but then your OS will compensate by slowing down the file download and your web surfing will speed up.
Where exactly are you hosting this site? Which hosting provider? If it's a commercial provider I would contact them about this. There is no way this should be happening on a 100mbps or even 10mbps uplink.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.