LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 11-02-2011, 11:20 AM   #1
davidstvz
Member
 
Registered: Jun 2008
Posts: 405

Rep: Reputation: 31
managing apache userdir


I want to put some checks in place on my user directories. Last night some one was downloading AVIs from some user account and just about every single server process and thread was devoted to it limiting access to every other page on the server. I guess an alternative would limit the amount of threads or bandwidth that can go to a single IP address.

My main config for processes and threads is like this:

Code:
<IfModule mpm_prefork_module>
    StartServers          5
    MinSpareServers       5
    MaxSpareServers      10
    MaxClients          150
    MaxRequestsPerChild   0
</IfModule>

<IfModule mpm_worker_module>
    StartServers          2
    MaxClients          150
    MinSpareThreads      25
    MaxSpareThreads      75
    ThreadsPerChild      25
    MaxRequestsPerChild   0
</IfModule>

My userdir just has CGI enabled and everything else at defaults. I think I want to disable php in the user directories also (I should be able to google that one).
 
Old 11-02-2011, 12:01 PM   #2
davidstvz
Member
 
Registered: Jun 2008
Posts: 405

Original Poster
Rep: Reputation: 31
I'm looking at the apache bandwidth mod (mod_bw) for the moment which is a good start. if anyone knows how to limit process usage per IP or per user directory, let me know. Thanks

Last edited by davidstvz; 11-02-2011 at 12:04 PM.
 
Old 11-02-2011, 03:04 PM   #3
davidstvz
Member
 
Registered: Jun 2008
Posts: 405

Original Poster
Rep: Reputation: 31
ok, mod_bw does not allow you to restrict per user or IP. It lets you specify an IP or sub net (or ALL) and limit that, but it does not let you specify a generic per user limitation. I'm surprised this isn't build into apache.
 
Old 11-02-2011, 07:23 PM   #4
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,360

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
Apache doesn't have any concept of users, its just a webpage serving engine. If your webapp has a login then you could keep track of them eg use a DB backend and limit to x sessions per user.
 
Old 11-03-2011, 09:32 AM   #5
davidstvz
Member
 
Registered: Jun 2008
Posts: 405

Original Poster
Rep: Reputation: 31
Yeah, I see what you mean. I was hoping at least a mod could help, but I guess there are other ways to deal with it. For now I have just limited bandwidth on very large video file types which should help. I'm still worried about people taking all active servers/threads up though.

I guess I could maybe catch the problem outside the web server process (maybe at the level of the TCP/IP connection to port 80).
 
Old 11-03-2011, 07:19 PM   #6
d3vrandom
Member
 
Registered: Jun 2006
Location: Karachi, Pakistan
Distribution: OpenSUSE, CentOS, Debian
Posts: 59

Rep: Reputation: 9
A single person can't possibly be using up all your bandwidth. You must have been DDoSed or something.
 
Old 11-04-2011, 12:17 AM   #7
John VV
LQ Muse
 
Registered: Aug 2005
Location: A2 area Mi.
Posts: 17,627

Rep: Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651
Quote:
A single person can't possibly be using up all your bandwidth. You must have been DDoSed or something.
yes one person could do that .

if the bandwidth is not high then very easily done

i had the "Univ. of Chicago " ( the student servers ) download and RE-DOWNLOAD my whole site over and over and over
downloades and images and php and css files EVERYTHING to the tune of about 4 gig every 6 hours .
i had to ban the univ.
so one ip address can do that

apache can be set to allow how many threads can be used
it is someplace near the top of the httpd.conf
 
Old 11-04-2011, 09:51 AM   #8
davidstvz
Member
 
Registered: Jun 2008
Posts: 405

Original Poster
Rep: Reputation: 31
I don't think it was a DDoS. Looking at the log and the files involved, it looks like it was one user who put 1 GB of .AVIs his user directory showing his little boy walking and stuff. Then he went to China to visit family and tried to download them all at once using an accelerator app. He did this from a different IP each night for 3 or 4 nights (probably because he was visiting a few different family households).

He may have been maxing the bandwidth, but more likely he was overloading the server by having it spawn to many threads/processes. When I ran 'ps aux | grep www-data' it showed 50+ apache2 processes (threads?).

Here are my process/thread settings. I didn't think there was any way I could tweak this to help since there are no per user settings. I could raise the max limits, but I figure then his download accelerator would have just opened more connections.

Code:
Timeout 300

KeepAlive On
MaxKeepAliveRequests 100
KeepAliveTimeout 5


<IfModule mpm_prefork_module>
    StartServers          5
    MinSpareServers       5
    MaxSpareServers      10
    MaxClients          150
    MaxRequestsPerChild   0
</IfModule>


<IfModule mpm_worker_module>
    StartServers          2
    MaxClients          150
    MinSpareThreads      25
    MaxSpareThreads      75
    ThreadsPerChild      25
    MaxRequestsPerChild   0
</IfModule>
 
Old 11-04-2011, 11:29 PM   #9
d3vrandom
Member
 
Registered: Jun 2006
Location: Karachi, Pakistan
Distribution: OpenSUSE, CentOS, Debian
Posts: 59

Rep: Reputation: 9
The reason one person can't max out your bandwidth is because as soon as someone else tries to access your sites your bandwidth get's divided between both users. So no one person can take all the bandwidth. It's like how you can be downloading something on your home computer and then start surfing the web. Web surfing will be a little slow at the start but then your OS will compensate by slowing down the file download and your web surfing will speed up.

Where exactly are you hosting this site? Which hosting provider? If it's a commercial provider I would contact them about this. There is no way this should be happening on a 100mbps or even 10mbps uplink.
 
Old 11-05-2011, 06:47 PM   #10
davidstvz
Member
 
Registered: Jun 2008
Posts: 405

Original Poster
Rep: Reputation: 31
Well its my own server run through a university firewall.

Now, I'm not the most savvy administrator so.I could be doing something wrong here.

Like I.said, I don't think .it was bandwidth, but it couldn't have hogged all the processes? I was able to connect but it was very slow.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Apache UserDir Problem sbb Linux - Software 13 04-10-2006 06:29 PM
Apache UserDir ]SK[ Linux - Software 4 06-18-2005 12:10 PM
Apache userdir. collen Linux - Networking 1 01-28-2005 12:24 PM
Apache UserDir Abbaddon Slackware 12 09-15-2004 11:28 AM
Question about apache UserDir Gilion Linux - Software 4 10-10-2003 12:28 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 08:11 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration