LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 03-18-2010, 06:55 AM   #1
yoachan
Member
 
Registered: Nov 2009
Posts: 109

Rep: Reputation: 16
Web Server and File (Media) Server


Dear all,
I have a box using CentOS 4.x, and current I'm using the box for hosting approx 5 domains with total (approx) 15 domains. Some of them very dynamic (a lot of user interactions), some of them static (only content update), and one of them static but with heavy database processes.

I choose Apache2 as my web server with PHP as server side script.

Lately we've add another domain to serve file storing where we kept hundreds of files for user to download, and it ate a lot of our bandwidth and worse, it ate our CPUs too because Apache instance will run for some time.

My question is can I assign another (web or download) server service to deal with this download so it won't burden my Apache? And how can I limit user's download section? For once I have some user that uses Download Accelelator and it opens many connection for a long time

Any ideas appreciated.

Regards,

YoChan
 
Old 03-18-2010, 07:29 AM   #2
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 4,070

Rep: Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897
These are ideas - not tested- so I do not know how much impact they would have in your context. Mind you, I am not sure that the information that you give is really clear enough to be sure exactly what to suggest.
  • Use squid in reverse http mode: in some contexts, you can cache the outgoing traffic, and depending on the nature of the variability in outgoing traffic and how the URLs are set up, this can be helpful
  • Apache, isn't exactly the lightest weight web server, and changing to something lighter could help.

Quote:
Lately we've add another domain to serve file storing
...well, you could do that on separate hardware, but that may be what you are trying to avoid...

And anything involving a database is likely to chew up resources; is it really necessary to use a database for all the things that you are trying to do? And writing databases so that resources are used efficiently is a significant problem...so mostly people ignore the problem. This may, or may not, be setting you back.
 
Old 03-18-2010, 08:47 AM   #3
yoachan
Member
 
Registered: Nov 2009
Posts: 109

Original Poster
Rep: Reputation: 16
@Salasi:
Thank you for your reply
  • So we can use squid to cache page that requested by user, eh? I'll take a look at that
  • Yes, Apache isn't the lightest... Been considering to use Nginx or Lighthttpd.... Any suggestion?
    Been reading about http://www.webdevelopmentbits.com/li...ives-to-apache too
  • Separate hardware will be wonderful, but as you said we are trying to avoid it. Cost issue...
  • The domain that heavy database processes is a unique domain.
    It stores Gigs of data. It did search and data mining... Perhaps we should move it somewhere else too...
  • About limiting client's download session (and perhaps bandwidth), do you have any suggestion?


Warm regards

YoChan
 
Old 03-18-2010, 12:10 PM   #4
arty
Member
 
Registered: Nov 2008
Posts: 66

Rep: Reputation: 17
are the users downloading or uploading to the web server?
there is a module for apache called mod_limitipconn(http://dominia.org/djao/limitipconn2.html)
this is used to limit the number of connections are allowed for an ip
i've tested it some time ago to see if it works for educational purpose only, so i don't know how it will perform if you use it for massive downloads

another idea is to use ftp.
 
Old 03-19-2010, 04:38 AM   #5
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 4,070

Rep: Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897
Quote:
Originally Posted by yoachan View Post
So we can use squid to cache page that requested by user, eh? I'll take a look at that
Yes, this is quite a conventional thing to do, but whether it helps, or not, depends entirely on your access pattern.

If people request the same page, over and over again, then it can be very effective. OTOH, if all the requests are for unique pages, or the page URLS are all unique, it will be less helpful, maybe even negative.

Quote:
Yes, Apache isn't the lightest... Been considering to use Nginx or Lighthttpd....
I'm in exactly the same position, so can't offer any advice.

Quote:
Separate hardware will be wonderful, but as you said we are trying to avoid it. Cost issue...
...although it needn't be that much of a cost, so if you can't consider this from a cost point of view, maybe you are doing something wrong, I can't say...

Quote:
The domain that heavy database processes is a unique domain.
It stores Gigs of data. It did search and data mining... Perhaps we should move it somewhere else too...
Whether its on a separate domain, or not, doesn't really seem relevant. Whether its on the same disk or whether its on the same cpu seem more germane.

Quote:
About limiting client's download session (and perhaps bandwidth), do you have any suggestion?
I should have, but I've forgotten the bandwidth limiting ones. You could probably limit the number of connections from a particular IP with a bit of iptables trickery, but I've no idea whether that really helps in any way; I still haven't really understood the problem statement well enough.
 
Old 03-20-2010, 04:22 AM   #6
yoachan
Member
 
Registered: Nov 2009
Posts: 109

Original Poster
Rep: Reputation: 16
@Arty: I'll take a look at mod_limitipconn, and open my mind for other Apache module(s) too.

@salasi:
Quote:
...although it needn't be that much of a cost, so if you can't consider this from a cost point of view, maybe you are doing something wrong, I can't say...
Quote:
Whether its on a separate domain, or not, doesn't really seem relevant. Whether its on the same disk or whether its on the same cpu seem more germane.
That's the idea, we have to think about another investment on new machine(s) and than keep the machine (rent a spot) on a data center.
And that's the cost issue I'm talking about. It's not my liberty to decide, but I'll talk about it with my team

Thank you for both of you
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Web media server, preferably in Amarok anix Linux - Software 4 11-05-2008 04:39 PM
How might I Setup Home Web Server plus File Server plus Printer Sharing brindamo Linux - Server 3 01-14-2008 12:48 AM
What would I need to make a File Server / Web Server? RHLinuxGUY Linux - General 13 12-19-2004 10:17 PM
can we configure a Linux server with mail server,file server and web server kumarx Linux - Newbie 5 09-09-2004 06:21 AM
Web server/File Sharing Server sschoenb Solaris / OpenSolaris 1 06-24-2004 03:56 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 03:59 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration