Web server load balancing during high demand - linking to outside server?
Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Web server load balancing during high demand - linking to outside server?
I am wondering if it is possible to set up a server either somewhere in the network mounts or even vhosts, where when demand is really high on the server, it can send all the requests for folder /www/site/pics and /www/site/video to an off-site server/host where these files are also located. So when the server load gets to a certain point, it redirects all calls for the local /pics and /video files to an external host to have that host send the files to the client.
I'm wondering if this can be done through the server or if it has to be done in the website coding (I'm not sure how it would be done in the coding, but I would think it would be possible..?).
What you are asking for is not simple, especially the way you're asking it makes me believe you are not experienced in server admin. If you want to know more the 2 best approaches imo are using nginx or HAProxy.
Of course it would be possible to roll your own custom solution but that is never a good idea. Existing solutions might look complicated but they are robust and well documented.
Most webservers can pick up different URL paths from different places on the filesystem.
To collect certain paths from another webserver requires reverse proxy functionality. nginx has this built in, apache has a loadable module to do it.
Or you can use a dedicated reverse proxy to front up you website and direct requests to other servers according to the path or other information in the request. There are plenty of good ones out there. This is a common setup for high performance web sites.
Make sure that "static-file content" is being served directly by Apache, not sent through your application.
Determine how many requests your server can handle at once, and set web-server limits so that no more than this number of workers are ever launched. (The other requests must wait their turn.)
Some cloud hosting services provide "near-instantaneous scale-up capability." (You pay more, of course, for the extra capacity, which is magically furnished on-demand.) These use "Linux container" and maybe "application server" strategies.
Consider using efficient technologies like FastCGI, which can distribute the work of generating web-page content to a phalanx of back-end servers, while keeping the actual server worker-bees "small, light and fast" as they should be.
Don't design web apps that attempt to do "expensive things," like generating reports, directly in the page. Build or steal a "work-queue manager" which lets you queue up the request for processing by a back-end daemon, with the means for the requesting user to periodically check its progress and then to retrieve results.
Last edited by sundialsvcs; 01-26-2018 at 08:14 AM.
You better go for the Cloud hosting for your web server in the first place. You see in this manner your website is hosted on multiple servers and at the peak times your load could be distributed among the different server. Thus, resulting you an awesome up-time of your website.
Considering your thread @cilbuper. I can assume that you are not that much technical in the server and all. So, I suggest you to go with google cloud web server which is managed by Cloudways. As, it takes away all the pain of managing the server by letting you a simple and easy platform to deal with.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.