Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have an Ubuntu server running a website. Once or twice a week apache will spawn so many instances my server will die. Looking through the access log all the traffic is from bots, google, baidu, yandex, msn, etc. I need these bots to crawl my site so I can't just deny them. Furthermore, the bots are GETing URLs that don't exist anymore. I.E. they are trying URLs from before I rolled out the new version of the site. Does anyone have any suggestions to mitigate this situation?
I was able to have my server handle high loads better by setting MaxClients in apache. I was also able to deny bots requesting bad URLs in robots.txt and was able to set Crawl-Delay to 20 seconds in robots.txt.
I have an Ubuntu server running a website. Once or twice a week apache will spawn so many instances my server will die. Looking through the access log all the traffic is from bots, google, baidu, yandex, msn, etc. I need these bots to crawl my site so I can't just deny them. Furthermore, the bots are GETing URLs that don't exist anymore. I.E. they are trying URLs from before I rolled out the new version of the site. Does anyone have any suggestions to mitigate this situation?
Mitigate what exactly? Your "need" to have bots crawl your site?
"google, baidu, yandex, msn" your site and see what the results are and compare to referer reference in the logs.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.