Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have a couple of dedicated servers and the load is well over 200 at times, which makes my websites non functional.
This problem is due to too many spiders visiting sites.
i only have about 133 sites on one and 233 on the other. Is there an easy way to take care of this.
We blocked Google temporarily but the sites are still getting attacked. What can I do to block all of them completely, maybe except yahoo or just allow yahoo and Google. Or shoot all of them until I get this resolved.
I'd suggest just using robots.txt to make the spiders go away (and using mod_rewrite or similar to block those spiders which do not respect robots.txt, but all the "big ones" should).
How many spiders are visiting at a time? If your server hardware can't handle several concurrent visits, maybe it's time to think about upgrading?
Googlebot is killing the server right now. I didnt have any robots.txt. I guess the situation is getting a balance on allowing spiders for indexing and not crashing server.
I am pinging about 50 different servers with wordpress, only about 133 websites on dedicated server. never had this problem before
Google's bots have been designed not to create huge amounts of load on servers (they have delays inserted). What is your load without the googlebot hitting you?
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.