Web Server Killed By Bots Crawling
I have an Ubuntu server running a website. Once or twice a week apache will spawn so many instances my server will die. Looking through the access log all the traffic is from bots, google, baidu, yandex, msn, etc. I need these bots to crawl my site so I can't just deny them. Furthermore, the bots are GETing URLs that don't exist anymore. I.E. they are trying URLs from before I rolled out the new version of the site. Does anyone have any suggestions to mitigate this situation?