apache overloaded by robots
hi,
im running vps with debian lenny, with apache2 webserver, memcached, postgres and django-framework using mod-wsgi.
Some of the pages served by django take long to generate (aprox 15 s), but there is also the memcached, compensating this.
My problem is that, when a robot visits the site, it starts traversing the site visiting all the pages and also the nongenerated, thus slowing it down to point where it is not responding.
What Im looking for is a solution to identify that the request comes from a robot (user-agent, ips, etc) and limit the resources, so that f.e. only one thread serves the robot etc...
Is it possible? has anyone come across similar problem? Any other solution?
|