Restrict access to x pages by IP address to prevent abuse and harvesting
We have a Linux (GNU/Linux 3.6.6) web server with hundreds of sites and some of these sites have hundreds of millions of html pages.
As you can imagine we have occasional abuse problems and harvesting problems. So we are looking for a way to restrict access to some of these domains (web sites) by limiting the access to about 500 pages per 24 hour period per IP address. This will ensure that the people consulting the sites are consulting them for valid reasons and are not harvesting the data. Can someone please direct me to a document or html page that explains how to restrict access to a web site or domain name on a Linux server in the manner explained above. Thank you |
Quote:
|
unSpawn...
With all due respect, we have made the decision to implement such a restriction based on our clientèle. Visitors to the sites in question do not need to access more than a few hundred pages in a single 24 hour period to get what they need. This has been established and we are now seeking a solution to our problem. We are not a hosting company. We host only our own sites and we have several which are huge (over a hundred million static html pages) and are regularly targeted by hackers and harvesters. As a result we have taken the decision to implement some sort of IP access restriction unless we can find a better solution. This having been said, could you kindly recommend something that would help solve our harvesting issue. Thanks |
See the iptables limit, recent, hashlimit and connlimit extensions. Note implementing only rate limiting access may suppress "your clientèle" but it won't help with "occasional abuse problems and harvesting problems" or "targeted by hackers" as each requires a different approach but I already hinted at that in my reply.
|
All times are GMT -5. The time now is 03:25 AM. |