Hello,
I've set up a free HD pictures gallery on a small web server, weak CPU, little RAM, and weak bandwidth connexion (10 Mb).
There aren't that many pics, but because of their size, there's almost 500 MB of images in total.
The problem is that some people, rather than picking which images are worth it, and downloading these images manually, are using massive leech software in order to grab EVERYTHING AT ONCE.
Can you imagine, 500 MB worth of data, asked, relentlessly, in massive multithreaded mode, from a server painfully sending one mega per second ? This is hardware abuse.
This means there will be times during which the website almost appears to cease to respond because of these leeches hogging all the bandwidth with numerous simultaneous connexion threads.
An example of such programs is Bulk Image Downloader, by Antibody Software.
At this point, I'm tempted to just conclude "screw it all", and keep my images for myself.
Still...
Would you know if there's a way to
- prevent massive bandwidth abuse
- while allowing normal bandwidth usage ?
I don't know, something like allowing only 2 simultaneous connexions per visitor, or restricting the bandwidth one person can receive if the server doesn't have enough free bandwidth at the moment...
The server specs :
- dedi
- debian squeeze
- Apache
- Webmin
If that's too complicated or impossible, that won't be a big loss, but I didn't want to give up before asking it here, who knows !
Thank you if you can help