Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I won't be too specific because that would point to how to defeat the procedures. But, essentially, it is a combination of honeypots and pattern matching. My site has locations on it that a human will never, ever find but a site scraper will find. I link these locations using a one pixel gif which is physically hidden behind another gif on the page. Humans will never reach it, but a site scraper will find the link, follow it, and get immediately blacklisted. I keep search engines that I want visiting the site from falling into the honeypots with .htaccess entries.
Pattern matching looks for things that are commonly done by scrapers; too many pages in too short a time; downloading pages while not downloading images; downloading images while not downloading pages - things like that.