LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 08-26-2007, 05:58 AM   #1
lucastic
Member
 
Registered: Aug 2003
Location: Oz
Distribution: Gentoo - Debian
Posts: 202

Rep: Reputation: 30
How to Stop Website theft with Apache2


Hi all -

I run a website that I have recently noticed someone has effectively stolen. Say my websites url is www.realsite.com, all my content comes up under their url (without correct formating) i.e. www.badsite.com.

I have looked over a heap of pages about preventing image hot-linking both by php checking and htaccess, but cannot find anything about preventing hot-linking of php scripts.

Can anybody point me in the right direction?

Thanks for reading.
 
Old 08-26-2007, 11:06 AM   #2
raskin
Senior Member
 
Registered: Sep 2005
Location: France
Distribution: approximately NixOS (http://nixos.org)
Posts: 1,900

Rep: Reputation: 69
Are you sure it is exactly hotlinking (for example, if you get pages from their domain many times, do you see accesses to your site from the IP you browse from)? To keep their domain in browser's address bar they have to serve html page to user through their server.
If they pull it every time from your server (which is stupid), find their IP and block access per IP.
But it can be that they have just downloaded all the pages they wanted (for this they need just to browse all your site and save every page they get; it can be done automatically - the tools are supposed to get offline copy, but are sometimes abused). Nothing except being isolated physically from all computers can prevent someone from manually saving text from all your pages. Multiple mails to provider/hoster can get their site offline, though (but it depends). There are some more ways to give their site some downtime, but they are either criminal and can harm third parties or can accidentally give them a bump in search rankings.
If they automate downloading, you can give them some problems.. You can try to fool them into loading some very big file. Works best with also hosting some useful content on a free hosting with a liberal file space limit and using iframes. Good luck with domain-filtering. Or there are JavaScript tricks (links are JS-generated, but one of them, instead of loading something useful, floods the memory; surely it is hidden under an image, and has caption 'this link is for killing too smart download spiders; don't click it!'). Or there is a fun referrer trick (all addresses on site redirect to main.php - maybe again by frames, to be able to keep address in address bar current - which gives content based on referrer) which some downloaders will fail to circumvent. If your site requires cookies for something useful anyway, this can also help you serving contents you want only to clients with too rich capabilities for a spider.
 
Old 08-27-2007, 10:47 AM   #3
cconstantine
Member
 
Registered: Dec 2005
Distribution: RedHat, Ubuntu
Posts: 101

Rep: Reputation: 15
If someone is poaching your PHP-parsed pages, wedge a test against $_SERVER['REFERER'] at the top of your pages... require() a single file from all your pages, then experiment in that file to tune what you need.
 
Old 08-27-2007, 03:12 PM   #4
lucastic
Member
 
Registered: Aug 2003
Location: Oz
Distribution: Gentoo - Debian
Posts: 202

Original Poster
Rep: Reputation: 30
Thanks for the ideas, will give it a go...
 
Old 08-27-2007, 07:08 PM   #5
nan0meter
Member
 
Registered: Aug 2007
Location: The Netherlands
Distribution: Fedora 7 x86_64
Posts: 119

Rep: Reputation: 15
Well check for the secret md5 hash in every file like
Code:
<?php 

if($_GET["secret"] != yourmd5codehere) {
    die("Not allowed to view this page")
}

your page here

?>
so when a visitor wants to see a page then he clicks the link which will look like http://www.mydomain.com/test.php?md5=yourmd5codehere

Or when people are trying to catch your page by directly accessing it you can check for it in your php page by putting this on top of every file.
Code:
<?php

if($_SERVER['http_referer'] != $_SERVER['server_addr'])
{
    die("You cannot access this page directly");
}

...

?>

Last edited by nan0meter; 08-27-2007 at 07:12 PM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
stop hacking of website dianarani Linux - Server 3 06-16-2007 02:22 AM
Location of website statistics for apache2 1veedo Linux - Newbie 1 04-20-2006 09:51 AM
my apache2 website unable to see...firestarter? rasgward Linux - Security 1 12-20-2005 02:29 PM
Apache2: LAN machines cannot access website Akhran Linux - Newbie 1 10-13-2005 05:39 AM
Managing website and Apache2 gfx Linux - General 5 06-02-2003 11:50 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 09:53 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration