Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
08-26-2007, 05:58 AM
|
#1
|
Member
Registered: Aug 2003
Location: Oz
Distribution: Gentoo - Debian
Posts: 202
Rep:
|
How to Stop Website theft with Apache2
Hi all -
I run a website that I have recently noticed someone has effectively stolen. Say my websites url is www.realsite.com, all my content comes up under their url (without correct formating) i.e. www.badsite.com.
I have looked over a heap of pages about preventing image hot-linking both by php checking and htaccess, but cannot find anything about preventing hot-linking of php scripts.
Can anybody point me in the right direction?
Thanks for reading.
|
|
|
08-26-2007, 11:06 AM
|
#2
|
Senior Member
Registered: Sep 2005
Location: France
Distribution: approximately NixOS (http://nixos.org)
Posts: 1,900
Rep:
|
Are you sure it is exactly hotlinking (for example, if you get pages from their domain many times, do you see accesses to your site from the IP you browse from)? To keep their domain in browser's address bar they have to serve html page to user through their server.
If they pull it every time from your server (which is stupid), find their IP and block access per IP.
But it can be that they have just downloaded all the pages they wanted (for this they need just to browse all your site and save every page they get; it can be done automatically - the tools are supposed to get offline copy, but are sometimes abused). Nothing except being isolated physically from all computers can prevent someone from manually saving text from all your pages. Multiple mails to provider/hoster can get their site offline, though (but it depends). There are some more ways to give their site some downtime, but they are either criminal and can harm third parties or can accidentally give them a bump in search rankings.
If they automate downloading, you can give them some problems.. You can try to fool them into loading some very big file. Works best with also hosting some useful content on a free hosting with a liberal file space limit and using iframes. Good luck with domain-filtering. Or there are JavaScript tricks (links are JS-generated, but one of them, instead of loading something useful, floods the memory; surely it is hidden under an image, and has caption 'this link is for killing too smart download spiders; don't click it!'). Or there is a fun referrer trick (all addresses on site redirect to main.php - maybe again by frames, to be able to keep address in address bar current - which gives content based on referrer) which some downloaders will fail to circumvent. If your site requires cookies for something useful anyway, this can also help you serving contents you want only to clients with too rich capabilities for a spider.
|
|
|
08-27-2007, 10:47 AM
|
#3
|
Member
Registered: Dec 2005
Distribution: RedHat, Ubuntu
Posts: 101
Rep:
|
If someone is poaching your PHP-parsed pages, wedge a test against $_SERVER['REFERER'] at the top of your pages... require() a single file from all your pages, then experiment in that file to tune what you need.
|
|
|
08-27-2007, 03:12 PM
|
#4
|
Member
Registered: Aug 2003
Location: Oz
Distribution: Gentoo - Debian
Posts: 202
Original Poster
Rep:
|
Thanks for the ideas, will give it a go...
|
|
|
08-27-2007, 07:08 PM
|
#5
|
Member
Registered: Aug 2007
Location: The Netherlands
Distribution: Fedora 7 x86_64
Posts: 119
Rep:
|
Well check for the secret md5 hash in every file like
Code:
<?php
if($_GET["secret"] != yourmd5codehere) {
die("Not allowed to view this page")
}
your page here
?>
so when a visitor wants to see a page then he clicks the link which will look like http://www.mydomain.com/test.php?md5=yourmd5codehere
Or when people are trying to catch your page by directly accessing it you can check for it in your php page by putting this on top of every file.
Code:
<?php
if($_SERVER['http_referer'] != $_SERVER['server_addr'])
{
die("You cannot access this page directly");
}
...
?>
Last edited by nan0meter; 08-27-2007 at 07:12 PM.
|
|
|
All times are GMT -5. The time now is 09:53 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|