Quote:
Originally Posted by stormpunk
Use squid and the url_rewrite_program option in it. Squid passes in a url, your program should check if it's the url you want to intercept. Set up a squid ACL to only bother with requests for the particular machine(s) that you want to intercept.
|
Hey hey, I've got some squid success! Although there are a couple things I'm still getting stuck on:
I can't figure out how to redirect the url to a local file. I can replace 'http://example.com/whatever' with 'http://mydomain.com/whatever' -- obviously that's fine as long as there's a web server somewhere that I control, but it'd be a lot tidier if I could actually point those url requests straight at /home/fang2415/myfile.txt.
Also, squid works out of the box on my Debian system, but seems to refuse all connections when installed on my Ubuntu (10.10) box, even though the squid.conf files are the same. I haven't done a lot of searching for solutions to this yet, but any hints would be much appreciated -- the squid configuration learning curve is a little steep where I'm standing. (EDIT: Just reinstalled squid and redid everything and it's working on the Ubuntu box now. Might've had a typo in there or something.)
Quote:
Originally Posted by corp769
One thing I just thought of, not sure if it was mentioned or not - How about changing your hosts file and point example.com to a different address? That's what I do when I want to mess with people
|
I thought something like this might be easier than tackling a squid configuration, but will it work for particular files, or only domains? That is, I would guess that you could use this method to map example.com to mydomain.com, but not to map example.com/robots.txt to mydomain.com/robots_replacement.txt? 'man hosts' says that host names can contain only alphanumerics, '-', and '.', so it doesn't seem like it could handle '/'? Or is there a better way of doing this that I'm overlooking? It would be great if that worked since then I could avoid squid altogether.
In case other dummies like me are interested, here's how I got the squid method working to spoof example.com/robots.txt:
Code:
sudo apt-get install squid
Wrote the following (which I found
here) to /home/fang2415/bin/redirect.pl:
Code:
#! /usr/bin/perl
# enable buffer autoflush
$|=1;
# read from standard input
while (<STDIN>) {
# perform string substitution:
# replace example.com/robots.txt with mydomain.com/robots_replacement.txt
s@http://example.com/robots.txt@http://mydomain.com/robots_replacement.txt@;
# print result to standard output
print STDOUT;
}
Added the following lines to /etc/squid/squid.conf:
Code:
acl redirect dstdomain .example.com
url_rewrite_access allow redirect
url_rewrite_program /home/fang2415/bin/redirect.pl
Then did
Code:
sudo squid -k reconfigure
Then added the following to .wgetrc:
Code:
http_proxy = localhost:3128/
(Or to test it, set Firefox's manual proxy to 127.0.0.1:3128).
Whew. I'm glad to have a workaround, and many thanks to stormpunk for the suggestion! ...But it does seem like a slightly overweight solution -- if anybody has any ideas on how to simplify it using only a hosts file or anything else, I'd appreciate it!