Linux - Server This forum is for the discussion of Linux Software used in a server related context. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
01-11-2008, 04:30 PM
|
#1
|
LQ Newbie
Registered: Jan 2008
Posts: 19
Rep:
|
Problem: Squid: The requested URL could not be retrived
Hi, I'm setting up a squid proxy on a Debian Etch machine, I compiled with:
Code:
# ./configure --prefix=/usr/local/squid --enable-delay-pools --enable-cache-digests --enable-poll --disable-ident-lookups --enable-truncate --enable-removal-policies
# make all
# make install
I'm redirecting www requests from LAN to squid server, squid is located on the LAN gateway, redirecting works, but when some browser from the lan tries to open ANY (example: http://www.google.com) page it comes with the following:
Code:
ERROR
The requested URL could not be retrieved
____________________________________________________
While trying to retrieve the URL: /
Some aspect of the URL is incorrect. Possible problems: - Missing or incorrect access protocol (should be `http://" or similar)
- Missing hostname
- Illegal double-escape in the URL-Path
- Illegal character in hostname; underscores are not allowed
Your cache administrator is webmaster.
_________________________________________________
Generated Fri, 11 Jan 2008 18:30:27 GMT by debian.gateway.2wire.net (squid/3.0.STABLE1)
this shows up with any page, dns is working, I wiresharked it, the requests reaches the proxy but they don't go out. Even a local a server on the same net as the proxy won't show up.
I think I'm doing something wrong, cause I've tried squid on a opensuse 10.3 with the same exact result
Does the order of http_access deny|allow matter, does it has a priority?
Here's my iptables sh script and squid.conf, any ideas would be welcome.
Code:
# File: firewall.sh
#!/bin/bash
IPLAN="172.16.1.1"
IPEXT="192.168.1.82"
IPSRV="172.16.1.2"
IFLAN="eth1"
IFEXT="eth0"
sudo ifconfig $IFLAN $IPLAN
sudo ifconfig $IFEXT $IPEXT
#Habilitar redireccionamiento de ip
sudo echo 1 > /proc/sys/net/ipv4/ip_forward
sudo iptables -F
sudo iptables -X
sudo iptables -t nat -F
sudo iptables -t nat -X
#políticas por defecto: tira todo.
sudo iptables -P INPUT DROP
sudo iptables -P FORWARD DROP
sudo iptables -P OUTPUT DROP
#Anti DoS: permite hasta 3 peticiones máximo en 5 segundos al servidor HTTP
sudo iptables -A FORWARD -p tcp -i $IFEXT -o $IFLAN --dport 80 -j DROP --syn -m recent --name antidos --rcheck --seconds 5 --hitcount 4
#Reglas para acceso externo a servidor HTTP
sudo iptables -t nat -A PREROUTING -p tcp -i $IFEXT --dport 80 -j DNAT --to-destination $IPSRV --syn -m recent --name antidos --set
sudo iptables -A FORWARD -p tcp -i $IFEXT -o $IFLAN --dport 80 -j ACCEPT
sudo iptables -A FORWARD -p tcp -i $IFLAN -o $IFEXT --sport 80 -j ACCEPT
sudo iptables -t nat -A POSTROUTING -p tcp -o $IFLAN --dport 80 -j SNAT --to-source $IPLAN
#Reglas para DNS de la LAN
#sudo iptables -t nat -A PREROUTING -p udp -i $IFEXT --sport 53 -j DNAT --to-destination $IPSRV
sudo iptables -t nat -A POSTROUTING -p udp -o $IFEXT --dport 53 -j SNAT --to-source $IPEXT
sudo iptables -A FORWARD -s $IPSRV -p udp -i $IFLAN -o $IFEXT --dport 53 -j ACCEPT
sudo iptables -A FORWARD -p udp -i $IFEXT -o $IFLAN --sport 53 -j ACCEPT
#reglas para squid
sudo iptables -A INPUT -p udp -i $IFEXT --sport 53 -j ACCEPT
sudo iptables -A OUTPUT -p udp -o $IFEXT --dport 53 -j ACCEPT
sudo iptables -A INPUT -p tcp -i $IFEXT --sport 80 -j ACCEPT
sudo iptables -A OUTPUT -p tcp -o $IFEXT --dport 80 -j ACCEPT
sudo iptables -t nat -A PREROUTING -p tcp -i $IFLAN --dport 80 -j REDIRECT --to-ports 6666
sudo iptables -A INPUT -p tcp -i $IFLAN --dport 6666 -j ACCEPT
sudo iptables -A OUTPUT -p tcp -o $IFLAN --sport 6666 -j ACCEPT
Code:
#File: squid.conf
acl prot proto HTTP FTP
acl metodos method GET POST
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
acl localnet src 172.16.1.0/24 # RFC1918 possible internal network
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access deny all
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow manager localhost
http_access allow prot
http_access allow metodos
http_access allow localnet
http_access allow localhost
http_reply_access allow all
icp_access allow localnet
icp_access deny all
htcp_access allow localnet
htcp_access deny all
http_port 6666
cache_mem 16 MB
maximum_object_size_in_memory 16 KB
cache_dir ufs /usr/local/squid/var/cache 100 16 256
access_log /usr/local/squid/var/logs/access.log squid
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
cache_effective_user nobody
cache_effective_group nogroup
icp_port 3130
acl magic_words1 url_regex -i 172.16
acl magic_words2 url_regex -i .ftp .exe .mp3 .vqf .tar.gz .gz .rpm .zip .rar .avi .mpeg .mpe .mpg .qt .ram .rm .iso .raw .wav .mov
acl day time 09:00-23:59
delay_pools 2
delay_class 1 2
delay_parameters 1 -1/-1 -1/-1
delay_access 1 allow magic_words1
delay_class 2 2
delay_parameters 2 5000/150000 5000/120000
delay_access 2 deny !day
delay_access 2 allow day
delay_access 2 allow magic_words2
|
|
|
01-11-2008, 05:02 PM
|
#2
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
i've not seen that squid behaviour directly, and i'm sure someone else could instantly tell you, but within wireshark, what are the full http headers that are being sent? maybe there's no HOST header field in it...
|
|
|
01-12-2008, 10:55 PM
|
#3
|
Member
Registered: Dec 2005
Location: St Petersburg, FL, USA
Posts: 220
Rep:
|
I've setup squid on debian etch, never had any problem. Why did you recompile it? apt-get install squid works fine... try that one and see if the problem goes away.
|
|
|
01-13-2008, 01:03 PM
|
#4
|
LQ Newbie
Registered: Jan 2008
Posts: 19
Original Poster
Rep:
|
Quote:
what are the full http headers that are being sent?
|
Here's what I get from wireshark:
REQUEST IP and TCP Headers:
Code:
Internet Protocol, Src: 172.16.1.2 (172.16.1.2), Dst: 72.14.207.99 (72.14.207.99)
Transmission Control Protocol, Src Port: 39127 (39127), Dst Port: www (80), Seq: 1, Ack: 1, Len: 491
Code:
GET / HTTP/1.1\r\n
Host: google.com\r\n
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.8.1.6) Gecko/20070730 SUSE/2.0.0.6-25 Firefox/2.0.0.6\r\n
Accept: text/xml,application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5\r\n
Accept-Language: en-us,en;q=0.5\r\n
Accept-Encoding: gzip,deflate\r\n
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7\r\n
Keep-Alive: 300\r\n
Connection: keep-alive\r\n
Cookie: PREF=ID=1b958449047fc239:TM=1199289509:LM=1199289509:S=mfiXpZpnuaMMQUi1\r\n
\r\n
REPLY from squid IP AND TCP:
Code:
Internet Protocol, Src: 72.14.207.99 (72.14.207.99), Dst: 172.16.1.2 (172.16.1.2)
Transmission Control Protocol, Src Port: www (80), Dst Port: 39127 (39127), Seq: 1449, Ack: 492, Len: 415
HTTP:
Code:
HTTP/1.0 400 Bad Request\r\n
Server: squid/3.0.STABLE1\r\n
Mime-Version: 1.0\r\n
Date: Sun, 13 Jan 2008 15:29:51 GMT\r\n
Content-Type: text/html\r\n
Content-Length: 1449
Expires: Sun, 13 Jan 2008 15:29:51 GMT\r\n
X-Squid-Error: ERR_INVALID_URL 0\r\n
X-Cache: MISS from debian.gateway.2wire.net\r\n
X-Cache-Lookup: NONE from debian.gateway.2wire.net:6666\r\n
Via: 1.0 debian.gateway.2wire.net (squid/3.0.STABLE1)\r\n
Proxy-Connection: close\r\n
\r\n
and then comes the html page with the error, i've already posted... there's no forwarding to the other interface. But I've allowed http and DNS outgoing requests from squid server with iptables.
Quote:
apt-get install squid works fine... try that one and see if the problem goes away.
|
Ok i uninstalled (make uninstall), and did apt-get install squid, with no problems at all. Restarted squid, but the problem stills the same. May be something is wrong with my squid.conf?
I did <telnet google.com 80> from the command line on suse client from the LAN. then:
Code:
GET / HTTP/1.1
Host: google.com
//putted an aditional <ENTER> here.
But it stills the same, squid sends the same page. If no http_access allow rule matches the request, what does squid prompts? does it prompts the page I'm getting?
Thanks a lot for the help guys.
|
|
|
01-13-2008, 02:07 PM
|
#5
|
Moderator
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417
|
ah, ok i think it's down to the lack of a transparent defintion with the listening port. the right config will move to derive the required data from generic http headers rather than the stnadard proxy enabled headers that the client would send if known to be using a proxy.
http://www.cyberciti.biz/tips/linux-...uid-howto.html
|
|
|
01-16-2008, 10:12 AM
|
#6
|
LQ Newbie
Registered: Jan 2008
Posts: 19
Original Poster
Rep:
|
Done
Quote:
Originally Posted by acid_kewpie
ah, ok i think it's down to the lack of a transparent defintion with the listening port. the right config will move to derive the required data from generic http headers rather than the stnadard proxy enabled headers that the client would send if known to be using a proxy.
http://www.cyberciti.biz/tips/linux-...uid-howto.html
|
Thanks a lot, that helped, but that tutorial seems to use an oloder version of squid, I had do use:
Code:
http_port 6666 vhost
that solved the problem. Thanks a lot, i got my transparent proxy up.
|
|
|
All times are GMT -5. The time now is 01:52 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|