Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I installed squid proxy in RHEL server, its working fine.
for some sites its not working.
when i try to open ppcreport.php in browser its shows connection was reset and for some other sites it shows error324 empty response.
for http://www.safra.sg is not working in proxy it shows error 104 read error, connection reset by peer.
with out proxy its working fine.
Yes, i followed the tutorial, my squid proxy service is running.
log file for squid:
27/Jun/2012:09:39:26 +0800 9103 127.0.0.1 172.17.2.50 TCP_MISS/502 1449 GET http://www.safra.sg/ - DIRECT/203.127.218.160 text/html
but it works in SSL.
Certainly for safra.sg, I think the problem's at their end:
Code:
[joshua:~]$ wget www.safra.sg
--2012-06-27 16:46:53-- http://www.safra.sg/
Resolving www.safra.sg... 203.127.218.160
Connecting to www.safra.sg|203.127.218.160|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 112525 (110K) [text/html]
Saving to: `index.html.1'
100%[=======================================================================================================>] 112,525 56.4K/s in 1.9s
2012-06-27 16:46:57 (56.4 KB/s) - `index.html.1' saved [112525/112525]
[joshua:~]$ wget 203.127.218.160
--2012-06-27 16:43:13-- http://203.127.218.160/
Connecting to 203.127.218.160:80... connected.
HTTP request sent, awaiting response... 400 Bad Request
2012-06-27 16:43:14 ERROR 400: Bad Request.
[joshua:~]$
I don't quite understand why this is happening (something to do with their server setup), but it seems Squid is doing the same thing as me: trying to connect via IP, and getting a Bad Request response. I'll do some digging, but maybe someone more conversant in these matters will be able to help you in the meantime.
browser page shows:
The connection was reset
The connection to the server was reset while the page was loading
The site could be temporarily unavailable or too busy. Try again in a few
moments.
If you are unable to load any pages, check your computer's network
connection.
If your computer or network is protected by a firewall or proxy, make sure
that Firefox is permitted to access the Web.
You're using firefox? Install https://getfirebug.com/ and use the 'NET' panel to look at the headers you're sending. I've worked out why the site rejects the connection:
Code:
[joshua:~]$ telnet www.safra.sg 80 (28-06 09:31)
Trying 203.127.218.160...
Connected to www.safra.sg.
Escape character is '^]'.
GET / HTTP/1.1
HTTP/1.1 400 Bad Request
Content-Type: text/html
Date: Thu, 28 Jun 2012 08:29:37 GMT
Connection: close
Content-Length: 39
<h1>Bad Request (Invalid Hostname)</h1>Connection closed by foreign host.
[joshua:~]$ telnet www.safra.sg 80 (28-06 09:32)
Trying 203.127.218.160...
Connected to www.safra.sg.
Escape character is '^]'.
GET / HTTP/1.1
Host: www.safra.sg
HTTP/1.1 200 OK
Date: Thu, 28 Jun 2012 08:29:54 GMT
Server: Microsoft-IIS/6.0
X-Powered-By: ASP.NET
X-AspNet-Version: 2.0.50727
Cache-Control: private
Content-Type: text/html; charset=utf-8
Content-Length: 112285
so my guess is that either your browser isn't sending the 'HOST' header, or your squid server is removing it.
i think my squid server is removing it.
i installed squid + clamav +dansguardian.
in my FF browser i set dansguardian port 6081. squid port is 8080.
everything is working fine.... but somesites are not allowing by proxy.
please look my iptables:
-A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 6081 -m limit --limit 500/sec --limit-burst 800 -j ACCEPT
-A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 6081 -j ACCEPT
-A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 80 -j ACCEPT
-A RH-Firewall-1-INPUT -p tcp -m state --state NEW -m tcp --dport 6081 -j ACCEPT
-A RH-Firewall-1-INPUT -j REJECT --reject-with icmp-host-prohibited
COMMIT
*nat
:OUTPUT ACCEPT [0:0]
:POSTROUTING ACCEPT [0:0]
:PREROUTING ACCEPT [0:0]
-A PREROUTING -i eth0 -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 6081
COMMIT
*raw
:OUTPUT ACCEPT [0:0]
:PREROUTING ACCEPT [0:0]
COMMIT
Have you checked FF is sending the host header? It should do (unless it's an old version), but it's worth checking before we move on to debugging other things.
Also, what are your proxy settings in FF? Are you running it through DansGuardian or Squid when you get the error? Or does one redirect through the other?
browser page shows:
The connection was reset
The connection to the server was reset while the page was loading
The site could be temporarily unavailable or too busy. Try again in a few
moments.
If you are unable to load any pages, check your computer's network
connection.
If your computer or network is protected by a firewall or proxy, make sure
that Firefox is permitted to access the Web.
# TAG: icp_access
# Allowing or Denying access to the ICP port based on defined
# access lists
#
# icp_access allow|deny [!]aclname ...
#
# See http_access for details
#
#Default:
icp_access allow localnet
icp_access deny all
#
#Allow ICP queries from everyone
icp_access allow all
follow_x_forwarded_for deny all
acl_uses_indirect_client on
delay_pool_uses_indirect_client on
log_uses_indirect_client on
http_port 8080
hierarchy_stoplist cgi-bin ?
cache_mem 1024 MB
maximum_object_size_in_memory 8 KB
cache_dir ufs /var/spool/squid 5000 16 256
maximum_object_size 16384 KB
debug_options ALL,1
hi,
i added "safra.sg" in exceptonsitelist, still the same error 104 read error.
pls look below.
#Sites in exception list
#Don't bother with the www. or
#the http://
#
#These are specifically domains and are not URLs.
#For example 'foo.bar/porn/' is no good, you need
#to just have 'foo.bar'.
#
#You can also match IPs here too.
#
#As of DansGuardian 2.7.3 you can now include
#.tld so for example you can match .gov for example
# Time limiting syntax:
# #time: <start hour> <start minute> <end hour> <end minute> <days>
# Example:
##time: 9 0 17 0 01234
# Remove the first # from the line above to enable this list only from
# 9am to 5pm, Monday to Friday.
# Blanket exception. To allow all sites except those in the
# exceptionsitelist and greysitelist files, remove
# the # from the next line to leave only a '**':
#**
# Blanket SSL/CONNECT exception. To allow all SSL
# and CONNECT tunnels except to addresses in the
# exceptionsitelist and greysitelist files, remove
# the # from the next line to leave only a '**s':
#**s
# Blanket IP exception. To allow all sites specified only as an IP,
# remove the # from the next line to leave only a '*ip':
#*ip
# Blanket SSL/CONNECT IP exception. To allow all SSL and CONNECT
# tunnels to sites specified only as an IP,
# remove the # from the next line to leave only a '*ips':
#*ips
Why don't you comment out all the sites you entered, and uncomment the line which says:
Code:
# Blanket exception. To allow all sites except those in the
# exceptionsitelist and greysitelist files, remove
# the # from the next line to leave only a '**':
#**
so you've completely disabled DansGuardian, and so you know if the problem is with Squid.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.