Visit Jeremy's Blog.
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 07-24-2010, 06:39 AM   #1
LQ Newbie
Registered: Jan 2010
Posts: 15

Rep: Reputation: 0
Could not retrieve local web page, Squid

Hello everyone, Im using Fedora 12 and got configuration problem with Squid (version 3.1.4)

The configuration file is attached

I have Lamp installed on my linux machine. Squid doesn't load the php page and give me this error:
------------------------------ Error -----------------------------
The following error was encountered while trying to retrieve the URL: http://localhost/photo/index.php

Connection to ::1 failed.

The system returned: (111) Connection refused

The remote host or network may be down. Please try the request again.

Your cache administrator is root.


--> I couldn't figure out how come I run into this problem. All the service is running, squid is running, no iptables, LAMP is running and listening. Please help!

+ Also, I would like to test Squid performance to see if it works as I expected, but it doesn't work that way. I still experience slow web page loading. I am having ADSL2, linux box is only installed with Squid, LAMP (Linux Apache MySQL PHP).
- I checked Squid cache.log and access.log to see if its caching something, it does
- On the same machine, I tried to access the same page, but it seems like the page I tried to access is not cached. I noticed it because I used iftop utility to check what the network is doing, the result is it still sends request for dns to resolve hostname and so forth which means there is no caching at all. Can't figure out why this happens. I must configure something wrong...

configuration file:
# Recommended minimum configuration:
acl manager proto cache_object
acl localhost src
acl localhost src ::1/128
acl to_localhost dst
acl to_localhost dst ::1/128
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src # RFC1918 possible internal network
acl localnet src # RFC1918 possible internal network
acl localnet src # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines

acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
# Recommended minimum Access Permission configuration:
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access allow localhost to_localhost
http_access deny manager

# Deny requests to certain unsafe ports
http_access deny !Safe_ports

# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports

# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost

# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all

#--------------------------RESTRICTED WEB SITE----------------------------
acl blockfiles url_regex "/etc/squid/bad_web_sites.squid"
acl blockpath urlpath_regex "/etc/squid/bad_web_sites.squid"
deny_info ERR_BLOCKED_FILES blockfiles
deny_info ERR_BLOCKED_FILES blockpath
http_access deny blockfiles
http_access deny blockpath
#--------------------------END OF RESTRICTED WEB SITES----------------------------
# Squid normally listens to port 3128
http_port 3128 transparent
always_direct allow all
# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?
# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /var/spool/squid 10000 16 256
# Cache access log enters here
cache_access_log /var/log/squid/access.log
log_mime_hdrs off
#--------------------------MIME TABLE---------------------------

mime_table /etc/squid/mime.conf

#------------------------ CACHE TUNING -------------------------
cache_mem 256 MB
reply_body_max_size 3 MB

#------------------------- TIME OUT ---------------------------
connect_timeout 90 second

#---------------VISIBLE HOSTNAME----------------------
visible_hostname linux

# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid

# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320


Any help would be greatly appreciated
Thanks a lot
Attached Files
File Type: txt squidConf.txt (3.5 KB, 12 views)

Last edited by vincent.dang; 07-24-2010 at 06:42 AM. Reason: the attachment is annoying, I dont know it turns out like this. Its better to post on the thread.
Old 07-25-2010, 11:03 AM   #2
Registered: Nov 2009
Location: Kolkata, India
Distribution: Fedora 11
Posts: 136

Rep: Reputation: 22
use "http_access allow all" at the end after "http_access deny all" and see if you get any reply.
0 members found this post helpful.
Old 07-27-2010, 07:01 PM   #3
LQ Newbie
Registered: Jan 2010
Posts: 15

Original Poster
Rep: Reputation: 0
http_access allow all doesn't do the trick. I still get connection refused, same error message. Every problem has a solution but this solution has a problem




Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Elinks Error: Unable to retrieve Web page. jpmaxyusuf Linux - Networking 1 07-19-2010 02:39 PM
setting a default web page using squid amitkush Linux - Server 9 05-06-2010 01:19 AM
Cant make web server run my local web page... Nik0s Linux - Newbie 22 10-08-2006 11:30 PM
cannot view web page from local lan Tigger Linux - Networking 4 05-31-2003 02:03 AM
Need example of how to retrieve a web page in C mjohnson Programming 0 01-21-2001 05:07 PM > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 02:34 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration