-   Linux - Server (
-   -   Squid.conf configuration for http_access not working properly (

nandagopalrvarma 02-09-2009 02:54 AM

Squid.conf configuration for http_access not working properly
Hi ,

I am trying to configure SQUID on RHEL Server 5 to replace Apache webProxy. What I want to do is this- Everybody authenticates through ncsa_auth but I want different website blocking levels for different usernames . i.e managers and guests should have unrestricted access while the rest of the company employees having usernames in ncsa_auth should have restricted access with both url_regex ,dstdomain and urlpath_regex configured to block certain domains, websites, regex lookup for words and filenames.

My problem is I created two acl 's - badwords and badsites . acl badsites points to a file which has some domains which i need to block andACL badwords should block websites based on the words.

I applied the ACL 's in the following manner

acl ncsa_users proxy_auth REQUIRED
#acl goodsites dstdomain "/usr/local/etc/allowed-sites.conf"
acl badsites dstdomain "/usr/local/etc/restricted-sites.conf"
acl badwords url_regex [-i] "/usr/local/etc/badwordfile.conf"

and in the http_access section

# And finally deny all other access to this proxy
http_access allow localhost
#http_access allow ncsa_users !badwords
http_access allow ncsa_users !badsites
http_access deny all

the http_access allow ncsa_users !badwords does not work.only the http_access allow ncsa_users !badsites works.If i uncomment the line,either everything is allowed or everything is denied.

is something like this ok??
http_access allow ncsa_users !badsites !badwords

I dont know how to configure different whitelists for different groups of people in the same ncsa_auth ?? or is it possible to use ncsa-auth along with windows NT authentication at the same time??

Would really like ur help?? really struggling with this...

nandagopalrvarma 02-11-2009 04:04 AM

any ideas guys???

chitambira 02-11-2009 01:03 PM

First of all remove http_access allow localhost
Note that the config file is read top down, and if a match is found, the processing is NOT continued.
So in this case, as long as you have http_access allow ncsa_users !badsites
all users who are browsing good sites are allowed.
Now when you add(uncomment)#http_access allow ncsa_users !badwords the badsites rule below will not be read because a match will have been found here.

Now everyuthing will be allowed if your badword file is ok or denied if it is somehow NOT OK. Make sure that squid user can read that file and make it a badwords.txt file instead.

Doing http_access allow ncsa_users !badsites !badwords is allowed
Best approach for squid is to USE DENY FIRST as much as possible to avoid confusion
and to keep away your bad guys by the first match, eg.

http_access deny !ncsa_users
http_access allow ncsa_users !badsites !badwords
http_access deny all


nandagopalrvarma 02-24-2009 03:57 AM

Thanks Chitambira,

Its almost completely working for me now. Thanks for the info... I still dont know how to use the wildcard option in url_regex and urlpath_regex acl.
My squid conf is given below. Am trying to use the acl features properly . Am trying to block Igoogle services without actually blocking other services of google. I did that by including the word 'ig' in a url_regex query but now all URL's with 'ig' in it are being blocked-of course it will,I know.

My query now is is it possible to give the entire website in a file and use dstdomain type-acl
i.e 'acl igoogle dstdomain_regex ' Will this block igoogle alone ??

my squid.conf ---
[root@mxchn-out squid]# grep -v "^#" /etc/squid/squid.conf | sed -e '/^$/d'
maximum_icp_query_timeout 2000
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
cache_dir ufs /var/spool/squid 100 16 256
logformat squid %ts.%03tu %6tr %>a %Ss/%03Hs %<st %rm %ru %un %Sh/%<A %mt
access_log /var/log/squid/access.log squid
cache_log /var/log/squid/cache.log
cache_store_log /var/log/squid/store.log
log_ip_on_direct on
pid_filename /var/run/
log_fqdn on
ftp_passive on
allow_underscore on
dns_timeout 2 minutes
hosts_file /etc/hosts
auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/squid_passwd
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src
acl manager proto cache_object
acl localhost src
acl to_localhost dst
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl ncsa_users proxy_auth REQUIRED
acl badsites dstdomain "/usr/local/etc/restricted-sites.conf"
acl baddomwords dstdom_regex -i "/usr/local/etc/bd"
acl badurl urlpath_regex -i .avi .mp3 .wmv .mpeg .mp4 .mpg mail video
acl gblock url_regex -i ig translate_t picasa sms. Servicelogin naukri
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow ncsa_users !baddomwords !badsites !badurl !gblock
http_access deny all
http_reply_access allow all
icp_access allow all
log_access allow ncsa_users
visible_hostname mxchn-out
coredump_dir /var/spool/squid

nandagopalrvarma 02-24-2009 03:59 AM

Also what do the notations like / , ^ ,$ when used in an acl mean? and how do I use them correctly... Wildcards???

chitambira 02-24-2009 05:17 AM


The regular expression matches any URL that begins with http://.


The regular expression matches any file extension that ends in .jpg. The \ is added because “.” is also a wildcard.

Squid is case sensitive by default. In order to make it

case insensitive use the -i option. e.g.


acl net url_regex -i ^http://www

nandagopalrvarma 03-02-2009 12:06 AM

Thanks Chitambira

I have a query. igoogle' URL is

will acl badurl url_regex -i ^
do the job or can the same be done with dstdomain??

linuxlover.chaitanya 03-02-2009 12:59 AM

dstdomain should also work. You should use "^" sign when you need to match a string starting with some expression. So if you know what domain you need to ban then you can use dstdomain option as well. But that would mean an extra acl if you already do not have one.

nandagopalrvarma 03-03-2009 12:23 AM

Well i had created a file which does a dstdomain acl check and in it I could only add a url like,but if I try to block the entire URl say or something ,the portion that comes after .com is not being taken into consideration.

the problem with using a url_regex filter as per my experience is blocking a URL with 'ig' in it blocks everything with ig..

I've used ISA 2006 and in that I just have to give the complete URL which I want to block and it does that and it also accepts wildcars like mail.*.* ,video.*.* , gm*.*.* etc and it will also accept any complete length URL without/with wildcards squid capable of the same things??

I think squid also has that functionality but I tried it using dstdomain as well as url_regex and urlpath_regex but to only limited success.

thanks for your support.

linuxlover.chaitanya 03-03-2009 12:43 AM

It can do everything that you are talking ISA can do. You just need to go through the documentation once and search around to get what you need.
And as far as using wildcards is concerned you do not need to always specify wildcards.
If you specify .video. in dstdomain, it will block all the domains that have video as word in it. So you really do not need *.video.* or like.

All times are GMT -5. The time now is 01:50 AM.