LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 02-09-2009, 01:54 AM   #1
nandagopalrvarma
LQ Newbie
 
Registered: Feb 2009
Posts: 6

Rep: Reputation: 0
Exclamation Squid.conf configuration for http_access not working properly


Hi ,

I am trying to configure SQUID on RHEL Server 5 to replace Apache webProxy. What I want to do is this- Everybody authenticates through ncsa_auth but I want different website blocking levels for different usernames . i.e managers and guests should have unrestricted access while the rest of the company employees having usernames in ncsa_auth should have restricted access with both url_regex ,dstdomain and urlpath_regex configured to block certain domains, websites, regex lookup for words and filenames.

My problem is I created two acl 's - badwords and badsites . acl badsites points to a file which has some domains which i need to block andACL badwords should block websites based on the words.

I applied the ACL 's in the following manner

acl ncsa_users proxy_auth REQUIRED
#acl goodsites dstdomain "/usr/local/etc/allowed-sites.conf"
acl badsites dstdomain "/usr/local/etc/restricted-sites.conf"
acl badwords url_regex [-i] "/usr/local/etc/badwordfile.conf"


and in the http_access section

# And finally deny all other access to this proxy
http_access allow localhost
#http_access allow ncsa_users !badwords
http_access allow ncsa_users !badsites
http_access deny all


the http_access allow ncsa_users !badwords does not work.only the http_access allow ncsa_users !badsites works.If i uncomment the line,either everything is allowed or everything is denied.

is something like this ok??
http_access allow ncsa_users !badsites !badwords
??

I dont know how to configure different whitelists for different groups of people in the same ncsa_auth ?? or is it possible to use ncsa-auth along with windows NT authentication at the same time??

Would really like ur help?? really struggling with this...
 
Old 02-11-2009, 03:04 AM   #2
nandagopalrvarma
LQ Newbie
 
Registered: Feb 2009
Posts: 6

Original Poster
Rep: Reputation: 0
Unhappy

any ideas guys???
 
Old 02-11-2009, 12:03 PM   #3
chitambira
Member
 
Registered: Oct 2008
Location: Online
Distribution: RHEL, Centos
Posts: 373
Blog Entries: 1

Rep: Reputation: 51
First of all remove http_access allow localhost
Note that the config file is read top down, and if a match is found, the processing is NOT continued.
So in this case, as long as you have http_access allow ncsa_users !badsites
all users who are browsing good sites are allowed.
Now when you add(uncomment)#http_access allow ncsa_users !badwords the badsites rule below will not be read because a match will have been found here.

Now everyuthing will be allowed if your badword file is ok or denied if it is somehow NOT OK. Make sure that squid user can read that file and make it a badwords.txt file instead.

Doing http_access allow ncsa_users !badsites !badwords is allowed
Best approach for squid is to USE DENY FIRST as much as possible to avoid confusion
and to keep away your bad guys by the first match, eg.

http_access deny !ncsa_users
http_access allow ncsa_users !badsites !badwords
http_access deny all

Cheers

Last edited by chitambira; 07-29-2009 at 05:10 AM.
 
Old 02-24-2009, 02:57 AM   #4
nandagopalrvarma
LQ Newbie
 
Registered: Feb 2009
Posts: 6

Original Poster
Rep: Reputation: 0
Thanks Chitambira,

Its almost completely working for me now. Thanks for the info... I still dont know how to use the wildcard option in url_regex and urlpath_regex acl.
My squid conf is given below. Am trying to use the acl features properly . Am trying to block Igoogle services without actually blocking other services of google. I did that by including the word 'ig' in a url_regex query but now all URL's with 'ig' in it are being blocked-of course it will,I know.

My query now is is it possible to give the entire website in a file and use dstdomain type-acl
i.e 'acl igoogle dstdomain_regex http://www.google.co.in/ig?hl=en&source=iglk ' Will this block igoogle alone ??

my squid.conf ---
[root@mxchn-out squid]# grep -v "^#" /etc/squid/squid.conf | sed -e '/^$/d'
http_port 10.4.0.14:3128
maximum_icp_query_timeout 2000
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
cache deny QUERY
acl apache rep_header Server ^Apache
broken_vary_encoding allow apache
cache_dir ufs /var/spool/squid 100 16 256
logformat squid %ts.%03tu %6tr %>a %Ss/%03Hs %<st %rm %ru %un %Sh/%<A %mt
access_log /var/log/squid/access.log squid
cache_log /var/log/squid/cache.log
cache_store_log /var/log/squid/store.log
log_ip_on_direct on
pid_filename /var/run/squid.pid
log_fqdn on
client_netmask 255.255.255.255
ftp_passive on
allow_underscore on
dns_timeout 2 minutes
dns_nameservers 10.4.0.18 10.4.0.20 203.200.157.73
hosts_file /etc/hosts
auth_param basic program /usr/lib/squid/ncsa_auth /etc/squid/squid_passwd
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl all src 0.0.0.0/0.0.0.0
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
acl ncsa_users proxy_auth REQUIRED
acl badsites dstdomain "/usr/local/etc/restricted-sites.conf"
acl baddomwords dstdom_regex -i "/usr/local/etc/bd"
acl badurl urlpath_regex -i .avi .mp3 .wmv .mpeg .mp4 .mpg mail video
acl gblock url_regex -i ig translate_t picasa sms. Servicelogin naukri
monster
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow ncsa_users !baddomwords !badsites !badurl !gblock
http_access deny all
http_reply_access allow all
icp_access allow all
log_access allow ncsa_users
visible_hostname mxchn-out
coredump_dir /var/spool/squid
 
Old 02-24-2009, 02:59 AM   #5
nandagopalrvarma
LQ Newbie
 
Registered: Feb 2009
Posts: 6

Original Poster
Rep: Reputation: 0
Also what do the notations like / , ^ ,$ when used in an acl mean? and how do I use them correctly... Wildcards???
 
Old 02-24-2009, 04:17 AM   #6
chitambira
Member
 
Registered: Oct 2008
Location: Online
Distribution: RHEL, Centos
Posts: 373
Blog Entries: 1

Rep: Reputation: 51
Quote:
^http://
The regular expression matches any URL that begins with http://.


Quote:
\.jpg$
The regular expression matches any file extension that ends in .jpg. The \ is added because “.” is also a wildcard.

Squid is case sensitive by default. In order to make it

case insensitive use the -i option. e.g.

Quote:
acl net url_regex -i ^http://www
 
Old 03-01-2009, 11:06 PM   #7
nandagopalrvarma
LQ Newbie
 
Registered: Feb 2009
Posts: 6

Original Poster
Rep: Reputation: 0
Thanks Chitambira

I have a query. igoogle' URL is http://www.google.co.in/ig?hl=en&source=iglk

will acl badurl url_regex -i ^http://www.google.co.in/ig?hl=en&source=iglk
do the job or can the same be done with dstdomain??
 
Old 03-01-2009, 11:59 PM   #8
linuxlover.chaitanya
Senior Member
 
Registered: Apr 2008
Location: Gurgaon, India
Distribution: Cent OS 6/7
Posts: 4,631

Rep: Reputation: Disabled
dstdomain should also work. You should use "^" sign when you need to match a string starting with some expression. So if you know what domain you need to ban then you can use dstdomain option as well. But that would mean an extra acl if you already do not have one.
 
Old 03-02-2009, 11:23 PM   #9
nandagopalrvarma
LQ Newbie
 
Registered: Feb 2009
Posts: 6

Original Poster
Rep: Reputation: 0
Well i had created a file which does a dstdomain acl check and in it I could only add a url like http://www.gmail.com,but if I try to block the entire URl say http://www.google.com/ig or something ,the portion that comes after .com is not being taken into consideration.

the problem with using a url_regex filter as per my experience is blocking a URL with 'ig' in it blocks everything with ig..

I've used ISA 2006 and in that I just have to give the complete URL which I want to block and it does that and it also accepts wildcars like mail.*.* ,video.*.* , gm*.*.* etc and it will also accept any complete length URL without/with wildcards also..is squid capable of the same things??

I think squid also has that functionality but I tried it using dstdomain as well as url_regex and urlpath_regex but to only limited success.

thanks for your support.
 
Old 03-02-2009, 11:43 PM   #10
linuxlover.chaitanya
Senior Member
 
Registered: Apr 2008
Location: Gurgaon, India
Distribution: Cent OS 6/7
Posts: 4,631

Rep: Reputation: Disabled
It can do everything that you are talking ISA can do. You just need to go through the documentation once and search around to get what you need.
And as far as using wildcards is concerned you do not need to always specify wildcards.
If you specify .video. in dstdomain, it will block all the domains that have video as word in it. So you really do not need *.video.* or like.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Squid+DansGuardian not working properly. squid blocking sites that should be linuxlover.chaitanya Linux - Server 13 11-10-2014 10:34 AM
Tedious job to detect http_access rule in squid suhas! Linux - Server 1 05-06-2007 12:54 AM
message command in smb.conf not working properly kamransoomro84 Linux - Desktop 0 11-07-2006 12:51 PM
Squid problem with http_access and time acl mago Linux - Networking 1 11-25-2005 01:11 PM
configuration of squid.conf file for fedore core 3 raju_dhakar2002 Linux - Networking 3 04-20-2005 10:25 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 02:12 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration