Squid.conf configuration for http_access not working properly
Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Squid.conf configuration for http_access not working properly
Hi ,
I am trying to configure SQUID on RHEL Server 5 to replace Apache webProxy. What I want to do is this- Everybody authenticates through ncsa_auth but I want different website blocking levels for different usernames . i.e managers and guests should have unrestricted access while the rest of the company employees having usernames in ncsa_auth should have restricted access with both url_regex ,dstdomain and urlpath_regex configured to block certain domains, websites, regex lookup for words and filenames.
My problem is I created two acl 's - badwords and badsites . acl badsites points to a file which has some domains which i need to block andACL badwords should block websites based on the words.
# And finally deny all other access to this proxy http_access allow localhost
#http_access allow ncsa_users !badwords
http_access allow ncsa_users !badsites
http_access deny all
the http_access allow ncsa_users !badwords does not work.only the http_access allow ncsa_users !badsites works.If i uncomment the line,either everything is allowed or everything is denied.
is something like this ok?? http_access allow ncsa_users !badsites !badwords
??
I dont know how to configure different whitelists for different groups of people in the same ncsa_auth ?? or is it possible to use ncsa-auth along with windows NT authentication at the same time??
Would really like ur help?? really struggling with this...
First of all remove http_access allow localhost
Note that the config file is read top down, and if a match is found, the processing is NOT continued.
So in this case, as long as you have http_access allow ncsa_users !badsites
all users who are browsing good sites are allowed.
Now when you add(uncomment)#http_access allow ncsa_users !badwords the badsites rule below will not be read because a match will have been found here.
Now everyuthing will be allowed if your badword file is ok or denied if it is somehow NOT OK. Make sure that squid user can read that file and make it a badwords.txt file instead.
Doing http_access allow ncsa_users !badsites !badwords is allowed
Best approach for squid is to USE DENY FIRST as much as possible to avoid confusion
and to keep away your bad guys by the first match, eg.
Its almost completely working for me now. Thanks for the info... I still dont know how to use the wildcard option in url_regex and urlpath_regex acl.
My squid conf is given below. Am trying to use the acl features properly . Am trying to block Igoogle services without actually blocking other services of google. I did that by including the word 'ig' in a url_regex query but now all URL's with 'ig' in it are being blocked-of course it will,I know.
My query now is is it possible to give the entire website in a file and use dstdomain type-acl
i.e 'acl igoogle dstdomain_regex http://www.google.co.in/ig?hl=en&source=iglk ' Will this block igoogle alone ??
dstdomain should also work. You should use "^" sign when you need to match a string starting with some expression. So if you know what domain you need to ban then you can use dstdomain option as well. But that would mean an extra acl if you already do not have one.
Well i had created a file which does a dstdomain acl check and in it I could only add a url like http://www.gmail.com,but if I try to block the entire URl say http://www.google.com/ig or something ,the portion that comes after .com is not being taken into consideration.
the problem with using a url_regex filter as per my experience is blocking a URL with 'ig' in it blocks everything with ig..
I've used ISA 2006 and in that I just have to give the complete URL which I want to block and it does that and it also accepts wildcars like mail.*.* ,video.*.* , gm*.*.* etc and it will also accept any complete length URL without/with wildcards also..is squid capable of the same things??
I think squid also has that functionality but I tried it using dstdomain as well as url_regex and urlpath_regex but to only limited success.
It can do everything that you are talking ISA can do. You just need to go through the documentation once and search around to get what you need.
And as far as using wildcards is concerned you do not need to always specify wildcards.
If you specify .video. in dstdomain, it will block all the domains that have video as word in it. So you really do not need *.video.* or like.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.