LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Security (https://www.linuxquestions.org/questions/linux-security-4/)
-   -   Squid and https sites (https://www.linuxquestions.org/questions/linux-security-4/squid-and-https-sites-522138/)

2buck56 01-23-2007 03:46 PM

Squid and https sites
 
I have an FC4 box that has squid and iptables running to prevent users from accessing any sites other than the ones I want them to access. I drop all requests by default. I then have this rule in place:

acl allowed_sites url_regex "/etc/squid/Allowed_Sites.txt".

In the Allowed_Sites.txt file I have the sites they are allowed to go to. This has been in place and working fine for a couple of years. The company discovered a couple of weeks ago that some of the users have figured out how to beat the restrictions.

As an example, they try to go to http://www.tricityplating.com. Squid pops up the message telling them that the site is restricted. They then change the url to https://www.tricityplating.com and are able to go to the site. This does not work on all sites. However, it works on quite a few sites. Users being as they are, the word has started to leak out and we are seeing more sites being accessed that are not in the Allowed_Sites.txt file.

I need to block port 443 access except for the sites I want them to visit. It would be nice if I could do it with a text file like I do for the http sites. That allows the manager to edit the text file and add https sites as needed.

I tried redirecting port 443 in iptables to port 3128 like I do with port 80. That stopped all https access, including sites that were in the Allowed_Sites.txt file. I still need to give them https access to authorized sites.

I have started blocking the sites by putting DROP statements in iptables to block the sites as the company discovers them. However, this means they have to catch them going to the sites with a monitoring program (which they have). I would prefer to do it through squid if possible so that only allowed sites are accessible just like I am doing with http sites.

Any help would be appreciated.

acid_kewpie 01-25-2007 02:11 AM

unless there's a complelling reason to do so i'd suggest stopping using transparent proxying, which i assume is what you're doing. the redirect generally sounds fine, but you've not really stated that you've done any troubleshooting from within squid in that direction. what do the entries in access.log say about the http connections? is there a rule higher up in the ACL that blocks 443 outright?

Saravana Pandi 01-25-2007 02:30 AM

acl allowed_sites url_regex "/etc/squid/Allowed_Sites.txt".

===> Use acl allowed_sites src "etc/squid/Allowed_Sites.txt"."
In Ur Allowed_Sites.txt.. put entries like this

.yahoo.com
.rediff.com

That src tag with conjucttion with the above file works for domain level

blocking. So any url attached with the domain will be blocked(ex https://www.yahoo.com).

2buck56 01-26-2007 08:50 AM

acid_kewpie: I have used transparent proxing for over 2 years and it has worked fine until the https issue came up. Is there a compelling reason not to use it? When they try to access the tricityplating.com site, this is what the access.log show:

1169478990.703 3 192.168.1.20 TCP_DENIED/403 1375 GET http://tricityplating.com:443/ - NONE/- text/html

However, when they change the access to https://tricityplating.com, the access log doesn't show anything about the connection. And they go directly to the site at that point.



Saravana Pandi: If I understand what you are saying, I can do the following:

acl blocked_sites src "/etc/squid/Blocked_Sites.txt"
acl http_access deny blocked_sites

I want to leave the allowed_sites rule in place as there are millions of sites I want to block and only a few sites I want to allow access to. I would actually like to have an allowed_sites rule for https if possible. There are only a few sites that they need https access to. I wonder if I did this:

acl https_sites "/etc/squid/Allowed_https.txt"
acl https_access allow https_sites

would it work? Within the Allowed_https.txt file I would put .wachovia.com as an example.

I guess I can test it and see if it works.

acid_kewpie 01-26-2007 09:00 AM

your acl's are wrong. or rather, they aren't meant to be acls...

acl blocked_sites src "/etc/squid/Blocked_Sites.txt"
acl http_access deny blocked_sites

would be

acl blocked_sites src "/etc/squid/Blocked_Sites.txt"
http_access deny blocked_sites

that would work apart from the obvious fact that a blocked website is not a source ip address, so clearly an src acl is the wrong choice. try dstdomain instead.

and the same for the second part, noting that it's still http_access, not https_access. i really would suggest actually reading up on squid though. your acl's have so many very basic mistakes in those 4 lines that it really illustrates that you don't understand any part of them, so go read!

venki 06-13-2007 11:24 PM

HI all,
Please help me,i am trying to block https from 2 days still there is no use.
I tried above example also..still no use
what to do in squid.conf? http is blocking correctly.I am using IPCOP
plz help me
thks and reagards

acid_kewpie 06-14-2007 03:06 AM

please don't drag old up threads. let them die in peace...


All times are GMT -5. The time now is 11:52 PM.