LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 01-23-2007, 03:46 PM   #1
2buck56
Member
 
Registered: Oct 2004
Posts: 54

Rep: Reputation: 15
Question Squid and https sites


I have an FC4 box that has squid and iptables running to prevent users from accessing any sites other than the ones I want them to access. I drop all requests by default. I then have this rule in place:

acl allowed_sites url_regex "/etc/squid/Allowed_Sites.txt".

In the Allowed_Sites.txt file I have the sites they are allowed to go to. This has been in place and working fine for a couple of years. The company discovered a couple of weeks ago that some of the users have figured out how to beat the restrictions.

As an example, they try to go to http://www.tricityplating.com. Squid pops up the message telling them that the site is restricted. They then change the url to https://www.tricityplating.com and are able to go to the site. This does not work on all sites. However, it works on quite a few sites. Users being as they are, the word has started to leak out and we are seeing more sites being accessed that are not in the Allowed_Sites.txt file.

I need to block port 443 access except for the sites I want them to visit. It would be nice if I could do it with a text file like I do for the http sites. That allows the manager to edit the text file and add https sites as needed.

I tried redirecting port 443 in iptables to port 3128 like I do with port 80. That stopped all https access, including sites that were in the Allowed_Sites.txt file. I still need to give them https access to authorized sites.

I have started blocking the sites by putting DROP statements in iptables to block the sites as the company discovers them. However, this means they have to catch them going to the sites with a monitoring program (which they have). I would prefer to do it through squid if possible so that only allowed sites are accessible just like I am doing with http sites.

Any help would be appreciated.
 
Old 01-25-2007, 02:11 AM   #2
acid_kewpie
Moderator
 
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417

Rep: Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985
unless there's a complelling reason to do so i'd suggest stopping using transparent proxying, which i assume is what you're doing. the redirect generally sounds fine, but you've not really stated that you've done any troubleshooting from within squid in that direction. what do the entries in access.log say about the http connections? is there a rule higher up in the ACL that blocks 443 outright?
 
Old 01-25-2007, 02:30 AM   #3
Saravana Pandi
LQ Newbie
 
Registered: Jun 2006
Posts: 6

Rep: Reputation: 0
acl allowed_sites url_regex "/etc/squid/Allowed_Sites.txt".

===> Use acl allowed_sites src "etc/squid/Allowed_Sites.txt"."
In Ur Allowed_Sites.txt.. put entries like this

.yahoo.com
.rediff.com

That src tag with conjucttion with the above file works for domain level

blocking. So any url attached with the domain will be blocked(ex https://www.yahoo.com).
 
Old 01-26-2007, 08:50 AM   #4
2buck56
Member
 
Registered: Oct 2004
Posts: 54

Original Poster
Rep: Reputation: 15
acid_kewpie: I have used transparent proxing for over 2 years and it has worked fine until the https issue came up. Is there a compelling reason not to use it? When they try to access the tricityplating.com site, this is what the access.log show:

1169478990.703 3 192.168.1.20 TCP_DENIED/403 1375 GET http://tricityplating.com:443/ - NONE/- text/html

However, when they change the access to https://tricityplating.com, the access log doesn't show anything about the connection. And they go directly to the site at that point.



Saravana Pandi: If I understand what you are saying, I can do the following:

acl blocked_sites src "/etc/squid/Blocked_Sites.txt"
acl http_access deny blocked_sites

I want to leave the allowed_sites rule in place as there are millions of sites I want to block and only a few sites I want to allow access to. I would actually like to have an allowed_sites rule for https if possible. There are only a few sites that they need https access to. I wonder if I did this:

acl https_sites "/etc/squid/Allowed_https.txt"
acl https_access allow https_sites

would it work? Within the Allowed_https.txt file I would put .wachovia.com as an example.

I guess I can test it and see if it works.
 
Old 01-26-2007, 09:00 AM   #5
acid_kewpie
Moderator
 
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417

Rep: Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985
your acl's are wrong. or rather, they aren't meant to be acls...

acl blocked_sites src "/etc/squid/Blocked_Sites.txt"
acl http_access deny blocked_sites

would be

acl blocked_sites src "/etc/squid/Blocked_Sites.txt"
http_access deny blocked_sites

that would work apart from the obvious fact that a blocked website is not a source ip address, so clearly an src acl is the wrong choice. try dstdomain instead.

and the same for the second part, noting that it's still http_access, not https_access. i really would suggest actually reading up on squid though. your acl's have so many very basic mistakes in those 4 lines that it really illustrates that you don't understand any part of them, so go read!

Last edited by acid_kewpie; 01-26-2007 at 09:02 AM.
 
Old 06-13-2007, 11:24 PM   #6
venki
Member
 
Registered: Sep 2006
Location: India
Distribution: suse10.2
Posts: 128

Rep: Reputation: 15
HI all,
Please help me,i am trying to block https from 2 days still there is no use.
I tried above example also..still no use
what to do in squid.conf? http is blocking correctly.I am using IPCOP
plz help me
thks and reagards
 
Old 06-14-2007, 03:06 AM   #7
acid_kewpie
Moderator
 
Registered: Jun 2001
Location: UK
Distribution: Gentoo, RHEL, Fedora, Centos
Posts: 43,417

Rep: Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985Reputation: 1985
please don't drag old up threads. let them die in peace...
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Client cannot open few https://.. sites i.e. secure sites rajeshghy Linux - General 1 11-02-2006 06:30 AM
Firefox cannot load https sites. Lambda_Core Linux - Software 3 08-27-2006 05:24 PM
got ADSL but cannot open any https sites slackist Linux - Networking 8 04-08-2006 02:10 PM
Browser Failing at HTTPS sites -- SOLVED cheerfulpickle LinuxQuestions.org Member Success Stories 2 03-30-2006 04:29 PM
Accessing hotmail, other https sites Tenover Linux - Newbie 3 09-04-2003 02:02 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 06:02 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration