LinuxQuestions.org
View the Most Wanted LQ Wiki articles.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices

Reply
 
Search this Thread
Old 11-19-2009, 11:03 PM   #1
Winanjaya
Member
 
Registered: Sep 2003
Posts: 209

Rep: Reputation: 32
how to deny block https sites for some users


in squid, how to block some https sites for some users?

ie. I want to deny https://www.google.com and https://www.xyz.com
for 192.168.1.6-16

please help

thanks & regards
 
Old 11-20-2009, 12:51 AM   #2
GlennsPref
Senior Member
 
Registered: Apr 2004
Location: Brisbane, Australia
Distribution: Mageia Studio-13.37 Kubuntu.
Posts: 3,325
Blog Entries: 33

Rep: Reputation: 199Reputation: 199
Hi, I also use squid.

ref. Restricting Access to specific Web sites

Quote:
Squid is also capable of reading files containing lists of web sites and/or domains for use in ACLs. In this example we create to lists in files named /usr/local/etc/allowed-sites.squid and /usr/local/etc/restricted-sites.squid.

# Add this to the bottom of the ACL section of squid.conf
acl home_network src 192.168.1.0/24
acl business_hours time M T W H F 9:00-17:00
acl GoodSites dstdomain "/usr/local/etc/allowed-sites.squid"
acl BadSites dstdomain "/usr/local/etc/restricted-sites.squid"

#
# Add this at the top of the http_access section of squid.conf
#
http_access deny BadSites
http_access allow home_network business_hours GoodSites
these addresses to the files need to be logical, and kept in a non-user space.

If you don't want to restrict the times, leave out that line...
eg. "acl business_hours time M T W H F 9:00-17:00"

These files may contain....another example from linuxhomenetworking
Quote:
# File: /usr/local/etc/allowed-sites.squid
www.openfree.org
linuxhomenetworking.com

# File: /usr/local/etc/restricted-sites.squid
www.porn.com
illegal.com
also see...www.linuxhomenetworking.com/wiki and
http://www.visolve.com/squid/squid30/contents.php

for a range of addresses, try this synopsis instead of the example...
Quote:
addr1-addr2/netmask

Like this, change...
Code:
acl home_network src 192.168.1.0/24 to...
acl home_network src 192.168.1.6-192.168.1.16/24
ref. http://www.visolve.com/squid/squid30...ntrols.php#acl

Read these sites for more info.

Hope this helps you, cheers Glenn

Last edited by GlennsPref; 11-20-2009 at 01:07 AM. Reason: spelling and comprehension
 
Old 11-20-2009, 03:27 AM   #3
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally Posted by Winanjaya View Post
in squid, how to block some https sites for some users?

ie. I want to deny https://www.google.com and https://www.xyz.com
for 192.168.1.6-16

please help

thanks & regards
Have you searched LQ for previous threads about this issue? I know for a fact that there are several, because I have participated in some. In any case, what you want could be done like this:
Code:
acl clients src 192.168.1.6-192.168.1.16
acl https_sites dstdomain .www.google.com
acl https_sites dstdomain .www.xyz.com
acl CONNECT method CONNECT

http_access deny clients https_sites
http_access allow clients
http_access deny all
Keep in mind that you're matching by subdomain. If you want to block HTTPS for the entire domains do:
Code:
acl clients src 192.168.1.6-192.168.1.16
acl https_sites dstdomain .google.com
acl https_sites dstdomain .xyz.com
acl CONNECT method CONNECT

http_access deny clients https_sites CONNECT
http_access allow clients
http_access deny all

Last edited by win32sux; 11-20-2009 at 03:28 AM.
 
Old 11-20-2009, 04:19 AM   #4
Winanjaya
Member
 
Registered: Sep 2003
Posts: 209

Original Poster
Rep: Reputation: 32
but how to put the https site list into file?
 
Old 11-20-2009, 06:08 AM   #5
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally Posted by Winanjaya View Post
but how to put the https site list into file?
Just stick them in a text file and specify it like:
Code:
acl clients src 192.168.1.6-192.168.1.16
acl https_sites dstdomain "/etc/squid/example.txt"
acl CONNECT method CONNECT

http_access deny clients https_sites CONNECT
http_access allow clients
http_access deny all
 
Old 11-30-2009, 10:52 PM   #6
Winanjaya
Member
 
Registered: Sep 2003
Posts: 209

Original Poster
Rep: Reputation: 32
I tried below .. but sometimes it doesnot work for some sites (such as https://facebook.com .. etc)

I have /etc/squid/src/freehttps

allowedhttps.domain1.com
allowedhttps.domain2.com

..

in squid.conf

acl clients src 192.168.1.0/24
acl freehttps url_regex -i "/etc/squid/src/freehttps"
acl CONNECT method CONNECT

http_access allow CONNECT freehttps
htto_access allow clients
http_access deny all


I still able to visit https://www.facebook.com

what I missed?
 
Old 11-30-2009, 11:17 PM   #7
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally Posted by Winanjaya View Post
I tried below .. but sometimes it doesnot work for some sites (such as https://facebook.com .. etc)

I have /etc/squid/src/freehttps

allowedhttps.domain1.com
allowedhttps.domain2.com

..

in squid.conf

acl clients src 192.168.1.0/24
acl freehttps url_regex -i "/etc/squid/src/freehttps"
acl CONNECT method CONNECT

http_access allow CONNECT freehttps
htto_access allow clients
http_access deny all


I still able to visit https://www.facebook.com

what I missed?
Why are you using URL regular expressions? That wasn't suggested anywhere on this thread. Also, the second http_access (which is presumably spelled properly in your actual squid.conf) would give total access to all clients in 192.168.1.0/24, which makes the first http_access pointless.

Last edited by win32sux; 11-30-2009 at 11:21 PM.
 
1 members found this post helpful.
Old 11-30-2009, 11:24 PM   #8
Winanjaya
Member
 
Registered: Sep 2003
Posts: 209

Original Poster
Rep: Reputation: 32
Although I changed it to

acl clients src 192.168.1.0/24
acl freehttps dstdomain "/etc/squid/src/freehttps"
acl CONNECT method CONNECT

http_access allow CONNECT freehttps
htto_access allow clients
http_access deny all

I still able to visit https://www.facebook.com


in "/etc/squid/src/freehttps" contains:

domain1.com
domain2.com
 
Old 11-30-2009, 11:25 PM   #9
Winanjaya
Member
 
Registered: Sep 2003
Posts: 209

Original Poster
Rep: Reputation: 32
and by the way .. I am running transparent proxy.. any comment?
 
Old 11-30-2009, 11:35 PM   #10
Winanjaya
Member
 
Registered: Sep 2003
Posts: 209

Original Poster
Rep: Reputation: 32
and in /var/log/squid/access.. I found the following..:

1954 POST http://204.11.16.115:80/toolbar/activate.php - NONE/- text/html


I dont know how to block such address?
 
Old 11-30-2009, 11:36 PM   #11
win32sux
Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 371Reputation: 371Reputation: 371Reputation: 371
Quote:
Originally Posted by Winanjaya View Post
Although I changed it to

acl clients src 192.168.1.0/24
acl freehttps dstdomain "/etc/squid/src/freehttps"
acl CONNECT method CONNECT

http_access allow CONNECT freehttps
htto_access allow clients
http_access deny all

I still able to visit https://www.facebook.com
If I'm properly understanding what you want, this should look like this instead:
Code:
acl clients src 192.168.1.0/24
acl freehttps dstdomain "/etc/squid/src/freehttps"
acl CONNECT method CONNECT

http_access allow CONNECT freehttps clients
http_access deny CONNECT clients
http_access allow clients
http_access deny all
Quote:
in "/etc/squid/src/freehttps" contains:

domain1.com
domain2.com
You need a dot before each of those domains, like:
Code:
.domain1.com
.domain2.com

Quote:
Originally Posted by Winanjaya View Post
and by the way .. I am running transparent proxy.. any comment?
It doesn't really matter in this case, as AFAICT your problem is bad ACLs, not interception.

Last edited by win32sux; 11-30-2009 at 11:38 PM.
 
1 members found this post helpful.
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
How to block https sites through ACL in squid avi_tokade Linux - Newbie 5 04-12-2011 05:53 PM
Squid to block all the sites except 1 or 2 sites winxandlinx Linux - Networking 8 10-27-2010 02:53 AM
How to block all mail sites (https) & chat clients to particular subnet satishmali1983 Linux - Server 1 04-14-2009 09:57 AM
squid 2.6 not blocking sites even i entered ACL to block sites mohantorvalds Linux - Server 1 01-08-2009 04:17 AM
Client cannot open few https://.. sites i.e. secure sites rajeshghy Linux - General 1 11-02-2006 06:30 AM


All times are GMT -5. The time now is 09:47 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration