LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 09-10-2009, 08:24 PM   #1
hkothari
Member
 
Registered: Jul 2009
Location: Lowell, Massachusetts
Distribution: Slackware
Posts: 70

Rep: Reputation: 17
Squid Blocking Sites


So, I'm running linux on my laptop, and I decided that I wanted to be a little more productive, and that I could do this by blocking some of my most time wasting sites. After looking around, squid seemed like the best option. I added the domains of the sites I wanted to block to /etc/squid/squid-block.acl and added this to my configuration file, but no matter what, nothing happens.
Code:
#  
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS

acl bad dstdomain "/etc/squid/squid-block.acl"
http_access deny bad
I'm just trying to block stuff locally. Does anyone know how to get this working?

P.S. I also tried url_regex instead of dstdomain
 
Old 09-10-2009, 09:51 PM   #2
win32sux
LQ Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 377Reputation: 377Reputation: 377Reputation: 377
Are you sure there isn't an http_access granting access above that one? Like, the one for 127.0.0.1 perhaps? If you could post your entire ACL section (and a snippet from your squid-block.acl) it might help us spot the problem. I'm assuming you did restart Squid after making the changes.

NOTE: I'm moving this to Server, since it's more of a server software configuration issue, rather than a security one. BTW, I would think that if you had the will power to refrain from unblocking the productivity-reducing sites, then you'd also have the will power to refrain from visiting them in the first place.

Last edited by win32sux; 09-10-2009 at 11:11 PM.
 
Old 09-11-2009, 05:22 AM   #3
hkothari
Member
 
Registered: Jul 2009
Location: Lowell, Massachusetts
Distribution: Slackware
Posts: 70

Original Poster
Rep: Reputation: 17
That is true, but I think even a small measure like that is more of a reminder to focus. I mean, when I'm at school I realized I was rather productive when I needed to be, and I realized that it was because I was more isolated and also away from sites that wasted time. Given it wouldn't be too hard to get around their filter, it's still a helpful way to keep me focused I guess.

Anyways, here's my acl section:
Code:
#Recommended minimum configuration:
acl manager proto cache_object
acl localhost src 127.0.0.1/32
acl to_localhost dst 127.0.0.0/8
#
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8     # RFC1918 possible internal network
acl localnet src 172.16.0.0/12  # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
#
acl SSL_ports port 443
acl Safe_ports port 80          # http
acl Safe_ports port 21          # ftp
acl Safe_ports port 443         # https
acl Safe_ports port 70          # gopher
acl Safe_ports port 210         # wais
acl Safe_ports port 1025-65535  # unregistered ports
acl Safe_ports port 280         # http-mgmt
acl Safe_ports port 488         # gss-http
acl Safe_ports port 591         # filemaker
acl Safe_ports port 777         # multiling http
acl CONNECT method CONNECT
And squid-block.acl contains sites in this format:
Code:
facebook.com
youtube.com
Just like that.
 
Old 09-11-2009, 06:37 AM   #4
win32sux
LQ Guru
 
Registered: Jul 2003
Location: Los Angeles
Distribution: Ubuntu
Posts: 9,870

Rep: Reputation: 377Reputation: 377Reputation: 377Reputation: 377
Well, without being able to look at the http_access lines there's no telling what's happening. Also, I don't see your actual site blocking ACL in there. In any case, you should keep in mind that if you don't put a dot before each of those domains, Squid will only block them when you don't specify a subdomain. With what you've posted (assuming there isn't a problem with the order of your http_access lines), http://facebook.com would be blocked, while http://www.facebook.com would be allowed.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] Squid+DansGuardian not working properly. squid blocking sites that should be linuxlover.chaitanya Linux - Server 13 11-10-2014 11:34 AM
squid 2.6 not blocking sites even i entered ACL to block sites mohantorvalds Linux - Server 1 01-08-2009 05:17 AM
Squid (Blocking tunneling sites) suhas! Linux - Server 4 03-30-2007 03:41 PM
squid url_regex is not blocking the sites nsampath Linux - Server 3 03-29-2007 06:04 AM
Squid Error while blocking sites winxandlinx Linux - Networking 15 06-29-2006 09:32 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 11:51 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration