LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 12-28-2006, 02:54 PM   #1
sholah
Member
 
Registered: Dec 2006
Posts: 34

Rep: Reputation: 15
Preventing a browser from connecting to a site


How do i use squid to prevent computers on my network from browsing or connecting to certain websites?

Anyone please!!!!

thanx
 
Old 12-28-2006, 03:01 PM   #2
timdsmith
Member
 
Registered: Nov 2003
Location: The Colony, TX
Distribution: Slackware, Debian Etch, FreeBSD, MicroSh*t free.
Posts: 209

Rep: Reputation: 30
Create a squid-block.acl file and save it in the same directory with squid.conf.
add these two line to squid.conf:
Code:
acl badURL url_regex -i "/path/to/squid-block.acl"
http_access deny badURL
add a line like this for whatever web site you want to block to squid-block.acl

.whatever.com

Last edited by timdsmith; 12-28-2006 at 03:05 PM.
 
Old 01-09-2007, 12:52 PM   #3
sholah
Member
 
Registered: Dec 2006
Posts: 34

Original Poster
Rep: Reputation: 15
blocking websites/pages with squid

the method doesnt seem to work.i intend to block this site (http://bigbooster.com/other/extractor.html) with squid.

any one?

thanx
 
Old 01-09-2007, 02:21 PM   #4
willia01
LQ Newbie
 
Registered: Jan 2007
Posts: 10

Rep: Reputation: 0
create a text file , denied_sites is as good a name as any.
file should have sites to be blocked without leeding www , IE google.com.
add a line similar to following with the acl entrie in file
acl closedsites url_regex "fullpath/denied_sites"

then add
http_access deny our_networks closedsites

BEFORE the default allow all , as squid reads the http_access rules in order.

the our_networks would also need to be defined as an ACL similar to
acl our_networks src 100.11.9.0/16

then do a squid -k reconfigure
should then work.

can also use a redirector , which would be a lot faster if you are going to be blocking a lot of sites as squid loads all these into mem , which slows down startup.
 
Old 02-02-2007, 02:50 PM   #5
sholah
Member
 
Registered: Dec 2006
Posts: 34

Original Poster
Rep: Reputation: 15
hi willia01,

can u tell me more about the redirector and how it works?

thanks.
 
Old 02-03-2007, 03:08 PM   #6
willia01
LQ Newbie
 
Registered: Jan 2007
Posts: 10

Rep: Reputation: 0
Squid was designed with the capability to call external programs for authentication , it also allows you to call a redirector (most based on squidguard , some on dansguardian.)
If a redirector is specified squid applies all it's ACL's and if the request passes it then sends it to the redirector.
the redirectors are mostly used to apply an additional set of ACL's normally to block access to undesireable sites from a corporate or educational viewpoint(porn, warez, phishing sites , dating , racism, violence ,etc)
Normally used to check against a list of blacklists and if the request matches a deny rule the browser is redirected to a site that denies the access (in our case a site giving the username , source hostname , and reason site was blocked, as well as the company internet usage policy)
A CGI script normally does this quite well.

If it matches an allow rule the request is passed.
At the end there is a default allow or deny rule , this applies if no other rule is matched.
It works like squid , IE the first matching policy applies.

There are free blacklists that you can download (notably squidGuard has one of the better ones) however they are not always updated timeously so due to the dynamic nature of the internet they tend to go out of date rather quickly , this can lead to unneccesary virus exposure on your network , and sites being allowed erroneously , or blocked when they are harmless.

As in our situation this was not acceptable , we decided to use the ufdbGuard redirector (software is free , and extremely fast)
And buy the blacklist subscription from them , and I must say they are extremely good , the lists are updated daily , and 99.9% of the sites are correctly categorized.

However the free lists are definitely better than no lists , so if your budget does not allow the paid option the squidGuard list is very good , and you can still use faster ufdbGuard redirector engine as it is free.

If you are working for a educational institution then DansGuardian is free , and it incorporates some very nice additional functionality (is free for libraries , schools , personal use , but not free for corporate use , licence fees are quite reasonable though)
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Issue connecting to ftp thru browser Badfish Linux - Networking 4 11-02-2006 10:44 AM
Download from ftp site freezes browser bongski55 Linux - General 2 09-17-2005 01:59 AM
Browser not connecting anymore to a specified port cubax Linux - Networking 7 05-29-2005 07:13 PM
Problem connecting to external ftp site kman2045 Linux - Networking 2 01-11-2004 06:41 AM
Browser not connecting to internet umannu Linux - Software 1 09-19-2003 01:06 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 02:31 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration