i want to block a URL from squid
i m running Red hat enterprise linux EL core4, i want to block a URL for my clients. kindly help me to place a access control list.
thanks |
You can do this with regular expressions or destination domain lists (and probably other ways as well). A short example would be:
Code:
acl Cooking1 url_regex cooking http://wiki.squid-cache.org/SquidFaq http://www.linuxhomenetworking.com/linux-adv/squid.htm |
I like block different users different sites
Example I have 3 user list and 3 file with block site names
acl group1 src 192.168.0.2 192.168.0.3 192.168.0.4/24 acl group2 src 192.168.0.5 192.168.0.6 192.168.0.7/24 acl group3 src 192.168.0.8 192.168.0.9 192.168.0.10/24 and acl blocksites1 url_regex -i "/etc/squid/block1.txt" acl blocksites2 url_regex -i "/etc/squid/block2.txt" acl blocksites3 url_regex -i "/etc/squid/block3.txt" I like block blocksites1 for group1, blocksites2 for group2, blocksites3 for group3, help me please |
Quote:
I've got the office, the plant, our security shack, and the union office: acl office src 192.168.50.0/24 acl plant src 192.168.99.0/24 acl union src 192.168.128.70/32 acl security src 192.168.128.183/32 I can define the sites I want either as a list in the .conf file: acl union-sites dstdomain www.ohiobwc.com .gov .edu .state.oh.us acl ok-plant-sites dstdomain .microsoft.com .windowsupdate.com ... or as a simple text file: acl porn-sites-1 dstdomain "/etc/squid/porn-sites-1" acl porn-sites-2 url_regex "/etc/squid/porn-sites-2" (these are plain text files that look like this: .allnicegirls.com .elephantlist.com .4pigs.com .consumptionjunction.com etc.) Then, I just tell squid what to do: http_access deny office porn-sites-1 http_access deny plant porn-sites-1 http_access deny office porn-sites-2 http_access deny plant porn-sites-2 # Put denys first... http_access deny union !union-sites http_access deny security !ok-plant-sites http_access allow office http_access allow plant |
All times are GMT -5. The time now is 04:46 AM. |