Originally Posted by geox
If you want/need access restriction it is better to setup a firewall that limits outbound traffic to certain hosts. Using iptables to do that is also much, much more efficient (read: less CPU) than using Squid to do this.
If you really want/have to use Squid, and get a real answer instead
, you should set exceptions for sites that do not work well using Squid. I know I spent most of my Squid administration time on adding exceptions
I've also spent many years maintaining squid servers on both Windows and *nix servers. I would recommend Squid as it's a solid and tested product.
Trying to use a firewall as a web filter is easier then using a proxy? Some sites have many ip addresses how much time will it take to constantly add ip addresses to your firewall? I know we have at least 20 for our site alone. Think of all the ip's the Amazon has, www.amazon.com
(1 IP address) fls-na.amazon.com (+1 IP address) all the images are called from ecx-images.amazon.com (+8 ip addresses) and that's just the beginning of the page.
So using a firewall to restrict access would leave the OP with the same problems. The sites in question call out to different sites when they build the web page. The OP will need to get the name/ip address of the 3rd part sites and add them to squid. This can be done by watching the log files (use tail -f logfilename on Linux)
StevenMorrison, you might also try squidguard one of many products with a squid back end designed as a web filter. Google search Squid web filter to find others.