LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Squid Proxy (https://www.linuxquestions.org/questions/linux-newbie-8/squid-proxy-4175475697/)

StevenMorrison 09-03-2013 07:20 AM

Squid Proxy
 
Hi Linux Administrators

I am quite new in the Linux environment.
I am struggling getting 2 users to access specific websites through the proxy but both of them are restricted in the allowhosts.txt.
The main problem i am having is that for whatever reason the websites i add on the allowurls.txt works but not correctly.

For example dropbox.com & news24.com does not show right in IE8,9 and Firefox with the restricted access through the proxy but with full access it shows right.

I have tried everything from my side but not even google gives me a step into a direction to try anymore.

Would really appreciate some help here.

cdhjrt 09-03-2013 11:47 AM

This is probably due to the sites calling out to other sites outside of their domain. You will need to watch the squid logs to determine which sites will also need to be opened. This happens on lots of web sites so you will probably need yo do this every time you add a new site. Most of the time it's just advertisements but not always.

IE: dropbox.com may be the front end but after you log in you go to securesite.com or some of the icons may come from secdropbox.com. In either case the other domain will need to be added.

Hope this helps.

geox 09-04-2013 01:59 AM

I have used Squid a number of years and to be honest: it did not bring me anything.
I decided to drop Squid years ago and glad I did. Most browsers have plenty of cache so if you are using Squid to save bandwidth it will not really save all that much.

If you want/need access restriction it is better to setup a firewall that limits outbound traffic to certain hosts. Using iptables to do that is also much, much more efficient (read: less CPU) than using Squid to do this.

If you really want/have to use Squid, and get a real answer instead :), you should set exceptions for sites that do not work well using Squid. I know I spent most of my Squid administration time on adding exceptions :)

SAbhi 09-04-2013 03:59 AM

well it is always good to share what you ahev tried so far or what is there in your configuration file...

thats ways w can figure out better what is causing you a prob..

what are these files allowuser.txt and allowurl.txt & where did you used them in your squid config ?

geox 09-04-2013 04:44 AM

I used to use a wpad.dat file on my local webserver and dug it out of my archive :)
You can use this to make exceptions for certain sites.
If you set your browser to "detect proxy settings automatically" they will automatically use this file.
Note that you need this file in the root of your webserver on your local network. It has to be available under 2 different filenames: proxy.pac and wpad.dat
Symlink is easiest to get this done.

Cleaned up wpad.dat/proxy.pac:
Code:

function FindProxyForURL(url, host)
{
    // For servers in the local domain, go direct. Add more exceptions as needed
    if ( isPlainHostName(host)
      || dnsDomainIs(host, ".xxx.xx")
      || dnsDomainIs(host, "xxxxxxx.net")
      )
        return "DIRECT";

    // If it's not local, use the cache server.
    return "PROXY myserver.lan:8080";
}


cdhjrt 09-04-2013 08:44 AM

Quote:

Originally Posted by geox (Post 5021372)

If you want/need access restriction it is better to setup a firewall that limits outbound traffic to certain hosts. Using iptables to do that is also much, much more efficient (read: less CPU) than using Squid to do this.

If you really want/have to use Squid, and get a real answer instead :), you should set exceptions for sites that do not work well using Squid. I know I spent most of my Squid administration time on adding exceptions :)

I've also spent many years maintaining squid servers on both Windows and *nix servers. I would recommend Squid as it's a solid and tested product.

Trying to use a firewall as a web filter is easier then using a proxy? Some sites have many ip addresses how much time will it take to constantly add ip addresses to your firewall? I know we have at least 20 for our site alone. Think of all the ip's the Amazon has, www.amazon.com (1 IP address) fls-na.amazon.com (+1 IP address) all the images are called from ecx-images.amazon.com (+8 ip addresses) and that's just the beginning of the page.

So using a firewall to restrict access would leave the OP with the same problems. The sites in question call out to different sites when they build the web page. The OP will need to get the name/ip address of the 3rd part sites and add them to squid. This can be done by watching the log files (use tail -f logfilename on Linux)

StevenMorrison, you might also try squidguard one of many products with a squid back end designed as a web filter. Google search Squid web filter to find others.

geox 09-04-2013 08:58 AM

I agree that Squid is a solid and well tested product. That is why i used it for 5 years! I just found I did not really need to use it anymore.

I did not consider the multi-ip per hostname/domainname case, I agree this would be problematic to use in a production environment.


All times are GMT -5. The time now is 02:30 PM.