Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I have configured squid and squidGuard by following this link http://www.squidguard.org/Doc/
squidguard is blocking blacklisted domians pretty fine however when it comes to whitelist i'm not able to open any websites.I'm getting below error . my proxy server is behind firewall and all ports have been opened.
The requested URL could not be retrieved
(110) Connection timed out
The remote host or network may be down. Please try the request again
Hi,
I have configured squid and squidGuard by following this link http://www.squidguard.org/Doc/
squidguard is blocking blacklisted domians pretty fine however when it comes to whitelist i'm not able to open any websites.I'm getting below error . my proxy server is behind firewall and all ports have been opened.
The requested URL could not be retrieved (110) Connection timed out The remote host or network may be down. Please try the request again
..and since you don't tell us what version/distro of Linux, version of Squid, Squidguard, show us the *RELEVANT* part of your squid config, or even tell us what website, what do you think we'll be able to tell you?
Chances are, there are some rules out of order in your configuration, but that's just a guess based on the behavior.
Below is the squid server details. Squid server is not directly connected to Internet. Accessing internet by enabling proxy. I want client machines traffic should flow through my proxy server.
OS version : Oracle Linux 6.5
Squid version : squid-3.1.10-19.el6_4.x86_64
SquidGuard version : SquidGuard: 1.4 Sleepycat Software: Berkeley DB 4.3.29: (September 6, 2005)
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
url_rewrite_program /usr/local/bin/squidGuard -c /usr/local/squidGuard/squidGuard.conf
#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# We strongly recommend the following be uncommented to protect innocent
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
Below is the squid server details. Squid server is not directly connected to Internet. Accessing internet by enabling proxy. I want client machines traffic should flow through my proxy server.
OS version : Oracle Linux 6.5
Squid version : squid-3.1.10-19.el6_4.x86_64
SquidGuard version : SquidGuard: 1.4 Sleepycat Software: Berkeley DB 4.3.29: (September 6, 2005)
Since you're using Oracle Linux, are you PAYING FOR IT?? And use CODE tags around config files, please. And as asked, only post the *RELEVANT* pieces of your configs, not the entire files.
Quote:
/etc/squid/squid.conf
Code:
#
# Recommended minimum configuration:
#
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
# Example rule allowing access from your local networks.
# Adapt to list your (internal) IP networks from where browsing
# should be allowed
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
acl localnet src fc00::/7 # RFC 4193 local private network range
acl localnet src fe80::/10 # RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
url_rewrite_program /usr/local/bin/squidGuard -c /usr/local/squidGuard/squidGuard.conf
#
# Recommended minimum Access Permission configuration:
#
# Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
# Deny requests to certain unsafe ports
http_access deny !Safe_ports
# Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
# We strongly recommend the following be uncommented to protect innocent
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow localhost
# And finally deny all other access to this proxy
http_access deny all
# Squid normally listens to port 3128
http_port xx.xx.xx.xx:8080
# We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?
# Uncomment and adjust the following to add a disk cache directory.
cache_dir ufs /var/spool/squid 20000 16 256
# Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
# Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp: 1440 20 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
squidGuard.conf
Code:
dbhome /usr/local/squidGuard/db/
logdir /usr/local/squidGuard/log
dest white {
domainlist white/domains
urllist white/urls
}
dest black{
domainlist black/domains
urllist black/urls
}
acl {
default {
pass white !black all
redirect http://localhost
}
}
Your squidguard config appears directly pulled from their website. The obvious question, is did you create the black/white directories and files???
Yes i created white/black directories under db folder. Also my squid server is not getting internet directly from ISP.Its getting from another proxy server.
Code:
[root@xxxxxx db]# pwd
/usr/local/squidGuard/db
[root@xxxx db]# ll
total 8
drwxrwxrwx 2 squid squid 4096 Jan 28 00:47 black
drwxrwxrwx 2 squid squid 4096 Jan 28 00:46 white
[root@xxxxxx db]#
total 24
-rwxrwxrwx 1 squid squid 34 Jan 27 00:01 domains
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 domains.db
-rwxrwxrwx 1 squid squid 77 Jan 28 00:47 urls
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 urls.db
Hi
Yes i created white/black directories under db folder.
Code:
[root@xxxxxx db]# pwd
/usr/local/squidGuard/db
[root@xxxx db]# ll
total 8
drwxrwxrwx 2 squid squid 4096 Jan 28 00:47 black
drwxrwxrwx 2 squid squid 4096 Jan 28 00:46 white
[root@xxxxxx db]#
total 24
-rwxrwxrwx 1 squid squid 34 Jan 27 00:01 domains
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 domains.db
-rwxrwxrwx 1 squid squid 77 Jan 28 00:47 urls
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 urls.db
...and you specified white/domains and white/urls, along with black/domains and black/urls....yet you only have one of each file. Did you create the OTHER one? You did an ll here, which shows that under BOTH the black and white directories, you have ONE domains file and ONE urls file. Therefore..you're missing one.
Quote:
Also my squid server is not getting internet directly from ISP.Its getting from another proxy server.
And what is the point of that? Have you tried bypassing that other proxy server? And why not put all of this on one server?
Below are the files and folders created under db directory. Also i have attached the error screenshot.
Code:
[root@xxxxx white]# pwd
/usr/local/squidGuard/db/white
[root@xxxxx white]# ll
total 24
-rwxrwxrwx 1 squid squid 69 Jan 27 04:54 domains
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 domains.db
-rwxrwxrwx 1 squid squid 63 Jan 28 00:46 urls
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 urls.db
[root@xxxxxx white]# cd ../black/
[root@xxxxx black]# pwd
/usr/local/squidGuard/db/black
[root@xxxxx black]# ll
total 24
-rwxrwxrwx 1 squid squid 34 Jan 27 00:01 domains
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 domains.db
-rwxrwxrwx 1 squid squid 77 Jan 28 00:47 urls
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 urls.db
[root@xxxx black]#
Let me know whether below scenario will work.
Squid & squidGuard is configured in Linux machine A. Linux machine A has one Nic eth0 which is connected to private network (10.34.81.5) . It is not directly connected to public network or ISP.
To get internet on Linux machine A i will set proxy in /ect/profile file like 10.24.89.1:8080 for accessing internet.
I have one more Linux machine B (10.34.81.6) has one nic eth0 which is connected to private network.Linux machine B should get internet from squid server (Linux machine A) by setting 10.34.81.5 as the proxy ip /etc/profile. Linux machine B traffic should go via linux machine A. Is it possible to share internet from linux machine A to B.
Below are the files and folders created under db directory. Also i have attached the error screenshot.
Code:
[root@xxxxx white]# pwd
/usr/local/squidGuard/db/white
[root@xxxxx white]# ll
total 24
-rwxrwxrwx 1 squid squid 69 Jan 27 04:54 domains
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 domains.db
-rwxrwxrwx 1 squid squid 63 Jan 28 00:46 urls
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 urls.db
[root@xxxxxx white]# cd ../black/
[root@xxxxx black]# pwd
/usr/local/squidGuard/db/black
[root@xxxxx black]# ll
total 24
-rwxrwxrwx 1 squid squid 34 Jan 27 00:01 domains
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 domains.db
-rwxrwxrwx 1 squid squid 77 Jan 28 00:47 urls
-rwxrwxrwx 1 squid squid 8192 Jan 28 00:48 urls.db
[root@xxxx black]#
...which is different than what you posted before, but since the files exist, I'd check the permissions now. 777 seems like it may cause problems.
Quote:
Let me know whether below scenario will work.
Squid & squidGuard is configured in Linux machine A. Linux machine A has one Nic eth0 which is connected to private network (10.34.81.5) . It is not directly connected to public network or ISP. To get internet on Linux machine A i will set proxy in /ect/profile file like 10.24.89.1:8080 for accessing internet.
I have one more Linux machine B (10.34.81.6) has one nic eth0 which is connected to private network.Linux machine B should get internet from squid server (Linux machine A) by setting 10.34.81.5 as the proxy ip /etc/profile. Linux machine B traffic should go via linux machine A. Is it possible to share internet from linux machine A to B.
Again, it *MAY* work, but WHY are you doing this??? Again, have you tried plugging the squid server directly to the Internet, and why are you using only one NIC for a proxy server????
This doesn't seem like a very good setup...what are you trying to accomplish?
I didn't try connecting squid server directly to internet and it is not possible . Already we have a server acting as a Proxy server which is connected to internet.
This proxy 10.24.89.1:8080 is being used in different networks. We need to restrict internet specific to 10.34.81.0/24 network so i have created a internal squid server.
My squid server is running on top of ESX so creating an additional nic wouldn't be a problem.
I added one more nic and configured NAT and Port forwarding still no luck.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.