LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Networking (https://www.linuxquestions.org/questions/linux-networking-3/)
-   -   Why can't I use /etc/hosts.deny to block a website? (https://www.linuxquestions.org/questions/linux-networking-3/why-cant-i-use-etc-hosts-deny-to-block-a-website-661873/)

CJS 08-10-2008 08:42 PM

Why can't I use /etc/hosts.deny to block a website?
 
I would like to block all programs/protocols within Ubuntu from accessing certain websites. Is this possible with the /etc/hosts.deny file?

As a test I put the following in my /etc/hosts.deny file:
Code:

ALL: .xkcd.com
ALL: www.xkcd.com, .xkcd.com
ALL: 208.122.19.56

I know the above is redundant, but it doesn't seem to work for me anyway: I saved the changes, and I could still access both www.xkcd.com and 208.122.19.56 (same website) with Firefox. I hit Ctrl-Reload in Firefox to make sure it wasn't just pulling up a cached version of the page either. Also, if I "ping www.xkcd.com" or "ping 208.122.19.56", the website responded fine, so it's not a problem with a cached page in Firefox. So just to make sure, I rebooted, and still I could access www.xkcd.com or 208.122.19.56.

So is my syntax incorrect, or what am I missing? Thanks for any help.

billymayday 08-10-2008 09:03 PM

Hosts.deny blocks those IPs accessing services on your computer, but that's not what you are trying to block. You are accessing their server. I suspect you need to use iptables or similar to achieve your goal, so drop the packet when the destination is 208...

There are probably better ways to do this though. Squid maybe?

salasi 08-11-2008 01:25 AM

Quote:

I know the above is redundant, but it doesn't seem to work for me anyway: I saved the changes, and I could still access both www.xkcd.com and 208.122.19.56 (same website)...
So, what are you going to do when the association between www.xkcd.com and an IP address changes?

Quote:

Originally Posted by billymayday (Post 3243061)
There are probably better ways to do this though. Squid maybe?

Squid could be a good way to do this if all of the services go through squid; from the original question it is possible that this is not the case.

billymayday 08-11-2008 01:41 AM

No, but he could set it up pretty easily

immortaltechnique 08-11-2008 01:51 AM

I would go with iptables where you will explicitly block the unwanted source addresses. Still i think squid could also come in handy, dunno if thats redudancy. Anyone?

jschiwal 08-11-2008 01:58 AM

If there are only a couple websites to block, the easiest method is to add these sites to /etc/hosts with an IP address of 127.0.0.1.

You can also add an IP tables rule to block packets destined to such a site. Bear in mind that major web sites will have 3 or 4 IP addresses served up in a round robin fashion. This is an easy form of load balancing.

junpa 08-11-2008 02:36 AM

CJS,

Quote:

Originally Posted by CJS (Post 3243050)
I would like to block all programs/protocols within Ubuntu from accessing certain websites. Is this possible with the /etc/hosts.deny file?

no you cannot.

Quote:

So is my syntax incorrect, or what am I missing? Thanks for any help.
You are not missing anything your syntax is correct.


The most common way is to do what you want is to use the hosts (/etc/hosts) file.
MVPS is a great resource for explaining howto and the caveats. It's geared towards windows users, but the content is still applicable.

alternatively as billymayday, immortaltechnique, and jschiwal have stated you can use the iptables OUTPUT chain to block access to undesirable websites.

ex.

Code:

sudo /sbin/iptables -A OUTPUT -d www.xkcd.com -j REJECT
sudo /sbin/iptables -A OUTPUT -d 208.122.19.56 -j REJECT

you can use dig to figure out if a domain has multiple A records.

Code:

junpa@quazi:(~)$ dig +noall +ans www.xkcd.com
if for some reason you do not have dig installed you can use the online verson

http://networking.ringofsaturn.com/Tools/dig.php

CJS 08-12-2008 07:59 AM

Thanks for the responses, everyone, I see now that /etc/hosts.deny is for incoming connection requests and not outgoing requests, such as when I access a website through Firefox. And thanks for the detailed examples on how to use iptables to block an IP address, Junpa.
Code:

sudo /sbin/iptables -A OUTPUT -d www.xkcd.com -j REJECT
That got my hopes up, Junpa, that maybe I could actually block a website and not just IP addresses with iptables; but as I'm sure you know, I found out that in doing the above, iptables does a once-only DNS lookup of www.xkcd.com and then applies the above rule to all returned IP addresses of www.xkcd.com. So in the end it is the same as just manually writing an iptable rule with all IP addresses associated with www.xkcd.com.

I think in my case I'll just go with sticking the websites in /etc/hosts for now. Thanks again for the help everyone. :)

S-i-A 08-29-2008 11:39 AM

that is worked fine, but after restart the pc, lost the block.
also with squid
squid-cache.org
is the best way, or with freeDNS service.


All times are GMT -5. The time now is 03:41 PM.