Why can't I use /etc/hosts.deny to block a website?
Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I know the above is redundant, but it doesn't seem to work for me anyway: I saved the changes, and I could still access both www.xkcd.com and 208.122.19.56 (same website) with Firefox. I hit Ctrl-Reload in Firefox to make sure it wasn't just pulling up a cached version of the page either. Also, if I "ping www.xkcd.com" or "ping 208.122.19.56", the website responded fine, so it's not a problem with a cached page in Firefox. So just to make sure, I rebooted, and still I could access www.xkcd.com or 208.122.19.56.
So is my syntax incorrect, or what am I missing? Thanks for any help.
Hosts.deny blocks those IPs accessing services on your computer, but that's not what you are trying to block. You are accessing their server. I suspect you need to use iptables or similar to achieve your goal, so drop the packet when the destination is 208...
There are probably better ways to do this though. Squid maybe?
I know the above is redundant, but it doesn't seem to work for me anyway: I saved the changes, and I could still access both www.xkcd.com and 208.122.19.56 (same website)...
So, what are you going to do when the association between www.xkcd.com and an IP address changes?
Quote:
Originally Posted by billymayday
There are probably better ways to do this though. Squid maybe?
Squid could be a good way to do this if all of the services go through squid; from the original question it is possible that this is not the case.
I would go with iptables where you will explicitly block the unwanted source addresses. Still i think squid could also come in handy, dunno if thats redudancy. Anyone?
If there are only a couple websites to block, the easiest method is to add these sites to /etc/hosts with an IP address of 127.0.0.1.
You can also add an IP tables rule to block packets destined to such a site. Bear in mind that major web sites will have 3 or 4 IP addresses served up in a round robin fashion. This is an easy form of load balancing.
I would like to block all programs/protocols within Ubuntu from accessing certain websites. Is this possible with the /etc/hosts.deny file?
no you cannot.
Quote:
So is my syntax incorrect, or what am I missing? Thanks for any help.
You are not missing anything your syntax is correct.
The most common way is to do what you want is to use the hosts (/etc/hosts) file. MVPS is a great resource for explaining howto and the caveats. It's geared towards windows users, but the content is still applicable.
alternatively as billymayday, immortaltechnique, and jschiwal have stated you can use the iptables OUTPUT chain to block access to undesirable websites.
ex.
Code:
sudo /sbin/iptables -A OUTPUT -d www.xkcd.com -j REJECT
sudo /sbin/iptables -A OUTPUT -d 208.122.19.56 -j REJECT
you can use dig to figure out if a domain has multiple A records.
Code:
junpa@quazi:(~)$ dig +noall +ans www.xkcd.com
if for some reason you do not have dig installed you can use the online verson
Thanks for the responses, everyone, I see now that /etc/hosts.deny is for incoming connection requests and not outgoing requests, such as when I access a website through Firefox. And thanks for the detailed examples on how to use iptables to block an IP address, Junpa.
Code:
sudo /sbin/iptables -A OUTPUT -d www.xkcd.com -j REJECT
That got my hopes up, Junpa, that maybe I could actually block a website and not just IP addresses with iptables; but as I'm sure you know, I found out that in doing the above, iptables does a once-only DNS lookup of www.xkcd.com and then applies the above rule to all returned IP addresses of www.xkcd.com. So in the end it is the same as just manually writing an iptable rule with all IP addresses associated with www.xkcd.com.
I think in my case I'll just go with sticking the websites in /etc/hosts for now. Thanks again for the help everyone.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.