LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Networking
User Name
Password
Linux - Networking This forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.

Notices


Reply
  Search this Thread
Old 08-10-2008, 08:42 PM   #1
CJS
Member
 
Registered: May 2008
Location: California, USA
Distribution: Ubuntu 8.10
Posts: 247

Rep: Reputation: 49
Why can't I use /etc/hosts.deny to block a website?


I would like to block all programs/protocols within Ubuntu from accessing certain websites. Is this possible with the /etc/hosts.deny file?

As a test I put the following in my /etc/hosts.deny file:
Code:
ALL: .xkcd.com
ALL: www.xkcd.com, .xkcd.com
ALL: 208.122.19.56
I know the above is redundant, but it doesn't seem to work for me anyway: I saved the changes, and I could still access both www.xkcd.com and 208.122.19.56 (same website) with Firefox. I hit Ctrl-Reload in Firefox to make sure it wasn't just pulling up a cached version of the page either. Also, if I "ping www.xkcd.com" or "ping 208.122.19.56", the website responded fine, so it's not a problem with a cached page in Firefox. So just to make sure, I rebooted, and still I could access www.xkcd.com or 208.122.19.56.

So is my syntax incorrect, or what am I missing? Thanks for any help.
 
Old 08-10-2008, 09:03 PM   #2
billymayday
LQ Guru
 
Registered: Mar 2006
Location: Sydney, Australia
Distribution: Fedora, CentOS, OpenSuse, Slack, Gentoo, Debian, Arch, PCBSD
Posts: 6,678

Rep: Reputation: 122Reputation: 122
Hosts.deny blocks those IPs accessing services on your computer, but that's not what you are trying to block. You are accessing their server. I suspect you need to use iptables or similar to achieve your goal, so drop the packet when the destination is 208...

There are probably better ways to do this though. Squid maybe?
 
Old 08-11-2008, 01:25 AM   #3
salasi
Senior Member
 
Registered: Jul 2007
Location: Directly above centre of the earth, UK
Distribution: SuSE, plus some hopping
Posts: 4,070

Rep: Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897Reputation: 897
Quote:
I know the above is redundant, but it doesn't seem to work for me anyway: I saved the changes, and I could still access both www.xkcd.com and 208.122.19.56 (same website)...
So, what are you going to do when the association between www.xkcd.com and an IP address changes?

Quote:
Originally Posted by billymayday View Post
There are probably better ways to do this though. Squid maybe?
Squid could be a good way to do this if all of the services go through squid; from the original question it is possible that this is not the case.
 
Old 08-11-2008, 01:41 AM   #4
billymayday
LQ Guru
 
Registered: Mar 2006
Location: Sydney, Australia
Distribution: Fedora, CentOS, OpenSuse, Slack, Gentoo, Debian, Arch, PCBSD
Posts: 6,678

Rep: Reputation: 122Reputation: 122
No, but he could set it up pretty easily
 
Old 08-11-2008, 01:51 AM   #5
immortaltechnique
Member
 
Registered: Oct 2006
Location: Kenya
Distribution: Ubuntu, RHEL, OpenBSD
Posts: 287

Rep: Reputation: 32
I would go with iptables where you will explicitly block the unwanted source addresses. Still i think squid could also come in handy, dunno if thats redudancy. Anyone?
 
Old 08-11-2008, 01:58 AM   #6
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682
If there are only a couple websites to block, the easiest method is to add these sites to /etc/hosts with an IP address of 127.0.0.1.

You can also add an IP tables rule to block packets destined to such a site. Bear in mind that major web sites will have 3 or 4 IP addresses served up in a round robin fashion. This is an easy form of load balancing.
 
Old 08-11-2008, 02:36 AM   #7
junpa
Member
 
Registered: Aug 2008
Location: Northern Hemisphere
Distribution: Slackware, OpenVMS, fbsd
Posts: 51

Rep: Reputation: 16
CJS,

Quote:
Originally Posted by CJS View Post
I would like to block all programs/protocols within Ubuntu from accessing certain websites. Is this possible with the /etc/hosts.deny file?
no you cannot.

Quote:
So is my syntax incorrect, or what am I missing? Thanks for any help.
You are not missing anything your syntax is correct.


The most common way is to do what you want is to use the hosts (/etc/hosts) file.
MVPS is a great resource for explaining howto and the caveats. It's geared towards windows users, but the content is still applicable.

alternatively as billymayday, immortaltechnique, and jschiwal have stated you can use the iptables OUTPUT chain to block access to undesirable websites.

ex.

Code:
sudo /sbin/iptables -A OUTPUT -d www.xkcd.com -j REJECT
sudo /sbin/iptables -A OUTPUT -d 208.122.19.56 -j REJECT
you can use dig to figure out if a domain has multiple A records.

Code:
junpa@quazi:(~)$ dig +noall +ans www.xkcd.com
if for some reason you do not have dig installed you can use the online verson

http://networking.ringofsaturn.com/Tools/dig.php
 
Old 08-12-2008, 07:59 AM   #8
CJS
Member
 
Registered: May 2008
Location: California, USA
Distribution: Ubuntu 8.10
Posts: 247

Original Poster
Rep: Reputation: 49
Thanks for the responses, everyone, I see now that /etc/hosts.deny is for incoming connection requests and not outgoing requests, such as when I access a website through Firefox. And thanks for the detailed examples on how to use iptables to block an IP address, Junpa.
Code:
sudo /sbin/iptables -A OUTPUT -d www.xkcd.com -j REJECT
That got my hopes up, Junpa, that maybe I could actually block a website and not just IP addresses with iptables; but as I'm sure you know, I found out that in doing the above, iptables does a once-only DNS lookup of www.xkcd.com and then applies the above rule to all returned IP addresses of www.xkcd.com. So in the end it is the same as just manually writing an iptable rule with all IP addresses associated with www.xkcd.com.

I think in my case I'll just go with sticking the websites in /etc/hosts for now. Thanks again for the help everyone.
 
Old 08-29-2008, 11:39 AM   #9
S-i-A
LQ Newbie
 
Registered: Aug 2008
Posts: 1

Rep: Reputation: 0
that is worked fine, but after restart the pc, lost the block.
also with squid
squid-cache.org
is the best way, or with freeDNS service.
 
  


Reply

Tags
block, debian, hostsdeny, linux, ubuntu, website



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
can't restrict sshd access through hosts.allow and hosts.deny but was working earlier farhan Linux - Security 4 04-18-2008 07:41 AM
grr trying to block outside access with hosts.deny 3rdtreenz Linux - Networking 1 07-06-2007 10:29 AM
How To Block Websites? Maybe /etc/hosts (.deny maybe) How? aaron4katie Linux - Security 7 01-08-2007 04:58 PM
Block country's with hosts.deny narmida Linux - Security 7 03-02-2006 10:01 AM
hosts.deny doesn't block an SSH attempt vmattila Linux - Security 4 11-14-2004 12:18 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Networking

All times are GMT -5. The time now is 07:38 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration