LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Security
User Name
Password
Linux - Security This forum is for all security related questions.
Questions, tips, system compromises, firewalls, etc. are all included here.

Notices


Reply
  Search this Thread
Old 01-21-2007, 04:52 PM   #1
General
Member
 
Registered: Aug 2005
Distribution: Debian 7
Posts: 526

Rep: Reputation: 31
/etc/hosts doesn't always block sites


I tested /etc/hosts by adding the lines:

127.0.0.1 slashdot.org
127.0.0.1 youtube.com

It successfully blocked the first address, but not the second.

I tried other variations, using IP address instead, adding www. to front, but it not working. I tried many other addresses and had the same problem. How do I get this to work?
 
Old 01-21-2007, 05:14 PM   #2
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
/etc/hosts is NOT used to BLOCK hosts!

/etc/hosts is used to equate the NAME of a host with it's IP address.

Adding these names to the 127.0.0.1 would have some effect at preventing access not because it blocks the hosts but because it makes it seem like the host has an IP it doesn't really have. Any attempts to go to the host (www.google.com for example) would actually go to local host. If you had a web server running on the local host it would actually open that web page rather than the remote hosts.

Also you can't have multiple lines for a given IP address. It will read the hosts file from top to bottom and use the first hit only.

To assign multiple names you'd have to do something like:
127.0.0.1 localhost slashdot.org youtube.com

As you might imagine there would be a limit to how many host names you could put on one line like that. You really want the "localhost" entry there as well as that is what it really is.

However as mentioned above this is not the purpose of /etc/hosts. You should look for other ways to block hosts you don't want accessed. iptables comes to mind. Using iptables rather than blocking specific hosts you can just allow the hosts you do want which may be a smaller list.
 
Old 01-21-2007, 06:54 PM   #3
Micro420
Senior Member
 
Registered: Aug 2003
Location: Berkeley, CA
Distribution: Mac OS X Leopard 10.6.2, Windows 2003 Server/Vista/7/XP/2000/NT/98, Ubuntux64, CentOS4.8/5.4
Posts: 2,986

Rep: Reputation: 45
Heh, suppose you could use an iptables rule and block everything out and specify the ones you want to go out.

iptables -A OUTPUT -p tcp -j DROP

Or just use a proxy server for websites you want to block

Last edited by Micro420; 01-21-2007 at 06:55 PM.
 
Old 01-21-2007, 06:56 PM   #4
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
In addition to what Jlightner wrote, say if whitelisting ain't gonna work, if it's just WWW traffic you could use Privoxy or another proxy to block. Adding this to the user.action:
Code:
{+block}
.doubleclick.net
would block all doubleclick.net HTTP hosts. You have to make sure all web traffic is routed through the proxy first and no other exits exist like tunnelling web traffic somewhere else.

If you still want to block at the DNS level then in Pdnsds (a caching nameserver) you can block complete domains like this:
Code:
neg {
        name=doubleclick.net;
        types=domain;
}
In this case you have to make sure people can't use *proxies* to route traffic through.


Also you can't have multiple lines for a given IP address.
Apparently by setting RESOLV_MULTI or "echo multi >> /etc/host.conf" and you can.
 
Old 01-21-2007, 07:04 PM   #5
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
Quote:
Apparently by setting RESOLV_MULTI or "echo multi >> /etc/host.conf" and you can.
That's a new one on me. Thanks for the info. Of course I still wouldn't use /etc/hosts for blocking but nice to know you could have multiple lines like that. I wonder what kind of lag having dozens of lines for a single IP might cause.
 
Old 01-21-2007, 07:13 PM   #6
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682
Quote:
Originally Posted by jlightner
/etc/hosts is NOT used to BLOCK hosts!

/etc/hosts is used to equate the NAME of a host with it's IP address.

Adding these names to the 127.0.0.1 would have some effect at preventing access not because it blocks the hosts but because it makes it seem like the host has an IP it doesn't really have. Any attempts to go to the host (www.google.com for example) would actually go to local host. If you had a web server running on the local host it would actually open that web page rather than the remote hosts.
Using /etc/hosts this way will in effect prevents resolving the name. It is an easy lightweight way to prevent access, and is commonly used. I don't think the original poster deserves a scolding on such a technicality.

Quote:
To assign multiple names you'd have to do something like:
127.0.0.1 localhost slashdot.org youtube.com
I tested it out using multiple lines and it worked.

You can also use addresses like
172.0.0.10 www.youtube.com youtube.com
172.0.0.11 www.slashdot.org slashdot.org

Try that, and you will find that you can enter "ping www.youtube.com" and ping yourself.

Last edited by jschiwal; 01-21-2007 at 07:18 PM.
 
Old 01-21-2007, 07:19 PM   #7
ralvez
Member
 
Registered: Oct 2003
Location: Canada
Distribution: ArchLinux && Slackware 10.1
Posts: 298

Rep: Reputation: 30
To disallow hosts you need to use /etc/hosts.deny
 
Old 01-21-2007, 08:20 PM   #8
jschiwal
LQ Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682Reputation: 682
Hosts.deny is to prevent a different host from connecting to your computer. It won't prevent browsing to a site. If the OP wants a kid proof solution, then only allowing access through a proxy would be effective enough. Someone could either edit /etc/hosts or boot up with a live distro to defeat restrictions on the same post. Then the only traffic possible is through the squid/dan's guardian proxy or maybe an appliance such as iBoss. ISPs often provide a filtering service for around $1/month. However, whether it is user configurable or just blocks porn sites, depends on the ISP.

Last edited by jschiwal; 01-21-2007 at 08:31 PM.
 
Old 01-21-2007, 10:06 PM   #9
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
Quote:
Originally Posted by jschiwal
Using /etc/hosts this way will in effect prevents resolving the name. It is an easy lightweight way to prevent access, and is commonly used. I don't think the original poster deserves a scolding on such a technicality.


I tested it out using multiple lines and it worked.

You can also use addresses like
172.0.0.10 www.youtube.com youtube.com
172.0.0.11 www.slashdot.org slashdot.org

Try that, and you will find that you can enter "ping www.youtube.com" and ping yourself.

I indicated it could be done and even told him how to do it.

I also later indicated that I was unaware of the setting that allows for multiple lines. Your post therefore added nothing to that.

Its not the way I learned it in UNIX and I'm not sure it would work in any of the variants of that I use. My intent wasn't to "scold" but highlight the fact that it doesn't really BLOCK anything - it redirects to localhost. That might have unusual effects if one had port 80 or 8080 in use for some web service on the local host.

As to its commonality for such blocking use - I've been doing UNIX/Linux full time as an admin since 1991 and this is the first time I've seen it. Perhaps it's a difference between the way professionals do it and the way home users do it.
 
Old 01-22-2007, 05:21 AM   #10
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
I wonder what kind of lag having dozens of lines for a single IP might cause.
You have the power of GNU/Linux, no need to wonder, just test it I'd say...

Anway. Since we apparently landed in detail-country: one other thing I forgot to add is that one of the consequences of using /etc/hosts for resolving purposes this way is you'll have to change lookup order in /etc/host.conf and /etc/nsswitch.conf since by default DNS querying precedes db's (file). (Also see: "null routing"). Maybe the idea for (ab)using /etc/hosts stems from the other O.S. where (online docs suggest using and) anti-malware applications do use the equivalent since the system itself does not provide any other easy and generic option.
 
Old 01-22-2007, 09:44 AM   #11
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
Well if I wanted to test it - I was just musing.

Another thing I just realized was missed in the above discussion. This would redirect things input in the browser by name but if the user knew the correct IP they could bypass the redirect by simply typing the IP in the browser.

Since nslookup/dig/host don't actually interrogate /etc/hosts in Linux the user could use any of those commands to determine the real IP.
 
Old 01-22-2007, 10:19 AM   #12
anomie
Senior Member
 
Registered: Nov 2004
Location: Texas
Distribution: RHEL, Scientific Linux, Debian, Fedora
Posts: 3,935
Blog Entries: 5

Rep: Reputation: Disabled
This discussion pops up occasionally on various forums; here's my 5 cents:

Using /etc/hosts to block a couple hosts (in the manner described here) is probably fine for a personal desktop. This will quickly get unwieldy as the list of hosts grows, though.

As mentioned, iptables can be used to drop outbound connections to those hosts (with the same caveat as the above).

IMO if you're getting into the "regulating outbound http traffic business", you're going to need to start using squid.
 
Old 01-22-2007, 10:43 AM   #13
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
Since nslookup/dig/host don't actually interrogate /etc/hosts in Linux the user could use any of those commands to determine the real IP.
"strace -eopen" shows dig, host and nslookup do honour /etc/nsswitch.conf and /etc/resolv.conf. I already pointed to "hosts" lookup order change in nsswitch.conf.


This would redirect things input in the browser by name but if the user knew the correct IP they could bypass the redirect by simply typing the IP in the browser.
You're right. It shows why any form of whitelisting is the "easiest way out". Not that it's any consolation but some hosts do not react well to HTTP access by IP where an FQDN is expected...
 
Old 01-22-2007, 12:07 PM   #14
MensaWater
LQ Guru
 
Registered: May 2005
Location: Atlanta Georgia USA
Distribution: Redhat (RHEL), CentOS, Fedora, CoreOS, Debian, FreeBSD, HP-UX, Solaris, SCO
Posts: 7,831
Blog Entries: 15

Rep: Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669Reputation: 1669
I thought as you that these commands would look at /etc/hosts mainly because nslookup does on HP-UX. However I've proven to myself after having others point it out to me that it doesn't on Linux or Solaris or SCO. Posters have indicated ONLY HP-UX does this. You can do as I did and test doing nslookup and host for something you know is only in your /etc/hosts file and see what I mean.

It may be you're thinking of the underlying C routines gethostbyaddr gethostbyname etc... which DO interrogate /etc/hosts. To me this is a flaw (that is to say IMO HP-UX does it right) but most of the posters I've traded comments with don't like the fact that HP-UX does it the way it does. They point out that "ns" in "nslookup" means name server rather than file. Of course I point out that nslookup is deprecated in favor of host and host has no "ns" in it's name.
 
Old 01-22-2007, 05:37 PM   #15
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
OK, I did the test, and I see what you mean. Wget does a read on /etc/hosts, dig doesn't, but reads /etc/resolv.conf (and in my case the DNS reads /etc/hosts).
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Squid block sites linuxnirmal Linux - Security 5 11-15-2006 10:59 AM
block some sites nkutty Linux - Security 1 10-03-2005 04:15 AM
How to BLOCK SLEEZE SITES? saugato Linux - Security 3 02-11-2005 08:39 PM
Looking to block out certain webb sites HenchmenResourc SUSE / openSUSE 1 12-02-2004 08:13 AM
allow certain sites and block rest all of them mcgrath Programming 1 11-24-2004 10:53 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Security

All times are GMT -5. The time now is 02:45 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration