Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I'm trying to block specific websites, preferably using a portion of the url. For example, if blocking acmegrocery.com, www.acmegrocery.com would also be blocked. This implies that simply blocking a single IP using iptables would not work well.
The names to block will be somewhat dynamic, so a list specification would be ideal.
In my case, the LAN has a mix of linux and ms machines, and they all use a linux gateway currently running dnsmasq.
Doing some reading, I find suggestions from using /etc/hosts (might work with yp, but not ideal) on the gateway machine, through to using squid on the gateway machine.
I have to believe several people have worked on solutions to this problem, and might be able to suggest best practices. Any takers?
The problem is that the hosts file can only deal with the name. Unless running YP/NIS the others, who _could_ use a different nameserver, can still route there. Also, can be readily circumvented with use of an IP in the URL.
Actually, Untangle is a largely free (some modules you must pay for) software that is intended to be used on a dedicated system or running as a vm. You don't have to buy anything if you don't want to. The only major change is that it replaces your gateway, or you could set it up as a bridged device so that it's transparent to the network.
I've used Untangle for quite some time and I really like it (just make sure you have at least 1GB RAM installed... preferably 2GB). It has a filter built into it that should do exactly what you want: http://www3.untangle.com/web-filter
They also have a fantastic forum where support staff answers any questions that users cannot.
If you have a spare system or parts laying around it's a great product. I am not affiliated with them in any way other than as a satisfied user.
If you have LAN and Linux as a gateway, you probably use DHCP, so you can tell clients to use your DNS, which can be just small dns proxy with list of negative queries. Of course you need to disable direct DNS query.
I would be inclined to use squid and dansguardian.
This is the exact solution I used at a company a while back. All free.
I built a machine from parts lying around our lab, you really don't need much horsepower or juice to accomplish this.
Install squid, install dansguardian and voila! Another good idea is to make sure you setup the web interface. You set up logins and passwords for users that should have access to which users are doing what and they can log into a web interface, scroll through a list of names and see all the sites they visited.
Dansguardian also comes with (when I last used it at least) a precompiled list of bad domains and ip addresses.
The only downfall - depending, actually, upon your perspective - is that squid requires an extra credential login prompt for any sort of logging.
All of this should be able to be accomplished without any cost.