Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I'm new to Linux and I'm running Mandrake version 9.0. I've got squid and squidguard installed. Here is what I want to be able to do, but I'm not exactly sure how to do it...
I have many computers which need to download daily data updates. The updates can be large and over a slow connection I'd rather not have 10 downloads. I've started to set up squid as a cache.
Now, here's the logic I'm hoping to be able to impliment.
Have one machine be able to access an external site daily and download the database update file. From that point on all other machines (which can be in a list of special IPs) can get the data ONLY from the cache.
For example mach (ip 192.168.0.200) would need access to the outside. All other machines (ip 192.168.0.201-211) would need to only hit from the cache. I'd like to be able to clear the cache every day, and until the .200 machine downloads the data the other machines get errors reported back.
Is this possible to do with squid/squidguard? It seems like both tools can do parts of the problem I'm just unsure how exactly to set it up.
Or.... if there's a better way I'd love to hear them.
That might work, I'll have to check on that. At the moment I'm not (knowingly) running apache or any other server on the linux box.
The system requesting the data is an embedded system we are running to control scheduling of transportation systems operators. I'm not sure if I can have the computer mimic one of these systems. It seems very possible, let me check on that. The real key is I don't have control over the software on the boxes requesting the data, just the network. The boxes support a proxy, so squid/squidguard seemed like a very obvious solution *but I could very easily be wrong!*