LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 01-13-2003, 12:59 AM   #1
Jeff D
LQ Newbie
 
Registered: Jan 2003
Posts: 2

Rep: Reputation: 0
Can squid/squidguard be used to do this?


I'm new to Linux and I'm running Mandrake version 9.0. I've got squid and squidguard installed. Here is what I want to be able to do, but I'm not exactly sure how to do it...

I have many computers which need to download daily data updates. The updates can be large and over a slow connection I'd rather not have 10 downloads. I've started to set up squid as a cache.

Now, here's the logic I'm hoping to be able to impliment.

Have one machine be able to access an external site daily and download the database update file. From that point on all other machines (which can be in a list of special IPs) can get the data ONLY from the cache.

For example mach (ip 192.168.0.200) would need access to the outside. All other machines (ip 192.168.0.201-211) would need to only hit from the cache. I'd like to be able to clear the cache every day, and until the .200 machine downloads the data the other machines get errors reported back.

Is this possible to do with squid/squidguard? It seems like both tools can do parts of the problem I'm just unsure how exactly to set it up.

Or.... if there's a better way I'd love to hear them.

Thanks!
 
Old 01-13-2003, 01:37 AM   #2
DavidPhillips
LQ Guru
 
Registered: Jun 2001
Location: South Alabama
Distribution: Fedora / RedHat / SuSE
Posts: 7,163

Rep: Reputation: 58
I would do it with the server

download the site with wget to a folder on your webserver or ftp server.

you can use a cron job to download it when you want it there, and also a cron job to delete it.

Last edited by DavidPhillips; 01-13-2003 at 01:48 AM.
 
Old 01-13-2003, 04:45 AM   #3
Jeff D
LQ Newbie
 
Registered: Jan 2003
Posts: 2

Original Poster
Rep: Reputation: 0
That might work, I'll have to check on that. At the moment I'm not (knowingly) running apache or any other server on the linux box.

The system requesting the data is an embedded system we are running to control scheduling of transportation systems operators. I'm not sure if I can have the computer mimic one of these systems. It seems very possible, let me check on that. The real key is I don't have control over the software on the boxes requesting the data, just the network. The boxes support a proxy, so squid/squidguard seemed like a very obvious solution *but I could very easily be wrong!*

Wget seems very useful tool, I'll look into it.

Thanks.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Squid, SquidGuard rewrite swingheim Linux - Networking 3 12-14-2009 05:16 AM
error: mail command failed for /var/log/squidguard/squidGuard.log.6 Niceman2005 Linux - Networking 1 01-22-2009 02:24 PM
squidguard metallica1973 Linux - Security 5 05-24-2005 10:24 AM
making squidguard start whenever squid starts Niceman2005 Linux - Software 2 11-24-2004 07:31 PM
squid conf: squid failed when I type insert redirect_program /usr/bin/squidguard Niceman2005 Linux - Software 1 11-24-2004 03:29 PM


All times are GMT -5. The time now is 09:01 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration