DaHammer |
03-22-2005 01:45 AM |
You can do it with wget. Here's an example:
Code:
wget -m -np -nH --cut-dirs=4 "ftp://slackware.mirrors.tds.net/pub/slackware/slackware-10.1/isolinux/"
This will download everything in the "isolinux", as well as everything in any directory below it, to the directory you are in when you execute it. Here's how it breaks down.
-m - This turns on recursion, timestamping, & infinite recursion depth
-np - This prevents wget from ascending into the directories above the one you want
-nH - This cuts off the slackware.mirrors.tds.net
--cut-dirs=4 - This cuts off the pub/slackware/slackware-10.1/isolinux
Without the last 2 your files would be stored at "slackware.mirrors.tds.net/pub/slackware/slackware-10.1/isolinux" in under the directory you ran it from. See "man wget", as there are loads of options. If you want to run this daily automatically, then use cron, along with the -P option to wget. The -P option tells wget where to store the files, default is "." or current directory. To do that, run "crontab -e" from a command line and add something like this:
Code:
10 6 * * * wget -m -np -nH --cut-dirs=4 -P "/home/jorasmi/files" "ftp://slackware.mirrors.tds.net/pub/slackware/slackware-10.1/isolinux/"
That would run the command every day at 6:10 AM and email you the output. You add " 1> /dev/null" without quotes onto the end if you don't want to be emailed the output. Or you can use the -o option to wget to write the output to a log file instead and just email you errors. As you can see, 100s of ways to skin this cat. :) At any rate, the above will only download files that change and do it automatically so you don't have to touch it. If you want to use the HTTP protocol instead, then that's possible as well, but it introduces other issues like the index.html files and robots.txt if they apply. Anyway, play with it a bit to get it like you want it before you automate it.
|