SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Greets. Been messing around with this for quite some time. I've man'ed it, read the online manual and just plain googled it but I can't get wget to do what I need it to do...
Need to stick with wget is the thing. I know whatever-ftp would be way better but I'm trying to get wget to do it. Perhaps someone who knows wget better than I do can throw me a bone.
All I want is the last directory listed below. It recursively works backwards which I find really odd. That leads you to mirroring the entire slackware.cs.utah.edu site. All I want is the slackware64 directory.
Any clues? Thanks. I've tried a whole host of options and haven't found the right combination yet...
That doesn't recursively work backwards but it also doesn't grab any of the packages. It just makes the directory tree of all the package sets and then stops....
Arggg.....
Believe me, rsync would be my first choice but it's for a script and I need to keep the deps down to basic linux commands, one of which I consider wget to be. That and I already use wget in the script for a few other things.
This command can be run to automatically download updates to a local mirror.
Note- Any outdated packages are still kept.
'wget --no-host-directories --cut-dirs=2 --mirror \
--directory-prefix=<your local mirror directory> \
--output-file=<path and name of output log file> \
ftp://<url of your mirror>/<path to slackware>/'
That doesn't recursively work backwards but it also doesn't grab any of the packages. It just makes the directory tree of all the package sets and then stops....
So frustrating....
Hi, jong357
You're almost there i guess.
Just remove the "-l1" option ( or use "-l3" or so... )
Sure wget is a basic command. Any system that doesn't have it installed isn't worth chicken scratch.
I however do not keep lftp or rsync on any of my systems. Anyhoo....
Removing the depth switch -l and especially adding a trailing slash at the end seems to have done it. Thank you both for your input.
Actually I won't know for sure until 30 minutes or so, but the behavior is different already. It went straight to slackware64 and is staying there and diligently downloading everything in that directory.
If I don't post back with reports of it downloading other crap it's not supposed to then it worked. Thanks guys!
Pretty sure I had to use "-e robots=off" because their robots.txt file was denying me access. The --wait is out of courtesy just for bypassing their robots file....
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.