How can I download a bunch of files from an http site?
Linux - DesktopThis forum is for the discussion of all Linux Software used in a desktop context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
Distribution: Fedora on servers, Debian on PPC Mac, custom source-built for desktops
Posts: 174
Rep:
Quote:
Originally Posted by abefroman
How can I download a bunch of files from an http site?
If this is a site that simply lists the files and little else, open nautilus, copy and paste the address into the nautilus location bar, and replace http:// with ftp://. This works on a lot of sites, like mirrors.kernel.org etc, of course they have to have FTP set up.
I use the FlashGot extension for Firefox. Works great, and ties into your favorite downloader automatically, i.e. Wget, Curl, KGet, etc.. just highlight the whole page of links, and right click --> Flashget.
I've never tried 'recursing' with it -- if you mean downloading whole folders -- but it may do that.. Let us know if it does, if you try it.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.