LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Desktop (https://www.linuxquestions.org/questions/linux-desktop-74/)
-   -   How can I download a bunch of files from an http site? (https://www.linuxquestions.org/questions/linux-desktop-74/how-can-i-download-a-bunch-of-files-from-an-http-site-774807/)

abefroman 12-10-2009 04:55 PM

How can I download a bunch of files from an http site?
 
How can I download a bunch of files from an http site?

Is there an easier way than clicking on each one?

And if it could get them all recursively that would be nice.

TIA

pljvaldez 12-10-2009 05:03 PM

Try using wget.

kellemes 12-11-2009 02:30 AM

Or throuh one of many firefox extensions..
DownThemAll

bendib 12-16-2009 08:58 PM

Quote:

Originally Posted by abefroman (Post 3787111)
How can I download a bunch of files from an http site?

If this is a site that simply lists the files and little else, open nautilus, copy and paste the address into the nautilus location bar, and replace http:// with ftp://. This works on a lot of sites, like mirrors.kernel.org etc, of course they have to have FTP set up.

GrapefruiTgirl 12-16-2009 09:00 PM

I use the FlashGot extension for Firefox. Works great, and ties into your favorite downloader automatically, i.e. Wget, Curl, KGet, etc.. just highlight the whole page of links, and right click --> Flashget.

I've never tried 'recursing' with it -- if you mean downloading whole folders -- but it may do that.. Let us know if it does, if you try it.


All times are GMT -5. The time now is 10:57 AM.