LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Desktop
User Name
Password
Linux - Desktop This forum is for the discussion of all Linux Software used in a desktop context.

Notices


Reply
  Search this Thread
Old 12-10-2009, 04:55 PM   #1
abefroman
Senior Member
 
Registered: Feb 2004
Location: lost+found
Distribution: CentOS
Posts: 1,430

Rep: Reputation: 55
How can I download a bunch of files from an http site?


How can I download a bunch of files from an http site?

Is there an easier way than clicking on each one?

And if it could get them all recursively that would be nice.

TIA
 
Old 12-10-2009, 05:03 PM   #2
pljvaldez
LQ Guru
 
Registered: Dec 2005
Location: Somewhere on the String
Distribution: Debian Wheezy (x86)
Posts: 6,094

Rep: Reputation: 281Reputation: 281Reputation: 281
Try using wget.
 
Old 12-11-2009, 02:30 AM   #3
kellemes
LQ Newbie
 
Registered: Oct 2005
Location: Amsterdam
Distribution: Arch
Posts: 26

Rep: Reputation: 18
Or throuh one of many firefox extensions..
DownThemAll
 
Old 12-16-2009, 08:58 PM   #4
bendib
Member
 
Registered: Feb 2009
Location: I'm the rat in your couch.
Distribution: Fedora on servers, Debian on PPC Mac, custom source-built for desktops
Posts: 174

Rep: Reputation: 40
Quote:
Originally Posted by abefroman View Post
How can I download a bunch of files from an http site?
If this is a site that simply lists the files and little else, open nautilus, copy and paste the address into the nautilus location bar, and replace http:// with ftp://. This works on a lot of sites, like mirrors.kernel.org etc, of course they have to have FTP set up.
 
Old 12-16-2009, 09:00 PM   #5
GrapefruiTgirl
LQ Guru
 
Registered: Dec 2006
Location: underground
Distribution: Slackware64
Posts: 7,594

Rep: Reputation: 555Reputation: 555Reputation: 555Reputation: 555Reputation: 555Reputation: 555
I use the FlashGot extension for Firefox. Works great, and ties into your favorite downloader automatically, i.e. Wget, Curl, KGet, etc.. just highlight the whole page of links, and right click --> Flashget.

I've never tried 'recursing' with it -- if you mean downloading whole folders -- but it may do that.. Let us know if it does, if you try it.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
how are files downloaded from an http://x.x.x.x site dsids Linux - General 3 11-19-2006 09:52 AM
myDSL pops open a bunch of extension download windows AND how do i keep my extensions lefty.crupps DamnSmallLinux 2 04-24-2006 04:45 PM
I want to download ftp-site files via wget and socks5 proxy server. jiawj Red Hat 2 10-28-2004 03:32 PM
download 2 files site request aus9 General 3 09-15-2004 12:54 AM
is there any linux command to download files from any ftp site remotely midnitc Linux - General 3 10-31-2003 08:05 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Desktop

All times are GMT -5. The time now is 11:55 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration