LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 05-25-2009, 03:56 AM   #1
Steve W
Member
 
Registered: Mar 2007
Distribution: Linux Mint 18.1
Posts: 511

Rep: Reputation: 44
wget: download multiple files


To my delight, I found a website containing loads of scans of old computer magazines. Fancying a wallow in some nostalgia, but without wanting to download each jpeg scan individually, I did a bit of Googling and found the Linux 'wget' command.

However, when I use it in this syntax:

wget ftp://ftp.worldofspectrum.org/pub/si...ue8112/Pages/*

.... it seems to "log on and log off" each time a file is downloaded (so about once every couple of seconds!). My shell screen is filled with messages like this one below, for every file downloaded:

--2009-05-25 09:46:24-- ftp://ftp.worldofspectrum.org/pub/si...r811200100.jpg
=> `YourComputer811200100.jpg'
==> CWD not required.
==> PASV ... done. ==> RETR YourComputer811200100.jpg ... done.
Length: 236257 (231K)

100%[======================================>] 236,257 375K/s in 0.6s

2009-05-25 09:46:25 (375 KB/s) - `YourComputer811200100.jpg' saved [236257]

I was just wondering, before I download any more using wget, if there were an option for just having the files downloaded with minimum feedback, or whether I am in fact using wget correctly in this case.

I did have a scan through a wget manual (this does *not* appear to be a simple command!), and the nearest I found was a way to write output to a text file rather than the shell screen.

Am I using wget optimally for downloading small and numerous files from one ftp address in this way?

Steve

P.S. Am quite impressed that Linux has a built-in command for doing this kind of thing. In the old days when I used Windows, I needed to install separate software to bulk-download multiple files like this. Go-Zilla, I think it was called. With Linux, it's just all there!
 
Old 05-25-2009, 04:12 AM   #2
AwesomeMachine
LQ Guru
 
Registered: Jan 2005
Location: USA and Italy
Distribution: Debian testing/sid; OpenSuSE; Fedora; Mint
Posts: 5,521

Rep: Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015Reputation: 1015
wget is written for bulk downloads, either from a single site, or from a file listing of link targets. I never had a problem with the screen output, but I'm sure is a way to silence it. It doesn't hurt anything to have wget print to the screen, and wget does reconnect with every file. Sometimes it will use the same connection again. That's how it's supposed to work.

Curl is a similar command, but it can use lists and ranges, even nested.
 
Old 05-25-2009, 04:23 AM   #3
ivanatora
Member
 
Registered: Sep 2003
Location: Bulgaria
Distribution: Ubuntu 9.10, FreeBSD 7.2
Posts: 459

Rep: Reputation: 31
man wget:
Code:
       -q
       --quiet
           Turn off Wget's output.
 
Old 05-25-2009, 04:25 AM   #4
i92guboj
Gentoo support team
 
Registered: May 2008
Location: Lucena, Córdoba (Spain)
Distribution: Gentoo
Posts: 4,083

Rep: Reputation: 405Reputation: 405Reputation: 405Reputation: 405Reputation: 405
Quote:
Originally Posted by Steve W View Post
Am I using wget optimally for downloading small and numerous files from one ftp address in this way?
There's no problem with that. It's just fine. But if you want quieter output you can use either --no-verbose for minimal output, or --quiet for a completely quiet wget.

Quote:
P.S. Am quite impressed that Linux has a built-in command for doing this kind of thing. In the old days when I used Windows, I needed to install separate software to bulk-download multiple files like this. Go-Zilla, I think it was called. With Linux, it's just all there!
To be fair, linux is just a kernel. The user land tools that are available depends on the creators of your distribution. Wget just happens to be common enough so most distros will ship it, except maybe for the smaller ones that can fit on a floppy
 
Old 05-25-2009, 05:56 AM   #5
H_TeXMeX_H
LQ Guru
 
Registered: Oct 2005
Location: $RANDOM
Distribution: slackware64
Posts: 12,928
Blog Entries: 2

Rep: Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301Reputation: 1301
For ftp just use something like lftp. wget is more general and usually more useful, but for ftp I would use lftp. Here's an example of downloading all tagfiles from slackware server:

Code:
lftp -c 'open -e "mget -d */tagfile" ftp://ftp.slackware.com/pub/slackware/slackware-12.2/slackware/'
Note that you may want to not use the '-d' option if you don't want to preserve directory structure.

Last edited by H_TeXMeX_H; 05-25-2009 at 05:58 AM.
 
Old 05-25-2009, 10:28 AM   #6
Steve W
Member
 
Registered: Mar 2007
Distribution: Linux Mint 18.1
Posts: 511

Original Poster
Rep: Reputation: 44
Okay, thanks for the information. And yes, 'i92', I realise that technically Linux is just the kernel of the operating system called Ubuntu (that I'm using); but I tend to be one of those lazy people who use the word 'Linux' as a generic term for this open source operating system, even though it is not exactly correct.

I'm aware of the arguments that rage in some quarters between the pro-Torvalds and pro-Stallman camps; but I don't use the expression 'GNU/Linux' either.

I saw an article a while back complaining that Ubuntu was now so popular, the term was becoming synonymous in Google results with the word 'Linux', as people seemed to use them interchangeably. Although I guess it would be more correct to say I was using "Ubuntu" (when referring to the entire OS) than I was using "Linux"...

Steve
 
Old 05-25-2009, 11:29 AM   #7
i92guboj
Gentoo support team
 
Registered: May 2008
Location: Lucena, Córdoba (Spain)
Distribution: Gentoo
Posts: 4,083

Rep: Reputation: 405Reputation: 405Reputation: 405Reputation: 405Reputation: 405
I am not that picky about the terminology as it might seem. I really don't care much about that.

My main concern there was about the comparison with Windows. Not that I enjoy defending Windows, but wget is not any more a part of Linux than eMule or whatever is a part of Windows. They both are 3rd party applications that work under a given OS.

On the contrary, any of the BSD's as far as I know would meet that requirement: they are complete OSes with a basic set of functionalities and a proper toolchain and a compiler in the same package, without the need for 3rd party applications. It's true however that if we speak about "Ubuntu", then wget will always be part of the basic installation. That will be true for most distros as well.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Use wget to download multiple files with wildcards Anant Khaitan Linux - Networking 8 08-23-2013 09:45 PM
wget - download cgi files estratos Linux - Software 7 02-06-2009 06:36 AM
download only zip files using wget command Fond_of_Opensource Linux - Newbie 1 08-09-2006 03:47 AM
wget fail to download pdf files powah Linux - Software 2 05-04-2006 03:38 PM
wget download all files of certain type GT_Onizuka Linux - Software 1 05-10-2004 08:33 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 06:23 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration