LinuxQuestions.org
Visit Jeremy's Blog.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 07-18-2011, 06:22 AM   #1
haojam
LQ Newbie
 
Registered: Jul 2011
Posts: 1

Rep: Reputation: Disabled
[Mocha] - I would like to retrieve multiple files at one go.


Dear Sir,
I would like to retrieve the lists of multiple files at one go. Each file is big size so wget command is too slow to download all 14 files. Could you please tell any other option to do it.

ftp://ftp-trace.ncbi.nlm.nih.gov/sra...X017/SRX017837

Regards,
Mocha
Seoul National University College of Medicine
 
Old 07-18-2011, 06:40 AM   #2
b0uncer
LQ Guru
 
Registered: Aug 2003
Distribution: CentOS, OS X
Posts: 5,131

Rep: Reputation: Disabled
Quote:
Originally Posted by haojam View Post
Each file is big size so wget command is too slow to download all 14 files.
Could you be a little more specific about this? I don't think wget "is slow", it should download as fast as the connection allows (theoretically, up to the point when disk I/O and such things start to slow things down--but this will probably not be a problem). If you mean that you have a very fast network connection, but that the specific connection through which you download a file from the server is slow (e.g., the bottleneck is that single connections to the server are limited to some slower speed than what you are capable of), then I guess you could try to download the files in parallel, e.g., several at once. This would speed things up if each of the connections could work at the same speed as one connection, e.g., the "speed limit" was on a per-connection basis, not over all the connections to the server. However, if it is not absolutely necessary, I recommend not doing this: there is probably a reason for the speed limit, and one should always respect the limits set on remote servers.

If you need an example on how you can use wget (other programs would be fine too) to do parallel downloads, see for example this. Basically it fetches a list of what to download, then pipes that to (e)grep and xargs which are used to pick up files and build (argument) lists for multiple wget commands, which then start (simultaneously) the downloads.

If I misunderstood your question, please ignore my answer (unless it's helpful in some way) and perhaps re-phrase your post to get better/correct answers.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
how do i retrieve files using my tremainal? constancebwn Linux - Newbie 2 09-24-2009 04:45 AM
How to retrieve deleted files saurabh_s2 Linux - Newbie 3 08-20-2009 01:45 AM
retrieve files for deletion props666999 Slackware 5 06-14-2006 11:03 PM
How to retrieve deleted files kumary Linux - General 3 01-26-2006 09:45 AM
I deleted some files, how do i retrieve it? novkhan Linux - General 4 10-31-2004 03:57 PM


All times are GMT -5. The time now is 12:58 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration