Help answer threads with 0 replies.
Go Back > Forums > Linux Forums > Linux - General
User Name
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.


  Search this Thread
Old 05-11-2006, 11:19 AM   #1
LQ Newbie
Registered: May 2006
Location: Little Rock, AR
Posts: 12

Rep: Reputation: 0
Using wget to copy large amounts of data via ftp.

I have an older production system that I need to get 3GB of data from. The old system is running ftp and I thought about using wget to retrieve the data.

Here is the command I planned on using:

wget --mirror -np -nH --ftp-user=myuser --ftp-password=mypass --cut-dirs=2 ftp://x.x.x.x//path/to/dir

1) The above command copies the entire contents of /path/to/dir to my local working directory. I tested this command. It seems to work as expected. Is wget a good tool for this type of thing? Are the any other options I should consider adding to the command?

2) Should I be concerned about thrashing the hard drives on the old system? The old system uses U320 SCSI RAID and is in working order AFAIK. It has about 1GB of RAM and will not be in use while I copy the data. Any thoughts on this? I am thinking about copying the data over in small chunks to avoid this as a potential problem.

Old 05-11-2006, 11:55 AM   #2
Senior Member
Registered: Oct 2003
Location: /earth/usa/nj (UTC-5)
Distribution: RHL9;F1-10; CentOS4-5; DebianSarge-Squeeze
Posts: 1,151

Rep: Reputation: 46
If you have root privileges on both systems, then you should take a look at rsync.

With root privileges on both systems, you can essentially do a somewhat simple copy without needing to set up a rsync server.

If you are doing a WAN transfer, there are some nice encryption and compression options available in rsync.

I usually reserve wget for downloading CD-sized stuff.

Oops, almost forgot the reference:

Last edited by WhatsHisName; 05-11-2006 at 12:00 PM.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
DISCUSSION: Network Attached Storage – An Alternative To Tape Drives In Managing Massive Amounts of Data LQ Articles Discussion 0 04-02-2006 04:48 PM
Using wget to copy entire ftp directory stuartmunro Linux - Newbie 7 06-17-2005 07:06 PM
Can't copy large files in SAMBA ghight Linux - Software 11 09-03-2003 01:59 PM
rm command is choking on large amounts of data? Jello Linux - General 18 02-28-2003 07:11 PM > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 03:12 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration