LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - General (https://www.linuxquestions.org/questions/linux-general-1/)
-   -   Using wget to copy large amounts of data via ftp. (https://www.linuxquestions.org/questions/linux-general-1/using-wget-to-copy-large-amounts-of-data-via-ftp-443829/)

AndrewCAtWayofthebit 05-11-2006 11:19 AM

Using wget to copy large amounts of data via ftp.
 
I have an older production system that I need to get 3GB of data from. The old system is running ftp and I thought about using wget to retrieve the data.

Here is the command I planned on using:

wget --mirror -np -nH --ftp-user=myuser --ftp-password=mypass --cut-dirs=2 ftp://x.x.x.x//path/to/dir


1) The above command copies the entire contents of /path/to/dir to my local working directory. I tested this command. It seems to work as expected. Is wget a good tool for this type of thing? Are the any other options I should consider adding to the command?

2) Should I be concerned about thrashing the hard drives on the old system? The old system uses U320 SCSI RAID and is in working order AFAIK. It has about 1GB of RAM and will not be in use while I copy the data. Any thoughts on this? I am thinking about copying the data over in small chunks to avoid this as a potential problem.

Thanks,

WhatsHisName 05-11-2006 11:55 AM

If you have root privileges on both systems, then you should take a look at rsync.

With root privileges on both systems, you can essentially do a somewhat simple copy without needing to set up a rsync server.

If you are doing a WAN transfer, there are some nice encryption and compression options available in rsync.

I usually reserve wget for downloading CD-sized stuff.


Oops, almost forgot the reference: http://rsync.samba.org/documentation.html


All times are GMT -5. The time now is 10:37 PM.