I have an older production system that I need to get 3GB of data from. The old system is running ftp and I thought about using wget to retrieve the data.
Here is the command I planned on using:
wget --mirror -np -nH --ftp-user=myuser --ftp-password=mypass --cut-dirs=2 ftp://x.x.x.x//path/to/dir
1) The above command copies the entire contents of /path/to/dir to my local working directory. I tested this command. It seems to work as expected. Is wget a good tool for this type of thing? Are the any other options I should consider adding to the command?
2) Should I be concerned about thrashing the hard drives on the old system? The old system uses U320 SCSI RAID and is in working order AFAIK. It has about 1GB of RAM and will not be in use while I copy the data. Any thoughts on this? I am thinking about copying the data over in small chunks to avoid this as a potential problem.