LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Transfer 1000000 files (130GB) server to server via shell (https://www.linuxquestions.org/questions/linux-software-2/transfer-1000000-files-130gb-server-to-server-via-shell-543322/)

masali 04-04-2007 11:57 AM

Transfer 1000000 files (130GB) server to server via shell
 
Hi guys!

I am having a problem. I have had some server problems with one (located in USA) of my two servers, and need to transfer all files to the other server (located in Germany). I've tried using Rsync, but it would take days to transfer this data (only get approx 500KB/sec). I have tried using several rsync sessions to transfer different folders to increase speed using parallel transfers, and it speeds up things alot. The problem is that folders are so different in sizes, so I it is not very practical to start a new rsync for every folder or so.

To the question: Is there any shell program that I can use to parallel transfer all these files (maybe 10-15 at a time) to speed up this process?

Thanks in advance
Masali

treed 04-04-2007 12:09 PM

Copies
 
Maybe you can use wget to get the files if you have a ftp server set up.

http://www.cyberciti.biz/tips/linux-...ownloader.html


You can also try uucp. This an unix to unix copy. You can do a man on uucp for all the options.

Finally another option is to try scp:

http://www.cyberciti.biz/faq/linux-o...work-computer/

almatic 04-04-2007 01:08 PM

I'd pack the files first, this way you can also easily make sure that all is transfered correctly (md5checksum).


All times are GMT -5. The time now is 07:16 AM.