Syncing very large number of files to another server
I've got a folder with a very large number of files and subdirectories (it's full of maildir folders for my company actually) which has files that rank in the millions. I need to replicate this to another server as a backup.
The number of files is not the only consideration, the volume is quite significant, even over gigabit.
I've tried using rsync to try to avoid re-copying a large volume of data by only taking the differences across to the backup server, but the extremely large number of files tends to hurt rsync and take forever while it's building up file lists.
Does anybody have a better idea or another tool for replicating a directory structure with both large volume and a large number of files?
|