Visit Jeremy's Blog.
Go Back > Forums > Linux Forums > Linux - General
User Name
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.


  Search this Thread
Old 07-18-2008, 06:45 AM   #1
Registered: Jun 2005
Posts: 374

Rep: Reputation: 30
Syncing very large number of files to another server

I've got a folder with a very large number of files and subdirectories (it's full of maildir folders for my company actually) which has files that rank in the millions. I need to replicate this to another server as a backup.

The number of files is not the only consideration, the volume is quite significant, even over gigabit.

I've tried using rsync to try to avoid re-copying a large volume of data by only taking the differences across to the backup server, but the extremely large number of files tends to hurt rsync and take forever while it's building up file lists.

Does anybody have a better idea or another tool for replicating a directory structure with both large volume and a large number of files?
Old 07-18-2008, 08:58 AM   #2
Senior Member
Registered: Dec 2005
Location: Massachusetts, USA
Distribution: Ubuntu 10.04 and CentOS 5.5
Posts: 3,873

Rep: Reputation: 335Reputation: 335Reputation: 335Reputation: 335
I think that the quickest solution would be to make a tar backup of the directory onto a fast medium such as a SATA or PATA hard drive. That would free up the active directory as quickly as possible. Then copy this tar file to a removable medium such as a USB external drive and carry it to the backup server. Lastly, of course, you would restore the tar archive onto the backup server.

The advantages of this are that the network will not be flooded with this maintenance operation which will allow users to have full speed network access to the active server, this would be the quickest approach which would mean that the active server bogged down with this maintenance operation for the least amount of time, and you end up with a tar archive of the mail directory which you can store.

I recommend using the nice utility to start operations such as copying the archive file to removable medium. Otherwise the server will be too busy performing this operation to provide good service to the network users.

Solution two is to add a second NIC to the server and put that on a physically separate LAN to a second NIC on the backup server. This could be a point to point network and it could use gigabit speed network components. This would be pretty fast which would result in minimal time spend performing the backup or copy from the active server to the backup server. I worked one place that did this for their servers. It worked great! The backup network wasn't point to point, it was a normal LAN with a router but it was fast and it didn't interfere with the building LAN.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Renaming large number of files. checkmate3001 Linux - General 16 06-20-2009 12:16 PM
Issues moving large number of files gauge73 Linux - Software 7 05-08-2008 02:23 PM
Deleting a large number of files msteudel Linux - General 4 01-26-2005 02:36 AM
Large Number of files? mikeshn Linux - Security 2 01-10-2004 07:11 AM
Java: Compile Large number of source files ? mikeshn Programming 7 10-07-2003 12:33 PM > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 12:57 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration