duplicating and backing up a server
What is the most efficient way to organize the duplication of a server with the backup of the same server?
What I mean is that the data on a server has to be backed up and the same data has to be copied to another server in case the first server goes down, this is doing the same thing twice but each copy can only be used in one circumstance, not two. Or am I missing something? In my case, making a backup is a fluid thing because the data changes all the time. To collect the data, forms are used, so one of my questions is: is it more efficient to send the contents of the form to one server and duplicate that server with "rsync" or can the contents of the form be sent to 2 servers (or more)? Is there a conventional way to deal with this sort of problem (have one or more copies of the server and one or more backup)? Your comments and advices are most welcome. Thank you for your help. |
You need to be clear about what you are trying to duplicate - if you are trying to sync data in a database then you can use master-slave replication, and simply use the slave if the master goes down. If you are trying to duplicate files then you can write a script or service that copies files as required to wherever they need to go, using whatever programming language you prefer. IIRC, there is a Linux feature called inotify which lets programs find out when a change occurs in the filesystem - this is used by desktop search; I once saw a Perl script that used it.
|
Thank you for your answer.
It is the duplication of data kept in flat files, not in a conventional database like mysql. I understand from your answer this scenario would require a script to copy the file (or probably sync it). If this is correct, what would be the correct procedure to follow to determine if the new data goes to the master or to the slave for storage, if the master is down? Is pinging the destination server an efficient and fast solution? At first glance, using rsync seems to be the solution to have the master/slave set up and this removes the need to use something like "inotify". In such a situation, can a slave, or a second slave, be considered as a backup? |
Quote:
Quote:
Quote:
Quote:
FWIW, I suspect that you need to think in terms of how to handle the files, rather than servers. Again, once the the file have been received by your upload server you have full control. If you want to keep a permanent copy of each file then perhaps do something like this... Have one "incoming" directory and seven "daily" directories. Use two scripts or programs: one that runs periodically to move new files from an "incoming" directory to the "daily" directory for that day, and a second that runs each day to write the daily directory for yesterday to tape or CD. Hope that helps |
Thank you very much, your explanation is very clear.
|
All times are GMT -5. The time now is 03:14 AM. |