LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 09-29-2007, 12:59 AM   #1
rblampain
Senior Member
 
Registered: Aug 2004
Location: Western Australia
Distribution: Debian 11
Posts: 1,288

Rep: Reputation: 52
duplicating and backing up a server


What is the most efficient way to organize the duplication of a server with the backup of the same server?

What I mean is that the data on a server has to be backed up and the same data has to be copied to another server in case the first server goes down, this is doing the same thing twice but each copy can only be used in one circumstance, not two.

Or am I missing something?

In my case, making a backup is a fluid thing because the data changes all the time. To collect the data, forms are used, so one of my questions is:
is it more efficient to send the contents of the form to one server and duplicate that server with "rsync" or can the contents of the form be sent to 2 servers (or more)?

Is there a conventional way to deal with this sort of problem (have one or more copies of the server and one or more backup)?

Your comments and advices are most welcome.

Thank you for your help.

Last edited by rblampain; 09-29-2007 at 01:01 AM.
 
Old 09-29-2007, 04:09 AM   #2
hob
Senior Member
 
Registered: Mar 2004
Location: Wales, UK
Distribution: Debian, Ubuntu
Posts: 1,075

Rep: Reputation: 45
You need to be clear about what you are trying to duplicate - if you are trying to sync data in a database then you can use master-slave replication, and simply use the slave if the master goes down. If you are trying to duplicate files then you can write a script or service that copies files as required to wherever they need to go, using whatever programming language you prefer. IIRC, there is a Linux feature called inotify which lets programs find out when a change occurs in the filesystem - this is used by desktop search; I once saw a Perl script that used it.
 
Old 09-29-2007, 07:28 AM   #3
rblampain
Senior Member
 
Registered: Aug 2004
Location: Western Australia
Distribution: Debian 11
Posts: 1,288

Original Poster
Rep: Reputation: 52
Thank you for your answer.

It is the duplication of data kept in flat files, not in a conventional database like mysql. I understand from your answer this scenario would require a script to copy the file (or probably sync it).

If this is correct, what would be the correct procedure to follow to determine if the new data goes to the master or to the slave for storage, if the master is down? Is pinging the destination server an efficient and fast solution?

At first glance, using rsync seems to be the solution to have the master/slave set up and this removes the need to use something like "inotify".

In such a situation, can a slave, or a second slave, be considered as a backup?
 
Old 09-29-2007, 11:59 AM   #4
hob
Senior Member
 
Registered: Mar 2004
Location: Wales, UK
Distribution: Debian, Ubuntu
Posts: 1,075

Rep: Reputation: 45
Quote:
Originally Posted by rblampain View Post
Thank you for your answer.

It is the duplication of data kept in flat files, not in a conventional database like mysql. I understand from your answer this scenario would require a script to copy the file (or probably sync it).
At the simplest level, a one line shell script to execute rsync lets you make as many extra copies of a given directory as you desire, and you can schedule it with cron. rsync integrates SSH, so whether the clones are local or remote is all the same. That's the base line, and you can add complexity from there to get to the solution that meets the requirements.

Quote:
If this is correct, what would be the correct procedure to follow to determine if the new data goes to the master or to the slave for storage, if the master is down? Is pinging the destination server an efficient and fast solution?
You probably need to draw your architecture (back of an envelope will do) to work this problem though. Each upload connection goes from one client to one server. Once you have received the upload you obviously have full control of that data, and can copy/sync/move it. If you have multiple servers you can get different clients to connect to different servers, but each connection is still one-to-one.

Quote:
At first glance, using rsync seems to be the solution to have the master/slave set up and this removes the need to use something like "inotify".
inotify lets a process know when the contents of a directory changes, rather than the process having to poll the directory periodically. It may or may not be useful for your particular problem, so I thought that I should mention it.

Quote:
In such a situation, can a slave, or a second slave, be considered as a backup?
An active slave is never a backup. It is only a second copy of something, taken at a particular point in time. It is useful for load-balancing, or rapidly restoring availability if your system can quickly detect a failure, but for genuine backup in the disaster recovery sense you need a means of retrieving "frozen" copies of the dataset from several previous points in time. If a live master copy corrupts the error can be replicated to any live slaves, which are then equally useless (been there, seen that).

FWIW, I suspect that you need to think in terms of how to handle the files, rather than servers. Again, once the the file have been received by your upload server you have full control.

If you want to keep a permanent copy of each file then perhaps do something like this... Have one "incoming" directory and seven "daily" directories. Use two scripts or programs: one that runs periodically to move new files from an "incoming" directory to the "daily" directory for that day, and a second that runs each day to write the daily directory for yesterday to tape or CD.

Hope that helps
 
Old 09-30-2007, 12:43 AM   #5
rblampain
Senior Member
 
Registered: Aug 2004
Location: Western Australia
Distribution: Debian 11
Posts: 1,288

Original Poster
Rep: Reputation: 52
Thank you very much, your explanation is very clear.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Backing Up CVS Server carlosinfl Linux - Server 3 07-18-2007 09:21 PM
Duplicating / backing up system to other harddisk xptchina Linux - Software 2 09-19-2006 10:06 AM
Backing Up Server stlyz3 Linux - Software 4 06-04-2005 10:44 PM
Backing Up Server stlyz3 Linux - Enterprise 1 06-04-2005 04:46 PM
duplicating Unix server on LAN (Perl / html) PerlUser Linux - Networking 1 08-06-2003 02:00 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 03:16 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration