Help answer threads with 0 replies.
Go Back > Forums > Linux Forums > Linux - Networking
User Name
Linux - Networking This forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.


  Search this Thread
Old 11-16-2005, 06:57 AM   #1
Registered: Sep 2005
Location: Vienna, Austria
Distribution: Mint 13
Posts: 524

Rep: Reputation: 31
Backup strategy


I decided a few days ago to make regular Backups from my server. I wrote a small script that uses rsync and through crontab I make an incremental Backup every 10 minutes.

My script looks like this:

# no --delete-after

rsync -e ssh -alpogtvz /home/christophe/Documents > /home/christophe/Administration/Backups/protokoll-laptop-root
Every thing works just fine. But I realized that in case my computer gets some virus that destroys or changes some files then my backup will save the corrupted files!

What is a good Backup-Strategy? The server is 180 GB big and I wanted to make regular backups and also have a second server as miror server. But how can I have regular Backups and at the same time avoiding to have my backups corrupted through some corrupted files?

Can any administrator tell me what is his/her Backup-strategy?


Old 11-16-2005, 11:59 AM   #2
Registered: Apr 2003
Distribution: RH 8
Posts: 246

Rep: Reputation: 30

i would suggest appending something to each backup and cycle through every month or something.

for example,

rsync -e ssh -alpogtvz /home/christophe/Documents`date`/ > /home/christophe/Administration/Backups/protokoll-laptop-root
Old 11-16-2005, 12:19 PM   #3
Registered: Jun 2003
Location: SoCal
Distribution: CentOS
Posts: 465

Rep: Reputation: 30
i would suggest appending something to each backup and cycle through every month or something.

There's nothing wrong with syncing a server with a mirror. It's just a good idea to include an archive backup, in case the problem on the server gets replicated to your mirror before its caught.

So in your script that runs rsync, include something like this:

setDate=$(date +"%m%d%Y-%k%M%S")
echo "Creating archive..."
tar -cpvf /home/christophe/archives/Documents-$fileDate.tar /home/christophe/Documents/*
echo "Compressing archive..."
gzip -v /home/christophe/archives/*.tar
The code above will make a compressed archive of your Documents folder and append the current date and time to the end of the file name. Now if you screw up a file, and it gets replicated to your mirror (thus overwriting your backup) before you catch the problem, you'll still have a good copy of the file intact in one of your compressed archives.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Tar and partimage backup strategy jrdioko Linux - Software 1 06-08-2005 07:50 PM
Building a backup strategy pembo13 Linux - General 16 04-28-2004 01:02 PM
Backup strategy Swift&Smart Linux - General 3 04-17-2003 03:07 AM
Upgrade Strategy frkstein Linux - General 1 03-31-2003 11:39 AM
File server backup strategy: best way? lhoff Linux - General 1 09-01-2001 10:24 PM > Forums > Linux Forums > Linux - Networking

All times are GMT -5. The time now is 10:29 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration