LinuxQuestions.org
Review your favorite Linux distribution.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 09-26-2012, 11:51 AM   #1
Rupadhya
Member
 
Registered: Sep 2012
Location: Hoffman Estates, IL
Distribution: Fedora 20
Posts: 167

Rep: Reputation: Disabled
Backup solution using cron and tar


Hello all,

I thought I would share my practice on backups. Initially, I was using the backup GUI with Fedora. I then decided I wanted something automated that would run everyday by cron.

I have created a couple of scripts to do the backup.

My general thoughts are
  • Use tools that are available on a standard install of my operating system.
  • Do a full backup once a week
  • Do incremental backups every other day
  • Do an offsite backup once a week

The first thing I do is mount another machine. I do this through a cron job that runs every minute. I mount a cifs/samba mount to a Windows XP machine with a big hard drive.
Code:
#! /bin/bash
## Mount the other machine
mountpoint="//192.168.1.1/C"
volume="/home/raj/c"

if mount | grep "on ${volume} type" > /dev/null
then
echo "nothing" > /dev/null
else
mount -w -t cifs $mountpoint $volume -o user=raj%{username}%{password},dir_mode=0777,file_mode=0766
fi
The script, (a kind of long script, but it works) automatically creates a new directory for today's backup and removes directories from 8, 9, and 10 days ago. That way I keep a week of rotating directories and if the machine is off for a couple of days, it blows away the old backups.

The script then runs the tar operations concurrently, and waits until all of the tar operations are done to terminate the script.

I welcome any suggestions for improvements, and let me know if this helps jumpstart your own backup strategy.

I had to utilize this once when I corrupted the filesystem last week when extending my swap partition. It made it relatively easy to restore.

- Raj
Code:
#! /bin/bash
# back.sh - Backup Script for daily backup of Raj's and Virginia's files.
#DESTINATION VARIABLE
RajHome=/home/raj
VirginiaHome=/home/virginia
#
sendmail=/bin/msmtp
# Going to Cell Phone for text message
emailRecipient="{Cell Phone Number}@email.uscc.net"
#
#GET DATE VARIABLES
#date –date=’-2 day’ ‘%Y_%m_%d’
#
Today=`date +"%Y_%m_%d"`
OneDayAgo=`(date +"%Y_%m_%d" --date="1 days ago")`
TwoDaysAgo=`(date +"%Y_%m_%d" --date="2 days ago")`
ThreeDaysAgo=`(date +"%Y_%m_%d" --date="3 days ago")`
FourDaysAgo=`(date +"%Y_%m_%d" --date="4 days ago")`
FiveDaysAgo=`(date +"%Y_%m_%d" --date="5 days ago")`
SixDaysAgo=`(date +"%Y_%m_%d" --date="6 days ago")`
SevenDaysAgo=`(date +"%Y_%m_%d" --date="7 days ago")`
EightDaysAgo=`(date +"%Y_%m_%d" --date="8 days ago")`
NineDaysAgo=`(date +"%Y_%m_%d" --date="9 days ago")`
TenDaysAgo=`(date +"%Y_%m_%d" --date="10 days ago")`
#
#SETUP PATHS TO OLD BACKUPS
TenDaysAgoTarRaj=$RajHome/c/backup/raj/$TenDaysAgo
NineDaysAgoTarRaj=$RajHome/c/backup/raj/$NineDaysAgo
EightDaysAgoTarRaj=$RajHome/c/backup/raj/$EightDaysAgo
SevenDaysAgoTarRaj=$RajHome/c/backup/raj/$SevenDaysAgo
SixDaysAgoTarRaj=$RajHome/c/backup/raj/$SixDaysAgo
FiveDaysAgoTarRaj=$RajHome/c/backup/raj/$FiveDaysAgo
FourDaysAgoTarRaj=$RajHome/c/backup/raj/$FourDaysAgo
ThreeDaysAgoTarRaj=$RajHome/c/backup/raj/$ThreeDaysAgo
TwoDaysAgoTarRaj=$RajHome/c/backup/raj/$TwoDaysAgo
OneDayAgoTarRaj=$RajHome/c/backup/raj/$OneDayAgo
TodayTarRaj=$RajHome/c/backup/raj/$Today
#
TenDaysAgoTarVirginia=$RajHome/c/backup/virginia/$TenDaysAgo
NineDaysAgoTarVirginia=$RajHome/c/backup/virginia/$NineDaysAgo
EightDaysAgoTarVirginia=$RajHome/c/backup/virginia/$EightDaysAgo
SevenDaysAgoTarVirginia=$RajHome/c/backup/virginia/$SevenDaysAgo
SixDaysAgoTarVirginia=$RajHome/c/backup/virginia/$SixDaysAgo
FiveDaysAgoTarVirginia=$RajHome/c/backup/virginia/$FiveDaysAgo
FourDaysAgoTarVirginia=$RajHome/c/backup/virginia/$FourDaysAgo
ThreeDaysAgoTarVirginia=$RajHome/c/backup/virginia/$ThreeDaysAgo
TwoDaysAgoTarVirginia=$RajHome/c/backup/virginia/$TwoDaysAgo
OneDayAgoTarVirginia=$RajHome/c/backup/virginia/$OneDayAgo
TodayTarVirginia=$RajHome/c/backup/virginia/$Today
#
##
## Main program
##
echo Backup Started `date` >> $RajHome/backup.log
#
# shutdown evolution, so we get a good backup
/usr/bin/gconftool-2 --shutdown
/usr/bin/evolution --force-shutdown
#
#
echo "Subject:  Backup Started `date`" |$sendmail $emailRecipient
#
# DELETE ANY OLD BACKUPS IF THEY EXIST RAJ - KEEP A WEEK BACK FOR THE FULL BACKUP.
rm -rf $TenDaysAgoTarRaj
rm -rf $NineDaysAgoTarRaj
rm -rf $EightDaysAgoTarRaj
# DELETE ANY OLD BACKUPS IF THEY EXIST VIRGINIA - KEEP A WEEK BACK FOR THE FULL BACKUP.
rm -rf $TenDaysAgoTarVirginia
rm -rf $NineDaysAgoTarVirginia
rm -rf $EightDaysAgoTarVirginia
#
mkdir $TodayTarRaj 2>> $RajHome/backup.log
mkdir $TodayTarVirginia 2>> $RajHome/backup.log

currentDay=`date "+%a"`

if [ "$currentDay" = "Sun" ];
if [ "$currentDay" = "Sun" ];
then
    fullbackupday=1
    echo Must be Sunday `date +"%a"`
else
    fullbackupday=0
    cp $OneDayAgoTarRaj/*.snar $TodayTarRaj/
    cp $OneDayAgoTarVirginia/*.snar $TodayTarVirginia/
    echo Must be another Day `date +"%a"`
    echo ---$currentDay---
fi
#
tar -czf $TodayTarRaj/backupsource.tar.gz -g $TodayTarRaj/inventorysource.snar $RajHome/source &
tar -czf $TodayTarRaj/backupDocuments.tar.gz -g $TodayTarRaj/inventoryDocuments.snar $RajHome/Documents &
# Backup most of the important files in the home directory.
tar -czf $TodayTarRaj/backupMisc.tar.gz -X $RajHome/exclude.txt -g $TodayTarRaj/inventoryMisc.snar $RajHome &
# Backup some system icons and files
tar -czf $TodayTarRaj/backupLoginIcons.tar.gz -g $TodayTarRaj/inventoryLoginIcons.snar /var/lib/AccountsService/icons &
tar -czf $TodayTarRaj/backupLoginUsers.tar.gz -g $TodayTarRaj/inventoryLoginUsers.snar /var/lib/AccountsService/users &
# Backup most of the important files in the home directory for Virginia.
tar -czf $TodayTarVirginia/backupMisc.tar.gz -X $RajHome/exclude.txt -g $TodayTarVirginia/inventoryMisc.snar $VirginiaHome &
tar -czf $TodayTarVirginia/backupconfigLibreoffice.tar.gz -g $TodayTarVirginia/inventoryconfigLibreOffice.snar $VirginiaHome/.config/libreoffice &
# Backup Virginia's documents
tar -czf $TodayTarVirginia/backupDocuments.tar.gz -g $TodayTarVirginia/inventoryDocuments.snar $VirginiaHome/Documents &
# Backup all the Evolution emails. - Don't do an Incremental backup.
/home/raj/source/scripts/backEvolution.sh
loop() {
for (( ; ; ))
do
  if (ps -ef | grep -q "[t]ar -c")
  then
        echo "found tar" >/dev/null
  else
        if [ "$fullbackupday" -eq "1" ];
        then
           /usr/bin/touch $TodayTarRaj/completed-start-offsite-backup.txt
           /usr/bin/touch $TodayTarRaj/fullBackup.txt
           /usr/bin/touch $TodayTarVirginia/fullBackup.txt
        else
           /usr/bin/touch $TodayTarRaj/completed.txt
        fi
        echo Backup Completed `date` >> $RajHome/backup.log
        echo "Subject:  Backup Completed `date`" |$sendmail $emailRecipient
        break
        #Abandon the loop.
  fi
done
}
loop
 
Old 09-26-2012, 01:56 PM   #2
unSpawn
Moderator
 
Registered: May 2001
Posts: 29,415
Blog Entries: 55

Rep: Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600Reputation: 3600
That's way too much. You should loop over account names to process, then over the source directories to backup and then loop over dates ("for ((i=0;i<10;i++)); do doSomething; done"). No need to use this much variables. BTW incrementals (and why a 10 day cycle?) use "--newer" AFAIK, you could keep a copy of the weeks tarball as a set of 4 and erase the last one when you do monthly backups. Also have a look at rsync if you've got a spare server or external hard disk around.

Last edited by unSpawn; 09-26-2012 at 01:58 PM. Reason: //Less *is* more
 
1 members found this post helpful.
Old 09-26-2012, 02:56 PM   #3
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761
If you want to use an archive rather than rsync to ensure that all Linux filesystem properties are retained when stored on Windows, then you might want to reconsider gzip compressed tars because a single corrupt bit near the beginning of the archive means the rest of the file is a write off. This is less of an issue when using an actual disk for backup as opposed to media like DVDs, Blu-ray, etc. but still something to consider. Personally I would either skip compression or use xar, dar or afio instead, all of which can compress files individually as they are added (afio gives you the most compression options, since you can specify any compressor you like). This is safer as any corruptions will mean only losing some of you files. Alternatively (or better yet in addition) look at making parity archive volume sets. Check out the par2cmdline utils, an implementation of PAR v2.0 specification.
 
1 members found this post helpful.
Old 09-26-2012, 04:05 PM   #4
Rupadhya
Member
 
Registered: Sep 2012
Location: Hoffman Estates, IL
Distribution: Fedora 20
Posts: 167

Original Poster
Rep: Reputation: Disabled
Initially I chose tar with compression because I could open up the file on Windows XP with Winzip to inspect the backups.

I will look into rsync and another compression option to see what makes sense.

Thanks for your feedback.

UnSpawn, I was planning on doing some kind of looping to take care of the backups and users and I haven't incorporated this as of yet. Thanks for the ideas.

- Raj
 
Old 09-27-2012, 12:52 AM   #5
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761
If you want something you can easily inspect on Windows I would go for Afio. By default it makes POSIX compliant cpio archives and only uses it own extensions to the format for any files it adds which push the limitations of that format. For example, if you were to add a multi-GiB file it would likely store that entry with an afio specific "large ASCII" header. This means that most (perhaps all) of the files within your backup will be accessible with any utility that can read POSIX cpio archives, of which there are many on windows (I'd use 7zip). Even the fact that internal compression is used is not a problem. Just extract the file first and then decompress it. If you need strong compression due to space limitations I would go for XZ compression internally. This will likely give you the smallest possible archive of any of the popular compression tools with excellent decompression performance. Alternatively you might want to consider bzip2. The archive won't be as small but it is supported by a wider range of Windows tools. Additionally on Linux it is one of the few common compressors that provides recovery tools. I'd skip gzip altogether, since its major feature these days is speed. However if that was my number one criteria I'd use lzop. It has comparable compression but is much faster, albeit with the downside of very few other tools supporting it. Finally do consider skipping compression altogether if you have the space. It is the lowest risk, and makes for fast backups that are very easy to open.

Last edited by ruario; 09-27-2012 at 12:23 PM. Reason: said decompression when I meant compression
 
1 members found this post helpful.
  


Reply

Tags
backups, crontab, tar



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
[SOLVED] tar.gz backup script for mysql directory times out if run from cron GradyDiver Linux - General 7 12-09-2011 12:20 PM
Tar Won't Properly Backup FS From Cron wafflesausage Linux - Software 3 04-09-2011 06:16 AM
Backup solution? Tar extraction is taking forever... buddyalexander Linux - Desktop 6 12-08-2010 01:27 AM
Troubles using cron+tar for automated backup magli Linux - Software 18 07-14-2009 05:40 PM
BackUp & Restore with TAR (.tar / .tar.gz / .tar.bz2 / tar.Z) asgarcymed Linux - General 5 12-31-2006 02:53 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 02:04 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration