Welcome to the most active Linux Forum on the web.
Go Back > Forums > Linux Forums > Linux - Newbie
User Name
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!


  Search this Thread
Old 09-13-2007, 09:51 AM   #1
LQ Newbie
Registered: Sep 2007
Posts: 1

Rep: Reputation: 0
Looking for a differential backup script


Im looking for a differential backup script that would allow me to run from monday to friday and would copy the files which has changes from the last backup.

I am looking for a reference file also that would find all files to be copied,

Old 09-13-2007, 10:16 AM   #2
Registered: Jun 2006
Location: Mariposa
Distribution: Slackware 9.1
Posts: 938

Rep: Reputation: 30
You can write your own script, in one of several languages. Do this at the command line:
man find
That's what you need to find the files modified since a certain point in time.

Hope this helps.
Old 09-13-2007, 10:34 AM   #3
Registered: Dec 2005
Location: /USA/MI/Detroit/home
Distribution: MEPIS, antiX, RHEL
Posts: 105

Rep: Reputation: 15
Thumbs up

If you're looking for a business/enterprise solution, I recommend rsnapshot.

If you're looking for something that's a bit more user-friendly/gui-driven, I recommend SBackup.

I use rsnapshot at work, and SBackup at home. They will both do incremental backups and then a full backup after a specified number of incrementals, and both are great in their own right depending on what you're looking for.

Old 09-13-2007, 07:31 PM   #4
Senior Member
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,191

Rep: Reputation: 105Reputation: 105
Can you give the scope of what you are doing? One machine? Several? Home or Work? Lots of data? Copy to external drive? To Tape? To another computer? Answers to those questions make a big difference in what solutions make sense for you.

If you are doing one linux machine at home and have just your basic personal stuff with an extra drive large enough to hold all your backups, then gnutar all by itself could do the job: You could do a few directly from the command line to see how it works, and then write a simple cron job that would automate it for you.

Note that the definition of differential is simply an incremental with reference to the last full backup. So, while multiple dump levels can give you incrementals with respect to a previous incremental, using a dump level of 1 all week long will make them all with respect to the last full (i.e. differential).

If your needs are more sophisticated or more extensive, then there are a whole host of possible solutions. If you were doing a network backup of a number of servers, then I would recommend Amanda:
Old 09-14-2007, 01:45 AM   #5
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.8, Centos 5.10
Posts: 17,295

Rep: Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358Reputation: 2358
If it's a fairly simple setup and you mean you just want to cut down on network activity/time on subsequent backups (after the initial), rsync's default mode is differences only and it can also do on the fly compression to reduce network bandwidth.
Can also use the ssh protocol if required.
Old 09-16-2007, 11:16 PM   #6
Registered: Aug 2007
Location: Florida
Posts: 33

Rep: Reputation: 19
For my home and work stand alone systems I use the following, which I tweaked quite a bit, to perform daily/weekly/monthly files system backups to a NAS drive smbfs mount point in /etc/fstab. I use G4L aperiodically to perform bare metal disk image backups.

Place the following text in a /etc/cron.daily/{filename without extension} and make it root with RWX permissions. Modify as necessary. Make sure to add "lfs" option to your mount point as this compresses after backing up. Not the most efficient, but then I don't have a lot to backup.

# full and incremental backup script

#Change the 5 variables below to fit your computer/backup

COMPUTER={replace with host prefix name} # name of this computer
DIRECTORIES="/" # directoris to backup
BACKUPDIR=/home/{replace with path}/nas-drive/backups # where to store the backups
TIMEDIR=/home/{replace with path}/nas-drive/backups # where to store time of full backup
TAR=/bin/tar # name and locaction of tar
EXCLUDE="--exclude=/home/{replace with path}/nas-drive/backups/* --exclude=/proc/kcore"

#You should not have to change anything below here

DOW=`date +%a` # Day of the week e.g. Mon
DOM=`date +%d` # Date of the Month e.g. 27
DM=`date +%d%b` # Date and Month e.g. 27Sep

# On the 1st of the month a permanet full backup is made
# Every Sunday a full backup is made - overwriting last Sundays backup
# The rest of the time an incremental backup is made. Each incremental
# backup overwrites last weeks incremental backup of the same name.
# if NEWER = "", then tar backs up all files in the directories
# otherwise it backs up files newer than the NEWER date. NEWER
# gets it date from the file written every Sunday.

# Monthly full backup
if [ $DOM = "01" ]; then
echo 'doing monthly full' > $ERRORS
$TAR -v --ignore-failed-read $EXCLUDE $NEWER -czf $BACKUPDIR/$COMPUTER-$DM.tar.gz $DIRECTORIES >> $ERRORS 2>&1

# Weekly full backup
if [ $DOW = "Sun" ]; then
NOW=`date +%d-%b`

# Update full backup date
echo $NOW > $TIMEDIR/$COMPUTER-full-date
echo 'doing weekly full' > $ERRORS
$TAR -v --ignore-failed-read $EXCLUDE $NEWER -czf $BACKUPDIR/$COMPUTER-$DOW.tar.gz $DIRECTORIES >> $ERRORS 2>&1

# Make incremental backup - overwrite last weeks

FULLDOW=`date +%a -d $FULLDATE`

if [ -f $BACKUPDIR/$COMPUTER-$FULLDOW.tar.gz ]; then
# Get date of last full backup
NEWER="--newer `cat $TIMEDIR/$COMPUTER-full-date`"
echo "doing daily $NEWER" > $ERRORS
$TAR -v --ignore-failed-read $EXCLUDE $NEWER -czf $BACKUPDIR/$COMPUTER-$DOW.tar.gz $DIRECTORIES >> $ERRORS 2>&1
NOW=`date +%d-%b`
# Update full backup date
echo $NOW > $TIMEDIR/$COMPUTER-full-date
echo 'doing daily full' > $ERRORS
$TAR -v --ignore-failed-read $EXCLUDE $NEWER -czf $BACKUPDIR/$COMPUTER-$DOW.tar.gz $DIRECTORIES >> $ERRORS 2>&1

Last edited by ElvisImprsntr; 09-28-2007 at 02:54 AM.
Old 09-17-2007, 12:25 AM   #7
LQ Guru
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 671Reputation: 671Reputation: 671Reputation: 671Reputation: 671Reputation: 671
Originally Posted by chrism01 View Post
If it's a fairly simple setup and you mean you just want to cut down on network activity/time on subsequent backups (after the initial), rsync's default mode is differences only and it can also do on the fly compression to reduce network bandwidth.
Can also use the ssh protocol if required.
You want to use an incremental backup in that case. A differential backup will backup all changes from the initial full backup. An incremental backup will backup up changed files since the previous full or incremental backup. A differential backup will result in more network bandwidth being used.

Any backup program like kdar will be able to perform incremental backups. You can simply use tar with the "listed-incremental=<snapshot file>" option.
See Section 5.2 of the info tar manual for an example of an incremental dump.
Old 09-17-2007, 06:13 AM   #8
Registered: Aug 2007
Location: Florida
Posts: 33

Rep: Reputation: 19
I couldn't get Partimage to restore successfully on some Dell 9150's with pre-configured HDs with some hidden partitions last year. Not to mention I have a large number of system to backup, which the procedure I was given for Partimage was to back up the MBR, PT, and each partition separately, which took over 6 hours elapsed time for each system over a LAN. Also, the security requirements (DoD NISPOM) imposed on me do not allow me to run the system unattended in an unprotected/unsecure mode. I found it easier and faster using G4L to backup 4 systems simultaneously to a NAS in less than 2 hours, and it will restore in less than 1 hour. Minimize recurring support, down time, and eliminate chances for human (me) mistakes. My self imposed bare metal backup requirement (in addition to daily, weekly, monthly file system) is every 6 months or whenever there is a significant change in the configuration or status of the system, which ever is less. Also, I need to create a bare metal backup for NISPOM forensic evidence in the event there is a compromise of the system.

To each his own.

Last edited by ElvisImprsntr; 09-17-2007 at 08:47 PM.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
differential backup mithridates Fedora 1 09-11-2007 07:52 PM
Script for Differential Backup ????? ajeetraina Programming 3 08-01-2007 03:46 PM
Need good suggestions for differential backup Micro420 Linux - Software 2 10-30-2006 12:04 AM
differential backup package? taiwf Linux - Software 1 04-10-2006 06:28 AM

All times are GMT -5. The time now is 01:47 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration