LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - General (https://www.linuxquestions.org/questions/linux-general-1/)
-   -   backup and update the backup file (https://www.linuxquestions.org/questions/linux-general-1/backup-and-update-the-backup-file-28650/)

doris 08-23-2002 08:23 PM

backup and update the backup file
 
What is the best way to backup a directory? tar?

Is "tar" capable of updating an old archive with newer files without creating a new archive?

Doris

mlp68 08-23-2002 09:10 PM

Hi Doris,

tar can replace files in principle, but since you almost certainly want a compressed archive, it's not practical. What you should look at are incremental backups, like do a full backup once a month or so and then daily backups of just the files that changed. tar supports that nicely; I have been running my backups this way for years. If you like, I can give you my script that does that.

mlp

doris 08-23-2002 09:55 PM

Yes, that is exactly what I want.

Please post it or email to me.

Doris

doris@zikai.org

MasterC 08-23-2002 11:53 PM

I vote for a post! That would be cool info to have and use.

Cool

mlp68 08-24-2002 07:26 PM

Ok here it comes - there is the preamble-style define section for all customizations; you can easily tweak it for your purposes.

Some comments:
The script goes through all directories underneath TOPDIR (/home). The BACKUPDIR (/backup/incremental) is another disk where the backups go. For each run of the script (at this point, I run it every Saturday night from cron), it creates a new directory in BACKUPDIR (such as "2002-08-25" this night) from the date. The way the date is used, you cannot run this more than once a day (generates just one directory for a given day).

The find command in the for loop finds all top-level user directories (the -name cuts out the "." and ".."), minus the "lost+found".

Since we need to allow for tar files larger than 2GB, I tar the contents to stdout, pipe into gzip and then split into 2 million bytes chunks. Note the trailing "." in the filename -- split (which reads from stdin) then makes user_2002-08-25.tar.gz.aa, bb, cc and so on. Usually, for small enough directories, you just get the .aa file. If you happen to have more and need to restore, you just
cat user_2002-08-25.tar.gz.* | tar xv ...

If you want a full backup (I usually go 1 full, 7 incrementals), just blow away the contents of the "INCREMENTALSINFO" directory - no info about incrementals equals a full backup.

Hope it's useful,

Martin

Quote:

#! /bin/sh

# Martin L Purschke 8/11/1998
#This script goes through all directories underneath
#a top directory (such as /home to backup all user accounts).
#It makes, in a "backup directory" (which is supposed
#to be another disk :-) a new directory for the date, and
#puts the files there. It keeps tar's incremental info in
#an "incrementalsinfo" directory.
#Because the compressed tar output might exceed 2GB, we pipe it
#into split to break the output up into < 2GB chunks.
#
#In case you want a full backup, just rm -rf the "IMCREMENTSALSINFO"
#directory.

# TOPDIR -- the directory under which you want all others backed up
# BACKUPDIR -- where the backups go
# INCREMENTALSINFO -- where the info about incrementals are kept


#what directories do we want?

TOPDIR=/home
BACKUPDIR=/backup/incremental
INCREMENTALSINFO=/etc/tar-backups

#--no customizations below here... --------------


DATE=`date "+%Y-%m-%d"`

[ -d $INCREMENTALSINFO ] || mkdir -p $INCREMENTALSINFO


cd $TOPDIR

for u in `find . -type d -maxdepth 1 -name "*[a-zA-Z]*" -not -name "*lost+found*"`; do

echo "$u"

dir="${BACKUPDIR}/${DATE}"
[ -d $dir ] || mkdir $dir

bfile="${BACKUPDIR}/${DATE}/${u}_${DATE}.tar.gz."

echo $bfile

echo -n "backing up $u..."
tar c --one-file-system --listed-incremental=${INCREMENTALSINFO}/${u} "${u}" | gzip -c3 | split -b 2000000000 - ${bfile}
echo "..done"

done


All times are GMT -5. The time now is 12:38 PM.