Welcome to the most active Linux Forum on the web.
Go Back > Forums > Linux Forums > Linux - General
User Name
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.


  Search this Thread
Old 08-23-2002, 08:23 PM   #1
Registered: May 2001
Location: New York, USA
Distribution: RH 7.3, 8.0
Posts: 64

Rep: Reputation: 15
backup and update the backup file

What is the best way to backup a directory? tar?

Is "tar" capable of updating an old archive with newer files without creating a new archive?

Old 08-23-2002, 09:10 PM   #2
Registered: Jun 2002
Location: NY
Distribution: Gentoo,RH
Posts: 333

Rep: Reputation: 40
Hi Doris,

tar can replace files in principle, but since you almost certainly want a compressed archive, it's not practical. What you should look at are incremental backups, like do a full backup once a month or so and then daily backups of just the files that changed. tar supports that nicely; I have been running my backups this way for years. If you like, I can give you my script that does that.

Old 08-23-2002, 09:55 PM   #3
Registered: May 2001
Location: New York, USA
Distribution: RH 7.3, 8.0
Posts: 64

Original Poster
Rep: Reputation: 15
Yes, that is exactly what I want.

Please post it or email to me.

Old 08-23-2002, 11:53 PM   #4
LQ Guru
Registered: Mar 2002
Location: Salt Lake City, UT - USA
Distribution: Gentoo ; LFS ; Kubuntu ; CentOS ; Raspbian
Posts: 12,613

Rep: Reputation: 69
I vote for a post! That would be cool info to have and use.

Old 08-24-2002, 07:26 PM   #5
Registered: Jun 2002
Location: NY
Distribution: Gentoo,RH
Posts: 333

Rep: Reputation: 40
Ok here it comes - there is the preamble-style define section for all customizations; you can easily tweak it for your purposes.

Some comments:
The script goes through all directories underneath TOPDIR (/home). The BACKUPDIR (/backup/incremental) is another disk where the backups go. For each run of the script (at this point, I run it every Saturday night from cron), it creates a new directory in BACKUPDIR (such as "2002-08-25" this night) from the date. The way the date is used, you cannot run this more than once a day (generates just one directory for a given day).

The find command in the for loop finds all top-level user directories (the -name cuts out the "." and ".."), minus the "lost+found".

Since we need to allow for tar files larger than 2GB, I tar the contents to stdout, pipe into gzip and then split into 2 million bytes chunks. Note the trailing "." in the filename -- split (which reads from stdin) then makes user_2002-08-25.tar.gz.aa, bb, cc and so on. Usually, for small enough directories, you just get the .aa file. If you happen to have more and need to restore, you just
cat user_2002-08-25.tar.gz.* | tar xv ...

If you want a full backup (I usually go 1 full, 7 incrementals), just blow away the contents of the "INCREMENTALSINFO" directory - no info about incrementals equals a full backup.

Hope it's useful,


#! /bin/sh

# Martin L Purschke 8/11/1998
#This script goes through all directories underneath
#a top directory (such as /home to backup all user accounts).
#It makes, in a "backup directory" (which is supposed
#to be another disk :-) a new directory for the date, and
#puts the files there. It keeps tar's incremental info in
#an "incrementalsinfo" directory.
#Because the compressed tar output might exceed 2GB, we pipe it
#into split to break the output up into < 2GB chunks.
#In case you want a full backup, just rm -rf the "IMCREMENTSALSINFO"

# TOPDIR -- the directory under which you want all others backed up
# BACKUPDIR -- where the backups go
# INCREMENTALSINFO -- where the info about incrementals are kept

#what directories do we want?


#--no customizations below here... --------------

DATE=`date "+%Y-%m-%d"`



for u in `find . -type d -maxdepth 1 -name "*[a-zA-Z]*" -not -name "*lost+found*"`; do

echo "$u"

[ -d $dir ] || mkdir $dir


echo $bfile

echo -n "backing up $u..."
tar c --one-file-system --listed-incremental=${INCREMENTALSINFO}/${u} "${u}" | gzip -c3 | split -b 2000000000 - ${bfile}
echo "..done"

1 members found this post helpful.


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Which file to backup? jorasmi Slackware 2 09-10-2005 01:03 AM
How to backup the update files ste85 Linux - Software 3 06-26-2005 09:32 PM
file backup to dvd help brewmaster Linux - Software 2 02-01-2004 04:31 AM
Selective backup to CDRW, or other backup methods. trekk Linux - Software 1 11-03-2003 02:46 PM
file backup question G011um Linux - Newbie 6 03-19-2002 11:39 AM > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 06:47 AM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration