LinuxQuestions.org
LinuxAnswers - the LQ Linux tutorial section.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices

Reply
 
Search this Thread
Old 11-09-2004, 01:43 AM   #1
ninmonkeys
LQ Newbie
 
Registered: Nov 2004
Distribution: Slackware 10
Posts: 28

Rep: Reputation: 15
How do you backup?


1) How do you do your backups? I have script that will run this command, only use the current date as the tarball name.

2) Can I make bzip2 compress the file even more, I don't care how long it takes to compress. Do I have to split it into 2 commands, or pipe the output to do this?

3) What directory do you save your backups to?

Here's the way I came up with by reading man pages. I'm pretty sure it is able to exclude the *directories* (correct me if I'm wrong) in the exclude file.
Code:
$ tar -cjv -X backup_exclude_paths -f jake.tar.bz2 /home/jake/
Code:
[jake@localhost: ~/tmp]$ cat backup_exclude_paths                              
/home/jake/download
/home/jake/Desktop
/home/jake/desktops
/home/jake/.thumbnails
/home/jake/.BitchX
/home/jake/.gconf
/home/jake/.gconfd
/home/jake/.gnome
/home/jake/.gnome2
/home/jake/.gnome2_private
/home/jake/.metacity
thanks,
--monkey

Last edited by ninmonkeys; 11-09-2004 at 02:25 AM.
 
Old 11-09-2004, 02:21 AM   #2
otoomet
Member
 
Registered: Oct 2004
Location: Tartu, Ċrhus,Nürnberg, Europe
Distribution: Debian, Ubuntu, Puppy
Posts: 588

Rep: Reputation: 45
Hi,

I backup my home (almoust) exactly in the same way. I found it is useful to cd $HOME and to tar thereafter, because in this case you get relative paths in the archive. This is useful if extracting to another place. Note that the excluded directoris must be specified accordingly.

I wrote it as a ascript, in this case it would be trivial to use bzip2 -9 after tarring. or what do you exactly mean as "compressing files even more".

best,

Ott

---- my archiving script

#!/bin/bash
if test "$1" == "-v" ; then
V="-v"
echo "Arhiivin $HOME -> $ARHIIV"
else
V=""
fi
cd $HOME
ARHIIV=/mnt/otoomet/$HOSTNAME.tar.gz
tar $V -czf $ARHIIV\
--exclude .Trash --exclude a --exclude 80274378.key\
.
 
Old 11-09-2004, 02:30 AM   #3
jschiwal
Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 654Reputation: 654Reputation: 654Reputation: 654Reputation: 654Reputation: 654
I'm not certain but I think you will need to use a pipe to be able to use the -9 (max compression) option.
You will be able to decompress using the j flag in tar however.

This will probably take experimentation to find the right balance between the compression level and the amount of time it takes to perform a backup.

One other thing is whether you want to exclude all of the .kde or .gnome directories. You may find that you want your preferences backed up, but exclude any cache type subdirectories such as icons and internet caches.

If you have very large backups, you might want to forgo compression alltogether. Especially if the majority of the files are in a format that are compressed to begin with such as jpegs or mpegs or downloads.
 
Old 11-09-2004, 03:16 AM   #4
ninmonkeys
LQ Newbie
 
Registered: Nov 2004
Distribution: Slackware 10
Posts: 28

Original Poster
Rep: Reputation: 15
Quote:
Originally posted by otoomet what do you exactly mean as "compressing files even more"[/B]
I didn't think the default was set to max compression. Here's my script, it uses mutliple directories and exclusion files:
Code:
#!/bin/bash

backup_dest_dir="/root/backups/slack10"
#date format: year-month-day
backup_date="$(date +%F)"
backup_dest_file="$backup_dest_dir/$backup_date.tar"

function bzip_it
{
  echo "Compressing: $backup_dest_file";
  bzip2 -9 $backup_dest_file
  echo "Created: $backup_dest_file.bzip2";
  echo "Removing: $backup_dest_file";
}

function back_up
{
  #if directory !exist
  if [ ! -e $backup_dest_dir ]; then
    echo "creating: $backup_dest_dir"
    mkdir $backup_dest_dir
  fi
  #later I'll make it so if there's just one argument, it doesn't use an exclusion file
  if [ -z $2 ]; then
    echo "usage: rootDir file_exclusion";
  fi
  echo "tar -uv -X $2 -f $backup_dest_file $1" 
  tar -u -X $2 -f $backup_dest_file $1 
}

echo "Backup: starting..."
back_up /root /root/backup_exclude_paths
back_up /home/jake /home/jake/backup_exclude_paths
bzip_it
echo "Backup: complete."
 
Old 11-09-2004, 03:30 PM   #5
J.W.
LQ Veteran
 
Registered: Mar 2003
Location: Milwaukee, WI
Distribution: Mint
Posts: 6,642

Rep: Reputation: 69
If you've got a second hard drive, I find it useful to periodically copy over my entire /home directory to that second drive. If I delete I file I later need from my "real" /home, it's nice to have a local copy. Obviously this isn't really a robust backup system, but then again I don't have any highly critical data -- J.W.
 
Old 11-09-2004, 05:19 PM   #6
bigrigdriver
LQ Addict
 
Registered: Jul 2002
Location: East Centra Illinois, USA
Distribution: Debian Squeeze
Posts: 5,739

Rep: Reputation: 298Reputation: 298Reputation: 298
Smile

The problem I have with tar backups is this: tar makes the entire tarball as one compressed file. If any part is degraded in any way, you have essentially lost the entire backup. I searched for the last 2 years for backup options; tried many; finally settled on Dar. I get compressed backups (gzip or bzip2), each file compressed individually. If part of one is degraded/corrupt in any way, I still have the rest. Plus, dar can do a skip-ahead in restoring a corrupt file, so that only a small part of the file is lost.
You can find dar here
 
Old 11-09-2004, 10:42 PM   #7
shazam75
Member
 
Registered: Oct 2004
Location: Australia, Brisbane
Distribution: Gentoo
Posts: 296

Rep: Reputation: 30
Why don't you guys try rsync to do your back ups:

http://applications.linux.com/print..../09/15/1931240

Regards
Shelton.
 
Old 11-09-2004, 11:02 PM   #8
jonr
Senior Member
 
Registered: Jan 2003
Location: Kansas City, Missouri, USA
Distribution: Ubuntu
Posts: 1,040

Rep: Reputation: 47
Quote:
Originally posted by J.W.
If you've got a second hard drive, I find it useful to periodically copy over my entire /home directory to that second drive. If I delete I file I later need from my "real" /home, it's nice to have a local copy. Obviously this isn't really a robust backup system, but then again I don't have any highly critical data -- J.W.
I'm in the same situation, and do the same thing--and it has saved me much grief time after time. Highly recommended to do this /home backup to a second drive even if you use some other method of backing up other files.

Here is how my command reads to do it, along with comments I put in to remind myself of what some of it means. I found the neat trick about interpreting dots correctly, here on LQ some time ago!

Code:
cd /home/jon
cp -rupv ./* ./.[^\.]* /prime_backup/home.jon.backup/
##Note: the "[^\.]" prevents cp from interpreting the "hidden file attribute" dot
 as 
#either the "current directory" dot or one of the "parent directory" dots by
#specifying that the second character is not to be a dot.  For this reason no
#file should begin with two dots, normally.
I was even automating this via a cron job for a while, but that can be a deceptive convenience, as the backup may take place just after you've made some regrettable changes, or while certain files are being accessed with the result that they don't get backed up correctly if at all. So it's better for me to choose for myself when to do this kind of backup.

As for big backups, system-wide in nature, it's worth looking at rdiff-backup, available at

http://www.nongnu.org/rdiff-backup/

This program creates "differential" backups (everybody's definition of "differential" is apt to be different, but the general result is that there's a series of "snapshots" available enabling restoring files or directories to a state they were in a given length of time ago, if necessary or desired).

It's really a pretty subtle but easy to use program (otherwise I wouldn't be able to use it).

If disk space is limited it may not be so desirable. However, you can trim the number of versions it keeps on hand at will.

Last edited by jonr; 11-09-2004 at 11:09 PM.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
backup help sridhar11 Debian 1 08-30-2005 10:40 AM
backup LuckyABA Linux - Software 1 01-22-2005 12:51 PM
can I backup a root disk and boot from the new backup disk linuxbkp Linux - Enterprise 3 10-15-2004 06:42 PM
Selective backup to CDRW, or other backup methods. trekk Linux - Software 1 11-03-2003 02:46 PM
backup and update the backup file doris Linux - General 4 08-24-2002 07:26 PM


All times are GMT -5. The time now is 02:31 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration