LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Desktop
User Name
Password
Linux - Desktop This forum is for the discussion of all Linux Software used in a desktop context.

Notices


Reply
  Search this Thread
Old 12-06-2010, 09:34 PM   #1
buddyalexander
LQ Newbie
 
Registered: Dec 2010
Location: Upstate NY
Distribution: Ubuntu
Posts: 4

Rep: Reputation: 0
Question Backup solution? Tar extraction is taking forever...


I am trying to settle on a backup solution. I would rather stick with command line. I like the idea of tar because it has been around forever, it's simple, and it's a standard. I have Acronis and Ghost, but want to get away from them and stick with command line (for scripting) and free (in both ways) of course. I am testing out tar extracting a simple file from a 340 Gig Tarred and gzipped archive. It's taking forever. I guess it's to be expected. I know Acronis and Ghost let you peruse archives and pull out files rather quickly. i would like the same option with being able to backup while the system is up and running as well. I know about Clonezilla and many of the live CD options, but I want to be able to have the system up and running. I don't want anything too complicated either. I only need it for a simple desktop backup. Imaging would be a nice plus, but I am not pushing it. Sorry so long. Any ideas?
 
Old 12-06-2010, 10:04 PM   #2
barriehie
Member
 
Registered: Nov 2010
Distribution: Debian Lenny
Posts: 136
Blog Entries: 1

Rep: Reputation: 23
I've got a bash script that runs everyday and backs up everything, excluding the virtual dir.'s while I'm asleep. It uses rsync and while not even close to 340 Gbytes it's all on a seperate drive and -no waiting- to get to a file. I save 3 of $HOME, everything else, and a list of the installed packages. Root folders are appended with the day of the year and the script does the math to see which ones to remove. Had issues with find but this works.

/usr/local/bin/eList
Code:
eList
tmp/
proc/
lost+found/
mnt/
sys/
media/
home/barrie/
dev/
/usr/local/bin/osbackup
Code:
#!/bin/bash
#
# Backup routine to keep 3 sets
# of backup files.
#
# File Naming:
#   os.[day of the year]
#   barrie.[day of the year]
#   packages.[day of the year]
#

#
# Variables
#
MYEMAIL=barrie@localhost
doy=$(date +%j)
declare -i nukem
declare -i middle
if [ $doy -le 3 ]; then
		middle=3-$doy
		nukem=366-$middle
else
		nukem=$doy-3
fi

rsync -av --exclude-from=/usr/local/bin/eList  / /media/backup/os.$doy

#
#  Remove Thunar sessions
#
rm /home/barrie/.cache/sessions/Thunar*

#
# Empty the trash
#
rm -rf /home/barrie/.local/share/Trash/files
rm -rf /home/barrie/.local/share/Trash/info
mkdir /home/barrie/.local/share/Trash/files
mkdir /home/barrie/.local/share/Trash/info
chown -R barrie:barrie /home/barrie/.local/share/Trash 

#
#  Nuke the Nautilus sessions
#  Leave 1 session files
#
# Specify the target directory and file names to operate on.
target_files=/home/barrie/.nautilus/saved-session-*

# Calculate the total number of files matching the target criteria. 
total_files=$(ls -t1 $target_files | wc --lines)

# Specify the number of files to retain.
retained_files=1

# If there are surplus files, delete them.
if [ $total_files -gt $retained_files ]
  then
  rm $(ls -t1 $target_files | tail --lines=$((total_files-retained_files)))
fi

#
#  Nuke the Metacity sessions
#  Leave 1 session
#
# Specify the target directory and file names to operate on.
target_files=/home/barrie/.metacity/sessions/*

# Calculate the total number of files matching the target criteria. 
total_files=$(ls -t1 $target_files | wc --lines)

# Specify the number of files to retain.
retained_files=1

# If there are surplus files, delete them.
if [ $total_files -gt $retained_files ]
  then
  rm $(ls -t1 $target_files | tail --lines=$((total_files-retained_files)))
fi

#
# Remove all the 'normal' thumbnails
#
rm /home/barrie/.thumbnails/normal/*

#
#  Do the Backup
#
# Get the installed packages.
dpkg --get-selections | grep -v deinstall > /media/backup/packages.$doy

# Backup /home/barrie
rsync -av --exclude=/home/barrie/Music /home/barrie/ /media/backup/barrie.$doy


#
# Keep only 3 of each file set.
#

if [ -d /media/backup/os.$nukem ]
then
				rm -rf /media/backup/os.$nukem
fi

if [ -d /media/backup/barrie.$nukem ]
then
				rm -rf /media/backup/barrie.$nukem
fi

if [ -e /media/backup/packages.$nukem ]
then
				rm /media/backup/packages.$nukem 
fi


#
# Send me an email
#
#echo ""|\mutt -s "Backup Completed $doy" $MYEMAIL

exit 0

Last edited by barriehie; 12-07-2010 at 12:08 AM. Reason: Oops, forgot the script.
 
Old 12-07-2010, 03:07 AM   #3
Latios
Member
 
Registered: Dec 2010
Distribution: Arch
Posts: 115

Rep: Reputation: 21
There are bz2 and xz have you tried them ?
 
Old 12-07-2010, 05:56 AM   #4
choogendyk
Senior Member
 
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,197

Rep: Reputation: 105Reputation: 105
I'm not surprised that a 340G tar takes a while. If you are just doing your one computer and want to keep it simple, command line, and free, then rsync might be an easy solution. There are quite a few backup programs or scripts with a few more bells and whistles that are based on rsync.

As far as tar goes, Amanda is often configured to use gnu tar; but, typically, you would break up what you are backing up into smaller pieces. This makes each piece easier to recover and also makes the backups more efficient. With Amanda's scheduler, you would typically end up running many of the individual backups in parallel. But, that's also typically across multiple drive spindles and machines, whereas you are only doing one. Still, breaking it up might make it easier.
 
Old 12-07-2010, 07:39 AM   #5
John VV
LQ Muse
 
Registered: Aug 2005
Location: A2 area Mi.
Posts: 17,627

Rep: Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651Reputation: 2651
it is not exactly what you want but have you looked at clonezilla
2 weeks ago there was a story on it over on linux.com
http://www.linux.com/learn/tutorials...aster-recovery

but rsync is also a good option
 
Old 12-07-2010, 05:42 PM   #6
buddyalexander
LQ Newbie
 
Registered: Dec 2010
Location: Upstate NY
Distribution: Ubuntu
Posts: 4

Original Poster
Rep: Reputation: 0
Going with Tar

Thanks for all of the responses. I took out all of my video files, eliminated gzip, and tried to extract a file without the starting "/" and it was fast. Not sure which event sped it up. Would the slash cause that? I know Tar mentions removing the slash when creating the archive. Or would it be the 152 Gig of MP4 files. Or maybe the compression... The MP4 files are DVD's I have anyways. I really don't need to back them up. I have the hard copies. Plus I have them on another ext hd that I plug into the xbox/ps3 to watch movies. Anyways. thanks for the interesting responses. I do use clonezilla on occasion for bare metal images. I will look more into the script more as well. I will also be using rsync for backing up to the other ext hd I just mentioned. Oh... and I have never heard of xz. bz2 seemed to take to long to back up. Plus, I am not hurting for space, so I have no reason to use compression at all. Would rather have the time back that it takes.
 
Old 12-08-2010, 01:27 AM   #7
Latios
Member
 
Registered: Dec 2010
Distribution: Arch
Posts: 115

Rep: Reputation: 21
I would guess that videos are compressed enough without the additional gz. When you compress more of them together, gz probably is searching for where to optimize more, a lot of work considering the combined size of the files, but probably with little success

Backup the videos uncompressed instead, see if they take noticably more space now
 
  


Reply

Tags
hotplug, imaging, tar



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
K3B taking forever grimx Slackware 8 05-04-2010 07:36 PM
binutils-2.16.1 taking forever to install townie Linux From Scratch 3 11-24-2007 06:39 PM
programs taking forever to start ripmaster Ubuntu 1 06-19-2006 12:59 AM
GNOME Taking forever to load up. waelaltaqi Linux - Software 2 11-28-2005 01:49 PM
Mandrake 9.1 taking forever to boot jcksrobbins Mandriva 6 08-31-2003 12:19 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Desktop

All times are GMT -5. The time now is 10:58 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration