Review your favorite Linux distribution.
Go Back > Forums > Linux Forums > Linux - Software
User Name
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.


  Search this Thread
Old 04-17-2012, 10:45 AM   #1
Senior Member
Registered: Jan 2003
Location: Aachen
Distribution: Opensuse 11.2 (nice and steady)
Posts: 2,203

Rep: Reputation: 45
How good is tar -zcvf for big files?

Dear all,
I am using a simple crontab entry for handling my backups

tar -zcvf /media/a9f299d7-9775-404d-a073-fcbc28b3f3c0/user-caracus`date '+%d-%B-%Y'`.tar.gz /etc /root /home 2>> /root/backup/backuperrors.txt

(please assume that I have unlimited hard disk space where I can store that).

I wonder if my pc crashes and I want to see what was inside my backup and

1) decide to extract only few files/folders from that huge file. Would that be possible?

2)Restore my files. How much hard disk space I will need for that? How I can calculate it?

3) Is it anywhere mentioned that there are certain limitaions of tar and there any chances for having unrecoverable files from tars that are too large?

4) How I can make that process through crontab to run with low priority?

P.S Edit part

5) How I can run the
tar -zcvf /media/a9f299d7-9775-404d-a073-fcbc28b3f3c0/user-caracus`date '+%d-%B-%Y'`.tar.gz /etc /root /home 2>> /root/backup/backuperrors.txt
with low priority?

Last edited by alaios; 04-17-2012 at 11:44 AM. Reason: added 5)
Old 04-17-2012, 01:10 PM   #2
Registered: Jun 2009
Location: México
Distribution: Suse, Debian based, CentOs
Posts: 48

Rep: Reputation: 10
First of all, and this applies to all i'am going to write: Do never take anything for granted when it comes to linux-based commands, tar as many other commands differs form distro to distro and even versions of the same distro, so always read the man of the command before using it.

now let's go over the questions

1) yes, it is possible to extract some files from your .tar.gz even using wildcards, again go to the man of your distro. Probably you can extract single file using
tar -xzf /some/tar/file.tar.gz file
2) mmm you may want to google a little the compression rate, a quick search tells that gzip (used in tar czf) have a compression rate of 40,6%

3) I used tar to back upd databases and servers while on a past job and never had issues with the size, the largest back up i ever made was about 60 Gb, you may want to estimate the size of yours.

4) to run a command with a defined priority use the nice command (again read the man):
nice [OPTION] [COMMAND [ARG]...]
note: evrything i write is based on my experience (and some googling) it is not the absolute truth, some one knows much more than i do.


Last edited by Spatior; 04-17-2012 at 01:13 PM. Reason: Clarification
Old 04-17-2012, 04:48 PM   #3
LQ Newbie
Registered: Dec 2008
Posts: 23

Rep: Reputation: 1
Spatior got a lot of good things right, here's what a few more years of tar has yielded:

First of all: make sure you have GNU tar (like the one in Linux).
For example, HP-UX tar cannot handle files over 2GB and does not support in-line compression.
GNU tar, however is available for HP-UX and can do anything you can think of (yes, even multi-stage incremental backups)

1) yes, use "tar -ztf tarfile" to see what files are in the archive and "tar -zxf tarfile file1 file2" to restore singular files.
I have made backups that fill up 400GB hardware-compressed tapes. Even then, you can instruct tar to ask for the next tape and continue the backup (it's awesome like that)

You can also use the -T option to give it a file with a list of files to restore.
tar -ztf my_mackup.tar.gz > contents.lst
vi contents.lst
tar -T contents.lst -zxf my_backup.tar.gz

2) when creating the backup, you can try piping to dd and writing to /dev/null to check the resulting filesize.
example: "tar -zcf - /huge_filesystem |dd bs=1M of=/dev/zero"
This will report your backup size in megabytes.

A similar trick can be user to check the size of a restore:
"tar -zxOf /mnt/usbdisk/my_backup.tar my/restored/directory |dd bs=1M of=/dev/zero"

Compression rates are highly dependent on the compression type and data to be compressed. database tables compress more than JPEG images do. Likewise, "compress" doesn't compress very well, "gzip" is OK and and "bzip2" is better but higher compression also meand higher CPU usage and slower backups.
compress, gzip and bzip2 are options -Z, -z and -j, respectively in GNU tar.

3) According to the documentation, modern GNU tar can handle unlimited filesizes:

4) indeed, nice works rather well (especially if you want to hog the cpu with compression)

5) same answer: just prepend "nice" =)
To make the backup complete, I would check the exit status ($?) and report it, along with the backup contents, via mail.

Here's what I use at home:

if [ "${SKIP}" -eq "1" ]
echo Backup SKIPPED because of config variable | mail -s "Backup SKIPPED" root
exit 1

touch /var/tmp/${PID} /var/tmp/${PID}.err
/root/ >/var/tmp/${PID} 2>/var/tmp/${PID}.err

if [ "${EXITSTATUS}" -eq "0" ]
cat /var/tmp/${PID} | mail -s "Backup SUCCESS" root
rm /var/tmp/${PID} /var/tmp/${PID}.err
cat /var/tmp/${PID} /var/tmp/${PID}.err |mail -s "Backup ERRORS" root

1 members found this post helpful.
Old 04-18-2012, 02:07 AM   #4
LQ Guru
Registered: Aug 2004
Location: Sydney
Distribution: Centos 6.10, Centos 7.5
Posts: 17,699

Rep: Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494Reputation: 2494
Actually, both gzip & bzip2 have options to specify how hard to compress; more compression equals more work/takes longer...


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
a tough question 4 u, problem in extracting tar & tar.gz files p_garg Linux - General 5 11-08-2010 12:02 PM
problem in copy big files using tar mdfakkeer Programming 1 09-13-2010 07:11 AM
More tar issues - listing files in a single tar from a multivolume borks Thymox Linux - Software 4 09-23-2009 08:48 PM
cut up a big tar file xhimi Linux - Newbie 5 02-23-2008 05:04 AM
How big can backup.tar.gz be? Heart DamnSmallLinux 2 07-05-2006 05:56 AM > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 06:31 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration