LinuxQuestions.org
Help answer threads with 0 replies.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices


Reply
  Search this Thread
Old 04-04-2007, 03:28 PM   #1
szim90
Member
 
Registered: Nov 2006
Distribution: Debian Testing
Posts: 39

Rep: Reputation: 15
listing the contents of large tar files


Hello.

I used tar to create a multi-part archive containing a group of large files (mostly uncompressed image files). Many of the files are larger than 100 MB and the entire archive was over 6 GB (although I split it using tar -M so it would fit on two dvds). Although I can create and restore from the archive, I noticed that simply listing the archive's contents with tar -t took a very long time. I wanted to know is there some way to make tar faster when it is simply listing the archive's contents.

Thank you for any help,
szim90
 
Old 04-05-2007, 08:35 AM   #2
ramram29
Member
 
Registered: Jul 2003
Location: Miami, Florida, USA
Distribution: Debian
Posts: 848
Blog Entries: 1

Rep: Reputation: 47
What I like to do is always do a find of the files that I want to backup; I then put them into a text file; at the end I use tar -T file.txt to archive the files in the text file. This makes it very easy to search for a file with the grep command.
 
Old 04-05-2007, 08:58 AM   #3
szim90
Member
 
Registered: Nov 2006
Distribution: Debian Testing
Posts: 39

Original Poster
Rep: Reputation: 15
Thank you. Okay so, just to make sure, if I wanted to backup directory A and all sub directories, I would do `find -d A -print > file.txt`, and then `tar -cv -T file.txt -f backup.tar`? Also, will this work with the -M option. I apologize if these are basic questions, but I am new to this and I want to make sure I create these backups correctly.

Also, when extracting files, if I specify a specific file to extract, will tar parse everything before that entry in the file, or will it jump to the file it needs to extract?
 
Old 04-05-2007, 02:41 PM   #4
ramram29
Member
 
Registered: Jul 2003
Location: Miami, Florida, USA
Distribution: Debian
Posts: 848
Blog Entries: 1

Rep: Reputation: 47
This is how I would do it:

find /dir ! -type d > /tmp/backup-$(date +%F).txt

less /tmp/backup-$(date +%F).txt

tar -zcvf /tmp/backup-$(date +%F).tgz -T /tmp/backup-$(date +%F).txt

This finds all the files that need to be backed up. Find only files and not directories or you'll be backing up the directory twice. At the end you'll have two files in /tmp, one is a compressed tarball and the otherone is a list of the files backed up. The files will also have the date. That way you can backup daily and you'll also know the date that this file was backed up. If you need to view the files in the backup run:

less /tmp/backup*.txt

To split the file substitute the last command with:

tar -zcvf /tmp/backup-$(date +%F).tgz -T /tmp/backup-$(date +%F).txt | split -b 700M -

Then the files will be split accross many files of 700M each for easy backup to CD.

The files will be called:

backup-2007-04-05.tgzaa
backup-2007-04-05.tgzab
backup-2007-04-05.tgzac

You will have to contactenate them before expanding with command:

cat backup-2007* > backup-restore.tgz

Make sure you have enough hard disk space.
 
Old 04-05-2007, 10:18 PM   #5
szim90
Member
 
Registered: Nov 2006
Distribution: Debian Testing
Posts: 39

Original Poster
Rep: Reputation: 15
Okay. Thank you for all the help. I will definitely try this next time I need to create backups.

I wanted to ask one more question. For the archives that I already have created that run slowly, or for new archives with large files (including archives created using the method you described above, would increasing the block size (-b) make tar run faster?

Thank you for everything,
szim90
 
Old 04-06-2007, 08:55 AM   #6
ramram29
Member
 
Registered: Jul 2003
Location: Miami, Florida, USA
Distribution: Debian
Posts: 848
Blog Entries: 1

Rep: Reputation: 47
I wouldn't mess with it. If you try to tar to the same hard disk that you are reading from then it may will take longer than usual. Over the network or to an external USB is faster. You seem to have a pretty big mega archive so it will take long. Also a faster processor and faster disks does help a lot.
 
Old 04-06-2007, 09:23 AM   #7
szim90
Member
 
Registered: Nov 2006
Distribution: Debian Testing
Posts: 39

Original Poster
Rep: Reputation: 15
I tried backing up to my external hard drive and I seem to get about 10 mb/sec when creating a tar archive, which seems to be better than what I was getting earlier when I was writing to the same disk I was reading from. Thank you for all of your help. I am backing up mostly uncompressed images as well some iMovie files so these archives grow to be 6-7 GB.

Thanks again,
szim90
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
apache listing directory contents sneakyimp Linux - Software 1 08-10-2006 12:08 PM
list contents of directory without listing contents baddah Linux - Newbie 2 06-12-2006 04:02 AM
Using (s)tar for back-up to hard disk creating very large files jlinkels Linux - Software 3 10-25-2005 08:55 PM
where to find listing of distribution contents? mfeat Fedora 3 09-21-2004 11:13 AM
listing large files fintan Linux - Newbie 7 11-06-2003 11:40 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Software

All times are GMT -5. The time now is 12:37 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration