Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I used tar to create a multi-part archive containing a group of large files (mostly uncompressed image files). Many of the files are larger than 100 MB and the entire archive was over 6 GB (although I split it using tar -M so it would fit on two dvds). Although I can create and restore from the archive, I noticed that simply listing the archive's contents with tar -t took a very long time. I wanted to know is there some way to make tar faster when it is simply listing the archive's contents.
What I like to do is always do a find of the files that I want to backup; I then put them into a text file; at the end I use tar -T file.txt to archive the files in the text file. This makes it very easy to search for a file with the grep command.
Thank you. Okay so, just to make sure, if I wanted to backup directory A and all sub directories, I would do `find -d A -print > file.txt`, and then `tar -cv -T file.txt -f backup.tar`? Also, will this work with the -M option. I apologize if these are basic questions, but I am new to this and I want to make sure I create these backups correctly.
Also, when extracting files, if I specify a specific file to extract, will tar parse everything before that entry in the file, or will it jump to the file it needs to extract?
tar -zcvf /tmp/backup-$(date +%F).tgz -T /tmp/backup-$(date +%F).txt
This finds all the files that need to be backed up. Find only files and not directories or you'll be backing up the directory twice. At the end you'll have two files in /tmp, one is a compressed tarball and the otherone is a list of the files backed up. The files will also have the date. That way you can backup daily and you'll also know the date that this file was backed up. If you need to view the files in the backup run:
To split the file substitute the last command with:
Okay. Thank you for all the help. I will definitely try this next time I need to create backups.
I wanted to ask one more question. For the archives that I already have created that run slowly, or for new archives with large files (including archives created using the method you described above, would increasing the block size (-b) make tar run faster?
I wouldn't mess with it. If you try to tar to the same hard disk that you are reading from then it may will take longer than usual. Over the network or to an external USB is faster. You seem to have a pretty big mega archive so it will take long. Also a faster processor and faster disks does help a lot.
I tried backing up to my external hard drive and I seem to get about 10 mb/sec when creating a tar archive, which seems to be better than what I was getting earlier when I was writing to the same disk I was reading from. Thank you for all of your help. I am backing up mostly uncompressed images as well some iMovie files so these archives grow to be 6-7 GB.