Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
I have been using the same bash script for archiving and backing up data for several years. I am getting concerned about possible data corruption in my .tar.bz2 backups because the size of them has grown immensely.
My bash script does the following:
1. makes a .tar.bz2 of each major data directory we have in the office.
2. then takes each .tar.bz2 and places it into an .iso file that can be mounted or burned to a DVD.
Each major directory holds a little over 5 gigs. of data.
When each is compressed into the .tar.bz2 file they are about 1.2-1.4 Gigs. in size.
I've done some googling on the subject and I'm not finding much empirical data about the limits of bzip2 files and potential data corruption or loss.
Does anyone have experiences with this issue?
And can you provide some good links for some worthwhile RTFM. :-)
The only problem I could potentially see is if they use 32-bit file sizes in bzip2, but I find this highly unlikely. I've seen plenty of large files bz2 compressed and all seemed to have worked fine. You could always give it a test... dump some files together into one big file and bzip2 it while testing md5sums.