Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
|
05-24-2005, 05:23 PM
|
#1
|
LQ Newbie
Registered: May 2005
Location: Austin, TX
Distribution: Gentoo
Posts: 7
Rep:
|
Recovering files from corrupt TAR archive...
I am a broke college student and cannot afford to purchase the Advanced TAR Repair tool (that ironically only runs on Windows machines) to repair a 4GB tar-ball (uncompressed) that has become corrupted. Old versions of 'tar --recover' does not work and the tar program quits without recovering a SINGLE file. Here is some output of my attempts to recover data:
Code:
$ tar --version
tar (GNU tar) 1.14
$ tar xvf new-try/new-try.tar
tar: This does not look like a tar archive
tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors
So then I tried with the older version of tar:
Code:
$ ./tar-1.13.25/src/tar --version
tar (GNU tar) 1.13.25
# ./tar-1.13.25/src/tar --recover -xvf new-try.tar
./tar-1.13.25/src/tar: Archive contains obsolescent base-64 headers
./tar-1.13.25/src/tar: Archive contains `\021n5I\266\203\vQ' where numeric major_t value expected
./tar-1.13.25/src/tar: Archive contains `\202P\315R\277\244\030{' where numeric minor_t value expected
./tar-1.13.25/src/tar: Archive contains `u\214\345\372\363\373\0341\313\370\301\314' where numeric time_t value expected
./tar-1.13.25/src/tar: Archive contains `5)@O\n).J\337\272\301' where numeric off_t value expected
./tar-1.13.25/src/tar: Archive contains `\362\372\tӕha\321' where numeric gid_t value expected
./tar-1.13.25/src/tar: Archive contains `KƄ\374\214\251\277\024' where numeric uid_t value expected
./tar-1.13.25/src/tar: Archive contains `5)@O\n).J\337\272\301' where numeric off_t value expected
./tar-1.13.25/src/tar: Archive contains `\2202\310R\366\251\350\017' where numeric major_t value expected
...
No reults. However, when I run the demo version of this Advanced TAR Repair tool on the file (using samba from a windows machine to access the file) it says it is able to recover every file. This tool is obviously doing something that the normal tar executable is not doing in order to read the file names and such. Does anyone have any suggestions on what I might do? I really cannot afford the $50 for the software to recover this data!
|
|
|
05-24-2005, 05:59 PM
|
#2
|
LQ Guru
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733
|
Can you view the archive in "mc"? Perhaps not given its size, but it is worth a try. It may be that the wrong tar program was used to produce it rather than being corrupted.
Exactly how and where was this tar file produced?
What does the "file" command say about this tar file?
I wonder if the source for the tar program used to produce the file is available.
Also, I wonder if the tar file is actually uncompressed. The message about the 64 bit headers would be seen if you forgot to use the -j option.
There is a program called "star" that you may have better luck with. It seems to have more options.
|
|
|
05-24-2005, 06:48 PM
|
#3
|
LQ Newbie
Registered: May 2005
Location: Austin, TX
Distribution: Gentoo
Posts: 7
Original Poster
Rep:
|
Quote:
Can you view the archive in "mc"?
|
mc gives pop-up message:
Code:
tar: This does not look like a tar archive
tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors
Quote:
Perhaps not given its size, but it is worth a try. It may be that the wrong tar program was used to produce it rather than being corrupted.
Exactly how and where was this tar file produced?
|
The tar file was produced by using
Code:
$ split -b 500M huge-tar-ball.tar my-backup-files
on an uncompressed tar file, which then generated 500MB chunks of huge-tar-ball.tar named my-backup-filesaa, my-backup-filesab, my-backup-filesac, my-backup-filesad, etc. I then later merged all of these files back together using cat my-backup-files* > huge-tar-ball-recovered.tar. However, in the process of storing the chunks one of them went partially bad. Nonetheless, I was not too worried at the time because it appeared as though I would lose some files but tar would keep on going and try to get every file out. However, this 50GB files tar errors out with the above messages around 45GB through, which is exactly where in the chunks the corrupt file is found.
Some answers to your other questions:
Quote:
What does the "file" command say about this tar file?
|
$ file new-try/new-try.tar
new-try/new-try.tar: data
Quote:
I wonder if the source for the tar program used to produce the file is available.
Also, I wonder if the tar file is actually uncompressed. The message about the 64 bit headers would be seen if you forgot to use the -j option.
|
I agree with everything there except I did not compress the file (no need for -j) and I used GNU TAR 1.14 to generate the file.
Quote:
There is a program called "star" that you may have better luck with. It seems to have more options.
|
I will certainly give it a try as it may have some more advanced recovery options for tar files.
|
|
|
05-24-2005, 09:48 PM
|
#4
|
LQ Newbie
Registered: May 2005
Location: Austin, TX
Distribution: Gentoo
Posts: 7
Original Poster
Rep:
|
HORRAY! I found out, thanks to Aaron M. Renn (creator of The gzip Recovery Toolkit ), that a useful utlility cpio can do the trick quite easily! Here is what I did:
Code:
$ cpio -ivd -H tar < new-try/new-try.tar
cpio: invalid header: checksum error
cpio: warning: skipped 23065728 bytes of junk
... EXTRACTED FILES DISPLAYED ...
Needless to say I got back 99.9% of my corrupted tarball and I am VERY happy (and very LUCKY)! I must have wasted two hours scouring about a thousand combonations of keywords on Google to come up with nothing. I hope this helps someone who is in as tight a bind as I was!
|
|
|
08-04-2005, 04:51 AM
|
#5
|
LQ Newbie
Registered: Sep 2003
Distribution: RedHat 8
Posts: 1
Rep:
|
Quote:
Originally posted by johnwyles
HORRAY! I found out, thanks to Aaron M. Renn (creator of The gzip Recovery Toolkit ), that a useful utlility cpio can do the trick quite easily! Here is what I did:
Code:
$ cpio -ivd -H tar < new-try/new-try.tar
cpio: invalid header: checksum error
cpio: warning: skipped 23065728 bytes of junk
... EXTRACTED FILES DISPLAYED ...
Needless to say I got back 99.9% of my corrupted tarball and I am VERY happy (and very LUCKY)! I must have wasted two hours scouring about a thousand combonations of keywords on Google to come up with nothing. I hope this helps someone who is in as tight a bind as I was!
|
Hi Everyone, I seem to have the same problem with 2 TAR files generated by WebMin.
Before I re-built my server I backed all my websites through webmins FileSystem Backup which uses TAR. Now that the server is built I tried to restore using webmin the 2.5gb and 3.5gb TAR archives.
I seem to be getting the same error as John above, I've tried the Advanced Recovery Software and that got me nowhere, and i've also tried using Johns solution above, that also got me nowhere.
Does anyone have any ideas? Please? I'm desperate and can't afford to lose 15+ websites. I've just realised that all the backups i've been running using the webmins that Backup solution is corrupt.
Many THanks
Regards
Bruno
Last edited by bmelim; 08-04-2005 at 05:10 AM.
|
|
|
04-16-2006, 02:12 PM
|
#6
|
LQ Newbie
Registered: Apr 2006
Posts: 12
Rep:
|
I have the same problem and I've tried Advancerepair and the guys solution none of the work,
I have 2 precious rare collections I want to use, but so far this broken tar is .... killing me
$ cpio -ivd -H tar < new-try/new-try.tar
Test - Recover/gzrt-0.4$ cpio -ivd -H tar < Jennifer.tar.gz
cpio: invalid header: checksum error
Nothing happens...
I just wanna fix this error and NEVER use tar for a backup
|
|
|
04-16-2006, 04:04 PM
|
#7
|
Member
Registered: Jul 2005
Distribution: Fedora fc4, fc7, Mandrake 10.1, mandriva06, suse 9.1, Slackware 10.2, 11.0, 12.0,1,2 (Current)]
Posts: 732
Rep:
|
hey dude!
I had the same problem with 2gb of my webspace, try to get the company hosted those site to see if there's any backup
A.S.A.P... it seems that there backup only works, while the cpanel backups are currupted!
|
|
|
04-16-2006, 08:33 PM
|
#8
|
LQ Veteran
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 21,314
|
Quote:
I just wanna fix this error and NEVER use tar for a backup
|
Can't help with the error, but I wouldn't be blaming tar. Backups is good, *good* backups is better.
You can verify a tar when you take it (some limitations), but that doesn't protect against media failure later on.
Nothing except multpile copies does.
I recommend dar - amongst many nice features it has CRC; as per the doco
Quote:
thanks to CRC (cyclic redundancy checks), dar is able to detect data corruption in the archive. Only the file where data corruption occurred will not be possible to restore, but dar will restore the others even when compression or encryption (or both) is used.
|
|
|
|
04-17-2006, 05:44 AM
|
#9
|
LQ Newbie
Registered: Apr 2006
Posts: 12
Rep:
|
Quote:
Originally Posted by syg00
Can't help with the error, but I wouldn't be blaming tar. Backups is good, *good* backups is better.
You can verify a tar when you take it (some limitations), but that doesn't protect against media failure later on.
Nothing except multpile copies does.
I recommend dar - amongst many nice features it has CRC; as per the doco
|
Thanks, but for now i'm just trying to recover the loss of those 2 rare things
I gave up yesterday, after 4 hours, i've never had such issues with rar or any other compression for that matter.
I will check out dar
Thanks Again
|
|
|
All times are GMT -5. The time now is 07:59 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|