LinuxQuestions.org
Did you know LQ has a Linux Hardware Compatibility List?
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Software
User Name
Password
Linux - Software This forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.

Notices

Reply
 
LinkBack Search this Thread
Old 05-24-2005, 04:23 PM   #1
johnwyles
LQ Newbie
 
Registered: May 2005
Location: Austin, TX
Distribution: Gentoo
Posts: 7

Rep: Reputation: 0
Recovering files from corrupt TAR archive...


I am a broke college student and cannot afford to purchase the Advanced TAR Repair tool (that ironically only runs on Windows machines) to repair a 4GB tar-ball (uncompressed) that has become corrupted. Old versions of 'tar --recover' does not work and the tar program quits without recovering a SINGLE file. Here is some output of my attempts to recover data:

Code:
$ tar --version
tar (GNU tar) 1.14

$ tar xvf new-try/new-try.tar
tar: This does not look like a tar archive
tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors
So then I tried with the older version of tar:

Code:
$ ./tar-1.13.25/src/tar --version
tar (GNU tar) 1.13.25

# ./tar-1.13.25/src/tar --recover -xvf new-try.tar
./tar-1.13.25/src/tar: Archive contains obsolescent base-64 headers
./tar-1.13.25/src/tar: Archive contains `\021n5I\266\203\vQ' where numeric major_t value expected
./tar-1.13.25/src/tar: Archive contains `\202P\315R\277\244\030{' where numeric minor_t value expected
./tar-1.13.25/src/tar: Archive contains `u\214\345\372\363\373\0341\313\370\301\314' where numeric time_t value expected
./tar-1.13.25/src/tar: Archive contains `5)@O\n).J\337\272\301' where numeric off_t value expected
./tar-1.13.25/src/tar: Archive contains `\362\372\tӕha\321' where numeric gid_t value expected
./tar-1.13.25/src/tar: Archive contains `KƄ\374\214\251\277\024' where numeric uid_t value expected
./tar-1.13.25/src/tar: Archive contains `5)@O\n).J\337\272\301' where numeric off_t value expected
./tar-1.13.25/src/tar: Archive contains `\2202\310R\366\251\350\017' where numeric major_t value expected
...
No reults. However, when I run the demo version of this Advanced TAR Repair tool on the file (using samba from a windows machine to access the file) it says it is able to recover every file. This tool is obviously doing something that the normal tar executable is not doing in order to read the file names and such. Does anyone have any suggestions on what I might do? I really cannot afford the $50 for the software to recover this data!
 
Old 05-24-2005, 04:59 PM   #2
jschiwal
Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 654Reputation: 654Reputation: 654Reputation: 654Reputation: 654Reputation: 654
Can you view the archive in "mc"? Perhaps not given its size, but it is worth a try. It may be that the wrong tar program was used to produce it rather than being corrupted.

Exactly how and where was this tar file produced?
What does the "file" command say about this tar file?

I wonder if the source for the tar program used to produce the file is available.
Also, I wonder if the tar file is actually uncompressed. The message about the 64 bit headers would be seen if you forgot to use the -j option.

There is a program called "star" that you may have better luck with. It seems to have more options.
 
Old 05-24-2005, 05:48 PM   #3
johnwyles
LQ Newbie
 
Registered: May 2005
Location: Austin, TX
Distribution: Gentoo
Posts: 7

Original Poster
Rep: Reputation: 0
Quote:
Can you view the archive in "mc"?
mc gives pop-up message:
Code:
tar: This does not look like a tar archive
tar: Skipping to next header
tar: Archive contains obsolescent base-64 headers
tar: Error exit delayed from previous errors
Quote:
Perhaps not given its size, but it is worth a try. It may be that the wrong tar program was used to produce it rather than being corrupted.
Exactly how and where was this tar file produced?
The tar file was produced by using
Code:
$ split -b 500M huge-tar-ball.tar my-backup-files
on an uncompressed tar file, which then generated 500MB chunks of huge-tar-ball.tar named my-backup-filesaa, my-backup-filesab, my-backup-filesac, my-backup-filesad, etc. I then later merged all of these files back together using cat my-backup-files* > huge-tar-ball-recovered.tar. However, in the process of storing the chunks one of them went partially bad. Nonetheless, I was not too worried at the time because it appeared as though I would lose some files but tar would keep on going and try to get every file out. However, this 50GB files tar errors out with the above messages around 45GB through, which is exactly where in the chunks the corrupt file is found.

Some answers to your other questions:

Quote:
What does the "file" command say about this tar file?
$ file new-try/new-try.tar
new-try/new-try.tar: data

Quote:
I wonder if the source for the tar program used to produce the file is available.
Also, I wonder if the tar file is actually uncompressed. The message about the 64 bit headers would be seen if you forgot to use the -j option.
I agree with everything there except I did not compress the file (no need for -j) and I used GNU TAR 1.14 to generate the file.

Quote:
There is a program called "star" that you may have better luck with. It seems to have more options.
I will certainly give it a try as it may have some more advanced recovery options for tar files.
 
Old 05-24-2005, 08:48 PM   #4
johnwyles
LQ Newbie
 
Registered: May 2005
Location: Austin, TX
Distribution: Gentoo
Posts: 7

Original Poster
Rep: Reputation: 0
HORRAY! I found out, thanks to Aaron M. Renn (creator of The gzip Recovery Toolkit ), that a useful utlility cpio can do the trick quite easily! Here is what I did:

Code:
$ cpio -ivd -H tar < new-try/new-try.tar
cpio: invalid header: checksum error
cpio: warning: skipped 23065728 bytes of junk
... EXTRACTED FILES DISPLAYED ...
Needless to say I got back 99.9% of my corrupted tarball and I am VERY happy (and very LUCKY)! I must have wasted two hours scouring about a thousand combonations of keywords on Google to come up with nothing. I hope this helps someone who is in as tight a bind as I was!
 
Old 08-04-2005, 03:51 AM   #5
bmelim
LQ Newbie
 
Registered: Sep 2003
Distribution: RedHat 8
Posts: 1

Rep: Reputation: 0
Quote:
Originally posted by johnwyles
HORRAY! I found out, thanks to Aaron M. Renn (creator of The gzip Recovery Toolkit ), that a useful utlility cpio can do the trick quite easily! Here is what I did:

Code:
$ cpio -ivd -H tar < new-try/new-try.tar
cpio: invalid header: checksum error
cpio: warning: skipped 23065728 bytes of junk
... EXTRACTED FILES DISPLAYED ...
Needless to say I got back 99.9% of my corrupted tarball and I am VERY happy (and very LUCKY)! I must have wasted two hours scouring about a thousand combonations of keywords on Google to come up with nothing. I hope this helps someone who is in as tight a bind as I was!
Hi Everyone, I seem to have the same problem with 2 TAR files generated by WebMin.
Before I re-built my server I backed all my websites through webmins FileSystem Backup which uses TAR. Now that the server is built I tried to restore using webmin the 2.5gb and 3.5gb TAR archives.
I seem to be getting the same error as John above, I've tried the Advanced Recovery Software and that got me nowhere, and i've also tried using Johns solution above, that also got me nowhere.

Does anyone have any ideas? Please? I'm desperate and can't afford to lose 15+ websites. I've just realised that all the backups i've been running using the webmins that Backup solution is corrupt.

Many THanks

Regards

Bruno

Last edited by bmelim; 08-04-2005 at 04:10 AM.
 
Old 04-16-2006, 01:12 PM   #6
Majin_Buu
LQ Newbie
 
Registered: Apr 2006
Posts: 9

Rep: Reputation: 0
Thumbs down

I have the same problem and I've tried Advancerepair and the guys solution none of the work,

I have 2 precious rare collections I want to use, but so far this broken tar is .... killing me

$ cpio -ivd -H tar < new-try/new-try.tar

Test - Recover/gzrt-0.4$ cpio -ivd -H tar < Jennifer.tar.gz
cpio: invalid header: checksum error

Nothing happens...

I just wanna fix this error and NEVER use tar for a backup
 
Old 04-16-2006, 03:04 PM   #7
itz2000
Member
 
Registered: Jul 2005
Distribution: Fedora fc4, fc7, Mandrake 10.1, mandriva06, suse 9.1, Slackware 10.2, 11.0, 12.0,1,2 (Current)]
Posts: 732

Rep: Reputation: 30
Quote:
bemelin
hey dude!
I had the same problem with 2gb of my webspace, try to get the company hosted those site to see if there's any backup
A.S.A.P... it seems that there backup only works, while the cpanel backups are currupted!
 
Old 04-16-2006, 07:33 PM   #8
syg00
LQ Veteran
 
Registered: Aug 2003
Location: Australia
Distribution: Lots ...
Posts: 11,799

Rep: Reputation: 923Reputation: 923Reputation: 923Reputation: 923Reputation: 923Reputation: 923Reputation: 923Reputation: 923
Quote:
I just wanna fix this error and NEVER use tar for a backup
Can't help with the error, but I wouldn't be blaming tar. Backups is good, *good* backups is better.

You can verify a tar when you take it (some limitations), but that doesn't protect against media failure later on.
Nothing except multpile copies does.
I recommend dar - amongst many nice features it has CRC; as per the doco
Quote:
thanks to CRC (cyclic redundancy checks), dar is able to detect data corruption in the archive. Only the file where data corruption occurred will not be possible to restore, but dar will restore the others even when compression or encryption (or both) is used.
 
Old 04-17-2006, 04:44 AM   #9
Majin_Buu
LQ Newbie
 
Registered: Apr 2006
Posts: 9

Rep: Reputation: 0
Quote:
Originally Posted by syg00
Can't help with the error, but I wouldn't be blaming tar. Backups is good, *good* backups is better.

You can verify a tar when you take it (some limitations), but that doesn't protect against media failure later on.
Nothing except multpile copies does.
I recommend dar - amongst many nice features it has CRC; as per the doco
Thanks, but for now i'm just trying to recover the loss of those 2 rare things

I gave up yesterday, after 4 hours, i've never had such issues with rar or any other compression for that matter.

I will check out dar

Thanks Again
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off
Trackbacks are Off
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Recovering tar files burnt to cd without a filesystem?? mazinoz DamnSmallLinux 2 11-22-2005 05:47 PM
show biggest files in tar archive? ahz10 Linux - Software 3 11-04-2005 03:48 PM
recovering mail archive under kmail jjge Linux - Software 2 11-04-2005 12:39 PM
How to list the files in my tar.bz2 archive ??? merlin23 Linux - Newbie 13 12-14-2004 08:50 AM
Tar gives error when creating a tar file archive davidas Linux - Newbie 10 04-13-2004 12:35 AM


All times are GMT -5. The time now is 08:16 AM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration