LinuxQuestions.org
Welcome to the most active Linux Forum on the web.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices


Reply
  Search this Thread
Old 10-12-2005, 07:04 AM   #1
JohnKFT
Member
 
Registered: Aug 2003
Location: NW Scotland
Distribution: Slackware 10
Posts: 169

Rep: Reputation: 30
Problem decompressing large archive


I stored a load of jpeg photos in a tgz file of about 1200MB. When I decompress it I get a folder of about 500MB before it stops with this complaint :

gzip : stdin : invalid compressed data--format violated
tar : Unexpected EOF in archive
tar : Unexpected EOF in archive
tar : Error is not recoverable : Exiting now

The folder is not corrupt. I used tar -xzvf. Any way I can try to recover the remaining files, if they are not corrupted?

PS No, I do not have another up-to-date backup!! Please don't say anything about this, my wife has already done so!
 
Old 10-12-2005, 07:42 AM   #2
freakyg
Member
 
Registered: Apr 2005
Distribution: LFS 5.0 and 6.1
Posts: 705

Rep: Reputation: 30
1200 mb is kinda big, you should have split it into 400 mb chunks..............that way they can be burned to a cd-r for a proper backup...........
 
Old 10-12-2005, 08:07 AM   #3
piete
Member
 
Registered: Apr 2005
Location: Havant, Hampshire, UK
Distribution: Slamd64, Slackware, PS2Linux
Posts: 465

Rep: Reputation: 44
On the assumption that (for whatever reason) tar doesn't like untarring 1200mb, why not try extracting smaller amounts of files at a time. You should be able to find all the info you're looking for by using `man tar` (which is what I'm working from, so I don't know if this'll work!). But basically, you should be able to run:

Code:
tar xfj myarchive.tgz a_file_in_myarchive
or

Code:
tar xfj myarchive.tgz a_directory_in_myarchive
And it'll pull out, either a single file, or a single directory.

Alternatively, you can queue them up:

Code:
tar xfj myarchive.tgz fileone, filetwo, filethree, fileseven directoryone, directorythree, directoryeleven
Some gratuitous scripting with judicious amounts of awk will let you extract 10, 20, 400 files at a time. It'll probably take longer than doing it all at once, but it should work.

Of course, extracting all at once will cause it to fail on finding a corrupt file and then exit, leaving you with 700mb of stuff still in the archive. If you extract one file at a time, then you may find that one file is corrupt, but the rest after it are not: by doing it one at a time, only that iteration will fail, the rest will continue as normal.

Let me know if you need a hand with that script, or if you find a better solution!

- Piete.
 
Old 10-12-2005, 08:43 AM   #4
dennisk
Member
 
Registered: May 2004
Location: Southwestern USA
Distribution: CentOS
Posts: 279

Rep: Reputation: 30
I wonder, could this be an issue of running out of space on that partition or where tar creates tmp files? In your case tar would need about 2400MB free (or a bit more) to uncompress everything.

Dennisk
 
Old 10-13-2005, 04:15 PM   #5
JohnKFT
Member
 
Registered: Aug 2003
Location: NW Scotland
Distribution: Slackware 10
Posts: 169

Original Poster
Rep: Reputation: 30
Thanks folks.

Yes, I should have made smaller ones and put straight onto CD. I will know and do so next time!

Plenty of space on the partition for temporary files, but it is a factor I had not considered - will note for the future.

I found the tar man page somewhat confusing and it certainly gave me no indication that I could extract only selected files. So I enthusiastically tried that but could not get past that blockage when searching for remaining files and directories that I know should be in the archive - same error every time. Is there anything else I can try, or am I to chalk it up to experience? Could all the remaining files be corrupted? The error about EOF seems to hint at a blockage beyond which all might be well.

Be most grateful for any ideas, however wild or unlikely - nothing to lose.

John

PS For anyone else trying this it took me ages of experimenting before I realised that the full path to each file/directory is needed. Ie - to extract the file rubbish.txt from drivel.tar.gz you must do :

tar -xzvf drivel.tar.gz drivel/rubbish.txt

and put the f option last
 
Old 10-13-2005, 07:23 PM   #6
uopjohnson
Member
 
Registered: Jun 2004
Location: San Francisco
Distribution: Slackware, Ubuntu, RHEL, OS X
Posts: 159

Rep: Reputation: 30
Have you attempted to open the archive in the gui? I know I know, don't yell at me! CLI is god... I worship the CLI...
maybe then you can click and extract large numbers of files at once and get around the block.
 
Old 10-13-2005, 08:18 PM   #7
dunric
Member
 
Registered: Jul 2004
Distribution: Void Linux, former Slackware
Posts: 498

Rep: Reputation: 100Reputation: 100
tgz should be a gzipped tar archive so first basic check whether is or isn't corrupt is to run gzip -t <archive_name>.tgz to test its integrity.
 
Old 10-14-2005, 05:44 PM   #8
JohnKFT
Member
 
Registered: Aug 2003
Location: NW Scotland
Distribution: Slackware 10
Posts: 169

Original Poster
Rep: Reputation: 30
Thanks dunric. Gave it a go this morning and it whirred away for a while then produced the same error message - not even a list of the files I did get out.

Anything else I can try?
 
Old 10-14-2005, 07:06 PM   #9
KnightHawk
Member
 
Registered: Sep 2005
Posts: 128

Rep: Reputation: 15
For the record tar has zero problems with large files. I've tarred up some mighty ungodly sized files before and have had no problems.


if gzip -t is bring out errors, then that pretty much tells ya the archive is corrupted.... is it possible you didn't actually use gzip ?
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
tar - extracting a single directory from a large archive? lowpro2k3 Linux - General 1 07-24-2005 03:44 AM
decompressing .ps.gz not possible rohr Linux - Software 2 04-11-2005 05:11 AM
Using cpio on a 26GB archive produces "File too large" mfairclough Linux - Newbie 2 06-28-2004 06:55 PM
software archive problem sanjaya Linux - Software 0 02-12-2004 02:12 AM
importing an MS Outlook archive archive.pst alloydog Linux - Software 2 08-29-2003 04:02 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware

All times are GMT -5. The time now is 10:52 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration