LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Server (https://www.linuxquestions.org/questions/linux-server-73/)
-   -   Cannot extract a tarfile that I created, error states stdin: not in gzip format (https://www.linuxquestions.org/questions/linux-server-73/cannot-extract-a-tarfile-that-i-created-error-states-stdin-not-in-gzip-format-754034/)

Mo-regard 09-09-2009 09:33 PM

Cannot extract a tarfile that I created, error states stdin: not in gzip format
 
I had a number of .iso images that I mounted to a loopback device to a single directory. I then took that directory(and all sub directories) and ran the TAR command(see below) to compress and transfer them to a NAS over the network. When I copied the file (it is 38GB) back to the server and attempted to extract them using TAR I recieved an

gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors

i then ran the file command (see below) and it returned a value back of "data"

How can I retrieve the data? Below is listing of all the command that I ran to mount, and then compress, copy to the NAS.


Background.
1. We copied the DVD using the dd command to an image file
nohup dd if=/dev/scdx of=/mnt/yyyymmdd.iso

2. Then mounted a number of the images above to a directory
mount –t iso9660 –o loop=/dev/loopX / mnt/yyyymmdd.iso /mnt/yyyymmdd where x=loop number i.e. /dev/loop2

3. Once the images where mounted we were able to see the file structure. We mounted multiple disks to in a directory structure then used the tar command to compress and copy the directory over to the NAS via the network
nohup tar cvzf - * | ssh root@nas \ ‘cd /Data; cat > filename.gz’


When I copied the file back to the server to uncompress it I receive the following errors.



root@secant:/Mo3# tar -xzf 9-2.gz

gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors


Then I tried to rename it.

root@secant:/Mo3# mv 9-2.gz 9-2.tar.gz
root@secant:/Mo3# tar -xzf 9-2.tar.gz
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors
root@secant:/Mo3# tar xzf 9-2.tar.gz

When I ran the file command it told me it was format of “Data”


root@secant:/Mo3# file 9-2.tar.tgz
9-2.tar.tgz: data

Have been hunting around for a solution, but have not been successful. Any thoughts?
I don’t think it is an image file, I believe that it has compressed, but I can’t get extracted.

knudfl 09-10-2009 03:29 AM

Unpacking

1) file.tar : 'tar xvf file.tar'
2) file.tar.gz : 'tar xvf file.tar.gz' ( older systems : tar zxvf )
3) file.tgz : 'tar xvf file.tgz' ( older systems : tar zxvf )
4) file.tar.bz : 'tar xvf file.tar.bz' ( older systems : tar jxvf )

5) file.gz : 'gunzip file.gz'

Please read 'man tar' and 'man gunzip'
.....

AlucardZero 09-10-2009 09:07 AM

sounds like it's not gzipped. just untar it

lbukys 09-10-2009 09:37 AM

things to try
 
Perhaps it was mangled on the way into the NAS via SSH, or mangled when copied from the NAS. Here are a couple of things to try:
  • Run this:
    od -c 9-2.gz | head
    and tell us what it says. Maybe it will reveal that it wasn't compressed. (Forgot the tar 'z' option when created?)
  • Copy the histogramming (character frequency counting) perl script below into "hist.pl" and run:
    perl hist.pl 9-2.gz
    and tell us what it says. For a gzip file of any substantial size, you should see some occurrences of every character value from 0 to 255. If your copy into the NAS or out of the NAS messes with line terminators, you may see things like all CR deleted, or all CR mapped to LF. (If this is the problem, it is possible to recover -- I have done it before -- but it requires extreme effort, only worthwhile if it's truly critical data. There's an discussion of how at http://bukys.com/services/recovery/examples/].)
  • How are you copying your files from the NAS? NFS mount? Something else?
    Does this command:
    ssh root@NAS "cat /Data/filename.gz"
    produce the same file?

Here's my hist.pl perl script:

Code:

#!/usr/bin/perl -w
use strict;

die "filename arguments expected\n" if ($#ARGV < 0);

foreach my $filename (@ARGV) {
    if (!open IN, "<$filename") {
        warn "can't open '$filename'\n";
        next;
    }
    print "$filename\n";
    binmode(IN);
    my @hist = ();
    my $total = 0;
    while (read IN, my $buf, 1024) {
        foreach my $octet (unpack "C*", $buf) {
            $hist[$octet]++;
            $total++;
        }
    }
    close(IN);

    for (my $i = 0;  $i < 256;  $i++) {
        my $count = $hist[$i] || 0;
        my $p = sprintf("%.5f", $count/$total);
        print "[$i] $count $p\n";
    }
    print "total $total\n\n";
}
exit 0;


Mo-regard 09-22-2009 11:51 AM

Resolved, the files where corrupt
 
Thanks for your help everyone, but the files on the NAS ended up being corrupt.


All times are GMT -5. The time now is 02:05 AM.