Cannot extract a tarfile that I created, error states stdin: not in gzip format
Linux - ServerThis forum is for the discussion of Linux Software used in a server related context.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Cannot extract a tarfile that I created, error states stdin: not in gzip format
I had a number of .iso images that I mounted to a loopback device to a single directory. I then took that directory(and all sub directories) and ran the TAR command(see below) to compress and transfer them to a NAS over the network. When I copied the file (it is 38GB) back to the server and attempted to extract them using TAR I recieved an
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors
i then ran the file command (see below) and it returned a value back of "data"
How can I retrieve the data? Below is listing of all the command that I ran to mount, and then compress, copy to the NAS.
Background.
1. We copied the DVD using the dd command to an image file
nohup dd if=/dev/scdx of=/mnt/yyyymmdd.iso
2. Then mounted a number of the images above to a directory
mount –t iso9660 –o loop=/dev/loopX / mnt/yyyymmdd.iso /mnt/yyyymmdd where x=loop number i.e. /dev/loop2
3. Once the images where mounted we were able to see the file structure. We mounted multiple disks to in a directory structure then used the tar command to compress and copy the directory over to the NAS via the network
nohup tar cvzf - * | ssh root@nas \ ‘cd /Data; cat > filename.gz’
When I copied the file back to the server to uncompress it I receive the following errors.
root@secant:/Mo3# tar -xzf 9-2.gz
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors
Then I tried to rename it.
root@secant:/Mo3# mv 9-2.gz 9-2.tar.gz
root@secant:/Mo3# tar -xzf 9-2.tar.gz
gzip: stdin: not in gzip format
tar: Child returned status 1
tar: Error exit delayed from previous errors
root@secant:/Mo3# tar xzf 9-2.tar.gz
When I ran the file command it told me it was format of “Data”
root@secant:/Mo3# file 9-2.tar.tgz
9-2.tar.tgz: data
Have been hunting around for a solution, but have not been successful. Any thoughts?
I don’t think it is an image file, I believe that it has compressed, but I can’t get extracted.
Perhaps it was mangled on the way into the NAS via SSH, or mangled when copied from the NAS. Here are a couple of things to try:
Run this:
od -c 9-2.gz | head
and tell us what it says. Maybe it will reveal that it wasn't compressed. (Forgot the tar 'z' option when created?)
Copy the histogramming (character frequency counting) perl script below into "hist.pl" and run:
perl hist.pl 9-2.gz
and tell us what it says. For a gzip file of any substantial size, you should see some occurrences of every character value from 0 to 255. If your copy into the NAS or out of the NAS messes with line terminators, you may see things like all CR deleted, or all CR mapped to LF. (If this is the problem, it is possible to recover -- I have done it before -- but it requires extreme effort, only worthwhile if it's truly critical data. There's an discussion of how at http://bukys.com/services/recovery/examples/].)
How are you copying your files from the NAS? NFS mount? Something else?
Does this command:
ssh root@NAS "cat /Data/filename.gz"
produce the same file?
Here's my hist.pl perl script:
Code:
#!/usr/bin/perl -w
use strict;
die "filename arguments expected\n" if ($#ARGV < 0);
foreach my $filename (@ARGV) {
if (!open IN, "<$filename") {
warn "can't open '$filename'\n";
next;
}
print "$filename\n";
binmode(IN);
my @hist = ();
my $total = 0;
while (read IN, my $buf, 1024) {
foreach my $octet (unpack "C*", $buf) {
$hist[$octet]++;
$total++;
}
}
close(IN);
for (my $i = 0; $i < 256; $i++) {
my $count = $hist[$i] || 0;
my $p = sprintf("%.5f", $count/$total);
print "[$i] $count $p\n";
}
print "total $total\n\n";
}
exit 0;
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.