.tar.gz 111GB extracting fails
Hi guys
We are trying to move Oracle applications database tier archive, that is 111GB (over Linux) to a USB external drive using cp. Though the file successfully gets transferred to the external drive, trying to extract the file from a 2nd machine always fails, saying the archive is corrupt. We have checked the integrity of the archive using 7-zip, reporting no errors. However totally frustrated as our last few attempts were totally futile. The interesting part is, if we do scp to transfer the file to 2nd machine, extraction doesn't fail. Please let us know, how we can successfully move this archive to the 2nd machine which is at a remote location and no possibilities of setting up a FTP for such a huge size file. Both the source and destination Linux distros are RHEL 5 Enterprise, 64Bit, ext3 file systems. regards, |
I guess there is a problem with the filesystem on that usb stick.
|
what format was the usb drive ?
the single file is after all over 100 Gig so the default from the manufacture fat32 is not being used for RHEL 5.11 use ext3 on the usb drive |
Check the size of usb drive? Is it greater than 111GB?
Run 'badblocks' command to find bad sectors in the usb drive. |
There have been problems with 7zip corruptions, suggest you use something like bzip2, (or even tar & gzip).
|
Get the md5sum of the file.
After transfer, check that. If correct, then there's nothing wrong with the file. Check versions of 7zip between the two machines. |
Quote:
Both the external HDDs are newly bought, checked thoroughly for errors. Most importantly, I don't have the same issues with the application tier archive, that is just 25GBs in size. regards, ---------- Post added 05-21-15 at 03:22 PM ---------- Quote:
The USB HDD is formatted using ext3 (As the source system) regards, |
Quote:
regards, ---------- Post added 05-21-15 at 03:24 PM ---------- Quote:
regards, |
Quote:
regards, |
Quote:
And the other point was also to check the versions of whatever extraction tool you're using. |
I still think there is an issue either with that filesystem (I do not mean in general with ext3, but that one on the usb stick). or the underlying hardware. You can try to copy another "big" file (for example a video) and check integrity.
|
Quote:
I am copying the file once again to the external HDD now, will post you the md5sum results soon. regards, |
Quote:
This is how I built the external HDD Connected the new HDD to Red Hat live system Delete the NTFS partition using fdisk Created a new partition mkfs.ext3 to format As I mentioned with my earlier posts, the files as big as 25GB are extracting without any issues. Please let me know whether I am doing something wrong with the partitioning or formatting part. regards, |
Quote:
I ran the md5sum against both source and destination files, and the md5sum for both files are the same. However, while going through the tar outputs, I could see that there were multiple errors raised like the quoted below Quote:
Other than the above, I can't see any other issues. regards, |
probably there are different users (user ids) configured and the user/group who created the tar does not exist on the target host therefore it has no right to create files/dirs.
Tar handles only ids, so you may have the the same username with different ids to make such conflicts. |
Quote:
As you could see, I posted this question to beginners area, as my exposure towards Linux is very limited. I interact with it regularly, as a developer for Oracle applications, however I don't know much about how the users, their ids are involved in certain few things. Coming from a purely Windows environment, I understand that, the root has the highest level of privileges and tar is only extracting the files! Does tar care for which user created the tar file and who is extracting them? How about creating the same oraprod in the target system and changing the owner of the tar.gz to oraprod? regards, |
first just check (if it was really true). You can use the command: id <username> and you will see the user ids (numbers) - on both hosts.
Next you can try to extract using tar with options --no-same-owner --no-same-permissions But would be nice to tell us the result of that check and also explain which user was used and which commands were executed (exactly) to create and extract that tar file. |
Quote:
I created the user with same name and changed the ownership of the archive and started the extracting once again. It looks like the entire errors thrown were due to some other read+write permissions. Once I get an output I will update it to you. regards and thank you very much for the patience :) |
Quote:
|
Quote:
I don't have a clue, the disk was remounted and I was able to extract the files. What I observed until almost 48 hours of efforts are if I move the tar.gz files using "cp", the md5sum are different if I move the same files using "scp", the md5sum are same and I can extract the files without any problems using tar It looks kinda strange, however, finally I was able to extract the files to a second server, even though the process took around 834 minutes to extract a 111gb tar file. thanks to everyone who contributed to the question. regards, |
All times are GMT -5. The time now is 03:21 AM. |