Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place. |
Notices |
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
Are you new to LinuxQuestions.org? Visit the following links:
Site Howto |
Site FAQ |
Sitemap |
Register Now
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
|
 |
01-28-2014, 04:47 AM
|
#1
|
LQ Newbie
Registered: Jul 2011
Posts: 3
Rep: 
|
Using Tar utility to untar(extract) from a continuous input stream
Hi,
I was just exploring an option if by using "tar -xvf -" option or any other option, i could untar a live stream of tar input file.
Currently in my application:
- A ".tar" File is being fetched by my application from a remote server.
- now i receive this file in multiple chunks of 256 bytes each
- I assemble these chunks to form the original ".tar" file and then feed it to "tar" utility to extract its content to my local flash drive.
I am looking for following change in my application:
Is there a way that the "tar" utility provides using which i can directly feed the 256 bytes chunk into it and it starts extracting the contents.
So i dont wait for all the chunks to be received from Server and assemble them before feeding it to "tar":
So Pseudo algo that i want to achieve:
1. Get chunk of master .tar file from server
2. Feed this chunk to "tar" utility to extract the contents
3. Discard the Chunk
4. Goto Step 1 till all chunks are downloaded.
I used "cat <chunk file name> | tar -xvf -". However this commands needs all file chunks to be provided in one go.
Last edited by sajalmalhotra; 01-28-2014 at 04:49 AM.
|
|
|
01-28-2014, 05:02 AM
|
#2
|
LQ Veteran
Registered: Sep 2003
Posts: 10,532
|
Quote:
Originally Posted by sajalmalhotra
I was just exploring an option if by using "tar -xvf -" option or any other option, i could untar a live stream of tar input file.
Currently in my application:
- A ".tar" File is being fetched by my application from a remote server.
- now i receive this file in multiple chunks of 256 bytes each
- I assemble these chunks to form the original ".tar" file and then feed it to "tar" utility to extract its content to my local flash drive.
|
Maybe I'm reading it wrong, but this looks like a hard way to do it.
Normally you tar something, which creates one tar file (say foo.tar). This file can be fetched or send to another machine using ftp, scp, rsync, etc. Once the complete file is fetched/send you can untar it. No need to reassemble all the chunks at all.
Tar does need a valid/complete file in order to do its thing.
Could you explain what you are actually trying to do?
|
|
|
01-28-2014, 07:03 AM
|
#3
|
Senior Member
Registered: Dec 2003
Location: Trondheim, Norway
Distribution: Debian and Ubuntu
Posts: 1,466
|
Hi
You don't say how you get those 256 byte chunks. Is there some command that will get a chunk? If so, you could do something like this:
Code:
(
tempfile=/tmp/chunk.file
while true; do
scp remotecomputer:/chucnk $tempfile
if [ -s $tempfile ] ; then
cat $tempfile
else
break
fi
done
) | tar -xvf -
Last edited by Guttorm; 01-28-2014 at 07:05 AM.
|
|
|
01-28-2014, 10:04 PM
|
#4
|
LQ Newbie
Registered: Jul 2011
Posts: 3
Original Poster
Rep: 
|
HI Druuna, Guttorm,
Thanks a lot for a quick reply.
Actually my problem is that the platform that i am using has a limited RAM left (8Mb Left).
And there are occassions when i get a tar file of size >8Mb which i then have to extract and store it in Flash.
The Remote Server sends this tar file in multiple chunks of 256 bytes each which i re-assemble into the RAM and then feed it to tar utility to extract it to the flash. However now since RAM i am short of RAM and file size being downloaded from server are getting bigger in size, i cannot wait for the entire tar file to be re-assembled before i extract it.
An alternate could be to store chunks in flash and once all chunks are downloaded then feed them to tar. But this will slow up the whole process.
Hence i was looking for a solution where i need not wait for entire file to re-assemble before feeding it to tar util. As soon as i receive a chunk i feed it to tar util and drop that chunk out of RAM.
I am also quite pessimistic about possibility of this but till now could not find a documented reason anywhere on internet as to why this should not be possible.
@Guttorm: In your solution as well the entire While loop has to terminate before you feed chunks into tar util using piping mechanism. i.e. all the chunks should first be downloaded and then fed to tar utility.
|
|
|
01-29-2014, 02:12 AM
|
#5
|
Senior Member
Registered: Dec 2003
Location: Trondheim, Norway
Distribution: Debian and Ubuntu
Posts: 1,466
|
Hi
I think you are mistaken. When the entire while loop is redirected to the tar command, it will send the chunks one by one to tar, which will unpack when it has enough data, then wait for more. There's a buffer involved. It will probably take a few chunks before the buffer is full and tar will get data.
But the scp command is obviously wrong. You will have to replace it with a command that gets a chunk and save it to tempfile. If you get the data into a shell variable, you can just use echo instead of cat, but you'll have to prevent it from adding newline or messing up binary data. Bash has some options for echo that can do this, but it's not standard.
|
|
|
All times are GMT -5. The time now is 02:38 AM.
|
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.
|
Latest Threads
LQ News
|
|