LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - General (https://www.linuxquestions.org/questions/linux-general-1/)
-   -   Unpack resume feature, rar, zip, gz, tar, 7z (https://www.linuxquestions.org/questions/linux-general-1/unpack-resume-feature-rar-zip-gz-tar-7z-743710/)

Z505 07-29-2009 11:08 AM

Unpack resume feature, rar, zip, gz, tar, 7z
 
Is there any command line unpacker with a resume feature? I am having problems with unpacking some large multi gigabyte files on a shared web server. The server seems to kill the command after a certain amount of time because they monitor CPU usage or have some detection in place. It would be great if I could do a certain amount of unpacking, and then wait for a while, and then do more unpacking.

An example of "resume" feature is FTP servers. When you upload files to FTP, often there is a resume feature in case the connection breaks. What I need is a little different: a simple command line unpacker that allows resuming.

Any unpacker is fine. Rar, gz, tar, 7zip, zip, etc. I just need one with resume!?


Since my problem is CPU usage for a certain amount of time, the last alternative, is to recompile the packer and insert some nanosleep() or sleep() or select() calls in the correct loops to reduce cpu. I would hate to have to do this if there was an easier alternative. Or, as another alternative, I could build a resume feature into some existing unpacker - but I don't want to code that sh*t if I don't have to.

By the way, I have tried using the NICE command to reduce cpu usage, but the process still takes up over 30 percent cpu, so the webserver still doesn't like it for extended periods of time, and still cuts the unpacking process off. I tried cron jobs, and executing it from my web browser through a nifty cgi program. The same thing happens: server cuts off the process after a while before it finishes unpacking.

Note: I will not buy a dedicated server for this small project (so please don't suggest that, I considered it ;-))

Guttorm 07-30-2009 09:11 AM

Hi

I've never seen any "resume" on unpacking except by unpacking a little by little. (You can usually specify some files to unpack so that not everything is unpacked at once.)

So one solution would be to write a script listing the contents of the file, and then unpacking it file by file. But I guess it won't be easy if some of files are very big.

Another way would be to pipe the output of the unpacker into a very slow reader script. For example:

while read line ; do
echo $line
sleep 10
done

Yet another way would be to not use compression at all - maybe simply tar?. Then it will use a lot more IO time, both for the file transfers and the disk access, but that's maybe what the admins of the server want?

nowonmai 07-30-2009 09:54 AM

Because of how pretty much all compressor programs work (variations on Lempel-Ziv) a file can only be decompressed from the top down. Decompression dictionary information is build on the fly by the decompressor as it decompresses, so it would have to start again from the top and rebuild the dictionary every time.


All times are GMT -5. The time now is 04:00 AM.