LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   Alternative to Rar to pack an harddisk image ? (https://www.linuxquestions.org/questions/linux-newbie-8/alternative-to-rar-to-pack-an-harddisk-image-4175426278/)

Xeratul 09-08-2012 02:39 PM

Alternative to Rar to pack an harddisk image ?
 
Hi,

In the past, I was using rar to burn my harddisk images to cdroms. I would like a method that is more Linux / Opensource.

which solution would you clone this?

Code:

rar a -r -v650000k harddisk_hdd2_tar.gz.rar  harddisk_hdd2_tar.gz
thanks

John VV 09-08-2012 05:29 PM

how about the current fad
"xz"

though i do not think it will be a fad
it dose CRUNCH those bits
a 334 Meg ppm photo ends up a 41 meg zip or a 33 meg xz default level6 ( just link png images)
and for a disk a "tar.xz "

jefro 09-09-2012 12:08 PM

The old way or maybe even common way is to use bz or gz

http://www.cyberciti.biz/howto/quest...heat-sheet.php

http://www.ghacks.net/2010/01/22/get...e-compression/

Kind of also depends on the types of files and general sizes. Some compression types work better on some than others. The command line 7-zip is a pretty good choice for best compression.

Mr. Alex 09-09-2012 01:00 PM

tar with preserve permissions option and than you can use bzip2 to compress. That is much more UNIX way than rar. Also compressing with any kind of algorythm will waste your time if you are planning on compressing mixed/binary files. Because rar/zip/gzip/bzip2/lzma/lzma2 are intended for text files. They will not compress other types well. And it's very time-consuming. So you can just use good old tar.

Xeratul 09-23-2012 04:55 AM

Quote:

Originally Posted by jefro (Post 4776248)
The old way or maybe even common way is to use bz or gz

http://www.cyberciti.biz/howto/quest...heat-sheet.php

http://www.ghacks.net/2010/01/22/get...e-compression/

Kind of also depends on the types of files and general sizes. Some compression types work better on some than others. The command line 7-zip is a pretty good choice for best compression.

well, but bz or gz does not allow to split on several parts like does rar

konsolebox 09-23-2012 05:06 AM

Hmm.. 7za might do well as well but I haven't tried it yet, and it also supports file splitting. I'd still choose tar though to be safe, with gz, or if I'm patient I'd choose xz. (tar -cpv --xz -f ...)

Sometimes cpio would be a choice as well, but it depends. (cpio ... | xz -c ... > ...)

414N 09-23-2012 05:09 AM

If the files to be compressed are hard disk images (i.e. single files), I see no point in "encapsulating" the file with tar first and then compressing it with gzip, bzip2, lzma (xz) etc. Am I wrong?
Personally, I'd go with p7zip as already suggested.

konsolebox 09-23-2012 05:24 AM

Quote:

Originally Posted by 414N (Post 4787015)
If the files to be compressed are hard disk images (i.e. single files), I see no point in "encapsulating" the file with tar first and then compressing it with gzip, bzip2, lzma (xz) etc. Am I wrong?
Personally, I'd go with p7zip as already suggested.

What actually matters is how the output file would be handled. Me I'd actually prefer if it's stream like. That is, only a limited buffer is used to handle the file and that it doesn't hold back with the header - that which is placed at the beginning of the output file.

To make things more clear, I don't want the archiver to go back to its header after writing the whole data just to make the file complete. I'd prefer a stream-like one.

It doesn't have much problem on small archives but I wonder about extremely larger ones. I wonder if in some way it could cause hang-ups or extreme system slow-down. I don't want to risk my HD from burning in its swap partition.

Xeratul 09-23-2012 05:58 AM

I am not so sure that p7zip is a solution as tar, gz, bz...

p7zip is somehow like using rar then....

Opensource rocks. Well, if you wanna use tar, then you have to fight to use split with it... an embedded tar with split would be long awaited

jefro 09-23-2012 10:55 AM

There is a technical point to tar actually but it doesn't affect this situation. Tar does reduce file space and puts all the files into a single file for each of them. This is the old way to defrag. It still is one of the best ways to ensure all files are contiguous.

In any case if you want to take a large file you can use the aptly named app called split. Then you can cat it back when needed.


All times are GMT -5. The time now is 06:46 PM.