Alternative to Rar to pack an harddisk image ?
Hi,
In the past, I was using rar to burn my harddisk images to cdroms. I would like a method that is more Linux / Opensource. which solution would you clone this? Code:
rar a -r -v650000k harddisk_hdd2_tar.gz.rar harddisk_hdd2_tar.gz |
how about the current fad
"xz" though i do not think it will be a fad it dose CRUNCH those bits a 334 Meg ppm photo ends up a 41 meg zip or a 33 meg xz default level6 ( just link png images) and for a disk a "tar.xz " |
The old way or maybe even common way is to use bz or gz
http://www.cyberciti.biz/howto/quest...heat-sheet.php http://www.ghacks.net/2010/01/22/get...e-compression/ Kind of also depends on the types of files and general sizes. Some compression types work better on some than others. The command line 7-zip is a pretty good choice for best compression. |
tar with preserve permissions option and than you can use bzip2 to compress. That is much more UNIX way than rar. Also compressing with any kind of algorythm will waste your time if you are planning on compressing mixed/binary files. Because rar/zip/gzip/bzip2/lzma/lzma2 are intended for text files. They will not compress other types well. And it's very time-consuming. So you can just use good old tar.
|
Quote:
|
Hmm.. 7za might do well as well but I haven't tried it yet, and it also supports file splitting. I'd still choose tar though to be safe, with gz, or if I'm patient I'd choose xz. (tar -cpv --xz -f ...)
Sometimes cpio would be a choice as well, but it depends. (cpio ... | xz -c ... > ...) |
If the files to be compressed are hard disk images (i.e. single files), I see no point in "encapsulating" the file with tar first and then compressing it with gzip, bzip2, lzma (xz) etc. Am I wrong?
Personally, I'd go with p7zip as already suggested. |
Quote:
To make things more clear, I don't want the archiver to go back to its header after writing the whole data just to make the file complete. I'd prefer a stream-like one. It doesn't have much problem on small archives but I wonder about extremely larger ones. I wonder if in some way it could cause hang-ups or extreme system slow-down. I don't want to risk my HD from burning in its swap partition. |
I am not so sure that p7zip is a solution as tar, gz, bz...
p7zip is somehow like using rar then.... Opensource rocks. Well, if you wanna use tar, then you have to fight to use split with it... an embedded tar with split would be long awaited |
There is a technical point to tar actually but it doesn't affect this situation. Tar does reduce file space and puts all the files into a single file for each of them. This is the old way to defrag. It still is one of the best ways to ensure all files are contiguous.
In any case if you want to take a large file you can use the aptly named app called split. Then you can cat it back when needed. |
All times are GMT -5. The time now is 06:46 PM. |