LinuxQuestions.org
Register a domain and help support LQ
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 09-08-2012, 03:39 PM   #1
Xeratul
Senior Member
 
Registered: Jun 2006
Location: Debian Land
Posts: 1,666

Rep: Reputation: 127Reputation: 127
Alternative to Rar to pack an harddisk image ?


Hi,

In the past, I was using rar to burn my harddisk images to cdroms. I would like a method that is more Linux / Opensource.

which solution would you clone this?

Code:
rar a -r -v650000k harddisk_hdd2_tar.gz.rar  harddisk_hdd2_tar.gz
thanks
 
Old 09-08-2012, 06:29 PM   #2
John VV
LQ Muse
 
Registered: Aug 2005
Location: A2 area Mi.
Posts: 16,817

Rep: Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408Reputation: 2408
how about the current fad
"xz"

though i do not think it will be a fad
it dose CRUNCH those bits
a 334 Meg ppm photo ends up a 41 meg zip or a 33 meg xz default level6 ( just link png images)
and for a disk a "tar.xz "
 
Old 09-09-2012, 01:08 PM   #3
jefro
Moderator
 
Registered: Mar 2008
Posts: 15,372

Rep: Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198
The old way or maybe even common way is to use bz or gz

http://www.cyberciti.biz/howto/quest...heat-sheet.php

http://www.ghacks.net/2010/01/22/get...e-compression/

Kind of also depends on the types of files and general sizes. Some compression types work better on some than others. The command line 7-zip is a pretty good choice for best compression.

Last edited by jefro; 09-09-2012 at 01:11 PM.
 
Old 09-09-2012, 02:00 PM   #4
Mr. Alex
Senior Member
 
Registered: May 2010
Distribution: No more Linux. Done with it.
Posts: 1,238

Rep: Reputation: Disabled
tar with preserve permissions option and than you can use bzip2 to compress. That is much more UNIX way than rar. Also compressing with any kind of algorythm will waste your time if you are planning on compressing mixed/binary files. Because rar/zip/gzip/bzip2/lzma/lzma2 are intended for text files. They will not compress other types well. And it's very time-consuming. So you can just use good old tar.
 
Old 09-23-2012, 05:55 AM   #5
Xeratul
Senior Member
 
Registered: Jun 2006
Location: Debian Land
Posts: 1,666

Original Poster
Rep: Reputation: 127Reputation: 127
Quote:
Originally Posted by jefro View Post
The old way or maybe even common way is to use bz or gz

http://www.cyberciti.biz/howto/quest...heat-sheet.php

http://www.ghacks.net/2010/01/22/get...e-compression/

Kind of also depends on the types of files and general sizes. Some compression types work better on some than others. The command line 7-zip is a pretty good choice for best compression.
well, but bz or gz does not allow to split on several parts like does rar
 
Old 09-23-2012, 06:06 AM   #6
konsolebox
Senior Member
 
Registered: Oct 2005
Distribution: Gentoo, Slackware, LFS
Posts: 2,248
Blog Entries: 8

Rep: Reputation: 235Reputation: 235Reputation: 235
Hmm.. 7za might do well as well but I haven't tried it yet, and it also supports file splitting. I'd still choose tar though to be safe, with gz, or if I'm patient I'd choose xz. (tar -cpv --xz -f ...)

Sometimes cpio would be a choice as well, but it depends. (cpio ... | xz -c ... > ...)

Last edited by konsolebox; 09-23-2012 at 06:07 AM.
 
Old 09-23-2012, 06:09 AM   #7
414N
Member
 
Registered: Sep 2011
Location: Italy
Distribution: Slackware
Posts: 635

Rep: Reputation: 186Reputation: 186
If the files to be compressed are hard disk images (i.e. single files), I see no point in "encapsulating" the file with tar first and then compressing it with gzip, bzip2, lzma (xz) etc. Am I wrong?
Personally, I'd go with p7zip as already suggested.
 
Old 09-23-2012, 06:24 AM   #8
konsolebox
Senior Member
 
Registered: Oct 2005
Distribution: Gentoo, Slackware, LFS
Posts: 2,248
Blog Entries: 8

Rep: Reputation: 235Reputation: 235Reputation: 235
Quote:
Originally Posted by 414N View Post
If the files to be compressed are hard disk images (i.e. single files), I see no point in "encapsulating" the file with tar first and then compressing it with gzip, bzip2, lzma (xz) etc. Am I wrong?
Personally, I'd go with p7zip as already suggested.
What actually matters is how the output file would be handled. Me I'd actually prefer if it's stream like. That is, only a limited buffer is used to handle the file and that it doesn't hold back with the header - that which is placed at the beginning of the output file.

To make things more clear, I don't want the archiver to go back to its header after writing the whole data just to make the file complete. I'd prefer a stream-like one.

It doesn't have much problem on small archives but I wonder about extremely larger ones. I wonder if in some way it could cause hang-ups or extreme system slow-down. I don't want to risk my HD from burning in its swap partition.

Last edited by konsolebox; 09-23-2012 at 06:28 AM.
 
Old 09-23-2012, 06:58 AM   #9
Xeratul
Senior Member
 
Registered: Jun 2006
Location: Debian Land
Posts: 1,666

Original Poster
Rep: Reputation: 127Reputation: 127
I am not so sure that p7zip is a solution as tar, gz, bz...

p7zip is somehow like using rar then....

Opensource rocks. Well, if you wanna use tar, then you have to fight to use split with it... an embedded tar with split would be long awaited
 
Old 09-23-2012, 11:55 AM   #10
jefro
Moderator
 
Registered: Mar 2008
Posts: 15,372

Rep: Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198Reputation: 2198
There is a technical point to tar actually but it doesn't affect this situation. Tar does reduce file space and puts all the files into a single file for each of them. This is the old way to defrag. It still is one of the best ways to ensure all files are contiguous.

In any case if you want to take a large file you can use the aptly named app called split. Then you can cat it back when needed.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Pack to RAR with default Debian Squeeze Xeratul Debian 5 04-05-2012 03:21 AM
Pack as RAR with p7zip-rar ? Xeratul Linux - Newbie 10 03-30-2012 02:44 PM
Harddisk image saran_redhat Linux - Newbie 1 03-24-2011 02:16 AM
How to pack rar of a 160GB Hdd without volume dependencies and with 4.2GB max. size? frenchn00b Linux - General 4 08-31-2008 04:36 PM
Embed File into Image (rar?) swatward Linux - Software 5 01-14-2008 06:57 AM


All times are GMT -5. The time now is 08:37 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration