LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - General (https://www.linuxquestions.org/questions/linux-general-1/)
-   -   Maximum Rar Efficiency (https://www.linuxquestions.org/questions/linux-general-1/maximum-rar-efficiency-447162/)

Nuvious 05-21-2006 06:38 PM

Maximum Rar Efficiency
 
I've noticed that several people rar files together in groups so that instead of having one big file, they have several 20-40 MB size rar chunks. Is this more efficient? If it is, how do I know what increment is most efficient in terms of file space? Thanks in advance for any reply!

Nuvious

jschiwal 05-21-2006 07:28 PM

An archive will be more efficient for small files, because a 2k file for example won't occupy an entire block on the disk. However, having several rar files doesn't add to this efficiency, but if you use par or par2 you can make recovery blocks in case one of the files is corrupted.
I just made a fresh install of SuSE 10.1 on my laptop. One of the files I backed up was the SUSE DVD iso file that I produced from the CDs and burned to a dvd. However, the size was to large for the usb drive, due to the FAT32 filesystem. So I used the split program to break it up and save it to the USB drive. I then used par2 to add parity files.

I also archived a large directory that was over 2 gig in total size. So I used split to back it up to the USB drive.
I can restore files and directories without reassembling the tarfile; like this:
cat /media/netdisk/documents.tar.gz.0?? | tar xvf - <filename(s) or dirname(s) to restore>


All times are GMT -5. The time now is 03:45 PM.