LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Software (https://www.linuxquestions.org/questions/linux-software-2/)
-   -   Error-tolerant non-compressing archives (https://www.linuxquestions.org/questions/linux-software-2/error-tolerant-non-compressing-archives-4175446327/)

Changes 01-19-2013 08:37 AM

Error-tolerant non-compressing archives
 
I just had the unfortunate experience of having a single bad sector ruin an entire 32GB uncompressed rar containing a whole lot of stuff, fortunately of secondary importance. I thought keeping the file uncompressed would make it just skip the relevant files in case of an error, but I was obviously wrong.

Is there anything equivalent that would do that in case of a small amount of bad data instead of aborting the whole procedure? What about uncompressed tar files?

I ask because I often have to backup lots of gigs of very small files, and a normal copy takes three forevers as it has to write all the tiny files, while an archive operation on a single huge file takes a lot less.

ruario 01-19-2013 09:37 AM

afio. It defaults to cpio odc format (but will use extensions to handle large files). Also does compression per file, if you do want compression.

ruario 01-19-2013 09:58 AM

Some redundancy data is a good idea for important backups and does work to safeguard against various levels of corruption. You could use parity files with any archive format. The advantage of parity files is that I could set say 10% redundancy (allowing you to fix corruptions of up to 10% anywhere in your file), without having to store a full extra copy. Check out parchive/par2cmdline (an implementation of PAR v2.0 specification).

unSpawn 01-19-2013 10:08 AM

...also note RAR (lets skip OSS vs non-OSS for now) recover undamaged files, keep extracted files that fail CRC check and it can create recovery records / volumes as well. Just enable the switches you need.

ruario 01-19-2013 10:46 AM

RAR is a nice tool but IIRC the format does not store all UNIX meta data (ownership, groups), nor support all file types (links, device files) so using an *nix native format (tar, cpio, dar, etc.) might be better if you care about these kinds of things.

H_TeXMeX_H 01-19-2013 11:32 AM

What I use for backups:
http://dvdisaster.net/en/index.html
It's mostly for DVDs.

ruario 01-19-2013 12:06 PM

This is also interesting http://www.mondorescue.org/ (it uses afio as a backend)

linuxbird 02-15-2013 09:53 PM

This is an interesting discussion.

I am trying to backup selected areas, and I would like to use cpio, except it fails with 7 or 8 gb files.

Ideally, I would like to find the files I would like backed up, and then have them sorted size wise so that the nearly optimally pack on media. And then I would like cpio run against one or more media, so that I can save on bd, dvd, or cd.

I am using Slackware 14.0, 32 bit. Any pointers or ideas?

ruario 02-16-2013 09:06 AM

afio will overcome the size limitations. it extends the cpio format when such limitations are reached. for the other stuff spend some time reading man find.


All times are GMT -5. The time now is 08:05 AM.