Linux - SoftwareThis forum is for Software issues.
Having a problem installing a new program? Want to know which application is best for the job? Post your question in this forum.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I just had the unfortunate experience of having a single bad sector ruin an entire 32GB uncompressed rar containing a whole lot of stuff, fortunately of secondary importance. I thought keeping the file uncompressed would make it just skip the relevant files in case of an error, but I was obviously wrong.
Is there anything equivalent that would do that in case of a small amount of bad data instead of aborting the whole procedure? What about uncompressed tar files?
I ask because I often have to backup lots of gigs of very small files, and a normal copy takes three forevers as it has to write all the tiny files, while an archive operation on a single huge file takes a lot less.
Some redundancy data is a good idea for important backups and does work to safeguard against various levels of corruption. You could use parity files with any archive format. The advantage of parity files is that I could set say 10% redundancy (allowing you to fix corruptions of up to 10% anywhere in your file), without having to store a full extra copy. Check out parchive/par2cmdline (an implementation of PAR v2.0 specification).
...also note RAR (lets skip OSS vs non-OSS for now) recover undamaged files, keep extracted files that fail CRC check and it can create recovery records / volumes as well. Just enable the switches you need.
RAR is a nice tool but IIRC the format does not store all UNIX meta data (ownership, groups), nor support all file types (links, device files) so using an *nix native format (tar, cpio, dar, etc.) might be better if you care about these kinds of things.
I am trying to backup selected areas, and I would like to use cpio, except it fails with 7 or 8 gb files.
Ideally, I would like to find the files I would like backed up, and then have them sorted size wise so that the nearly optimally pack on media. And then I would like cpio run against one or more media, so that I can save on bd, dvd, or cd.
I am using Slackware 14.0, 32 bit. Any pointers or ideas?
afio will overcome the size limitations. it extends the cpio format when such limitations are reached. for the other stuff spend some time reading man find.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.