SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Shows that it does have a compression option! And it features pretty much every type! This makes it superior or equal to backup-manager! Thanks you gave me a new tool to work with. I'll update later on which one I find works better.
Last edited by Mercury305; 03-16-2014 at 10:02 AM.
That sounds like a bad idea (see: Fault tolerant backup archives). If you need to use an archive as a container at least use one that can do internal compression, like afio or dar.
That sounds like a bad idea (see: Fault tolerant backup archives). If you need to use an archive as a container at least use one that can do internal compression, like afio or dar.
Those are some good and solid points. So the solution to your problem is to simply do a checksum after the file is created and compressed. I think both tools have that built in so that shouldn't be a problem.
EDIT and ofcourse use a good filesystem like EXT4 and good hard drive to prevent corruption of your data to happening + store multiple copies with intervals. For example make a copy at end of each day. Hence Example if day 3 is corrupted then day 2 might still be good.
But I agree that sticking to bzip2 might be a good idea as well.
Oh, and finally to top that off. The best way to secure your files is "Redundancy". I might go grab another 3TB for this.
Last edited by Mercury305; 03-16-2014 at 10:43 AM.
So the solution to your problem is to simply do a checksum after the file is created and compressed. I think both tools have that built in so that shouldn't be a problem.
A checksum will tell you if a file is corrupt but it won't let you fix it. So, no that does not solve the problem at all. Did you read my linked post?
Fixing the problem involves having some redundancy either in the form of multiple backups on different media or at the very least parity archives. That way if you get some corruption due to media failure you may a different backup or use the parity files to have a chance to correct it.
Additionally I advise using internal compression on a file by file basis rather than external compression across the entire archive so that in the worst case you can recover some of your files (hopefully most of them). With ompression across the entire archive a minor corruption near the start of the file will probably mean that entire backup is a write off, due to the way compression works.
A checksum will tell you if a file is corrupt but it won't let you fix it. So, no that does not solve the problem at all. Did you read my linked post?
Fixing the problem involves having some redundancy either in the form of multiple backups on different media or at the very least parity archives. That way if you get some corruption due to media failure you may a different backup or use the parity files to have a chance to correct it.
Additionally I advise using internal compression on a file by file basis rather than external compression across the entire archive so that in the worst case you can recover some of your files (hopefully most of them). With ompression across the entire archive a minor corruption near the start of the file will probably mean that entire backup is a write off, due to the way compression works.
These are obvious things ruario. I only use my external on my mbp. It doesn't make sense to use external for your desktop unless you are out of drives and too dumb to put 1 in. But I think you are the one not reading what I wrote (or maybe you didn't check my last edit to go more into detail on what I meant). A checksum is the cure to the problem (as in knowing if your data can get backed up without problems). Also your data corruption due to compression argument is a small risk and can also happen to a non compressed tar (allthough much easily repaired). The better and more costly solution is like you said redundancy but I also added that in my last edit. Maybe you didnt get a chance to read my edited part? If you actually read what I wrote in there I'm pretty much on your side of what you wrote. I even agreed with what you wrote about bzip2.
Also using a SD Drive is pretty much the best option for this type of stuff. Both in speed and loss from fragmentation, or corruption.
But I appreciate your help.
EDIT before you correct me I realize what you meant by "External Internal Compression". But honestly this doesn't make a difference due to "CHECKSUM".
BTW this is my other Account I recently opened because I had forgot the password to the other 1 and lost access to that email. So JD = Mercury305
or maybe you didn't check my last edit to go more into detail on what I meant
Hmm ... perhaps I did not reload the thread between your initial reply and mine.
Quote:
Originally Posted by J.D.
A checksum is the cure to the problem (as in knowing if your data can get backed up without problems
That is not a problem that I raised.
Quote:
Originally Posted by J.D.
Also your data corruption due to compression argument is a small risk
You need only a single byte corruption near the start of a compressed archive to ruin everything if you have no extra backups or other forms of redundancy.
Quote:
Originally Posted by J.D.
BTW this is my other Account I recently opened because I had forgot the password to the other 1 and lost access to that email. So JD = Mercury305
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.