LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Slackware (http://www.linuxquestions.org/questions/slackware-14/)
-   -   Ideal Backup Slack 14.1 (http://www.linuxquestions.org/questions/slackware-14/ideal-backup-slack-14-1-a-4175498335/)

Mercury305 03-15-2014 08:47 PM

Ideal Backup Slack 14.1
 
Whats the Ideal backup tool you guys use for convenience sake to backup your data?

I use backup-manager wondering if there is better options?

rouvas 03-16-2014 03:36 AM

I use this:

Code:

#!/bin/bash

#
# CONFIG
#

HIST_BASE=/mnt/hd/backup

HIST_PREFIX=history

BACK_IN_TIME=10

#
# END OF CONFIG
#

./mysql_export.sh

pushd $HOME/bin &> /dev/null

  mkdir -p $HIST_BASE

  for (( i=$BACK_IN_TIME; i>=0; i-- )); do
    if [ -d $HIST_BASE/$HIST_PREFIX.$i ]; then
      if [[ $i == $BACK_IN_TIME ]]; then
        rm -rf $HIST_BASE/$HIST_PREFIX.$i
      else
        mv $HIST_BASE/$HIST_PREFIX.$i $HIST_BASE/$HIST_PREFIX.$((i+1))
      fi
    fi
  done
  mkdir -p $HIST_BASE/$HIST_PREFIX.0
  /usr/bin/rsync -avx --delete --force --ignore-errors \
                --delete-excluded --exclude-from=$HOME/usb-excluded \
--stats --link-dest=$HIST_BASE/$HIST_PREFIX.1 \
                / $HIST_BASE/$HIST_PREFIX.0

popd &> /dev/null

Simple and efficient.

I haven't thought of the technique, but I can't find the link to the original right now.

Diantre 03-16-2014 04:20 AM

Hmm... I use a very similar technique with rsync. Found it here:

http://www.mikerubel.org/computers/rsync_snapshots/
http://webgnuru.com/linux/rsync_incremental.php

jheengut 03-16-2014 04:40 AM

what is the use of pushd and popd ?
 
Quote:

Originally Posted by rouvas (Post 5135402)

Code:

#!/bin/bash

pushd $HOME/bin &> /dev/null

popd &> /dev/null



what is the use of pushd and popd here??

moisespedro 03-16-2014 06:31 AM

I use this https://wiki.archlinux.org/index.php...kup_with_rsync

Mercury305 03-16-2014 10:48 AM

Quote:

Originally Posted by moisespedro (Post 5135445)

Thanks Pedro, thats a nice 1. I like the switches. Very convenient and flexible.

Mines, spits out a bzip2 tarball. Does rsync do compression as well? If so what type? Does it do any of the real good ones like lzma2 / xz

Mercury305 03-16-2014 10:50 AM

Quote:

Originally Posted by Diantre (Post 5135417)
Hmm... I use a very similar technique with rsync. Found it here:

http://www.mikerubel.org/computers/rsync_snapshots/
http://webgnuru.com/linux/rsync_incremental.php

Thanks! great links. It seems Rsync has everything I want except for Auto Compression though. (if am not mistaken)

EDIT

http://www.evbackup.com/support-comm...ync-arguments/

Shows that it does have a compression option! And it features pretty much every type! This makes it superior or equal to backup-manager! Thanks you gave me a new tool to work with. I'll update later on which one I find works better.

Richard Cranium 03-16-2014 11:01 AM

Quote:

Originally Posted by jheengut (Post 5135421)
what is the use of pushd and popd here??

I assume the OP has "." at the start of his/her PATH and wants to ensure that any scripts in ~/bin are found first.

Mercury305 03-16-2014 11:16 AM

WOW, very nice full of features I like it.

http://linux.die.net/man/1/rsync

moisespedro 03-16-2014 11:26 AM

Quote:

Originally Posted by Mercury305 (Post 5135562)
Thanks Pedro, thats a nice 1. I like the switches. Very convenient and flexible.

Mines, spits out a bzip2 tarball. Does rsync do compression as well? If so what type? Does it do any of the real good ones like lzma2 / xz

I don't think rsync can compress files but I am not really sure.

ruario 03-16-2014 11:26 AM

Quote:

Originally Posted by Mercury305 (Post 5135562)
Mines, spits out a bzip2 tarball

That sounds like a bad idea (see: Fault tolerant backup archives). If you need to use an archive as a container at least use one that can do internal compression, like afio or dar.

Mercury305 03-16-2014 11:33 AM

Quote:

Originally Posted by ruario (Post 5135581)
That sounds like a bad idea (see: Fault tolerant backup archives). If you need to use an archive as a container at least use one that can do internal compression, like afio or dar.

Those are some good and solid points. So the solution to your problem is to simply do a checksum after the file is created and compressed. I think both tools have that built in so that shouldn't be a problem.

EDIT and ofcourse use a good filesystem like EXT4 and good hard drive to prevent corruption of your data to happening + store multiple copies with intervals. For example make a copy at end of each day. Hence Example if day 3 is corrupted then day 2 might still be good.

But I agree that sticking to bzip2 might be a good idea as well.

Oh, and finally to top that off. The best way to secure your files is "Redundancy". I might go grab another 3TB for this.

ruario 03-16-2014 01:56 PM

Quote:

Originally Posted by Mercury305 (Post 5135583)
So the solution to your problem is to simply do a checksum after the file is created and compressed. I think both tools have that built in so that shouldn't be a problem.

A checksum will tell you if a file is corrupt but it won't let you fix it. So, no that does not solve the problem at all. Did you read my linked post?

Fixing the problem involves having some redundancy either in the form of multiple backups on different media or at the very least parity archives. That way if you get some corruption due to media failure you may a different backup or use the parity files to have a chance to correct it.

Additionally I advise using internal compression on a file by file basis rather than external compression across the entire archive so that in the worst case you can recover some of your files (hopefully most of them). With ompression across the entire archive a minor corruption near the start of the file will probably mean that entire backup is a write off, due to the way compression works.

J.D. 03-16-2014 02:09 PM

Quote:

Originally Posted by ruario (Post 5135658)
A checksum will tell you if a file is corrupt but it won't let you fix it. So, no that does not solve the problem at all. Did you read my linked post?

Fixing the problem involves having some redundancy either in the form of multiple backups on different media or at the very least parity archives. That way if you get some corruption due to media failure you may a different backup or use the parity files to have a chance to correct it.

Additionally I advise using internal compression on a file by file basis rather than external compression across the entire archive so that in the worst case you can recover some of your files (hopefully most of them). With ompression across the entire archive a minor corruption near the start of the file will probably mean that entire backup is a write off, due to the way compression works.

These are obvious things ruario. I only use my external on my mbp. It doesn't make sense to use external for your desktop unless you are out of drives and too dumb to put 1 in. But I think you are the one not reading what I wrote (or maybe you didn't check my last edit to go more into detail on what I meant). A checksum is the cure to the problem (as in knowing if your data can get backed up without problems). Also your data corruption due to compression argument is a small risk and can also happen to a non compressed tar (allthough much easily repaired). The better and more costly solution is like you said redundancy but I also added that in my last edit. Maybe you didnt get a chance to read my edited part? If you actually read what I wrote in there I'm pretty much on your side of what you wrote. I even agreed with what you wrote about bzip2.

Also using a SD Drive is pretty much the best option for this type of stuff. Both in speed and loss from fragmentation, or corruption.

But I appreciate your help.

EDIT before you correct me I realize what you meant by "External Internal Compression". But honestly this doesn't make a difference due to "CHECKSUM".

BTW this is my other Account I recently opened because I had forgot the password to the other 1 and lost access to that email. So JD = Mercury305

ruario 03-16-2014 02:28 PM

Quote:

Originally Posted by J.D. (Post 5135669)
or maybe you didn't check my last edit to go more into detail on what I meant

Hmm ... perhaps I did not reload the thread between your initial reply and mine.

Quote:

Originally Posted by J.D. (Post 5135669)
A checksum is the cure to the problem (as in knowing if your data can get backed up without problems

That is not a problem that I raised.

Quote:

Originally Posted by J.D. (Post 5135669)
Also your data corruption due to compression argument is a small risk

You need only a single byte corruption near the start of a compressed archive to ruin everything if you have no extra backups or other forms of redundancy.


Quote:

Originally Posted by J.D. (Post 5135669)
BTW this is my other Account I recently opened because I had forgot the password to the other 1 and lost access to that email. So JD = Mercury305

OK, hope you manage to get it back.


All times are GMT -5. The time now is 04:27 AM.