LinuxQuestions.org
Visit Jeremy's Blog.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware
User Name
Password
Slackware This Forum is for the discussion of Slackware Linux.

Notices


Reply
  Search this Thread
Old 03-15-2014, 07:47 PM   #1
Mercury305
Member
 
Registered: Jul 2012
Location: Rockville, MD
Distribution: CrunchBang / Ubuntu
Posts: 540

Rep: Reputation: Disabled
Ideal Backup Slack 14.1


Whats the Ideal backup tool you guys use for convenience sake to backup your data?

I use backup-manager wondering if there is better options?
 
Old 03-16-2014, 02:36 AM   #2
rouvas
Member
 
Registered: Aug 2006
Location: Greece
Distribution: Slackware.12.2
Posts: 104
Blog Entries: 3

Rep: Reputation: 21
I use this:

Code:
#!/bin/bash

#
# CONFIG
#

HIST_BASE=/mnt/hd/backup

HIST_PREFIX=history

BACK_IN_TIME=10

#
# END OF CONFIG
#

./mysql_export.sh

pushd $HOME/bin &> /dev/null

  mkdir -p $HIST_BASE

  for (( i=$BACK_IN_TIME; i>=0; i-- )); do
    if [ -d $HIST_BASE/$HIST_PREFIX.$i ]; then
      if [[ $i == $BACK_IN_TIME ]]; then
        rm -rf $HIST_BASE/$HIST_PREFIX.$i
      else 
        mv $HIST_BASE/$HIST_PREFIX.$i $HIST_BASE/$HIST_PREFIX.$((i+1))
      fi
    fi
  done
  mkdir -p $HIST_BASE/$HIST_PREFIX.0
  /usr/bin/rsync -avx --delete --force --ignore-errors \
                 --delete-excluded --exclude-from=$HOME/usb-excluded \
--stats --link-dest=$HIST_BASE/$HIST_PREFIX.1 \
                 / $HIST_BASE/$HIST_PREFIX.0

popd &> /dev/null
Simple and efficient.

I haven't thought of the technique, but I can't find the link to the original right now.
 
Old 03-16-2014, 03:20 AM   #3
Diantre
Member
 
Registered: Jun 2011
Distribution: Slackware
Posts: 515

Rep: Reputation: 234Reputation: 234Reputation: 234
Hmm... I use a very similar technique with rsync. Found it here:

http://www.mikerubel.org/computers/rsync_snapshots/
http://webgnuru.com/linux/rsync_incremental.php
 
1 members found this post helpful.
Old 03-16-2014, 03:40 AM   #4
jheengut
Member
 
Registered: Sep 2006
Location: Providence, Moka Mauritius
Distribution: Slackware, Lubuntu
Posts: 352
Blog Entries: 16

Rep: Reputation: 51
Smile what is the use of pushd and popd ?

Quote:
Originally Posted by rouvas View Post

Code:
#!/bin/bash

pushd $HOME/bin &> /dev/null

popd &> /dev/null

what is the use of pushd and popd here??
 
Old 03-16-2014, 05:31 AM   #5
moisespedro
Senior Member
 
Registered: Nov 2013
Location: Brazil
Distribution: Slackware
Posts: 1,223

Rep: Reputation: 195Reputation: 195
I use this https://wiki.archlinux.org/index.php...kup_with_rsync
 
1 members found this post helpful.
Old 03-16-2014, 09:48 AM   #6
Mercury305
Member
 
Registered: Jul 2012
Location: Rockville, MD
Distribution: CrunchBang / Ubuntu
Posts: 540

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by moisespedro View Post
Thanks Pedro, thats a nice 1. I like the switches. Very convenient and flexible.

Mines, spits out a bzip2 tarball. Does rsync do compression as well? If so what type? Does it do any of the real good ones like lzma2 / xz
 
Old 03-16-2014, 09:50 AM   #7
Mercury305
Member
 
Registered: Jul 2012
Location: Rockville, MD
Distribution: CrunchBang / Ubuntu
Posts: 540

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by Diantre View Post
Hmm... I use a very similar technique with rsync. Found it here:

http://www.mikerubel.org/computers/rsync_snapshots/
http://webgnuru.com/linux/rsync_incremental.php
Thanks! great links. It seems Rsync has everything I want except for Auto Compression though. (if am not mistaken)

EDIT

http://www.evbackup.com/support-comm...ync-arguments/

Shows that it does have a compression option! And it features pretty much every type! This makes it superior or equal to backup-manager! Thanks you gave me a new tool to work with. I'll update later on which one I find works better.

Last edited by Mercury305; 03-16-2014 at 10:02 AM.
 
Old 03-16-2014, 10:01 AM   #8
Richard Cranium
Senior Member
 
Registered: Apr 2009
Location: McKinney, Texas
Distribution: Slackware64 15.0
Posts: 3,858

Rep: Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225Reputation: 2225
Quote:
Originally Posted by jheengut View Post
what is the use of pushd and popd here??
I assume the OP has "." at the start of his/her PATH and wants to ensure that any scripts in ~/bin are found first.
 
Old 03-16-2014, 10:16 AM   #9
Mercury305
Member
 
Registered: Jul 2012
Location: Rockville, MD
Distribution: CrunchBang / Ubuntu
Posts: 540

Original Poster
Rep: Reputation: Disabled
WOW, very nice full of features I like it.

http://linux.die.net/man/1/rsync
 
Old 03-16-2014, 10:26 AM   #10
moisespedro
Senior Member
 
Registered: Nov 2013
Location: Brazil
Distribution: Slackware
Posts: 1,223

Rep: Reputation: 195Reputation: 195
Quote:
Originally Posted by Mercury305 View Post
Thanks Pedro, thats a nice 1. I like the switches. Very convenient and flexible.

Mines, spits out a bzip2 tarball. Does rsync do compression as well? If so what type? Does it do any of the real good ones like lzma2 / xz
I don't think rsync can compress files but I am not really sure.
 
Old 03-16-2014, 10:26 AM   #11
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761
Quote:
Originally Posted by Mercury305 View Post
Mines, spits out a bzip2 tarball
That sounds like a bad idea (see: Fault tolerant backup archives). If you need to use an archive as a container at least use one that can do internal compression, like afio or dar.

Last edited by ruario; 03-16-2014 at 10:29 AM.
 
Old 03-16-2014, 10:33 AM   #12
Mercury305
Member
 
Registered: Jul 2012
Location: Rockville, MD
Distribution: CrunchBang / Ubuntu
Posts: 540

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by ruario View Post
That sounds like a bad idea (see: Fault tolerant backup archives). If you need to use an archive as a container at least use one that can do internal compression, like afio or dar.
Those are some good and solid points. So the solution to your problem is to simply do a checksum after the file is created and compressed. I think both tools have that built in so that shouldn't be a problem.

EDIT and ofcourse use a good filesystem like EXT4 and good hard drive to prevent corruption of your data to happening + store multiple copies with intervals. For example make a copy at end of each day. Hence Example if day 3 is corrupted then day 2 might still be good.

But I agree that sticking to bzip2 might be a good idea as well.

Oh, and finally to top that off. The best way to secure your files is "Redundancy". I might go grab another 3TB for this.

Last edited by Mercury305; 03-16-2014 at 10:43 AM.
 
Old 03-16-2014, 12:56 PM   #13
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761
Quote:
Originally Posted by Mercury305 View Post
So the solution to your problem is to simply do a checksum after the file is created and compressed. I think both tools have that built in so that shouldn't be a problem.
A checksum will tell you if a file is corrupt but it won't let you fix it. So, no that does not solve the problem at all. Did you read my linked post?

Fixing the problem involves having some redundancy either in the form of multiple backups on different media or at the very least parity archives. That way if you get some corruption due to media failure you may a different backup or use the parity files to have a chance to correct it.

Additionally I advise using internal compression on a file by file basis rather than external compression across the entire archive so that in the worst case you can recover some of your files (hopefully most of them). With ompression across the entire archive a minor corruption near the start of the file will probably mean that entire backup is a write off, due to the way compression works.
 
Old 03-16-2014, 01:09 PM   #14
J.D.
LQ Newbie
 
Registered: Jun 2013
Posts: 9

Rep: Reputation: Disabled
Quote:
Originally Posted by ruario View Post
A checksum will tell you if a file is corrupt but it won't let you fix it. So, no that does not solve the problem at all. Did you read my linked post?

Fixing the problem involves having some redundancy either in the form of multiple backups on different media or at the very least parity archives. That way if you get some corruption due to media failure you may a different backup or use the parity files to have a chance to correct it.

Additionally I advise using internal compression on a file by file basis rather than external compression across the entire archive so that in the worst case you can recover some of your files (hopefully most of them). With ompression across the entire archive a minor corruption near the start of the file will probably mean that entire backup is a write off, due to the way compression works.
These are obvious things ruario. I only use my external on my mbp. It doesn't make sense to use external for your desktop unless you are out of drives and too dumb to put 1 in. But I think you are the one not reading what I wrote (or maybe you didn't check my last edit to go more into detail on what I meant). A checksum is the cure to the problem (as in knowing if your data can get backed up without problems). Also your data corruption due to compression argument is a small risk and can also happen to a non compressed tar (allthough much easily repaired). The better and more costly solution is like you said redundancy but I also added that in my last edit. Maybe you didnt get a chance to read my edited part? If you actually read what I wrote in there I'm pretty much on your side of what you wrote. I even agreed with what you wrote about bzip2.

Also using a SD Drive is pretty much the best option for this type of stuff. Both in speed and loss from fragmentation, or corruption.

But I appreciate your help.

EDIT before you correct me I realize what you meant by "External Internal Compression". But honestly this doesn't make a difference due to "CHECKSUM".

BTW this is my other Account I recently opened because I had forgot the password to the other 1 and lost access to that email. So JD = Mercury305

Last edited by J.D.; 03-16-2014 at 01:28 PM.
 
Old 03-16-2014, 01:28 PM   #15
ruario
Senior Member
 
Registered: Jan 2011
Location: Oslo, Norway
Distribution: Slackware
Posts: 2,557

Rep: Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761Reputation: 1761
Quote:
Originally Posted by J.D. View Post
or maybe you didn't check my last edit to go more into detail on what I meant
Hmm ... perhaps I did not reload the thread between your initial reply and mine.

Quote:
Originally Posted by J.D. View Post
A checksum is the cure to the problem (as in knowing if your data can get backed up without problems
That is not a problem that I raised.

Quote:
Originally Posted by J.D. View Post
Also your data corruption due to compression argument is a small risk
You need only a single byte corruption near the start of a compressed archive to ruin everything if you have no extra backups or other forms of redundancy.


Quote:
Originally Posted by J.D. View Post
BTW this is my other Account I recently opened because I had forgot the password to the other 1 and lost access to that email. So JD = Mercury305
OK, hope you manage to get it back.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Setting up External Hard Drive (USB) as a backup system on Slack 12.0 thethinker Slackware 4 01-06-2009 03:06 PM
backup directories to move from Slack 12 to Slack 12.1 linuxhippy Slackware 1 11-10-2008 10:33 PM
An ideal WM HEMMLine Linux - General 14 05-29-2007 08:33 PM
How should I backup my Slack 10.2 nhouseman85 Slackware 6 02-18-2006 01:26 AM
backup files over network on Slack 9 ne21 Slackware 2 04-09-2004 12:21 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - Distributions > Slackware

All times are GMT -5. The time now is 08:38 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration