LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


View Poll Results: What backup solution do you use?
rsync 29 96.67%
Amanda/Zamanda 1 3.33%
Déjà Dup 0 0%
Bacula 1 3.33%
Duplicati 2 6.67%
Multiple Choice Poll. Voters: 30. You may not vote on this poll

Reply
  Search this Thread
Old 08-16-2012, 12:38 AM   #1
workent
LQ Newbie
 
Registered: May 2012
Posts: 6

Rep: Reputation: Disabled
Backup strategy & solutions


Hi all,

I've been thinking recently that I need to develop & deploy a more comprehensive backup strategy than the ad-hoc "solution" I use at present. Been putting feelers out for opinions on IRC recently, but thought I should do this is a more structured (& shared) manner

Some of my current setup:
* FreeNAS with RAIDZ2 (soon RAIDZ3) - RAID 6
* Ubuntu (currently 12.04.x) server running KVM on LVM2 partitions
* A mix of clients, including various POSIX desktops (Ubuntu, Centos/Fedora, Mac, Android, etc), and a few windows instances for testing
* Gigabit ethernet

I have a few developments in the works, and a *lot* of "dogfooding":
* In the process of building a redundant FreeNAS with RAIDZ3 for testing
* Moving my VM data onto my NAS & connecting via iSCSI
* Popping some SQL VM's for replication
* PXE to simplify triage & cloning
* Start to scale down the use of heavy clients & start making use of devices like Hackberry/Raspberry Pi to act as thin-clients for VDi or terminal services (remote-X, VNC, RDP, NX, SPICE, etc), so that I can nuke any client & be up & running to my previous state over PXE - within a margin of 1 day

What I want to implement at the end of the day:
* Incremental client backups of hosts/data to the server
* SQL replication to a dedicated (sandboxed) VM
* Host pauses & makes weekly snapshots of VM's
* Redundant NAS does a daily rsync of data on primary NAS
* NAS makes weekly/monthly archives of backed up data & pushes it to the secondary NAS, to removable HDD's/media and/or to an off-site facility, such as TarSnap (Amazon, pier, colo, whatever)
* Periodically test & nuke any archives older than a month

Now I know that rsync (+tar) is the de-facto standard for POSIX, and short of me getting jiggy with with some bash scripting & cron (& probably end up making some really catastrophic typo's), I'm looking for some established systems (& methodologies) I can use for the purpose.
There are a few good references available on the subject:
* http://wiki.linuxquestions.org/wiki/Backup
* https://help.ubuntu.com/community/BackupYourSystem
Both are good resources, but do not delve into much detail on the systems listed.

There are a few criteria that I'm trying to keep in mind: (please keep in mind that this is not only for myself, but stem from interactions & observations with non-techs)
* FLOSS (& active) - I try to encourage people to make use of Open Source Software whenever possible, irrespective of their current OS.
* I've been asked for FLOSS backup solutions on a number of occasions, not all by Linux users, so the system, or at least an agent, needs to have supporte for multiple OS's (.deb, .rpm, win, mac, etc)
** GUI or WebGUI is a big selling-point (probably don't need it myself, but sweetens the deal for newcomers)
* Network support - a client-side agent is OK; not needing one is even better.
* The actual backups/archive need to make sense. I know that a few systems archives (& encrypts) data into monolithic volumes, but if a laymen user was to dig into the data to try & locate a file, it could be off-putting.
* Support for (encrypted) on-line/off-site backup, as well as removable disks (i.e. if they plug a 1TB USB HDD every Friday/weekend)

I've been looking into, but have not come to any conclusion, wrt a few systems: (just sharing my list)
* rsync & tar
** Grsync
** DeltaCopy
** Synametrics
** cwRsync
** grsync
** rsnapshot
* Déjà Dup
* Simple Backup & Restore
* Amanda & Zmanda
* Bacula
* Duplicity & Duplicati
* BackupPC
* Synkron
* TimeVault
* FlyBack
* Back-in-Time
* rdiff-backup
* Simple Backup


What I'd like to know:
* Which of these systems have community/forum members been using & what has your observations been?
* Do you know of systems that meet the aforementioned criteria (including any on the list provided)?
* Are there any excellent systems, that meet the criteria, that I do not have on the list?
* Any wisdom you'd like to share?

Any help & insights would be greatly appreciated.

- J
 
Old 08-17-2012, 08:30 AM   #2
GATTACA
Member
 
Registered: Feb 2002
Location: USA
Distribution: Fedora, CENTOS
Posts: 209

Rep: Reputation: 32
Are you 100% committed to FLOSS?

It sounds from your post that you are looking for a backup solution for a company (correct me if I'm wrong).
If you are dealing with company data, you can't risk its loss due to software problem.

I use Crashplan as a service http://www.crashplan.com/.
They have enterprise-level support if you are willing to pony up the $$$.

The nice thing about them is that you can backup all 3 major OS types: Winblows, Linux, MacOS X.
Also, you can access the data from anywhere and they take daily snapshots.

Just a commercial solution that works really well.
(just my 0.02)
 
Old 08-17-2012, 04:51 PM   #3
workent
LQ Newbie
 
Registered: May 2012
Posts: 6

Original Poster
Rep: Reputation: Disabled
Hi GATTACA,

I try to make use of FLOSS wherever & whenever I can (with a few exceptions - mostly dogfooding for support purposes).
In principal, 100% committed to FLOSS, in practice, maybe not entirely (close to 99%)

My SoHo setup is mostly FLOSS, as I like to make use of the systems 1st-hand that I recommend to others.

Crashplan is a compelling solution, and is one of the hosted services that I can put forward to interested parties who need off-site backup.
Other services/solutions in that space I've encountered include: (alternativeto hase this good list)
* a co-lo server, partner/pier site
* Amazon
* Dropbox
* ZumoDrive
* ADrive
* Ubuntu One
* SpiderOak
* Wuala
* Jungle Disk
* TarSnap
* RetroShare

The remote/online/"cloud" stuff is only the last link in the chain (still an important one, granted) - it's the internal backup regimen & system that I'm looking at at present.
 
Old 08-21-2012, 03:35 PM   #4
linux999
LQ Newbie
 
Registered: Aug 2012
Posts: 9

Rep: Reputation: Disabled
I use rsync for backing miscellaneous stuff and I use an encrypted USB stick for the more sensitive stuff
 
Old 09-04-2012, 09:45 PM   #5
KenJackson
Member
 
Registered: Jul 2006
Location: Maryland, USA
Distribution: Fedora and others
Posts: 757

Rep: Reputation: 145Reputation: 145
I haven't used any of those. I use a cron script which tars, encrypts and copies to my gaggle of USB sticks.

One reason I don't want to use an existing solution is that I expect there is a higher chance that it could be attacked. The more knowledge the enemy has, the more he can hurt you. But I do it all my own way, so attackers would have to shoot in the dark.

Another reason is that I enjoy writing shell scripts.
 
1 members found this post helpful.
Old 09-04-2012, 09:57 PM   #6
suicidaleggroll
LQ Guru
 
Registered: Nov 2010
Location: Colorado
Distribution: OpenSUSE, CentOS
Posts: 5,573

Rep: Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142Reputation: 2142
I wrote a script to do incremental backups a while back. Basically the first time you run it it just copies everything over to your other drive. Whenever you run it after that first time, it compares each file you want to back up to the version in the previous backup. If they're the same, it hard links the file from the previous backup to the new backup. If they're different, it copies the file into the new backup.

The end result is you have a set of backups, one for each time you run the code. Each backup contains the active version of every file at the time of the backup, however the hard drive space used is only that of the files that changed since the last backup. Since they're all hard links, you can remove any backup you want without affecting the other backups. This will easily let you do say daily backups, then when a backup is more than a month old you can decimate them to weekly backups, then monthly, and so on...all without using any more hard drive space than a single backup plus extra copies of whatever files changed between backups.

Last edited by suicidaleggroll; 09-04-2012 at 10:00 PM.
 
Old 09-04-2012, 10:06 PM   #7
workent
LQ Newbie
 
Registered: May 2012
Posts: 6

Original Poster
Rep: Reputation: Disabled
Quote:
Originally Posted by suicidaleggroll View Post
I wrote a script to do incremental backups a while back. Basically the first time you run it it just copies everything over to your other drive. Whenever you run it after that first time, it compares each file you want to back up to the version in the previous backup. If they're the same, it hard links the file from the previous backup to the new backup. If they're different, it copies the file into the new backup.

The end result is you have a set of backups, one for each time you run the code. Each backup contains the active version of every file at the time of the backup, however the hard drive space used is only that of the files that changed since the last backup. Since they're all hard links, you can remove any backup you want without affecting the other backups. This will easily let you do say daily backups, then when a backup is more than a month old you can decimate them to weekly backups, then monthly, and so on...all without using any more hard drive space than a single backup plus extra copies of whatever files changed between backups.
This is almost identical to what I have in mind for my own setup, or rather the proposed one.

This seems to be the most logical & robust manner to keep incremental backups in a relatively accessible manner, without too much resource overhead.

Basically what I'm looking for is such a system implemented in a nice package/GUI that I can then suggest to non-tech's (& tech's) to keep their house in order.
 
Old 09-05-2012, 03:08 AM   #8
tigger908
LQ Newbie
 
Registered: Jul 2012
Location: Wales
Distribution: Fedora
Posts: 4

Rep: Reputation: Disabled
I'm using Lucky Backup http://luckybackup.sourceforge.net/ to put stuff on a remote server via ssh. Uses delta compressions and should allow automation though cron (I can't seem to get that to work however!).
 
Old 09-18-2012, 06:40 AM   #9
JeremyBoden
Senior Member
 
Registered: Nov 2011
Location: London, UK
Distribution: Debian
Posts: 1,947

Rep: Reputation: 511Reputation: 511Reputation: 511Reputation: 511Reputation: 511Reputation: 511
I use rsync to an external disk periodically (using NFS) and
"Simple Backup" on a daily basis to an internal disk.
 
Old 09-18-2012, 07:54 AM   #10
Habitual
LQ Veteran
 
Registered: Jan 2011
Location: Abingdon, VA
Distribution: Catalina
Posts: 9,374
Blog Entries: 37

Rep: Reputation: Disabled
Quote:
Originally Posted by KenJackson View Post
...I use a cron script which tars, encrypts and copies to my gaggle of USB sticks....
I personally find mounting of our (internal) AWS s3:// buckets the easiest to "maintain". I use mysqldump for the entirety of our Zabbix database (historical data in every dump) and a combination of date-stamped.tar.gz with the DocumentRoot of every apache virtualhost (and applicable mysql db if one is installed for that same virtualhost). It's probably not ideal, but it works for me.

Quote:
Originally Posted by KenJackson View Post
Another reason is that I enjoy writing shell scripts.
Ditto.

I have a Bacula Server installed but I quickly got away from that when I was disappointed by the "proprietary format" of the archive.

I can 'tar zxf *.tar.gz faster that I can re-install and re-configure bacula, so I only have it sitting there. I had issues using it to grab some SQLExpress dbs on Windows hosts, where I didn't want to enable any Shadow Copying Services (it's not my windows server) and the server is on a closed and proprietary grid environment.

I don't know if that helps but there are going to be many replies to your inquiry, and probably just as many creative solutions.

You sound prepared, so you should land on your feet. Good luck!
 
Old 09-19-2012, 07:41 PM   #11
chrism01
LQ Guru
 
Registered: Aug 2004
Location: Sydney
Distribution: Rocky 9.2
Posts: 18,359

Rep: Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751Reputation: 2751
Quote:
I have a Bacula Server installed but I quickly got away from that when I was disappointed by the "proprietary format" of the archive.
For anyone else who's worried about that issue, Amanda/Zmanda uses native tools in the background
 
Old 09-19-2012, 07:47 PM   #12
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,982

Rep: Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625Reputation: 3625
I keep wanting to try Fog.

I still use G4U a lot. Guess I could just use dd over ftp or nc.

A real time or live state backup in open source might be an idea.
 
  


Reply

Tags
backup



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Backup strategy philwynk Linux - Newbie 8 01-26-2008 03:21 PM
Backup Strategy jadurant Linux - Newbie 1 06-05-2007 06:38 PM
Backup strategy DIL23 Linux - Newbie 4 03-10-2007 07:59 PM
Backup strategy xpucto Linux - Networking 2 11-16-2005 12:19 PM
Backup strategy Swift&Smart Linux - General 3 04-17-2003 03:07 AM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 06:14 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration