LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   2009 LinuxQuestions.org Members Choice Awards (https://www.linuxquestions.org/questions/2009-linuxquestions-org-members-choice-awards-91/)
-   -   Backup Application of the Year (https://www.linuxquestions.org/questions/2009-linuxquestions-org-members-choice-awards-91/backup-application-of-the-year-780674/)

jeremy 01-07-2010 04:00 PM

Backup Application of the Year
 
What's your preferred tool for backups?

--jeremy

Lee_Ball 01-07-2010 04:54 PM

Clonezilla. I love it, although I wouldn't really call it a backup tool. More a snapshot tool.

diilbert 01-08-2010 07:16 PM

I have just always used rdiff-backup.

jlinkels 01-08-2010 07:56 PM

rsync, couldn't be easier. Other "real" back-up suites are too complicated. The more complicated a back-up solution is, the more opaque the mechanism, and I am afraid I pull my hair out when I actually have to perfrom a full restore.

jlinkels

~sHyLoCk~ 01-08-2010 08:25 PM

A custom rsync script, it's in my sig. ;-)

geek745 01-10-2010 05:23 PM

I use full versioned backup with a local "server" replicating my repo, currently using Mercurial for all but a couple directories. It is really easy to sync across desktop and laptop, any operating system, and maintain double-updates, even just by myself :)

gotfw 01-10-2010 07:26 PM

For enterprise class backup solution in mixed MS and *nix environment Bacula is my preferred solution. Note that setting such a solution up correctly will require some investment in actually reading the manual, analysis, capacity planning, and tuning, but once you've done your due diligence and rolled to production Bacula does it's thing with minimal hassle. Role based, encryption, d2d2t, etc. Bacula offers a comprehensive solution that is truly open source that is not just a crippled "community edition" teaser calculated to be conduit to costly full featured "enterprise edition". Amanda aficionados will cite database issues with Bacula but I've never had experienced such - but then I also allocate database hardware appropriate for the task at hand. And having the sql lookup capability is a godsend on networks with more than a few systems.

Otherwise if we're talking backup for a couple systems then dump gives me all I need.

jlinkels 01-10-2010 08:22 PM

Gotfw, a quesion about Bacula. I did RTFM, even twice. But when I came to the bare metal restor chapter I started to doubt. To me it seemed restore was easy if you have your complete Bacula setup and infrastructure running. But what happens if you just have a tape and a new server just coming in from Dell with RAID arrays and all? Have you ever done a bare metal restore, and how did it go?

jlinkels

portamenteff 01-13-2010 09:49 AM

rsync, so easy even I can do it.

linus72 01-13-2010 10:49 AM

Remastersys
which is only for debian/ubuntu but it is great and has many options

Super TWiT 01-13-2010 01:30 PM

Rsync rocks! Especially if you backup to slow usb devices since only the changes are backed up.

raju.mopidevi 01-13-2010 08:38 PM

tar

choogendyk 01-18-2010 02:34 PM

Amanda -- see Ten Things I Like About Amanda.

Of course, Amanda is network backup software, intended to coordinate the backup of a number of computers to a centralized backup server. Most of the items listed in the poll are not in the same class. They are intended as tools to be used in scripts, one-off backup events, or incorporated into other software. Amanda, for example, uses gnu-tar as a backup tool.

rsync is another of those basic tools that ends up getting using in lots of other scripts and backup software.

In the realm of just backing up your own user files, I might vote for rsync or one of its offshoots. But, forced to make a choice between basic tools and enterprise level network backup software, I vote for Amanda for all the reasons listed in the post quoted above.

acousticm 01-19-2010 03:31 AM

Quote:

Originally Posted by jlinkels (Post 3821923)
Gotfw, a quesion about Bacula. I did RTFM, even twice. But when I came to the bare metal restor chapter I started to doubt. To me it seemed restore was easy if you have your complete Bacula setup and infrastructure running. But what happens if you just have a tape and a new server just coming in from Dell with RAID arrays and all? Have you ever done a bare metal restore, and how did it go?

jlinkels

Which is why I like amanda :)

Have done a couple of bare metal restore with amanda, including ones late in the evening on the central server for the entire office, when the Raid had 2 HD failures in half an hour .

So not only can you (relatively easily) do it, it's even doable when under (very) high pressure :)

jlinkels 01-19-2010 06:03 AM

Any docs on such a bare metal restore? User manual from amanda.org?

jlinkels

choogendyk 01-19-2010 08:16 PM

Quote:

Originally Posted by gotfw (Post 3821873)
... not just a crippled "community edition" teaser calculated to be conduit to costly full featured "enterprise edition". Amanda aficionados ...

That's an inaccurate and unfair jab at Amanda.

The Amanda community edition has seen a constant stream of improvements and additions since Zmanda came into existence and hired full time programmers to work on Amanda. There are a great many users who support large network backup environments using the community edition. There are relatively few features that are held back for the Enterprise edition, and those are things that were developed exclusively by Zmanda's paid programmers. The main feature, I think, is the Zmanda Management Console, and most sysadmins who work extensively with open source software get along just fine without that. It's an addition that is important for the corporate environments that Zmanda is pushing into, and where their income comes from support contracts. The benefit to us in the open source community is that those efforts push the overall development of Amanda, and most functionality features that we really care about go directly into the open source code.

Efforts by Zmanda over recent years that have gone directly into the open source code include: (1) Closing all the security holes found by automated source code scanning, resulting in Amanda being one of a handful of open source projects to be security certified by the Department of Homeland Security; (2) A new perl interface allowing perl modules to link directly to Amanda, to support writing Amanda applications in Perl; (3) A completely new device API -- a pluggable interface to storage devices, supporting tapes, vtapes, RAIT, and Amazon S3; (4) New encryption plugins based on gpg; (5) A new set of configuration tools to make the initial installation and setup of Amanda easier; (6) The addition of almost 200 unit tests to check the installation of Amanda; (7) Incorporation of in depth support for ZFS; (8) New APIs for scripting, for a changer interface supporting concurrent use of multiple devices and changers, and for moving and filtering data with maximal efficiency (the transfer API); and so on.

I hardly think that is a teaser. All the core code is there.

choogendyk 01-19-2010 08:36 PM

Quote:

Originally Posted by jlinkels (Post 3832009)
Any docs on such a bare metal restore? User manual from amanda.org?

You'll find most of the useful documentation on the wiki.

The tools that Amanda uses are the native tools for the platform. So, in my case, that ends up being ufsdump and ufsrestore. The backup stream is broken into pieces, gzipped if you request it, and dd'd to tape if you are using tape. The first file on the tape gives you the unix commands required to read the tape. The directions for reading and using those are on the wiki.

Bare metal restore is going to be platform and environment specific. I've done it a couple of times and it is different each time. I had a worst case scenario where a server had been improperly housed for years in a room that chronically over heated. This year they turned the building A/C off in early November, and then we had an extraordinary (for our region) heat wave in late November. The server finally died. We had already replaced a number of parts, and this time the boot drive failed, something on the motherboard failed, and some other stuff we couldn't figure out. We had some comparable spare hand me down servers that had served their lives in properly cooled server rooms. We snagged one, laid them both out on adjacent tables and proceeded to do surgery and transplant. After several hours we had put together a new server that had most of the drives and external arrays, the tape library, PCI cards exchanged, identity chips swapped, and so on. We then booted off CD, reconfigured the boot drive, and began recovering from tape backups. It was messy in part because the tape drive and library aren't part of the stock OS configuration, and we didn't have a spare DDS/3 that these servers typically have.

In other cases, I was able to get a quick elementary recovery from a boot drive backup on DDS/3 and then follow up with up-to-date recovery from AIT5 backup tapes once the base configuration was recovered (including device drivers necessary for the AIT5).

If someone were running x86 with Linux, then they might be using something like a Knopix CD for the bare metal step, but I don't really know anything about that (have just heard it). Anyway, it's all based on whatever native tools your environment works with.

gotfw 01-20-2010 12:03 AM

Quote:

Originally Posted by jlinkels (Post 3821923)
Gotfw, a quesion about Bacula. I did RTFM, even twice. But when I came to the bare metal restor chapter I started to doubt. To me it seemed restore was easy if you have your complete Bacula setup and infrastructure running. But what happens if you just have a tape and a new server just coming in from Dell with RAID arrays and all? Have you ever done a bare metal restore, and how did it go?

jlinkels

Yes, both simulated and real. They went just fine. As long as you have read the fine manual and prepared appropriately. I have also simulated failure of database machine. That's a bit more work, but not much because that is a dedicated box.

To others who've subsequently commented upon my comments, to reiterate my op, for just a few boxes I'd use dump and friends. Not much point in anything more complex than that unless you're just into for the exercise. For a larger environment of a couple hundred boxes or more, Bacula's database helps immensely when some phb type wants you to restore file bar from 15 weeks ago to client foo. If I had a big install were I wasn't using Bacula, Amanda would be high on my list as second choice. I haven't touched Amanda in a good 5 years (or more?), however, so I'd have to spend some time re-evaluating and getting up to speed.

I've been in this game too many years to turn this into a religious war. Rather I simply will suggest that anyone rolling out an enterprise grade backup solution in mixed environment of a few hundred boxes, autochangers, and requisite compliance with specific retention policies dictated from on high should spend some weeks evaluating Bacula AND Amanda, simulating bare metal restores, etc. Then make your decision accordingly. Last time I did this Bacula was the clear winner. Reportedly Amanda has improved significantly in recent years, however, so if it works for you, great.

P.S.; I will concede that the Bacula manual is extensive and non trivial to wrap your mind around the first or even second time through, particularly if it was one's first exposure to an enterprise grade backup solution. For me it was not and I still had to make a few passes through it. Bacula allows for some complex possibilities. Also some pretty simple ones. It all depends on needs. If all you're looking to do is follow examples to get quick solutions, you're not going to like Bacula. But then you're not going to like Amanda either....


Peace ;)

jlinkels 01-20-2010 01:16 AM

choogendyk and gotfw, thank both of you for the additional explanation.

jlinkels

anticapitalista 01-20-2010 08:49 AM

luckybackup which uses rsync

http://luckybackup.sourceforge.net/

saltyp 01-21-2010 05:37 AM

choogendyk and gotfw, that was exceptional and professional rhetoric. Thank you both. Stuck in a M$ + Symantec world myself. But thank you both for showing some light. jlinkels should also be praised for a very relevant question.

BLuFeNiX 01-21-2010 10:49 PM

dd if=/dev/sda of=/dev/sdb

sswam 01-22-2010 03:05 AM

I use git, with some wrapper program I wrote for it, to back up and sync my projects, sites, etc. It's very good :)

chicken76 01-22-2010 10:25 AM

Areca also deserves to be mentioned.

jeremy 01-22-2010 10:28 AM

Areca-Backup has been added.

--jeremy

diaco 01-22-2010 10:48 AM

sbackup in ubuntu

FredGSanford 01-22-2010 10:56 PM

I vote for sbackup and partimage which are not included.

r1d3r 01-23-2010 12:52 AM

the good old rsync.

schneidz 01-27-2010 10:39 AM

i wish dd was an option.

landroni 01-27-2010 10:51 AM

What about Back In Time?

SCerovec 01-29-2010 03:17 PM

Clonezilla saved my a$$ a couple of times this and the last year.
my vote is "thank You Clonezilla team"
:)

wan3 02-01-2010 05:04 AM

I vote for dump!
It is about twice as fast as every other solution (It is very near to the drive speed) like tar, cpio, rsync. Those other tools works on the filesystem. dump works directly on the inode-structure. Only dd is faster (if you do not have to save to much free space). But dd can do no incremental backup. There is no free solution, which is faster as an incremental backup with dump. Most backup tools do an incremental backup of a whole directory if you only change its name or permissions. dump saves these changes correctly but only has to save this few bytes not the whole contents of the directorys. Most other backup software wasts a lot of time and memory on such things. Also a test showed up that dump is very good on the linux/unix special files and metadata. dump can work over network but you will loose a lot of speed.
Ok dump has some disadvantages:
It is dangerous to run on a file system, which is mounted writable. So to backup your operating system root you need another system on an additional partition or you have to boot from CD or something like this (like clonezilla does the job). But because dump is so fast I see no problem to temporally remount your data readonly and do the backup job. Originally dump was written for tape backup. But you can write the dumps to files or DVD as well. If a disk gets full, dump can switch to another disk or DVD, because it was designed to ask for another tape.
dump is not good for long time archival!
It is a backup tool not an archive tool. The problem is that the file format is system and file system dependent and not very well documented (you will have the same platform dependence problem with other low level file images). But I was used for years and years for cross platform transfer and worked. But it was never designed for this.
My recommendation:
Do regular incremental backup on a daily or weekly schedule with a super fast incremental dump. And do an archive snapshot for example monthly or quarterly with an archival tool like tar or cpio ...
If speed does not mater rsync does a good job as well.

Gene Heskett 02-04-2010 09:21 AM

I voted for amanda because once setup and running, you can essentially forget it, the user's crontab entry takes care of all the work. This of course is for machines that run 24/7, so amanda gets run in the wee hours here.

And I wrote some scripts that wrap it up and manage the database so that a bare metal recovery actually makes you current with the last backup run.

Using virtual tapes on a big hard drive, in my case a terabyte drive, the recovery process is many times faster than with real tape since the hard drive is truly random access. And its a heck of a lot more dependable than I'd found tapes to be, they were always becoming unreadable for some reason or the other, and you can invest thousands in a tape library and tapes. Terabyte drives are commodity drives today, and have head flying lifetimes thousands of hours longer than any tape drive ever will. Once I had switched to vtapes, I wondered why I had wasted so much time and almost daily aggravation over trying to do it the classical way with tapes. Now I just read the morning email from amanda, and put the printout away. Whats not to like?

Cheers, Gene.

gotfw 02-09-2010 11:38 AM

Wow! Never would have thought rsync would have taken this one by a landslide. Makes sense, I guess when you take into account that there's likely lots more hobbyists at LQ than enterprise sysadmin types. Still, I am surprised.

raju.mopidevi 02-09-2010 04:53 PM

Never thought that rsync will be first !

jlinkels 02-09-2010 06:31 PM

Remarkable that two very basic tools (rsync and tar) are among the favorites. Would this be because these tools are simple to use and many (most?) users prefer to use a tool rather than read a manual? Although the man page of rsync isn't really simple, one or two basic commands is all you need. Or is it because many users prefer a simple tool so nothing is hidden or automatic, giving the user an idea of being in control so he has more confidence?

Quote:

Originally Posted by gotfw (Post 3858012)
Makes sense, I guess when you take into account that there's likely lots more hobbyists at LQ than enterprise sysadmin types. Still, I am surprised.

I hear what you say :). However is that true? I thought that Linux has a poor share on the desktop market, but the share in the server market (web servers, databases) is significant. Haven't looked at statistics for a long time tough.

jlinkels

Linux.tar.gz 03-08-2010 03:49 AM

g4l !!!
Ghost for Linux !

SilentSam 03-08-2010 07:35 AM

I'm surprised that neither dd nor partimage even made it to the list.

tallship 03-10-2010 06:50 AM

Someone gave me an odd look once when they fished for a good backup solution from me and I said, "tar". I still get a giggle remembering it to this day.

Really though, tar and rsync are the best backup solutions for simplicity and ease, but had I made it to the poll in time to vote I would have actually chosen Bacula.

Not because a large suite of utilities is better than tar, or rsync, but because people will still look at you funny when you mention them in an enterprise setting.

That's why I tell people to use Bacula or Backup Express by Syncsort (Hey, it's not my fault they'll feel better when they pay a bunch of money for a commercial solution), and then turn around and give them all their restores from nfs mounts that were rsync'd or gzipped tarballs I make anyway with cron.

Of course, I'm not usually asked how I got their restored files, so they just naturally assume that all that money they spent on the pricey solution paid for itself.

But I do like Bacula, and have been using it for years. It's good stuff.

Oh, and I agree with the guy who said he can't believe that dd didn't make it to the list either ;)

upengan78 03-14-2011 09:50 AM

Hey,

I know this is late, but I just registered on this site ;)

I see people have voted clonezilla highly, but I am surprised. Although I agree clonezilla is a terrific tool for imaging the drives or partitions on the same machine or on a remote machine, you need to shutdown your system always and to me I don't think you can afford taking your servers down to do full disk or partition image.

I would have voted for Amanda. It works quite nicely with virtual tapes as well on multiple platforms such as linux/solaris..etc.

If I was allowed to vote 2 times, then BackupPC would have been other choice, it has a user friendly web interface..


All times are GMT -5. The time now is 02:49 AM.