Latest LQ Deal: Complete CCNA, CCNP & Red Hat Certification Training Bundle
Go Back > Forums > 2009 Members Choice Awards
User Name
2009 Members Choice Awards This forum is for the 2009 Members Choice Awards.
You can now vote for your favorite products of 2009. This is your chance to be heard! Voting ends on February 9th.


View Poll Results: Backup Application of the Year
rsync 170 48.99%
AMANDA 23 6.63%
Bacula 11 3.17%
BackupPC 15 4.32%
dump 6 1.73%
DAR 3 0.86%
tar 50 14.41%
Mondo Rescue 2 0.58%
Time Vault 3 0.86%
Clonezilla 31 8.93%
Duplicity 2 0.58%
FlyBack 2 0.58%
cpio 5 1.44%
rsnapshot 11 3.17%
rdiff-backup 11 3.17%
Areca-Backup 2 0.58%
Voters: 347. You may not vote on this poll

  Search this Thread
Old 01-19-2010, 08:16 PM   #16
Senior Member
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,194

Rep: Reputation: 105Reputation: 105

Originally Posted by gotfw View Post
... not just a crippled "community edition" teaser calculated to be conduit to costly full featured "enterprise edition". Amanda aficionados ...
That's an inaccurate and unfair jab at Amanda.

The Amanda community edition has seen a constant stream of improvements and additions since Zmanda came into existence and hired full time programmers to work on Amanda. There are a great many users who support large network backup environments using the community edition. There are relatively few features that are held back for the Enterprise edition, and those are things that were developed exclusively by Zmanda's paid programmers. The main feature, I think, is the Zmanda Management Console, and most sysadmins who work extensively with open source software get along just fine without that. It's an addition that is important for the corporate environments that Zmanda is pushing into, and where their income comes from support contracts. The benefit to us in the open source community is that those efforts push the overall development of Amanda, and most functionality features that we really care about go directly into the open source code.

Efforts by Zmanda over recent years that have gone directly into the open source code include: (1) Closing all the security holes found by automated source code scanning, resulting in Amanda being one of a handful of open source projects to be security certified by the Department of Homeland Security; (2) A new perl interface allowing perl modules to link directly to Amanda, to support writing Amanda applications in Perl; (3) A completely new device API -- a pluggable interface to storage devices, supporting tapes, vtapes, RAIT, and Amazon S3; (4) New encryption plugins based on gpg; (5) A new set of configuration tools to make the initial installation and setup of Amanda easier; (6) The addition of almost 200 unit tests to check the installation of Amanda; (7) Incorporation of in depth support for ZFS; (8) New APIs for scripting, for a changer interface supporting concurrent use of multiple devices and changers, and for moving and filtering data with maximal efficiency (the transfer API); and so on.

I hardly think that is a teaser. All the core code is there.

Last edited by choogendyk; 01-19-2010 at 08:17 PM.
Old 01-19-2010, 08:36 PM   #17
Senior Member
Registered: Aug 2007
Location: Massachusetts, USA
Distribution: Solaris 9 & 10, Mac OS X, Ubuntu Server
Posts: 1,194

Rep: Reputation: 105Reputation: 105
Originally Posted by jlinkels View Post
Any docs on such a bare metal restore? User manual from
You'll find most of the useful documentation on the wiki.

The tools that Amanda uses are the native tools for the platform. So, in my case, that ends up being ufsdump and ufsrestore. The backup stream is broken into pieces, gzipped if you request it, and dd'd to tape if you are using tape. The first file on the tape gives you the unix commands required to read the tape. The directions for reading and using those are on the wiki.

Bare metal restore is going to be platform and environment specific. I've done it a couple of times and it is different each time. I had a worst case scenario where a server had been improperly housed for years in a room that chronically over heated. This year they turned the building A/C off in early November, and then we had an extraordinary (for our region) heat wave in late November. The server finally died. We had already replaced a number of parts, and this time the boot drive failed, something on the motherboard failed, and some other stuff we couldn't figure out. We had some comparable spare hand me down servers that had served their lives in properly cooled server rooms. We snagged one, laid them both out on adjacent tables and proceeded to do surgery and transplant. After several hours we had put together a new server that had most of the drives and external arrays, the tape library, PCI cards exchanged, identity chips swapped, and so on. We then booted off CD, reconfigured the boot drive, and began recovering from tape backups. It was messy in part because the tape drive and library aren't part of the stock OS configuration, and we didn't have a spare DDS/3 that these servers typically have.

In other cases, I was able to get a quick elementary recovery from a boot drive backup on DDS/3 and then follow up with up-to-date recovery from AIT5 backup tapes once the base configuration was recovered (including device drivers necessary for the AIT5).

If someone were running x86 with Linux, then they might be using something like a Knopix CD for the bare metal step, but I don't really know anything about that (have just heard it). Anyway, it's all based on whatever native tools your environment works with.
Old 01-20-2010, 12:03 AM   #18
Registered: Jan 2007
Posts: 416

Rep: Reputation: 70
Originally Posted by jlinkels View Post
Gotfw, a quesion about Bacula. I did RTFM, even twice. But when I came to the bare metal restor chapter I started to doubt. To me it seemed restore was easy if you have your complete Bacula setup and infrastructure running. But what happens if you just have a tape and a new server just coming in from Dell with RAID arrays and all? Have you ever done a bare metal restore, and how did it go?

Yes, both simulated and real. They went just fine. As long as you have read the fine manual and prepared appropriately. I have also simulated failure of database machine. That's a bit more work, but not much because that is a dedicated box.

To others who've subsequently commented upon my comments, to reiterate my op, for just a few boxes I'd use dump and friends. Not much point in anything more complex than that unless you're just into for the exercise. For a larger environment of a couple hundred boxes or more, Bacula's database helps immensely when some phb type wants you to restore file bar from 15 weeks ago to client foo. If I had a big install were I wasn't using Bacula, Amanda would be high on my list as second choice. I haven't touched Amanda in a good 5 years (or more?), however, so I'd have to spend some time re-evaluating and getting up to speed.

I've been in this game too many years to turn this into a religious war. Rather I simply will suggest that anyone rolling out an enterprise grade backup solution in mixed environment of a few hundred boxes, autochangers, and requisite compliance with specific retention policies dictated from on high should spend some weeks evaluating Bacula AND Amanda, simulating bare metal restores, etc. Then make your decision accordingly. Last time I did this Bacula was the clear winner. Reportedly Amanda has improved significantly in recent years, however, so if it works for you, great.

P.S.; I will concede that the Bacula manual is extensive and non trivial to wrap your mind around the first or even second time through, particularly if it was one's first exposure to an enterprise grade backup solution. For me it was not and I still had to make a few passes through it. Bacula allows for some complex possibilities. Also some pretty simple ones. It all depends on needs. If all you're looking to do is follow examples to get quick solutions, you're not going to like Bacula. But then you're not going to like Amanda either....


Last edited by gotfw; 01-20-2010 at 12:12 AM.
Old 01-20-2010, 01:16 AM   #19
LQ Guru
Registered: Oct 2003
Location: Bonaire, Leeuwarden
Distribution: Debian /Jessie/Stretch/Sid, Linux Mint DE
Posts: 5,182

Rep: Reputation: 1017Reputation: 1017Reputation: 1017Reputation: 1017Reputation: 1017Reputation: 1017Reputation: 1017Reputation: 1017
choogendyk and gotfw, thank both of you for the additional explanation.

Old 01-20-2010, 08:49 AM   #20
Registered: May 2005
Location: Greece
Distribution: antiX using herbstluftwm, fluxbox, IceWM and jwm.
Posts: 400

Rep: Reputation: 112Reputation: 112
luckybackup which uses rsync
Old 01-21-2010, 05:37 AM   #21
LQ Newbie
Registered: Dec 2006
Location: Australia
Distribution: OpenSuSE 11, Linux Mint
Posts: 16

Rep: Reputation: 0
choogendyk and gotfw, that was exceptional and professional rhetoric. Thank you both. Stuck in a M$ + Symantec world myself. But thank you both for showing some light. jlinkels should also be praised for a very relevant question.

Last edited by saltyp; 01-21-2010 at 05:39 AM.
Old 01-21-2010, 10:49 PM   #22
LQ Newbie
Registered: Dec 2009
Posts: 8

Rep: Reputation: 0
dd if=/dev/sda of=/dev/sdb
Old 01-22-2010, 03:05 AM   #23
LQ Newbie
Registered: Dec 2009
Posts: 10

Rep: Reputation: 1
I use git, with some wrapper program I wrote for it, to back up and sync my projects, sites, etc. It's very good
Old 01-22-2010, 10:25 AM   #24
Registered: Mar 2009
Distribution: Slackware
Posts: 116

Rep: Reputation: 2
Areca also deserves to be mentioned.
Old 01-22-2010, 10:28 AM   #25
Registered: Jun 2000
Distribution: Debian, Red Hat, Slackware, Fedora, Ubuntu
Posts: 12,738

Original Poster
Rep: Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523Reputation: 3523
Areca-Backup has been added.

Old 01-22-2010, 10:48 AM   #26
LQ Newbie
Registered: Oct 2009
Location: In a one-bed hotel room
Distribution: Ubuntu Karmic
Posts: 7

Rep: Reputation: 0
sbackup in ubuntu
Old 01-22-2010, 10:56 PM   #27
Senior Member
Registered: Nov 2005
Location: USA
Distribution: Mageia Cauldron - VoidLinux - Devuan
Posts: 1,059
Blog Entries: 5

Rep: Reputation: 165Reputation: 165
I vote for sbackup and partimage which are not included.
Old 01-23-2010, 12:52 AM   #28
Registered: May 2008
Location: Glendale, CA
Distribution: ubuntu 12.04
Posts: 146

Rep: Reputation: 22
the good old rsync.
Old 01-27-2010, 10:39 AM   #29
LQ Guru
Registered: May 2005
Location: boston, usa
Distribution: fc-15/ fc-20-live-usb/ aix
Posts: 5,235

Rep: Reputation: 914Reputation: 914Reputation: 914Reputation: 914Reputation: 914Reputation: 914Reputation: 914Reputation: 914
i wish dd was an option.
Old 01-27-2010, 10:51 AM   #30
Registered: Feb 2005
Distribution: Xubuntu 12.04 LTS
Posts: 154

Rep: Reputation: 16
What about Back In Time?


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Similar Threads
Thread Thread Starter Forum Replies Last Post
Backup Application of the Year jeremy 2008 Members Choice Awards 46 03-05-2010 10:50 PM
Graphics Application of the Year jeremy 2009 Members Choice Awards 36 02-09-2010 09:17 AM
Monitoring Application of the Year jeremy 2007 Members Choice Awards 25 09-23-2008 04:37 AM > Forums > 2009 Members Choice Awards

All times are GMT -5. The time now is 01:41 PM.

Main Menu
Write for LQ is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration