2009 LinuxQuestions.org Members Choice AwardsThis forum is for the 2009 LinuxQuestions.org Members Choice Awards.
You can now vote for your favorite products of 2009. This is your chance to be heard! Voting ends on February 9th.
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I vote for dump!
It is about twice as fast as every other solution (It is very near to the drive speed) like tar, cpio, rsync. Those other tools works on the filesystem. dump works directly on the inode-structure. Only dd is faster (if you do not have to save to much free space). But dd can do no incremental backup. There is no free solution, which is faster as an incremental backup with dump. Most backup tools do an incremental backup of a whole directory if you only change its name or permissions. dump saves these changes correctly but only has to save this few bytes not the whole contents of the directorys. Most other backup software wasts a lot of time and memory on such things. Also a test showed up that dump is very good on the linux/unix special files and metadata. dump can work over network but you will loose a lot of speed.
Ok dump has some disadvantages:
It is dangerous to run on a file system, which is mounted writable. So to backup your operating system root you need another system on an additional partition or you have to boot from CD or something like this (like clonezilla does the job). But because dump is so fast I see no problem to temporally remount your data readonly and do the backup job. Originally dump was written for tape backup. But you can write the dumps to files or DVD as well. If a disk gets full, dump can switch to another disk or DVD, because it was designed to ask for another tape.
dump is not good for long time archival!
It is a backup tool not an archive tool. The problem is that the file format is system and file system dependent and not very well documented (you will have the same platform dependence problem with other low level file images). But I was used for years and years for cross platform transfer and worked. But it was never designed for this.
Do regular incremental backup on a daily or weekly schedule with a super fast incremental dump. And do an archive snapshot for example monthly or quarterly with an archival tool like tar or cpio ...
If speed does not mater rsync does a good job as well.
I voted for amanda because once setup and running, you can essentially forget it, the user's crontab entry takes care of all the work. This of course is for machines that run 24/7, so amanda gets run in the wee hours here.
And I wrote some scripts that wrap it up and manage the database so that a bare metal recovery actually makes you current with the last backup run.
Using virtual tapes on a big hard drive, in my case a terabyte drive, the recovery process is many times faster than with real tape since the hard drive is truly random access. And its a heck of a lot more dependable than I'd found tapes to be, they were always becoming unreadable for some reason or the other, and you can invest thousands in a tape library and tapes. Terabyte drives are commodity drives today, and have head flying lifetimes thousands of hours longer than any tape drive ever will. Once I had switched to vtapes, I wondered why I had wasted so much time and almost daily aggravation over trying to do it the classical way with tapes. Now I just read the morning email from amanda, and put the printout away. Whats not to like?
Wow! Never would have thought rsync would have taken this one by a landslide. Makes sense, I guess when you take into account that there's likely lots more hobbyists at LQ than enterprise sysadmin types. Still, I am surprised.
Distribution: Debian Wheezy/Jessie/Sid, Linux Mint DE
Remarkable that two very basic tools (rsync and tar) are among the favorites. Would this be because these tools are simple to use and many (most?) users prefer to use a tool rather than read a manual? Although the man page of rsync isn't really simple, one or two basic commands is all you need. Or is it because many users prefer a simple tool so nothing is hidden or automatic, giving the user an idea of being in control so he has more confidence?
Originally Posted by gotfw
Makes sense, I guess when you take into account that there's likely lots more hobbyists at LQ than enterprise sysadmin types. Still, I am surprised.
I hear what you say . However is that true? I thought that Linux has a poor share on the desktop market, but the share in the server market (web servers, databases) is significant. Haven't looked at statistics for a long time tough.
Someone gave me an odd look once when they fished for a good backup solution from me and I said, "tar". I still get a giggle remembering it to this day.
Really though, tar and rsync are the best backup solutions for simplicity and ease, but had I made it to the poll in time to vote I would have actually chosen Bacula.
Not because a large suite of utilities is better than tar, or rsync, but because people will still look at you funny when you mention them in an enterprise setting.
That's why I tell people to use Bacula or Backup Express by Syncsort (Hey, it's not my fault they'll feel better when they pay a bunch of money for a commercial solution), and then turn around and give them all their restores from nfs mounts that were rsync'd or gzipped tarballs I make anyway with cron.
Of course, I'm not usually asked how I got their restored files, so they just naturally assume that all that money they spent on the pricey solution paid for itself.
But I do like Bacula, and have been using it for years. It's good stuff.
Oh, and I agree with the guy who said he can't believe that dd didn't make it to the list either
I know this is late, but I just registered on this site
I see people have voted clonezilla highly, but I am surprised. Although I agree clonezilla is a terrific tool for imaging the drives or partitions on the same machine or on a remote machine, you need to shutdown your system always and to me I don't think you can afford taking your servers down to do full disk or partition image.
I would have voted for Amanda. It works quite nicely with virtual tapes as well on multiple platforms such as linux/solaris..etc.
If I was allowed to vote 2 times, then BackupPC would have been other choice, it has a user friendly web interface..