LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   2013 LinuxQuestions.org Members Choice Awards (https://www.linuxquestions.org/questions/2013-linuxquestions-org-members-choice-awards-109/)
-   -   Backup Application of the Year (https://www.linuxquestions.org/questions/2013-linuxquestions-org-members-choice-awards-109/backup-application-of-the-year-4175488229/)

jeremy 12-16-2013 09:57 PM

Backup Application of the Year
 
What's your preferred tool for backups?

--jeremy

kooru 12-17-2013 01:32 AM

Here I should give multiple answers :)
Anyway I say tar

Tux! 12-17-2013 07:31 AM

tar over ssh to a remote linux box with a lot of storage

ozar 12-17-2013 07:38 AM

fsarchiver

frtorres 12-17-2013 01:16 PM

luckybackup.

TerryP 12-17-2013 03:40 PM

Dump because it's been saving my bacon for years, and I ain't seen anything that isn't Just Another Dump At Heart plus window dressing and/or helpers.

Janus_Hyperion 12-17-2013 04:43 PM

rsync is awesome! All my backups are done using scripts that use rsync! :)

trosdejos 12-18-2013 03:00 AM

Bacula works great.

daishi 12-19-2013 03:44 AM

I am missing burp! Great tool for private and office use..
http://burp.grke.org/

jeremy 12-19-2013 06:44 AM

Burp has been added.

--jeremy

firekage 12-19-2013 09:03 AM

I use Clonezilla. It is great tool with many options, supports cloning using quad core processors (usuable when we want to compress image file). As a matter of fact, i use it for all my systems (Arch, Slackware, Ubuntu...and even Windows). Clonezilla is something that gives me great controll over backup process, also, there is terminal built in, command line available. I can record cd and start using it - i don't like backups apps that works with shadow clones (on running system) like Symantec backups apps.

landroni 12-22-2013 10:53 AM

You should include in this list Dropbox and SpiderOak, or create a new appropriate category.

divyashree 12-24-2013 12:52 AM

Quote:

Originally Posted by jeremy (Post 5081941)
What's your preferred tool for backups?

--jeremy

Hi Jermy,dd can also be used as a backup tool.

anticapitalista 12-27-2013 02:40 PM

luckybackup

FeyFre 12-27-2013 06:09 PM

cp

damn, forum doesn't allows to post message "cp" - too short

allend 01-01-2014 12:13 AM

My backup solution is a custom bash script based around piping the results from find to cpio, so cpio it is.

Medievalist 01-02-2014 11:47 AM

I use shootsnap, with a restricted keyset and a large AOE array.

http://typinganimal.net/code/textify.php?f=shootsnap.sh

ruario 01-02-2014 03:46 PM

Quote:

Originally Posted by allend (Post 5089969)
My backup solution is a custom bash script based around piping the results from find to cpio, so cpio it is.

Filesize limitations with the bin, odc or newc file formats would make GNU cpio a poor choice for many. If you like cpio I would use bsdcpio (or heirloom cpio) with pax format or use afio.

You could also get GNU tar to use the pipe to read its file list, e.g. "tar --no-recursion -T- -cvf archive.tar", again with pax file format or even just gnutar file format. Neither have any serious limitations.

Pax is another option but make sure it is heirloom Pax as the pax util provided by most distros cannot actually make pax formatted archives, only ustar, which is really no better than newc.

allend 01-02-2014 05:41 PM

MAS (To use Tim Minchen's suggestion of "mildly amused smirk" rather than LOL :) ).

Thanks for the technical advice, but for the situation I have, backups of recently acquired files on a suite of Windows machines, I like cpio for the ability to manipulate path names. It has been working fine for years. I do not want to use a packed archive format, as what is required is to be able to read files directly from the Windows machines. It is also easier to demonstrate the recovery process to the occasional external auditor when our quality system is being assessed. The disaster recovery plan for these Windows machines (basically dedicated instrument controllers) is to restore a known good disk image, then restore any needed recently acquired files from backup. The file size issue will not occur due to the way these Windows machines are used.

I think what I am really trying to say is just the truism that the best choice of backup system heavily depends on what taking backups is trying to achieve.

ruario 01-03-2014 02:19 AM

Quote:

Originally Posted by allend (Post 5090865)
Thanks for the technical advice, but for the situation I have, backups of recently acquired files on a suite of Windows machines, I like cpio for the ability to manipulate path names.

You can do the same with numerous other utils, including those I mentioned (e.g. GNU tar has the --xform switch).

Quote:

Originally Posted by allend (Post 5090865)
It has been working fine for years.

Fair enough but as computing changes and common file sizes increase you may find a time where you hit the individual file size limitations of the old cpio formats (bin = 2GB, odc = 8GB, newc = 4GB). I know that I personally have multimedia files and Linux distro ISO images in these size ranges lying around on my disks. afio extends the odc format (only for entries that need it) past these limits and pax has no real, practical limits right from the get go (9 EB file sizes are possible).

Quote:

Originally Posted by allend (Post 5090865)
I do not want to use a packed archive format, as what is required is to be able to read files directly from the Windows machines.

I'm not sure what you mean by this but the pax file format is the POSIX.1-2001 standard file format and an extension of tar. There are numerous utilities available on Windows that will open its contents just fine. Additionally as afio works with odc by default and only extends the header on entries that exceed its limitations its archives should also be readable using common Windows archiving tools, particularly as you state no files currently do exceed the limits.

Quote:

Originally Posted by allend (Post 5090865)
It is also easier to demonstrate the recovery process to the occasional external auditor when our quality system is being assessed. The disaster recovery plan for these Windows machines (basically dedicated instrument controllers) is to restore a known good disk image, then restore any needed recently acquired files from backup.

I fail to see how this would be different if you used pax or afio.

Quote:

Originally Posted by allend (Post 5090865)
The file size issue will not occur due to the way these Windows machines are used.

Fair enough, I did not know your specific use case until you just stated it, so it could well have been an issue. In any case you can just take the information (assuming you were not already aware) as something to keep in mind for the future.

Quote:

Originally Posted by allend (Post 5090865)
I think what I am really trying to say is just the truism that the best choice of backup system heavily depends on what taking backups is trying to achieve.

Sure, I agree with that. It was just a warning that (IMHO at least) cpio (the GNU implementation in particular) has run out of steam as it is no longer being actively developed to handle the types of files that are ever more common in the modern world. So it was just a heads up, so that you don't get bitten in the future. Like all advice on the internet you are free to just ignore it! ;)

acampbell 01-03-2014 05:13 AM

Should inclute tarsnap - defitely my choice.

rjleaf 01-06-2014 01:38 PM

Having a single backup isn't exactly the best idea for protecting your data. If one of your backup mediums gets damaged or destroyed, it's very difficult to restore from your backup.

I follow the 3-2-1 backup philosophy quite strongly: 3 copies, 2 mediums, 1 off-site.

I'm using CrashPlan for my off-site backup solution and Deja-Dup for my local backup. Having a local backup is immensely useful, since there are many times where you might need to restore your data from a backup but not have the time, patience, or bandwidth to restore from an online backup service; the online backup service comes in handy when something catastrophic occurs, such as a fire, theft, or natural disaster.

I also store most of my media files on a dedicated file server at my friend's house in another state (since he has a fast FiOS connection).

acampbell 01-07-2014 05:46 AM

I agree about the value of a local backup. I was thinking about off-site backups when I mentiond tarsnap. I think it's good to have both.

chrisretusn 01-13-2014 07:33 PM

I use rsnapshot as part of my automated plan. I also use grsync on regular basis for selective backups. FSArchiver is also quite useful.

savotije 01-15-2014 03:16 AM

tar

metalaarif 01-15-2014 03:23 AM

There are many good ones such as: Bacula, Amanda, rsync etc. but I will go for Legendary TAR

mariuz 01-22-2014 09:28 AM

Git annex
http://git-annex.branchable.com/

Hans-Michael 02-03-2014 12:25 AM

CloneZilla

hal_tux 02-03-2014 06:22 AM

tar with bzip2

alldoug 02-03-2014 08:44 AM

rsync

axel112 02-03-2014 12:05 PM

rsync

gotfw 02-04-2014 02:51 AM

I think Bacula takes the win here. True enterprise grade, sports clients for both *nix and Winwoes so works in mixed environments. Then probably Amanda. Although admittedly I have not used either much as of late. Both would be massive overkill if you're just backing up a single workstation.

My $0.02

Peace-- :)

javaunixsolaris 03-04-2014 05:43 PM

Quote:

Originally Posted by rjleaf (Post 5093026)
I follow the 3-2-1 backup philosophy quite strongly: 3 copies, 2 mediums, 1 off-site.

You're hard core! Most companies aren't even that diligent.

catkin 03-05-2014 09:11 AM

I didn't vote because none is best -- rather it is a question of "horses for courses", that is which best meets the requirements.

We used to use amanda, rdiff-backup and rsync. I personally use Bacula, rsync and SpiderOak (a cloud backup service).

We have now almost completed migrating away from amanda and rdiff-backup to rsync run by a script which adds retention. The motivation is that neither amanda nor rdiff-backup are robust when they are interrupted while they are running; that is a problem because some of our clients shut down their systems while backups are running and our Internet connections are not reliable.

Comparing the tools I know:
  • amanda
    • Compresssed backup files: yes
    • Ease of configuration: yes
    • Graphical interface: no
    • Network loading: not known
    • Point in time restore: yes
    • Poor cousin to the commercial edition: yes
    • Retention: yes
    • Robust when interrupted: no
  • bacula
    • Compresssed backup files: yes
    • Ease of configuration: no
    • Graphical interface: yes
    • Point in time restore: yes
    • Network loading: not known
    • Poor cousin to the commercial edition: no
    • Retention: yes
    • Robust when interrupted: not known
  • rdiff-backup
    • Compresssed backup files: no
    • Ease of configuration: yes
    • Graphical interface: no
    • Network loading: minimal
    • Point in time restore: yes
    • Poor cousin to the commercial edition: not applicable
      [*[Retention: yes
    • Robust when interrupted: no
  • rsync
    • Compresssed backup files: no
    • Ease of configuration: yes
    • Graphical interface: no
    • Network loading: minimal
    • Point in time restore: no
    • Poor cousin to the commercial edition: not applicable
    • Retention: no
    • Robust when interrupted: yes
Amanda's error handling seems less robust than bacula's and, being written in perl rather than compiled, it is at the mercy of changes in the perl interpreter. The last point is not academic; on ubuntu 12.04 a recent perl upgrade which includes stricter syntax checking has broken amanda.

jefro 03-05-2014 08:16 PM

Forgot G4U and dd.

gotfw 03-05-2014 11:33 PM

Bacual may be a bit of a pain to set up - not really, you just really need to actually read the docs - but it excels at doing what it says it will do. Regarding the poor cousin consideration, this is why I never really got behind Amanda. Bacula is overkill for a home PC, but great for actual enterprise use.

Oh, yeah, it has a gui now? Ooh, la, la!!

Peace :cool:

Steve R. 03-06-2014 10:48 PM

Simple Backup
 
Simple Backup. Simple Backup is proving to be excellent now. When I first installed the program it seemed quite fickle (with external drives) but still seemed to be superior to the other programs. Simple Backup worked as expected with your main drive but was unreliable with external networked drives. Whether the bugs were fixed or I learned how to adapt - I don't know, but it has been working very well now. I am backing up (three computers) to one external USB Hard Drive that is attached to a router.

Based on my experience, Simple Backup appears to be more reliable when the following factors are taken into account:
  1. When the USB drive is mounted through FSTAB.
  2. When the USB drive is formated for Linux. My router does not recognize EXT4, so I had to format it with EXT3. Other routers with USB ports may recognize EXT4.


All times are GMT -5. The time now is 03:50 AM.