Hi.
I consider my time to be very valuable. So I use a number of strategies to reduce risk. For example, I use virtual machines most of the time. This allows me to do a
snapshot, which subsequently could be used to restore an entire system in less than 5 minutes (and usually less). I usually do a snapshot just before a big update. After a reboot and few days of smooth running, I fold the snapshot into the running system. I have used VMWare and Virtualbox for that. This requires learning about virtual machines. Currently, I have a Linux host system on my (newish) workstation. I rarely use the host directly, but I installed a guest machine and have almost all of my working files there. Every now and then, I update the guest and also the host, but most frequently the guest. I do the same on a small laptop, but with a Windows host and a Linux guest ( a version of Debian testing because the old stable did not support the newer Intel CPU ). Our server is a Linux host with a number of guests (see below) -- usually no more than 10 or so simultaneously.
We also have an external machine to contact other machines periodically, say every 4 hours, then run
rsnapshot to capture changed files, rotating to daily, weekly, monthly backups. Rsnapshot knows about LVM volumes, and can create LVM snapshot volumes (not to be confused with virtual machine snapshots, but both use the idea of copy-on-write,
https://en.wikipedia.org/wiki/Copy-on-write), so that the system can be captured without stopping the machine. Requires learning, setup, external machine, and a big disk (I use a large capacity disk in a raw disk dock).
During development work, I use a version control system, like rcs , bzr , etc., to keep a local (to the directory) copy of files in development. Before I make a substantial change, I save the current instance, usually taking less a second or two. If something goes horribly wrong, I copy in the previous instance. This requires learning a bit about version control. More feature-full systems include subversion, git , etc.
Every week or so, I tar up the home directory on my main workstation and store it on a (slow, but large) external drive. This is easiest.
On a Windows box I recently tried cloud backup with Acronis. It was agonizingly slow. It seemed better to backup to a local drive (again in a USB/SATA dock).
I have rarely completely lost a file in the recent past.
Good luck ... cheers, makyo
# Uptimes for available virtual machines 2016.08.03:
up 82 days 5 users: CentOS 6.4 (Final)
up 54 days 2 users: Fedora 23 (Workstation)
up 82 days 9 users: openSUSE 13.2 (ext4, "Harlequin")
up 82 days 5 users: Slackware 14.1
up 82 day(s) 2 users: Solaris 11.3 X86
up 46 days 6 users: Ubuntu 14.04.2 (KDE, Trusty Tahr)