[SOLVED] what is the best SIMPLE one-to-one direct copy backup system?
Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
what is the best SIMPLE one-to-one direct copy backup system?
I've been using BackInTime, which uses compression, but it was pointed out to me that with the inexpensive storage available nowadays a direct copy backup system is simpler.
I've looked at grysnc and Unison and things like that and I get lost with some of their terminology. (And wonder if I'm really doing what I want to be doing.) What's a simple system for just simply backing up my most important folders (under 30GB) in my Home folder on a daily basis--that doesn't take a long time?
I find that FreeFileSync works for me as a handy GUI solution.
Note that, under normal "mirror" operation, FreeFileSync doesn't maintain multiple backup versions (e.g. multiple versions of one file if you've changed it multiple times), it purely creates one mirror image of all your files on the destination drive. So, if you've added a file, it adds it to the backup drive, if you've deleted a file then it deletes it, and if you've modified a file then it copies the new file over the older one on the backup drive.
You would have to decide if that suits you. If not, you might want to find a solution that stores all previous copies of files (those that have been modified or deleted).
There are several considerations in backups which mean that simplicity is a relative term.
Half of the considerations in backup have to do with restores. You have to be able to do different kinds of restores and do so on a system that may be crippled and only partially functional. If you use compression then decompressing during a restore is a complication that you should avoid if you have the space to keep uncompressed backups.
You should keep your backup in a file structure as close to possible to your live file structure. That makes it easier to select whatever subset of your backup that you need to restore. Tar files are probably the worst backup structure in this regard.
You should be able to log in as root. Running a restore as root is far less complicated than trying to do a restore as a user or superuser or running a restore from a live DVD or USB stick.
Quote:
Originally Posted by Gregg Bell
What's a simple system for just simply backing up my most important folders (under 30GB) in my Home folder on a daily basis--that doesn't take a long time?
The best way to keep the copy time down is to only copy files that have changed since the last backup. The first copy copies everything and takes a long time. Then the incremental backups from then on only take a fraction of the time of a full backup.
You need to keep several generations of backup. You may not notice that you have lost an important file until some time has passed. How many generations of backup you keep depends on how much space you have available. The time interval between generations is a guess as to how long a missing file will be unnoticed.
So I suggest that you first make a backup plan to suit your needs and then go back and tackle the various backup programs again to find one that will do things your way.
The best is one that you use and like and works. Not sure you need daily backs of full files. Might look at rsycn or other tool that only selects changes?
Compression isn't bad if network transfer or disk access is slow and processor is faster.
You might be able to adjust compression levels if you feel that is it.
I use rsync. As I am using it at home, I back ~/ manually when I've done something worth backing up. Here's my little script, in case it helps.
Code:
rsync -a /home/[username] [username]@[serverIP]:/path/to/backup/directory
I must say, it took me a bit of research to get the script configured in a way that worked for me. There's a lot of material on the web about rsync, but much of it is rather impenetrable, especially if you are as green as I was when I started learning the command.
I used G4U a few years back, it was simple-ish. Although I was trying to make a not so good install of win98 better use the disk. Fat16 so 2GB limit on a 10GB disk. And other things my brother did when doing hand me down tech to the nephews (he could have at least brought the install media). The only tech I had brought with me that trip was a 2GB usb stick. But I got it done.
Depends on what you want from a backup though. Just access to your data, and not a system recovery of an OS, rsync is pretty simple. Don't care much about drive space usage or time, dd is pretty simple. At least until you want to access the data on an image of the whole drive. But partx and other things makes it doable without using split to part out the partitions, or restoring the drive to access the data.
Compression can help save space and time depending on the speed of your computer versus the speed of I/O to various devices. Over a network, then most definitely use compression. Even gigabit ethernet is horridly slow compared to USB3 speeds or better.
I find that FreeFileSync works for me as a handy GUI solution.
Note that, under normal "mirror" operation, FreeFileSync doesn't maintain multiple backup versions (e.g. multiple versions of one file if you've changed it multiple times), it purely creates one mirror image of all your files on the destination drive. So, if you've added a file, it adds it to the backup drive, if you've deleted a file then it deletes it, and if you've modified a file then it copies the new file over the older one on the backup drive.
You would have to decide if that suits you. If not, you might want to find a solution that stores all previous copies of files (those that have been modified or deleted).
Thanks hydrurga. I just heard of FreeFileSync a few days ago and I think it's pretty amazing. I was just testing it. I think I'll stay with BackInTime but use this as an additional backup.
And as to the multiple backup versions, if it's important enough, I can just date each day's version.
There are several considerations in backups which mean that simplicity is a relative term.
Half of the considerations in backup have to do with restores. You have to be able to do different kinds of restores and do so on a system that may be crippled and only partially functional. If you use compression then decompressing during a restore is a complication that you should avoid if you have the space to keep uncompressed backups.
You should keep your backup in a file structure as close to possible to your live file structure. That makes it easier to select whatever subset of your backup that you need to restore. Tar files are probably the worst backup structure in this regard.
You should be able to log in as root. Running a restore as root is far less complicated than trying to do a restore as a user or superuser or running a restore from a live DVD or USB stick.
The best way to keep the copy time down is to only copy files that have changed since the last backup. The first copy copies everything and takes a long time. Then the incremental backups from then on only take a fraction of the time of a full backup.
You need to keep several generations of backup. You may not notice that you have lost an important file until some time has passed. How many generations of backup you keep depends on how much space you have available. The time interval between generations is a guess as to how long a missing file will be unnoticed.
So I suggest that you first make a backup plan to suit your needs and then go back and tackle the various backup programs again to find one that will do things your way.
----------------------------
Steve Stites
Thanks Steve. That's solid advice. I think although, like you say, it would be better to do the restore from root, I just don't have the skills for that yet. I've got the BackInTime for the older files. And the restore does work quite nicely. And now I'm experimenting with FreeFileSync. I'm feeling pretty comfortable at this point.
The best is one that you use and like and works. Not sure you need daily backs of full files. Might look at rsycn or other tool that only selects changes?
Compression isn't bad if network transfer or disk access is slow and processor is faster.
You might be able to adjust compression levels if you feel that is it.
Thanks jefro. The GUI for BackInTime shows that rsync is the power behind it. And it is pretty quick. The FreeFileSync does the incremental changes.
I use rsync. As I am using it at home, I back ~/ manually when I've done something worth backing up. Here's my little script, in case it helps.
Code:
rsync -a /home/[username] [username]@[serverIP]:/path/to/backup/directory
I must say, it took me a bit of research to get the script configured in a way that worked for me. There's a lot of material on the web about rsync, but much of it is rather impenetrable, especially if you are as green as I was when I started learning the command.
Thanks Frank. I really should do something like that but I still need my security blanket (GUI) at this point!
I used G4U a few years back, it was simple-ish. Although I was trying to make a not so good install of win98 better use the disk. Fat16 so 2GB limit on a 10GB disk. And other things my brother did when doing hand me down tech to the nephews (he could have at least brought the install media). The only tech I had brought with me that trip was a 2GB usb stick. But I got it done.
Depends on what you want from a backup though. Just access to your data, and not a system recovery of an OS, rsync is pretty simple. Don't care much about drive space usage or time, dd is pretty simple. At least until you want to access the data on an image of the whole drive. But partx and other things makes it doable without using split to part out the partitions, or restoring the drive to access the data.
Compression can help save space and time depending on the speed of your computer versus the speed of I/O to various devices. Over a network, then most definitely use compression. Even gigabit ethernet is horridly slow compared to USB3 speeds or better.
Thanks Shadow. The BackInTime I'm using does the compression and it's pretty quick. I'm thinking I'm pretty safe (and satisfied) at this point. Appreciate your input.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.