LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Newbie (https://www.linuxquestions.org/questions/linux-newbie-8/)
-   -   what is the best SIMPLE one-to-one direct copy backup system? (https://www.linuxquestions.org/questions/linux-newbie-8/what-is-the-best-simple-one-to-one-direct-copy-backup-system-4175605325/)

Gregg Bell 05-05-2017 01:06 PM

what is the best SIMPLE one-to-one direct copy backup system?
 
I've been using BackInTime, which uses compression, but it was pointed out to me that with the inexpensive storage available nowadays a direct copy backup system is simpler.

I've looked at grysnc and Unison and things like that and I get lost with some of their terminology. (And wonder if I'm really doing what I want to be doing.) What's a simple system for just simply backing up my most important folders (under 30GB) in my Home folder on a daily basis--that doesn't take a long time?

Thanks.

hydrurga 05-05-2017 01:19 PM

I find that FreeFileSync works for me as a handy GUI solution.

Note that, under normal "mirror" operation, FreeFileSync doesn't maintain multiple backup versions (e.g. multiple versions of one file if you've changed it multiple times), it purely creates one mirror image of all your files on the destination drive. So, if you've added a file, it adds it to the backup drive, if you've deleted a file then it deletes it, and if you've modified a file then it copies the new file over the older one on the backup drive.

You would have to decide if that suits you. If not, you might want to find a solution that stores all previous copies of files (those that have been modified or deleted).

jailbait 05-05-2017 01:48 PM

There are several considerations in backups which mean that simplicity is a relative term.

Half of the considerations in backup have to do with restores. You have to be able to do different kinds of restores and do so on a system that may be crippled and only partially functional. If you use compression then decompressing during a restore is a complication that you should avoid if you have the space to keep uncompressed backups.

You should keep your backup in a file structure as close to possible to your live file structure. That makes it easier to select whatever subset of your backup that you need to restore. Tar files are probably the worst backup structure in this regard.

You should be able to log in as root. Running a restore as root is far less complicated than trying to do a restore as a user or superuser or running a restore from a live DVD or USB stick.

Quote:

Originally Posted by Gregg Bell (Post 5706677)
What's a simple system for just simply backing up my most important folders (under 30GB) in my Home folder on a daily basis--that doesn't take a long time?

The best way to keep the copy time down is to only copy files that have changed since the last backup. The first copy copies everything and takes a long time. Then the incremental backups from then on only take a fraction of the time of a full backup.

You need to keep several generations of backup. You may not notice that you have lost an important file until some time has passed. How many generations of backup you keep depends on how much space you have available. The time interval between generations is a guess as to how long a missing file will be unnoticed.

So I suggest that you first make a backup plan to suit your needs and then go back and tackle the various backup programs again to find one that will do things your way.

----------------------------
Steve Stites

jefro 05-05-2017 03:08 PM

The best is one that you use and like and works. Not sure you need daily backs of full files. Might look at rsycn or other tool that only selects changes?

Compression isn't bad if network transfer or disk access is slow and processor is faster.

You might be able to adjust compression levels if you feel that is it.

frankbell 05-05-2017 09:06 PM

I use rsync. As I am using it at home, I back ~/ manually when I've done something worth backing up. Here's my little script, in case it helps.

Code:

rsync -a /home/[username] [username]@[serverIP]:/path/to/backup/directory
I must say, it took me a bit of research to get the script configured in a way that worked for me. There's a lot of material on the web about rsync, but much of it is rather impenetrable, especially if you are as green as I was when I started learning the command.

Shadow_7 05-05-2017 11:16 PM

I used G4U a few years back, it was simple-ish. Although I was trying to make a not so good install of win98 better use the disk. Fat16 so 2GB limit on a 10GB disk. And other things my brother did when doing hand me down tech to the nephews (he could have at least brought the install media). The only tech I had brought with me that trip was a 2GB usb stick. But I got it done.

Depends on what you want from a backup though. Just access to your data, and not a system recovery of an OS, rsync is pretty simple. Don't care much about drive space usage or time, dd is pretty simple. At least until you want to access the data on an image of the whole drive. But partx and other things makes it doable without using split to part out the partitions, or restoring the drive to access the data.

Compression can help save space and time depending on the speed of your computer versus the speed of I/O to various devices. Over a network, then most definitely use compression. Even gigabit ethernet is horridly slow compared to USB3 speeds or better.

Gregg Bell 05-06-2017 12:31 AM

Quote:

Originally Posted by hydrurga (Post 5706681)
I find that FreeFileSync works for me as a handy GUI solution.

Note that, under normal "mirror" operation, FreeFileSync doesn't maintain multiple backup versions (e.g. multiple versions of one file if you've changed it multiple times), it purely creates one mirror image of all your files on the destination drive. So, if you've added a file, it adds it to the backup drive, if you've deleted a file then it deletes it, and if you've modified a file then it copies the new file over the older one on the backup drive.

You would have to decide if that suits you. If not, you might want to find a solution that stores all previous copies of files (those that have been modified or deleted).

Thanks hydrurga. I just heard of FreeFileSync a few days ago and I think it's pretty amazing. I was just testing it. I think I'll stay with BackInTime but use this as an additional backup.

And as to the multiple backup versions, if it's important enough, I can just date each day's version.

Gregg Bell 05-06-2017 12:35 AM

Quote:

Originally Posted by jailbait (Post 5706697)
There are several considerations in backups which mean that simplicity is a relative term.

Half of the considerations in backup have to do with restores. You have to be able to do different kinds of restores and do so on a system that may be crippled and only partially functional. If you use compression then decompressing during a restore is a complication that you should avoid if you have the space to keep uncompressed backups.

You should keep your backup in a file structure as close to possible to your live file structure. That makes it easier to select whatever subset of your backup that you need to restore. Tar files are probably the worst backup structure in this regard.

You should be able to log in as root. Running a restore as root is far less complicated than trying to do a restore as a user or superuser or running a restore from a live DVD or USB stick.



The best way to keep the copy time down is to only copy files that have changed since the last backup. The first copy copies everything and takes a long time. Then the incremental backups from then on only take a fraction of the time of a full backup.

You need to keep several generations of backup. You may not notice that you have lost an important file until some time has passed. How many generations of backup you keep depends on how much space you have available. The time interval between generations is a guess as to how long a missing file will be unnoticed.

So I suggest that you first make a backup plan to suit your needs and then go back and tackle the various backup programs again to find one that will do things your way.

----------------------------
Steve Stites

Thanks Steve. That's solid advice. I think although, like you say, it would be better to do the restore from root, I just don't have the skills for that yet. I've got the BackInTime for the older files. And the restore does work quite nicely. And now I'm experimenting with FreeFileSync. I'm feeling pretty comfortable at this point.

Gregg Bell 05-06-2017 12:37 AM

Quote:

Originally Posted by jefro (Post 5706738)
The best is one that you use and like and works. Not sure you need daily backs of full files. Might look at rsycn or other tool that only selects changes?

Compression isn't bad if network transfer or disk access is slow and processor is faster.

You might be able to adjust compression levels if you feel that is it.

Thanks jefro. The GUI for BackInTime shows that rsync is the power behind it. And it is pretty quick. The FreeFileSync does the incremental changes.

Gregg Bell 05-06-2017 12:38 AM

Quote:

Originally Posted by frankbell (Post 5706839)
I use rsync. As I am using it at home, I back ~/ manually when I've done something worth backing up. Here's my little script, in case it helps.

Code:

rsync -a /home/[username] [username]@[serverIP]:/path/to/backup/directory
I must say, it took me a bit of research to get the script configured in a way that worked for me. There's a lot of material on the web about rsync, but much of it is rather impenetrable, especially if you are as green as I was when I started learning the command.

Thanks Frank. I really should do something like that but I still need my security blanket (GUI) at this point!

Gregg Bell 05-06-2017 12:40 AM

Quote:

Originally Posted by Shadow_7 (Post 5706865)
I used G4U a few years back, it was simple-ish. Although I was trying to make a not so good install of win98 better use the disk. Fat16 so 2GB limit on a 10GB disk. And other things my brother did when doing hand me down tech to the nephews (he could have at least brought the install media). The only tech I had brought with me that trip was a 2GB usb stick. But I got it done.

Depends on what you want from a backup though. Just access to your data, and not a system recovery of an OS, rsync is pretty simple. Don't care much about drive space usage or time, dd is pretty simple. At least until you want to access the data on an image of the whole drive. But partx and other things makes it doable without using split to part out the partitions, or restoring the drive to access the data.

Compression can help save space and time depending on the speed of your computer versus the speed of I/O to various devices. Over a network, then most definitely use compression. Even gigabit ethernet is horridly slow compared to USB3 speeds or better.

Thanks Shadow. The BackInTime I'm using does the compression and it's pretty quick. I'm thinking I'm pretty safe (and satisfied) at this point. Appreciate your input.


All times are GMT -5. The time now is 09:04 AM.