LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie
User Name
Password
Linux - Newbie This Linux forum is for members that are new to Linux.
Just starting out and have a question? If it is not in the man pages or the how-to's this is the place!

Notices


Reply
  Search this Thread
Old 05-05-2017, 01:06 PM   #1
Gregg Bell
Senior Member
 
Registered: Mar 2014
Location: Illinois
Distribution: Xubuntu
Posts: 2,034

Rep: Reputation: 176Reputation: 176
what is the best SIMPLE one-to-one direct copy backup system?


I've been using BackInTime, which uses compression, but it was pointed out to me that with the inexpensive storage available nowadays a direct copy backup system is simpler.

I've looked at grysnc and Unison and things like that and I get lost with some of their terminology. (And wonder if I'm really doing what I want to be doing.) What's a simple system for just simply backing up my most important folders (under 30GB) in my Home folder on a daily basis--that doesn't take a long time?

Thanks.
 
Old 05-05-2017, 01:19 PM   #2
hydrurga
LQ Guru
 
Registered: Nov 2008
Location: Pictland
Distribution: Linux Mint 21 MATE
Posts: 8,048
Blog Entries: 5

Rep: Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925Reputation: 2925
I find that FreeFileSync works for me as a handy GUI solution.

Note that, under normal "mirror" operation, FreeFileSync doesn't maintain multiple backup versions (e.g. multiple versions of one file if you've changed it multiple times), it purely creates one mirror image of all your files on the destination drive. So, if you've added a file, it adds it to the backup drive, if you've deleted a file then it deletes it, and if you've modified a file then it copies the new file over the older one on the backup drive.

You would have to decide if that suits you. If not, you might want to find a solution that stores all previous copies of files (those that have been modified or deleted).

Last edited by hydrurga; 05-05-2017 at 01:21 PM.
 
2 members found this post helpful.
Old 05-05-2017, 01:48 PM   #3
jailbait
LQ Guru
 
Registered: Feb 2003
Location: Virginia, USA
Distribution: Debian 12
Posts: 8,337

Rep: Reputation: 548Reputation: 548Reputation: 548Reputation: 548Reputation: 548Reputation: 548
There are several considerations in backups which mean that simplicity is a relative term.

Half of the considerations in backup have to do with restores. You have to be able to do different kinds of restores and do so on a system that may be crippled and only partially functional. If you use compression then decompressing during a restore is a complication that you should avoid if you have the space to keep uncompressed backups.

You should keep your backup in a file structure as close to possible to your live file structure. That makes it easier to select whatever subset of your backup that you need to restore. Tar files are probably the worst backup structure in this regard.

You should be able to log in as root. Running a restore as root is far less complicated than trying to do a restore as a user or superuser or running a restore from a live DVD or USB stick.

Quote:
Originally Posted by Gregg Bell View Post
What's a simple system for just simply backing up my most important folders (under 30GB) in my Home folder on a daily basis--that doesn't take a long time?
The best way to keep the copy time down is to only copy files that have changed since the last backup. The first copy copies everything and takes a long time. Then the incremental backups from then on only take a fraction of the time of a full backup.

You need to keep several generations of backup. You may not notice that you have lost an important file until some time has passed. How many generations of backup you keep depends on how much space you have available. The time interval between generations is a guess as to how long a missing file will be unnoticed.

So I suggest that you first make a backup plan to suit your needs and then go back and tackle the various backup programs again to find one that will do things your way.

----------------------------
Steve Stites

Last edited by jailbait; 05-05-2017 at 01:51 PM.
 
1 members found this post helpful.
Old 05-05-2017, 03:08 PM   #4
jefro
Moderator
 
Registered: Mar 2008
Posts: 21,980

Rep: Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624Reputation: 3624
The best is one that you use and like and works. Not sure you need daily backs of full files. Might look at rsycn or other tool that only selects changes?

Compression isn't bad if network transfer or disk access is slow and processor is faster.

You might be able to adjust compression levels if you feel that is it.
 
1 members found this post helpful.
Old 05-05-2017, 09:06 PM   #5
frankbell
LQ Guru
 
Registered: Jan 2006
Location: Virginia, USA
Distribution: Slackware, Ubuntu MATE, Mageia, and whatever VMs I happen to be playing with
Posts: 19,323
Blog Entries: 28

Rep: Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141Reputation: 6141
I use rsync. As I am using it at home, I back ~/ manually when I've done something worth backing up. Here's my little script, in case it helps.

Code:
rsync -a /home/[username] [username]@[serverIP]:/path/to/backup/directory
I must say, it took me a bit of research to get the script configured in a way that worked for me. There's a lot of material on the web about rsync, but much of it is rather impenetrable, especially if you are as green as I was when I started learning the command.

Last edited by frankbell; 05-05-2017 at 09:09 PM.
 
1 members found this post helpful.
Old 05-05-2017, 11:16 PM   #6
Shadow_7
Senior Member
 
Registered: Feb 2003
Distribution: debian
Posts: 4,137
Blog Entries: 1

Rep: Reputation: 874Reputation: 874Reputation: 874Reputation: 874Reputation: 874Reputation: 874Reputation: 874
I used G4U a few years back, it was simple-ish. Although I was trying to make a not so good install of win98 better use the disk. Fat16 so 2GB limit on a 10GB disk. And other things my brother did when doing hand me down tech to the nephews (he could have at least brought the install media). The only tech I had brought with me that trip was a 2GB usb stick. But I got it done.

Depends on what you want from a backup though. Just access to your data, and not a system recovery of an OS, rsync is pretty simple. Don't care much about drive space usage or time, dd is pretty simple. At least until you want to access the data on an image of the whole drive. But partx and other things makes it doable without using split to part out the partitions, or restoring the drive to access the data.

Compression can help save space and time depending on the speed of your computer versus the speed of I/O to various devices. Over a network, then most definitely use compression. Even gigabit ethernet is horridly slow compared to USB3 speeds or better.
 
1 members found this post helpful.
Old 05-06-2017, 12:31 AM   #7
Gregg Bell
Senior Member
 
Registered: Mar 2014
Location: Illinois
Distribution: Xubuntu
Posts: 2,034

Original Poster
Rep: Reputation: 176Reputation: 176
Quote:
Originally Posted by hydrurga View Post
I find that FreeFileSync works for me as a handy GUI solution.

Note that, under normal "mirror" operation, FreeFileSync doesn't maintain multiple backup versions (e.g. multiple versions of one file if you've changed it multiple times), it purely creates one mirror image of all your files on the destination drive. So, if you've added a file, it adds it to the backup drive, if you've deleted a file then it deletes it, and if you've modified a file then it copies the new file over the older one on the backup drive.

You would have to decide if that suits you. If not, you might want to find a solution that stores all previous copies of files (those that have been modified or deleted).
Thanks hydrurga. I just heard of FreeFileSync a few days ago and I think it's pretty amazing. I was just testing it. I think I'll stay with BackInTime but use this as an additional backup.

And as to the multiple backup versions, if it's important enough, I can just date each day's version.
 
Old 05-06-2017, 12:35 AM   #8
Gregg Bell
Senior Member
 
Registered: Mar 2014
Location: Illinois
Distribution: Xubuntu
Posts: 2,034

Original Poster
Rep: Reputation: 176Reputation: 176
Quote:
Originally Posted by jailbait View Post
There are several considerations in backups which mean that simplicity is a relative term.

Half of the considerations in backup have to do with restores. You have to be able to do different kinds of restores and do so on a system that may be crippled and only partially functional. If you use compression then decompressing during a restore is a complication that you should avoid if you have the space to keep uncompressed backups.

You should keep your backup in a file structure as close to possible to your live file structure. That makes it easier to select whatever subset of your backup that you need to restore. Tar files are probably the worst backup structure in this regard.

You should be able to log in as root. Running a restore as root is far less complicated than trying to do a restore as a user or superuser or running a restore from a live DVD or USB stick.



The best way to keep the copy time down is to only copy files that have changed since the last backup. The first copy copies everything and takes a long time. Then the incremental backups from then on only take a fraction of the time of a full backup.

You need to keep several generations of backup. You may not notice that you have lost an important file until some time has passed. How many generations of backup you keep depends on how much space you have available. The time interval between generations is a guess as to how long a missing file will be unnoticed.

So I suggest that you first make a backup plan to suit your needs and then go back and tackle the various backup programs again to find one that will do things your way.

----------------------------
Steve Stites
Thanks Steve. That's solid advice. I think although, like you say, it would be better to do the restore from root, I just don't have the skills for that yet. I've got the BackInTime for the older files. And the restore does work quite nicely. And now I'm experimenting with FreeFileSync. I'm feeling pretty comfortable at this point.
 
Old 05-06-2017, 12:37 AM   #9
Gregg Bell
Senior Member
 
Registered: Mar 2014
Location: Illinois
Distribution: Xubuntu
Posts: 2,034

Original Poster
Rep: Reputation: 176Reputation: 176
Quote:
Originally Posted by jefro View Post
The best is one that you use and like and works. Not sure you need daily backs of full files. Might look at rsycn or other tool that only selects changes?

Compression isn't bad if network transfer or disk access is slow and processor is faster.

You might be able to adjust compression levels if you feel that is it.
Thanks jefro. The GUI for BackInTime shows that rsync is the power behind it. And it is pretty quick. The FreeFileSync does the incremental changes.
 
Old 05-06-2017, 12:38 AM   #10
Gregg Bell
Senior Member
 
Registered: Mar 2014
Location: Illinois
Distribution: Xubuntu
Posts: 2,034

Original Poster
Rep: Reputation: 176Reputation: 176
Quote:
Originally Posted by frankbell View Post
I use rsync. As I am using it at home, I back ~/ manually when I've done something worth backing up. Here's my little script, in case it helps.

Code:
rsync -a /home/[username] [username]@[serverIP]:/path/to/backup/directory
I must say, it took me a bit of research to get the script configured in a way that worked for me. There's a lot of material on the web about rsync, but much of it is rather impenetrable, especially if you are as green as I was when I started learning the command.
Thanks Frank. I really should do something like that but I still need my security blanket (GUI) at this point!
 
Old 05-06-2017, 12:40 AM   #11
Gregg Bell
Senior Member
 
Registered: Mar 2014
Location: Illinois
Distribution: Xubuntu
Posts: 2,034

Original Poster
Rep: Reputation: 176Reputation: 176
Quote:
Originally Posted by Shadow_7 View Post
I used G4U a few years back, it was simple-ish. Although I was trying to make a not so good install of win98 better use the disk. Fat16 so 2GB limit on a 10GB disk. And other things my brother did when doing hand me down tech to the nephews (he could have at least brought the install media). The only tech I had brought with me that trip was a 2GB usb stick. But I got it done.

Depends on what you want from a backup though. Just access to your data, and not a system recovery of an OS, rsync is pretty simple. Don't care much about drive space usage or time, dd is pretty simple. At least until you want to access the data on an image of the whole drive. But partx and other things makes it doable without using split to part out the partitions, or restoring the drive to access the data.

Compression can help save space and time depending on the speed of your computer versus the speed of I/O to various devices. Over a network, then most definitely use compression. Even gigabit ethernet is horridly slow compared to USB3 speeds or better.
Thanks Shadow. The BackInTime I'm using does the compression and it's pretty quick. I'm thinking I'm pretty safe (and satisfied) at this point. Appreciate your input.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
LXer: Simple off-site backup of a MD RAID 1 system LXer Syndicated Linux News 0 08-30-2011 08:41 AM
Keeping a backup copy of my system. How? glore2002 Debian 22 05-01-2009 01:16 AM
Probem copy whole backup including hardlinks to another system hendrixx Linux - Software 3 08-05-2008 07:26 AM
How do I copy my system backup from hda3 to DVD? Balarabay1 Linux - Software 1 11-17-2006 10:24 AM
Simple backup system Azhrarn Linux - Software 1 09-16-2005 02:24 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Newbie

All times are GMT -5. The time now is 06:27 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration