Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
I had Back In Time and it works great until it doesn't. Free File Sync seems like it would be good but it often finds files it can't sync for some reason. I tried Kbackup. They said it was easy. It's not. At least not for me. I ran it for hours and when the Home folder was finally copied the screen froze. Then I finally learned how to untar the tar file and there was nothing in the file. (Admittedly, I may have done something wrong.)Grsync seems powerful but I get confused with "verbose" and all that stuff. Anyway, there's got to be something simple. All I would ask is that it be able to do incremental changes. I don't need old versions or anything like that. Thank you.
I use rsync with the -a (for archive) argument. Here's my little script:
Code:
rsync -a [/path/to/home] /media/sdb1/backups
I understand that Back in Time is a front end for rsync, So it might help it you told us more about how Back in Time has misbehaved.
Thanks Frank. Back In Time will just throw errors at me sometimes. It's a really good program, though. I don't mean to dis it. The latest thing was it was saying the receiving usb didn't have enough space.
Anyway, I was looking over the directions for grsync and I'm starting to think that might work for me. (It's not as complicated as it seems.)
So in your command sdb1 is a usb drive? What does your command actually do? (I'm still not real comfortable doing stuff like that in the terminal, though.)
Thanks Frank. Back In Time will just throw errors at me sometimes. It's a really good program, though. I don't mean to dis it. The latest thing was it was saying the receiving usb didn't have enough space.
Well, that last one sounds like a fairly straight forward error to understand.
Quote:
Originally Posted by Gregg Bell
Anyway, I was looking over the directions for grsync and I'm starting to think that might work for me. (It's not as complicated as it seems.)
So in your command sdb1 is a usb drive? What does your command actually do? (I'm still not real comfortable doing stuff like that in the terminal, though.)
It is wherever you want to backup your files to - so in your case probably a usb drive.
If you were mostly happy with Back In Time, perhaps people here could help you with whatever problems you had with it.
Rolling your own backup solution if you're not comfortable with the commandline could be a bit of a headache.
Distribution: openSUSE, Raspbian, Slackware. Previous: MacOS, Red Hat, Coherent, Consensys SVR4.2, Tru64, Solaris
Posts: 2,803
Rep:
Quote:
Originally Posted by Gregg Bell
Back In Time will just throw errors at me sometimes. It's a really good program, though. I don't mean to dis it. The latest thing was it was saying the receiving usb didn't have enough space.
What did "df <receiving-usb-device>" return after that message was seen?
I've never used "Back In Time" so I wonder: does it save copies of the source trees in point-in-time directories? (Ah... I checked the documentation and it looks like this may very well be the case.) If so, perhaps there's too many of the older snapshots and insufficient space for another. Did that error message appear immediately or early in the backup process? (Perhaps BIT scanned your hard disk and determined that the number of files to be backed up exceeded the remaining space on the USB drive and threw up its hands before doing anything.) After it had been running for a while? (It was backing up but simply ran out of space while performing the backup.)
All of that would be helpful to know.
Also, I'd try cleaning up old cruft like browser caches and thumbnails and trying again. If that works, perhaps BIT allows you to specify "backup up everything except that... and that... and..." (You get the idea.)
I had Back In Time and it works great until it doesn't. Free File Sync seems like it would be good but it often finds files it can't sync for some reason. I tried Kbackup. They said it was easy. It's not. At least not for me. I ran it for hours and when the Home folder was finally copied the screen froze. Then I finally learned how to untar the tar file and there was nothing in the file. (Admittedly, I may have done something wrong.)Grsync seems powerful but I get confused with "verbose" and all that stuff. Anyway, there's got to be something simple. All I would ask is that it be able to do incremental changes. I don't need old versions or anything like that. Thank you.
All this sounds like you need to solve the problems you listed instead of dismissing them.
In other words, what if we recommend "something else" and it "works until it doesn't" (great problem description btw)?
Well, that last one sounds like a fairly straight forward error to understand.
Thanks Evo2. Yeah, it seems that way but it wasn't. There was plenty of room on the USB drive. What happened is as soon as I took another "snapshot" these black squares started filling the computer screen saying there wasn't enough room. I'd run into that problem before, and as I recall I had to go to the dev in Github to solve it.
Quote:
Originally Posted by evo2
It is wherever you want to backup your files to - so in your case probably a usb drive.
If you were mostly happy with Back In Time, perhaps people here could help you with whatever problems you had with it.
Rolling your own backup solution if you're not comfortable with the commandline could be a bit of a headache.
Evo2.
Yeah, I'm staying away from the commandline. It's like Universal replace in LO or Word. Too risky. (Too much power in the hands of somebody as impatient as me.)
What did "df <receiving-usb-device>" return after that message was seen?
I've never used "Back In Time" so I wonder: does it save copies of the source trees in point-in-time directories? (Ah... I checked the documentation and it looks like this may very well be the case.) If so, perhaps there's too many of the older snapshots and insufficient space for another. Did that error message appear immediately or early in the backup process? (Perhaps BIT scanned your hard disk and determined that the number of files to be backed up exceeded the remaining space on the USB drive and threw up its hands before doing anything.) After it had been running for a while? (It was backing up but simply ran out of space while performing the backup.)
All of that would be helpful to know.
Also, I'd try cleaning up old cruft like browser caches and thumbnails and trying again. If that works, perhaps BIT allows you to specify "backup up everything except that... and that... and..." (You get the idea.)
HTH...
Thanks rnturn. I was definitely short on trying to figure it out. There was only one snapshot on the USB so there was plenty of room (I've had up to seven). I had run into the problem before and knew that the fix was pretty involved and just wasn't up for it. The thing is I don't really need anything more than the current Home folder backed up. In fact, when BIT gets a few snapshots on it it gets really slow. I'm trying Grsync right now. I think this might be my answer.
All this sounds like you need to solve the problems you listed instead of dismissing them.
In other words, what if we recommend "something else" and it "works until it doesn't" (great problem description btw)?
LOL You mean you needed a better description than that?
I'm running Grsync right now. All I need is one backup. BIT specialized in going back in time. I just want now. We'll see how this goes. I can go back (no pun intended) and figure out BIT if I need to.
I've been using Duplicity globally by the root crontab. It works fine and I've been able to do simple restorations. My problem is how to restore if I lose the existing directory /home/"$USER"/.cache/duplicity folder. Here is my script.
Code:
#!/bin/sh
desktop_home_dir_func() {
if [ ! -z "$POOLLOC" ] ; then
TARGET="$POOLLOC"/backups/homebackups
if [ "$USER" = root ] ; then
HOSTNAME=$(uname -n)
for user in /home/* ; do
BUSER=$(basename "$user")
if [ ! -d "$TARGET"/"$HOSTNAME"/"$BUSER}" ] ; then
mkdir -p "$TARGET"/"$HOSTNAME"/"$BUSER" && chown -R \
"$BUSER":"$BUSER" "$TARGET"/"$HOSTNAME"/"$BUSER"
fi
su -c "duplicity \
--exclude-if-present .nobackup \
--no-encryption \
--full-if-older-than 1M \
/home/${BUSER} file://${TARGET}/${HOSTNAME}/${BUSER}" "$BUSER"
su -c \
"duplicity remove-all-but-n-full 4 --force \
file://${TARGET}/${HOSTNAME}/${BUSER}" "$BUSER"
done
else
lsblk | head -n 2 | grep ltsp && HOSTNAME=server || HOSTNAME=$(uname -n)
RESTFOLDER=/home/"$USER"/Desktop/RESTORE_DELETE_ASAP
duplicity \
--no-encryption \
--progress \
file://"$TARGET"/"$HOSTNAME"/"$USER"/ "$RESTFOLDER"
touch "$RESTFOLDER"/.nobackup
fi
fi
}
Last edited by jmgibson1981; 11-25-2020 at 12:55 PM.
Putting a backup of /home in /home is totally useless, just a waste of time and storage space. Backups should go to an external drive, either local, network, or cloud, preferably more than one. I know nothing about duplicity.
Putting a backup of /home in /home is totally useless, just a waste of time and storage space.
If directed at me might you point out the problem in my script. I'd like to it to be perfect. My "$POOLLOC" is an nfs mounted directory from my server.
.cache should not be considered as permanent. Anything there is subject to loss. I suggest backing up /home/"$USER"/.cache/duplicity to someplace external to /home. That could be done separately or as part of the normal backup.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.