Need suggestions - best way to backup computers across LAN
Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Need suggestions - best way to backup computers across LAN
I want to have centralized unattended backup of 5 computers on my LAN but have no idea how to implement it. One of the computers will have a USB 3.0 to SATA Dual Bay External Hard Drive Docking Station with two 2TB SSD's in it that will be online 24x7 that is on a UPS.
There are a number of ways to do this. I personally use duplicity on the command line. I have it set to run a script at x time of day. Combine this with pxe waking of the system and you have a good setup. Here is my script for the home directories of a given machine.
Code:
if [[ "$USER" == root ]] ; then
for user in /home/* ; do
BACKUPUSER=$(basename "$user")
TARGETDIR="$POOLLOC"/backups/homebackups/"$HOSTNAME"/"$BACKUPUSER"
if [[ ! -d "$TARGETDIR" ]] ; then
mkdir -p "$TARGETDIR"
chown -R "$BACKUPUSER":"$BACKUPUSER" "$TARGETDIR"
fi
su -c "duplicity \
--exclude-if-present .nobackup \
--no-encryption \
--full-if-older-than 1M \
/home/${BACKUPUSER} file://${TARGETDIR}" "$BACKUPUSER"
su -c \
"duplicity remove-all-but-n-full 4 --force file://${TARGETDIR}" \
"$BACKUPUSER"
done
fi
It's rough, probably not the best way. But it works for me. I have a single script shared via nfs to all machines in my home.
Last edited by jmgibson1981; 09-01-2020 at 09:13 AM.
@jmgibson1981 - Thanks for the quick reply. I've never used duplicity, I need to check it out. The script looks like it might be the solution I'm looking for.
I want to have centralized unattended backup of 5 computers on my LAN but have no idea how to implement it. One of the computers will have a USB 3.0 to SATA Dual Bay External Hard Drive Docking Station with two 2TB SSD's in it that will be online 24x7 that is on a UPS.
On that machine, you might consider using the OpenZFS file system for your storage drives. OpenZFS has file level error checking and correction.
However, as far as the logistics go, if the machine is online, it contains merely copies and not a backup. You'll need something you can take offline or remove for it to be a backup and, for the minutes it is online, it is also only a copy. So you'll really need two removable devices.
@Turbocapitalist What I was envisioning was running rsync against the clients from the backup machine, not imaging the drives, although that's not a bad idea. All the clients are laptops, so have built-in UPS.
I'm all for the original plan, copying the data, just with the two additional requirements.
The OpenZFS file system would be for the destination drives used by rsync or duplicity or any similar tools handling the data copying. You'd still transfer the data, it's just that once on your 24/7 UPS machine, there would be some protection for the files at rest from being vulnerable to the physical medium flipping a bit or two.
@Turbocapitalist - thanks for that, it verifies the concept I had envisioned. I guess the next question is how to make rsync target a client machine from the backup machine, but that's a question for another area in the forums.
As with most things here there are several ways to do it. Since the clients are not on 24/7 but the server is, the clients would be the first to know they are active and can rely on the the server being on. So I would say have the clients contact the server. Maybe choose something like anacron to enable the client to catch up on cron jobs that were scheduled while the system was off.
As for the rsync options, it runs over SSH since almost forever and that allows keys to be used for authentication. (SSH certificates would also work.)
Code:
rsync -a -e 'ssh -i /home/larryhale/.ssh/somekey_ed25519' \
/home/larryhale/ lh@example.com:/home/larryhale/
Once you have the basic script, you can lock down the keys on the server-side by prepending command="..." to the public key found in the authorized_keys file:
The exact formula to put there can be found by using the -v option with the SSH client. Also, see "man sshd" and scroll down to the section AUTHORIZED_KEYS FILE FORMAT about the command="..." part.
Note that the authorized_keys file can be relocated on the SSH server to be outside of the home directory. See "man sshd_config" and scroll down to AuthorizedKeysFile. That would allow you to copy a home directory as-is yet still have additional keys or controls on the login process.
Last edited by Turbocapitalist; 09-01-2020 at 10:34 AM.
I use Linux Mint and there is no option to format a drive to OpenZFS or read from an OpenZFS partition. The closest I can do is BTRF or XFS. Which do you think I should use?
Hopefully others will chime in here over time. I've not used Btrfs, though can't remember why. So of the two, I would try XFS with copy-on-write enabled. But that aspect of XFS might not be so relevant if you have a UPS.
Ok, version 20 might be able to add OpenZFS support since you can add it to Debian and derivatives and the very closely related Ubuntu 20.04 supports it out of the box. Fortunately you just need a data partition or two with OpenZFS not the root partition. So that would be easier. You do have to install a few packages and then reboot to get it though.
Hey there, there are many of online cloud services to backup your data on with different sizes and prices, some companies also have offers for big sizes cloud services for example Google Drive gives you 50GB free and can be extended unlimited for cheap prices, there’s also Dropbox, some servers are faster than other with uploading and downloading, i didn’t have this experience with cloud services until i had to change my computer which was powered on Linux to another PC because the old one was getting really old and slow and it has a millions of my important work data for over 10 years and i was afraid to lose them, so i had to find a way to save them all by one and i found all these services which made more confused but at last i found this site for data backup services and advices and it showed me the best provider depends on my data size and for lowest prices, you can try it.
Last edited by yourfriend007; 09-11-2020 at 05:03 AM.
I use the code blow to backup my laptops & workstations to my file server. Replace the value for RUSER & RHOST to match your user name and IP address for the back up destination. This script uses RSA keys to function.
This will create a folder called day of week (DOW) as well as populate a log file for you to monitor. Also all log files that are greater than 30 days old are removed to preserve space.
This script also uses the --exclude-from= option. This is a simple text file that will skip file types, files, and directories set by the user.
EX: -*.conf
Good luck.
Code:
#!/bin/bash
#
###########################################################
# Created by Ray Brunkow January 11, 2015
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 2 or version 3 of the
# license, at your option.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#############################################################
#
#
#############################################################
#
# Backup program for laptops to Linux server using
# rsync -aviS
#
#############################################################
# Veriables
RUSER=user
RHOST=1.1.1.1
# RHOST=destination
WHO=`whoami`
DOW=`date +%A`
RDIR=/exports/backup
HOMEDIR="$HOME"
dtstamp="`date +%Y-%m-%d-%H:%M `"
log=${HOMEDIR}/logs/${dtstamp}-rsync.log
###############################################################
### Check for a logs directory, if not found create one
###############################################################
[ ! -d "${HOMEDIR}/logs" ] && mkdir -p ${HOMEDIR}/logs >> /dev/null 2>&1
###############################################################
#
# Rsync preserving permissions
#
###############################################################
rsync -aviS --exclude-from=${HOMEDIR}/excludes.txt ${HOMEDIR}/ ${RUSER}@${RHOST}:${RDIR}/${WHO}/${DOW}/ >> ${log} 2>&1
### Cleaning up log files
###############################
find ${HOMEDIR}/logs/*.log -mtime +30 -exec rm '{}' \;
exit
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.