LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Networking
User Name
Password
Linux - Networking This forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.

Notices


Reply
  Search this Thread
Old 09-01-2020, 09:03 AM   #1
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Rep: Reputation: Disabled
Question Need suggestions - best way to backup computers across LAN


I want to have centralized unattended backup of 5 computers on my LAN but have no idea how to implement it. One of the computers will have a USB 3.0 to SATA Dual Bay External Hard Drive Docking Station with two 2TB SSD's in it that will be online 24x7 that is on a UPS.

I'd appreciate any ideas.
 
Old 09-01-2020, 09:12 AM   #2
jmgibson1981
Senior Member
 
Registered: Jun 2015
Location: Tucson, AZ USA
Distribution: Debian
Posts: 1,149

Rep: Reputation: 393Reputation: 393Reputation: 393Reputation: 393
There are a number of ways to do this. I personally use duplicity on the command line. I have it set to run a script at x time of day. Combine this with pxe waking of the system and you have a good setup. Here is my script for the home directories of a given machine.

Code:
if [[ "$USER" == root ]] ; then
                for user in /home/* ; do
                        BACKUPUSER=$(basename "$user")
                        TARGETDIR="$POOLLOC"/backups/homebackups/"$HOSTNAME"/"$BACKUPUSER"
                        if [[ ! -d "$TARGETDIR" ]] ; then
                                mkdir -p "$TARGETDIR"
                                chown -R "$BACKUPUSER":"$BACKUPUSER" "$TARGETDIR"
                        fi
                        su -c "duplicity \
                        --exclude-if-present .nobackup \
                        --no-encryption \
                        --full-if-older-than 1M \
                        /home/${BACKUPUSER} file://${TARGETDIR}" "$BACKUPUSER"
                        su -c \
                        "duplicity remove-all-but-n-full 4 --force file://${TARGETDIR}" \
                        "$BACKUPUSER"
                done
        fi
It's rough, probably not the best way. But it works for me. I have a single script shared via nfs to all machines in my home.

Last edited by jmgibson1981; 09-01-2020 at 09:13 AM.
 
Old 09-01-2020, 09:22 AM   #3
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Original Poster
Rep: Reputation: Disabled
@jmgibson1981 - Thanks for the quick reply. I've never used duplicity, I need to check it out. The script looks like it might be the solution I'm looking for.
 
Old 09-01-2020, 09:26 AM   #4
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,333
Blog Entries: 3

Rep: Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730
Quote:
Originally Posted by LarryHale View Post
I want to have centralized unattended backup of 5 computers on my LAN but have no idea how to implement it. One of the computers will have a USB 3.0 to SATA Dual Bay External Hard Drive Docking Station with two 2TB SSD's in it that will be online 24x7 that is on a UPS.
On that machine, you might consider using the OpenZFS file system for your storage drives. OpenZFS has file level error checking and correction.

However, as far as the logistics go, if the machine is online, it contains merely copies and not a backup. You'll need something you can take offline or remove for it to be a backup and, for the minutes it is online, it is also only a copy. So you'll really need two removable devices.
 
Old 09-01-2020, 09:49 AM   #5
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Original Poster
Rep: Reputation: Disabled
@Turbocapitalist What I was envisioning was running rsync against the clients from the backup machine, not imaging the drives, although that's not a bad idea. All the clients are laptops, so have built-in UPS.

Last edited by LarryHale; 09-01-2020 at 09:51 AM.
 
Old 09-01-2020, 09:56 AM   #6
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,333
Blog Entries: 3

Rep: Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730
I'm all for the original plan, copying the data, just with the two additional requirements.

The OpenZFS file system would be for the destination drives used by rsync or duplicity or any similar tools handling the data copying. You'd still transfer the data, it's just that once on your 24/7 UPS machine, there would be some protection for the files at rest from being vulnerable to the physical medium flipping a bit or two.
 
Old 09-01-2020, 10:16 AM   #7
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Original Poster
Rep: Reputation: Disabled
@Turbocapitalist - thanks for that, it verifies the concept I had envisioned. I guess the next question is how to make rsync target a client machine from the backup machine, but that's a question for another area in the forums.
 
Old 09-01-2020, 10:33 AM   #8
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,333
Blog Entries: 3

Rep: Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730
This thread is probably fine.

As with most things here there are several ways to do it. Since the clients are not on 24/7 but the server is, the clients would be the first to know they are active and can rely on the the server being on. So I would say have the clients contact the server. Maybe choose something like anacron to enable the client to catch up on cron jobs that were scheduled while the system was off.

As for the rsync options, it runs over SSH since almost forever and that allows keys to be used for authentication. (SSH certificates would also work.)

Code:
rsync -a -e 'ssh -i /home/larryhale/.ssh/somekey_ed25519' \
        /home/larryhale/ lh@example.com:/home/larryhale/
Once you have the basic script, you can lock down the keys on the server-side by prepending command="..." to the public key found in the authorized_keys file:

Code:
command="/usr/bin/rsync --server --sender -vlogDtpre.iLsfxC . /home/larryhale/"  ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGIaAGNj74eo/Dt4B4qwUT+z7Wyyw3oqDUL1VmMLZviv
The exact formula to put there can be found by using the -v option with the SSH client. Also, see "man sshd" and scroll down to the section AUTHORIZED_KEYS FILE FORMAT about the command="..." part.

Code:
rsync -a -v -e 'ssh -v -i /home/larryhale/.ssh/somekey_ed25519' \
        /home/larryhale/ lh@example.com:/home/larryhale/
Note that the authorized_keys file can be relocated on the SSH server to be outside of the home directory. See "man sshd_config" and scroll down to AuthorizedKeysFile. That would allow you to copy a home directory as-is yet still have additional keys or controls on the login process.

Last edited by Turbocapitalist; 09-01-2020 at 10:34 AM.
 
Old 09-01-2020, 10:46 AM   #9
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Original Poster
Rep: Reputation: Disabled
I use Linux Mint and there is no option to format a drive to OpenZFS or read from an OpenZFS partition. The closest I can do is BTRF or XFS. Which do you think I should use?
 
Old 09-01-2020, 10:48 AM   #10
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,333
Blog Entries: 3

Rep: Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730
Hopefully others will chime in here over time. I've not used Btrfs, though can't remember why. So of the two, I would try XFS with copy-on-write enabled. But that aspect of XFS might not be so relevant if you have a UPS.

Which version of Linux Mint?
 
Old 09-01-2020, 10:52 AM   #11
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Original Poster
Rep: Reputation: Disabled
LM 20. Been using LM since LM 14.
 
Old 09-01-2020, 10:56 AM   #12
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,333
Blog Entries: 3

Rep: Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730Reputation: 3730
Ok, version 20 might be able to add OpenZFS support since you can add it to Debian and derivatives and the very closely related Ubuntu 20.04 supports it out of the box. Fortunately you just need a data partition or two with OpenZFS not the root partition. So that would be easier. You do have to install a few packages and then reboot to get it though.
 
Old 09-01-2020, 10:59 AM   #13
LarryHale
LQ Newbie
 
Registered: Mar 2016
Location: Coastal South Carolina
Distribution: LM 20.4
Posts: 21

Original Poster
Rep: Reputation: Disabled
I'll have to check the repositories. Yep, it's available.

Last edited by LarryHale; 09-01-2020 at 11:02 AM.
 
Old 09-07-2020, 05:03 PM   #14
yourfriend007
LQ Newbie
 
Registered: Jul 2020
Posts: 3

Rep: Reputation: Disabled
Hey there, there are many of online cloud services to backup your data on with different sizes and prices, some companies also have offers for big sizes cloud services for example Google Drive gives you 50GB free and can be extended unlimited for cheap prices, there’s also Dropbox, some servers are faster than other with uploading and downloading, i didn’t have this experience with cloud services until i had to change my computer which was powered on Linux to another PC because the old one was getting really old and slow and it has a millions of my important work data for over 10 years and i was afraid to lose them, so i had to find a way to save them all by one and i found all these services which made more confused but at last i found this site for data backup services and advices and it showed me the best provider depends on my data size and for lowest prices, you can try it.

Last edited by yourfriend007; 09-11-2020 at 05:03 AM.
 
Old 09-09-2020, 09:57 AM   #15
lleb
Senior Member
 
Registered: Dec 2005
Location: Florida
Distribution: CentOS/Fedora/Pop!_OS
Posts: 2,983

Rep: Reputation: 551Reputation: 551Reputation: 551Reputation: 551Reputation: 551Reputation: 551
I use the code blow to backup my laptops & workstations to my file server. Replace the value for RUSER & RHOST to match your user name and IP address for the back up destination. This script uses RSA keys to function.

This will create a folder called day of week (DOW) as well as populate a log file for you to monitor. Also all log files that are greater than 30 days old are removed to preserve space.

This script also uses the --exclude-from= option. This is a simple text file that will skip file types, files, and directories set by the user.

EX: -*.conf

Good luck.

Code:
#!/bin/bash
#
###########################################################
# Created by Ray Brunkow January 11, 2015
#
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 2 or version 3 of the
# license, at your option.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
#############################################################
#
#
#############################################################
#
#	Backup program for laptops to Linux server using 
#	rsync -aviS
#
#############################################################

#	Veriables

RUSER=user
RHOST=1.1.1.1
# RHOST=destination
WHO=`whoami`
DOW=`date +%A`
RDIR=/exports/backup
HOMEDIR="$HOME"
dtstamp="`date +%Y-%m-%d-%H:%M `"
log=${HOMEDIR}/logs/${dtstamp}-rsync.log

###############################################################

### 	Check for a logs directory, if not found create one
###############################################################

[ ! -d "${HOMEDIR}/logs" ] && mkdir -p ${HOMEDIR}/logs >> /dev/null 2>&1

###############################################################
#
#	Rsync preserving permissions
#
###############################################################

	rsync -aviS --exclude-from=${HOMEDIR}/excludes.txt ${HOMEDIR}/ ${RUSER}@${RHOST}:${RDIR}/${WHO}/${DOW}/ >> ${log} 2>&1

### 	Cleaning up log files
###############################

	find ${HOMEDIR}/logs/*.log -mtime +30 -exec rm '{}' \;

exit
 
  


Reply

Tags
back-up, network confguration, unattended



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Need suggestions on best way to expand embedded memory (flash) space elsutjr Linux - General 4 05-05-2015 08:12 AM
LXer: The Best & Fastest Computers are Linux Computers LXer Syndicated Linux News 0 06-23-2011 12:40 AM
Using rsync to backup data. Best way to backup folders? Micro420 Linux - General 2 11-23-2006 01:13 AM
Using second harddrive to share across local network. Best way to do so? mlsbraves Linux - Networking 2 04-18-2005 11:54 AM
KDE Lan Browser doesn't display available LAN computers dance2die Linux - Newbie 2 01-16-2005 08:14 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Networking

All times are GMT -5. The time now is 12:43 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration