LinuxQuestions.org
Help answer threads with 0 replies.
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 02-19-2019, 09:21 AM   #1
mkrco
LQ Newbie
 
Registered: Feb 2019
Posts: 5

Rep: Reputation: Disabled
How do you deal with backups on multiple computers?


Hi everyone,

I'm just curious on what strategies people may use to maintain backups on multiple computers. I'm a scientist, and need to work on typically 4 or 5 different computers (often in the same day). And I need access to my libraries on all the computers. I'd like to implement a system where if I write a new piece of code, or add a new document on one machine, it'll show up (or be accessible) to all the other machines as well.

I've tried rsync, but found that since one of my machines is a windows laptop (running ubuntu through VBox) the owner of all my files there is "root" which tends to mess things up. Either ways, with rsyns there are just so many files that if often takes hours just for rsync to go through all the directories and check what it needs to update.

I've also tried using sshfs to mount a directory from a particular machine onto all the others.. But that can be quite slow, especially when trying to access many files in a short time...

So I was just wondering if someone has a clever strategy they might suggest.
 
Old 02-19-2019, 09:30 AM   #2
TB0ne
LQ Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 26,805

Rep: Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003Reputation: 8003
Quote:
Originally Posted by mkrco View Post
Hi everyone,
I'm just curious on what strategies people may use to maintain backups on multiple computers. I'm a scientist, and need to work on typically 4 or 5 different computers (often in the same day). And I need access to my libraries on all the computers. I'd like to implement a system where if I write a new piece of code, or add a new document on one machine, it'll show up (or be accessible) to all the other machines as well.

I've tried rsync, but found that since one of my machines is a windows laptop (running ubuntu through VBox) the owner of all my files there is "root" which tends to mess things up. Either ways, with rsyns there are just so many files that if often takes hours just for rsync to go through all the directories and check what it needs to update.

I've also tried using sshfs to mount a directory from a particular machine onto all the others.. But that can be quite slow, especially when trying to access many files in a short time...
The SSHFS solution is the easiest, and can be faster if you organize your files. Having folders/sub-folders makes things faster to load since, instead of loading the 1,000 files in a single directory, you load ten folder names that let you drill down to the 100 files each. Easier to navigate and faster over SSHFS. If they're large files, you can copy locally, work on it, and copy it back up. Low-tech, but simple.

Past that, consider using SVN or Git. Set up a repository somewhere, and check your files in/out. Has versioning and other goodies to help you keep track of what you're doing, too. Lots of editors (like kdevelop, for example), have plugins to deal directly with SVN and Git, letting you check files in/out automatically.

All these things depend on your network speed, of course, and what kind of data and files you're working with. Google Drive and the like is also a possibility.
 
Old 02-19-2019, 10:19 AM   #3
mralk3
Slackware Contributor
 
Registered: May 2015
Distribution: Slackware
Posts: 1,904

Rep: Reputation: 1053Reputation: 1053Reputation: 1053Reputation: 1053Reputation: 1053Reputation: 1053Reputation: 1053Reputation: 1053
How do you deal with backups on multiple computers?

I have a raspberry pi that uses autofs to automatically mount external disks via USB. NFS is used to share these drives over my LAN. Autofs helps lower power consumption because the disks spin down when not in use. My home is wired with Cat 6 and Gigabit switches so file syncs do not take long. I found NFS to be faster than samba or sshfs. The only downside is that NFS is not as secure as samba and sshfs. I only store work files on the NFS share. I have two 1 Terabyte drives. One drive is networked and the second is not. The second is used to backup the network drive locally on the Pi.

For music and pictures I use Google drive. I have a G suite account for 5$ a month, with the option of disk storage upgrade at any time. I find Google drive to be a better solution because my Mobile devices run Android.

My work files are stored on my NFS share.
 
Old 02-19-2019, 10:52 AM   #4
fatmac
LQ Guru
 
Registered: Sep 2011
Location: Upper Hale, Surrey/Hants Border, UK
Distribution: Mainly Devuan, antiX, & Void, with Tiny Core, Fatdog, & BSD thrown in.
Posts: 5,536

Rep: Reputation: Disabled
External media(?) - USB hard drive - pendrives

Maybe - ssh/sftp - rsync

Set one up as a file server(?)
 
Old 02-19-2019, 12:24 PM   #5
Turbocapitalist
LQ Guru
 
Registered: Apr 2005
Distribution: Linux Mint, Devuan, OpenBSD
Posts: 7,382
Blog Entries: 3

Rep: Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773Reputation: 3773
Yes, set one up as an SFTP server. That is very easy to access and even legacy operating systems have ways of getting files to and from an SFTP server.

Or else rent access to something encrypted like Tarsnap.
 
Old 02-27-2019, 06:16 AM   #6
mkrco
LQ Newbie
 
Registered: Feb 2019
Posts: 5

Original Poster
Rep: Reputation: Disabled
thanks for the replies

Looks like sshfs might be the best option for me after all. That way for low-bandwidth stuff I can just keep everything in one location, and when I have high bandwidth stuff I can copy it locally and then copy back. Thank you.
 
Old 02-27-2019, 03:54 PM   #7
jefro
Moderator
 
Registered: Mar 2008
Posts: 22,023

Rep: Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632Reputation: 3632
Hope it works. Let us know how it goes maybe?.

Generally rsync has some low overhead.

I was trying to find where I read about using ZFS for this task. Pretty sure Sun had a page about how it would maintain copies and adjust overhead to match work. Still looking for that.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
XSIbackup - do backups overwrite existing backups robertkwild Linux - General 3 11-14-2020 09:33 AM
Ideas for backups where you cannot read what you have just written Ulysses_ Linux - General 9 10-11-2017 02:25 PM
Switching from dump tape directory backups to NAS disk backups on RHEL 6 Jerry_C Red Hat 0 06-05-2013 12:44 AM
ANNOYING FREEZES: Happens on multiple distributions on multiple computers. keithieopia Linux - Software 26 02-23-2009 02:36 PM
LXer: The Fourth ‘Patent Deal’ was with Europe… and the Sixth Deal That Won’t be LXer Syndicated Linux News 0 10-24-2007 03:40 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 05:00 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration