[SOLVED] How do you deal with backups on multiple computers?
Linux - GeneralThis Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
How do you deal with backups on multiple computers?
Hi everyone,
I'm just curious on what strategies people may use to maintain backups on multiple computers. I'm a scientist, and need to work on typically 4 or 5 different computers (often in the same day). And I need access to my libraries on all the computers. I'd like to implement a system where if I write a new piece of code, or add a new document on one machine, it'll show up (or be accessible) to all the other machines as well.
I've tried rsync, but found that since one of my machines is a windows laptop (running ubuntu through VBox) the owner of all my files there is "root" which tends to mess things up. Either ways, with rsyns there are just so many files that if often takes hours just for rsync to go through all the directories and check what it needs to update.
I've also tried using sshfs to mount a directory from a particular machine onto all the others.. But that can be quite slow, especially when trying to access many files in a short time...
So I was just wondering if someone has a clever strategy they might suggest.
Hi everyone,
I'm just curious on what strategies people may use to maintain backups on multiple computers. I'm a scientist, and need to work on typically 4 or 5 different computers (often in the same day). And I need access to my libraries on all the computers. I'd like to implement a system where if I write a new piece of code, or add a new document on one machine, it'll show up (or be accessible) to all the other machines as well.
I've tried rsync, but found that since one of my machines is a windows laptop (running ubuntu through VBox) the owner of all my files there is "root" which tends to mess things up. Either ways, with rsyns there are just so many files that if often takes hours just for rsync to go through all the directories and check what it needs to update.
I've also tried using sshfs to mount a directory from a particular machine onto all the others.. But that can be quite slow, especially when trying to access many files in a short time...
The SSHFS solution is the easiest, and can be faster if you organize your files. Having folders/sub-folders makes things faster to load since, instead of loading the 1,000 files in a single directory, you load ten folder names that let you drill down to the 100 files each. Easier to navigate and faster over SSHFS. If they're large files, you can copy locally, work on it, and copy it back up. Low-tech, but simple.
Past that, consider using SVN or Git. Set up a repository somewhere, and check your files in/out. Has versioning and other goodies to help you keep track of what you're doing, too. Lots of editors (like kdevelop, for example), have plugins to deal directly with SVN and Git, letting you check files in/out automatically.
All these things depend on your network speed, of course, and what kind of data and files you're working with. Google Drive and the like is also a possibility.
How do you deal with backups on multiple computers?
I have a raspberry pi that uses autofs to automatically mount external disks via USB. NFS is used to share these drives over my LAN. Autofs helps lower power consumption because the disks spin down when not in use. My home is wired with Cat 6 and Gigabit switches so file syncs do not take long. I found NFS to be faster than samba or sshfs. The only downside is that NFS is not as secure as samba and sshfs. I only store work files on the NFS share. I have two 1 Terabyte drives. One drive is networked and the second is not. The second is used to backup the network drive locally on the Pi.
For music and pictures I use Google drive. I have a G suite account for 5$ a month, with the option of disk storage upgrade at any time. I find Google drive to be a better solution because my Mobile devices run Android.
Yes, set one up as an SFTP server. That is very easy to access and even legacy operating systems have ways of getting files to and from an SFTP server.
Or else rent access to something encrypted like Tarsnap.
Looks like sshfs might be the best option for me after all. That way for low-bandwidth stuff I can just keep everything in one location, and when I have high bandwidth stuff I can copy it locally and then copy back. Thank you.
I was trying to find where I read about using ZFS for this task. Pretty sure Sun had a page about how it would maintain copies and adjust overhead to match work. Still looking for that.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.