Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Not a newbie to Linux, but a newbie to more advanced networking here.
Now that Dropbox is limiting the number of devices with which one can use their service, I'd really like to move to an easy way of simply hosting my own files, well, at home.
I've looked into some more complex options, such as an NAS or Owncloud server, but those seem fairly involved for a relatively simple personal-use solution.
I'm already quite familiar with accessing my home "server" machine remotely via ssh and transferring files via rsync, so I keep thinking: shouldn't there be a relatively easy way to create a personal "cloud" simply by automating rsync to keep certain folders synced across machines, even my laptop when I'm out working? It seems like there ought to be an easy way to do this--or at least easier than manually using r-sync every time I switch machines.
How much data are you talking about, both the total to be stored and the amount changed daily?
For a generic answer, I'd just go with a home-built NAS and sshfs and/or rsync. For the NAS I'd also make sure that the underlying file system would be OpenZFS using RAID-Z1 or RAID-Z2. However because of OpenZFS' problematic licensing you have to add that extra in most cases.
There are other options like Syncthing and InterplanetaryFileSystem but I have no direct experience.
The Coda file system was almost an option a long time ago. I read about someone considering rejuvenating it but don't know where it stands now.
How much data are you talking about, both the total to be stored and the amount changed daily?
For a generic answer, I'd just go with a home-built NAS and sshfs and/or rsync. For the NAS I'd also make sure that the underlying file system would be OpenZFS using RAID-Z1 or RAID-Z2. However because of OpenZFS' problematic licensing you have to add that extra in most cases.
There are other options like Syncthing and InterplanetaryFileSystem but I have no direct experience.
The Coda file system was almost an option a long time ago. I read about someone considering rejuvenating it but don't know where it stands now.
Thanks for the reply!
As to your question, I'm not talking about tons of data. I have a lot stored, but the amount I actually want constantly updated across devices isn't that huge: certainly less than 5gb; once the folders in question are initially synced, probably less than 1gb on a daily basis.
Which is why I'm thinking there's gotta be an easier way than going through the rigamarole of setting up an NAS. I already have a linux box that I use for storage, and have port 22 forwarded so that I can ssh in or use rsync via my public ip address. Very often I just use rsync to grab an individual file remotely from that "storage" machine (which also runs a Minecraft server). I'm just thinking there must be a simpler way to, for example, start a script on startup that compares and updates a directory. But I'm new enough at this that I'm unsure of the best method.
How much data are you talking about, both the total to be stored and the amount changed daily?
For a generic answer, I'd just go with a home-built NAS and sshfs and/or rsync. For the NAS I'd also make sure that the underlying file system would be OpenZFS using RAID-Z1 or RAID-Z2. However because of OpenZFS' problematic licensing you have to add that extra in most cases.
There are other options like Syncthing and InterplanetaryFileSystem but I have no direct experience.
The Coda file system was almost an option a long time ago. I read about someone considering rejuvenating it but don't know where it stands now.
Thanks for the reply!
As to your question, I'm not talking about tons of data. I have a lot stored, but the amount I actually want constantly updated across devices isn't that huge: certainly less than 5gb; once the folders in question are initially synced, probably less than 1gb on a daily basis.
Which is why I'm thinking there's gotta be an easier way than going through the rigamarole of setting up an NAS. I already have a linux box that I use for storage, and have port 22 forwarded so that I can ssh in or use rsync via my public ip address. Very often I just use rsync to grab an individual file remotely from that "storage" machine (which also runs a Minecraft server). I'm just thinking there must be a simpler way to, for example, start a script on startup that compares and updates a directory. I'm guessing that I'm thinking more along the lines of a very simple, personal, peer-to-peer setup rather than some kind of server-client setup. But I'm new enough at this that I'm unsure of the best method.
Unison was another option but I'm not sure it was developed any more.
Your existing setup might be ok. Though for the central machine, I'd still suggest considering a move to OpenZFS RAID-Z1 or -Z2 for the file system. OpenZFS has checksumming so that when the drive goes bad, the error is detected and, hopefully, corrected.
Rysnc can be automated easily enough with SSH keys but be sure to make incremental backups from your central machine often. Adding the --update option to the --archive option will be quite helpful in reducing the risk of overwriting the wrong file.
Don't make this more complicated than it needs to be. A server is, at the heart of it, a computer that stores and serves data for other computers in a network.
What you need is a computer in your network with adequate hard drive space, network access, and file-sharing. It could even be a Raspberry Pi with a beefy external HDD. You could indeed use ssh, scp, rsync, and samba shares or NFS.
A web search for "raspberry pi as file server" will turn up a number of articles which I think you will find helpful, even if you end up using something other than a Pi for this.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.