Linux - NetworkingThis forum is for any issue related to networks or networking.
Routing, network cards, OSI, etc. Anything is fair game.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Does anyone know of a program for Linux systems that will maintain/update a set of file(s) on a group of around 50 PC's? The files need to be updated daily but done in an efficient manner. All PC's must be in sync with the same file from a server on the same network.
I could mount and copy with NFS but it is an old protocol and I had bad experience with it on a SCO OpenServer implemntation (we are migrating to Linux.) I want to keep the files identical on the PCs as they provide a standard set of tools and data reference for apps we develope.
I want a fast deployment method that is good at maintaining data integrity since any data contention problems could be amplified by the system that need to be maintained.
Example, our old network deploys via NFS in an automated set of bourne scripts. In some instances, the PC will have the updated file but all other files in the directory are missing (except the running program.)
I was going through some of my old threads and decided to reply to my own in case any poor soul comes across this problem. First of all the NFS issue where it wipes out the directory is due to file locking and the crappy implementation of NFS on SCO.
Secondly, if you want a data deployment system that will maintain duplicate data files on a network of systems, use Rsync. It is Andrew Tridgell's (creator of Samba as well) gift to network admins like me.
All you have to do is setup the Rsync server and then push/pull files with rsync -avz <from_file> <to_file>. I have a shell script that runs in cron on the clients and will rsync every couple of minutes. Rsync will only update modified files and uses compression while sending data. Its the best thing since sliced bread and the overhead is low on the server!
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.