LinuxQuestions.org
Latest LQ Deal: Latest LQ Deals
Home Forums Tutorials Articles Register
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Server
User Name
Password
Linux - Server This forum is for the discussion of Linux Software used in a server related context.

Notices


Reply
  Search this Thread
Old 01-18-2008, 05:05 AM   #1
johann_p
LQ Newbie
 
Registered: Jan 2004
Location: Vienna
Posts: 15

Rep: Reputation: 0
How to administer many identical desktops?


OK, this is not really a server question in the narrow sense, but I did not know which forum would be more apropriate (tell me if there is one).

What I would like to know is how to go about administrating many computers that should have identical software installation (OS packages, CPAN packages, Ruby gems, R packages etc.), should all have the same users with the same permissions defined, should all be able to access new printers in the network etc.

This would mean that
1) all machines should be easily installed in very similar ways (apart from slight hardware differences, identical ways when it comes to software).
2) some administrative command, e.g. installing a package should be done on all computers in some automatic, systematic, well-documented way instead of having the administrator do this manually or though some dirty scripting hack.
3) configuration changes should also get distributed to all computers ... e.g. a newly defined printer should be accessible identically on all of them
4) administrative stuff should not be visible or accessible to the end users: no sycurity update notifications, no access to configuration changes via sudo etc.

So .. how is this usually done? Are there ways how to automate and organize this kind of work independent of the Linux distro used?
Or are there distros which support these things in their own way? Which (free as in free beer) distro provides the best support?

We are in a planning phase of having everyone use a Linux desktop here but we need to sort these kinds of things out beforehand.

Out of the personal preferences of many, Ubuntu would be the ideal choice, but that would mean that Ubuntu should support the actions described above in some way.
 
Old 01-18-2008, 06:30 AM   #2
ilikejam
Senior Member
 
Registered: Aug 2003
Location: Glasgow
Distribution: Fedora / Solaris
Posts: 3,109

Rep: Reputation: 97
Hi.

There's two ways to do this that I can think of.
1) Use NFS to have a centralised file server that exports /usr/local or /opt or whatever to all the desktops - change the software on the NFS server and the changes will appear on the desktops automatically (since they're sharing the filesystem)

2) Set up your own apt/yum/whatever repository on a server, configure apt/yum/whatever on all the desktops to point to that, and set up a cron job that runs the package management 'update' every night to see if there's been any changes rolled out to the repository.

Dave
 
Old 01-18-2008, 07:03 AM   #3
rupertwh
Member
 
Registered: Sep 2006
Location: Munich, Germany
Distribution: Debian / Ubuntu
Posts: 297

Rep: Reputation: 49
Hi,

I am currently planning something similar and thought about creating one 'master' package, which consists mainly of dependencies and possibly some configuration script(s).
When I want some new software installed on the workstations, I'd just add it as a new dependency. apt-get dist-upgrade on the workstations via cron would do the rest.
(So far it's just an idea, I have zero experience with creating packages yet -- comments, anyone?).
(EDIT: That seems to be pretty much what Dave suggested)

For centralized user accounts (and much more), LDAP is your friend.

Last edited by rupertwh; 01-18-2008 at 07:04 AM.
 
Old 01-18-2008, 11:41 AM   #4
johann_p
LQ Newbie
 
Registered: Jan 2004
Location: Vienna
Posts: 15

Original Poster
Rep: Reputation: 0
Quote:
Originally Posted by ilikejam View Post
Hi.

There's two ways to do this that I can think of.
1) Use NFS to have a centralised file server that exports /usr/local or /opt or whatever to all the desktops - change the software on the NFS server and the changes will appear on the desktops automatically (since they're sharing the filesystem)

2) Set up your own apt/yum/whatever repository on a server, configure apt/yum/whatever on all the desktops to point to that, and set up a cron job that runs the package management 'update' every night to see if there's been any changes rolled out to the repository.

Dave
Thanks -- we actually have some experience with solution 1) and the problems with that solution are one reason why we are looking for something that is more like what I described earlier.
Having a central, NFS mounted, /usr/local has many problems, e.g. when administering perl packages. I have also simplified the situation a bit by saying the computers will all have the same configuration

As to solution 2) this is more or less what we would like to avoid, if possible: a self-written hack
Since OS packages would not be the only thing to administer centrally for groups of computers, we would need similar cron-job or other scripts for CPAN packages, plugin updates, configuration file changes etc.

What I was wondering mainly is that all these tasks seem to be so common to everyone who has to deal with dozens or hundreds of Linux desktops, that I was exepecting some software solution that would be more general and more organized than just solving things by self-written scripts.

Are there packages or already hacked together scripts or software solution that help in doing this?
 
Old 01-18-2008, 01:39 PM   #5
ilikejam
Senior Member
 
Registered: Aug 2003
Location: Glasgow
Distribution: Fedora / Solaris
Posts: 3,109

Rep: Reputation: 97
There's a commercial offering from Sun called Update Connection Enterprise (used to be called 'Aduva Onstage') which was designed to do what you want (it's RPM/Solaris only), but it's not cheap.

Personally, I would just create your own .deb or .rpm packages and go with an apt/yum repository - rpm files can just be containers for scripts that are to be run, so you can roll out a package which does a CPAN download when it's 'installed'.

Dave
 
Old 01-18-2008, 02:29 PM   #6
forrestt
Senior Member
 
Registered: Mar 2004
Location: Cary, NC, USA
Distribution: Fedora, Kubuntu, RedHat, CentOS, SuSe
Posts: 1,288

Rep: Reputation: 99
Check out cfengine (http://www.cfengine.org/). It has a fairly steep learning curve, but it will do everything you want.

HTH

Forrest
 
Old 01-18-2008, 04:45 PM   #7
johann_p
LQ Newbie
 
Registered: Jan 2004
Location: Vienna
Posts: 15

Original Poster
Rep: Reputation: 0
Smile

Quote:
Originally Posted by forrestt View Post
Check out cfengine (http://www.cfengine.org/). It has a fairly steep learning curve, but it will do everything you want.

HTH

Forrest
This looks very interesting indeed. Will take a while to figure and try it out, but we will definitely look at this!
Thanks!
 
Old 01-19-2008, 03:07 AM   #8
systemnotes
LQ Newbie
 
Registered: Apr 2007
Distribution: RHEL
Posts: 29

Rep: Reputation: 15
Aside from cfengine, we use a customized system of cron scripts, so that we can force changes where needed. This works great for configuration files, such as printers, /etc/resolv.conf, NIS or LDAP files, etc. -- anything that all machines should have.

Basically, there is a central repository that all machines can access. This can be nfs, or afs depending on how secure you want it.

Machines are configured to be part of a "cluster" or "duty" through a local configuration file.

Every machine has a set of cron scripts that are called up at appropriate times through crontab.
rc.1_per_boot
rc.1_per_hour
rc.1_per_day
rc.1_per_month
rc.4_per_hour
... etc...

the scripts check for the existence of a file in the shared directory structure. The machine knows what cluster it belongs to and what duties it should perform.

If the is no file in the repository, they run at their scheduled time, find nothing, and quit. If there is a file, they either copy it or run a script, depending on how you set it up.

This works on multiple platforms (hpux, solaris, linux) if done properly.

Rather than install software, it is easier to access it from nfs using rc.1_per_day, e.g.

rm -f /usr/bin/perl
ln -s /pkgs/perl/5.6.1/bin/perl /usr/bin/perl

You probably wouldn't do this with perl, but you get the idea how easy it is to "upgrade" the local version to a centrally located file. The problem is that you can't do this for laptops, since they will require local files.

-----
As for upgrading on multiple machines. This is not an easy task. I wouldn't recommend using a global yum update, because it is difficult to know what each machine has, and also to limit the repository. If you do this, be prepared to reimage some machines if everything breaks.

yum is a good way to do updates, but it has to be controlled, and probably shouldn't be left running automatically. If someone modifies their yum.conf, and an update runs it can mess up the machine.
 
  


Reply



Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
Administer Windows ACL from Linux? cilynx Linux - Server 2 09-25-2007 02:08 PM
Identical disks that are not identical staphanes Linux - Hardware 8 03-11-2006 11:50 AM
Looking for a package to administer hosting kuplo Linux - Software 1 10-22-2005 11:00 AM
Using WebMin to administer DNS depdiver Linux - Networking 1 04-21-2005 06:34 PM
I want to learn to administer an Oracle database ganninu Linux - Software 2 01-21-2004 02:42 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - Server

All times are GMT -5. The time now is 04:40 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration