LinuxQuestions.org
Go Job Hunting at the LQ Job Marketplace
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices

Reply
 
Search this Thread
Old 12-31-2005, 05:56 PM   #1
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Rep: Reputation: 0
Linux for Public Library?


Hi. I work at a public library and presently "manage" our network. We have 6 systems that are for public internet use, 2 systems for public word processing use, and 5 systems for catalog access (basically a small set of webpages need to be accessed).

Right now we have a 2003 Server with XP clients, in which we use GPO to limit user activities, etc. This works, but not well at all.

I have recently looked into implementing Linux for our public computing solution. I use Linux at home on my laptop, but have NO idea how this would be implemented in a network type situation. Basically I want to have these systems as simple as possible, with only the features we want available.

I would like to be able to have the computers update on a schedule defined in a central location, be able to shut the computers down all at once, or as I choose.

I have looked into Userful DiscoverStation. Would this be a better solution than tweaking some other software package?

Sorry for the plethora of questions. Any kick in the right direction is greatfully appreciated.

Cody
 
Old 12-31-2005, 08:04 PM   #2
GrueMaster
Member
 
Registered: Aug 2005
Location: Oregon
Distribution: Kubuntu.
Posts: 848

Rep: Reputation: 30
There are several ways to do what you want (that's the beauty of Linux, right?). The easiest way is to install Linux on each machine with a default user account and setup autologin. By setting up an autologin, you can restrict who has access to the console, and it limits hacking. For web access machines, you can run a proxy server like squid, and it will limit the content that user can get to (like NetNanny).

If you really want to make administration easier in the long run, it will take a lot of up front work, but you can setup each system to network boot and go from there. This takes more planning upfront, and a lot of server side configuration, but in the end, you have one setup to update, as the other systems will get new updates simply by rebooting.

Unfortunately, the magnitude of your project is a little much to really get online just through this forum. It will take a lot of planning and administrative know how (true for any OS).
 
Old 01-01-2006, 12:21 AM   #3
cs-cam
Senior Member
 
Registered: May 2004
Location: Australia
Distribution: Gentoo
Posts: 3,544
Blog Entries: 4

Rep: Reputation: 56
You could look at perhaps using a thin-client setup. Basically, the server is the only "real" computer, the clients are just using it's resources etc over the network. That would centralise everything just nicely for you Ubuntu also has thin-client technology integrated into their OS with some purty GUI admin tools and stuff, it might be worth having a look at and it might have a lot of the hard work done for you. Not sure never tried it.

https://wiki.ubuntu.com/ThinClientHowto
 
Old 01-01-2006, 12:33 PM   #4
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Original Poster
Rep: Reputation: 0
Thanks for the kick in the right direction. I realize this is going to be an undertaking, just as setting up our 2003 server was, and I realize it's not going to be a 3 step process underlined in this forum. However, because I don't have a lot of networking experience (outside of what i've described, and using OS X, Linux and XP on a network and sharing at home), I am not really familiar with ANY of the technologies to do this.

Note that this does NOT have to be a completely free solution. I had been looking at Red Hat Enterprise setups, and Novell SuSE Linux setups... are any of these subscription/pay distributions worthwhile in making my life easier if we choose to pursue a Linux network, or would I do the same amount of work and setup with a FREE distribution?

Thank you!

Cody
 
Old 01-01-2006, 01:01 PM   #5
stress_junkie
Senior Member
 
Registered: Dec 2005
Location: Massachusetts, USA
Distribution: Ubuntu 10.04 and CentOS 5.5
Posts: 3,873

Rep: Reputation: 331Reputation: 331Reputation: 331Reputation: 331
Here is a place to look for integrating Linux into a Windows network. As you will see it is a career in itself. The thrust of it is that a Linux machine can participate in a Windows AD domain in the same way that an NT4 machine can. If you are using the older Windows network protocol then the Linux machine can fully participate in the domain, including using the Linux machine as a domain controller.

http://us2.samba.org/samba/docs/man/...TO-Collection/

Last edited by stress_junkie; 01-01-2006 at 01:02 PM.
 
Old 01-01-2006, 10:35 PM   #6
cs-cam
Senior Member
 
Registered: May 2004
Location: Australia
Distribution: Gentoo
Posts: 3,544
Blog Entries: 4

Rep: Reputation: 56
I don't know for sure but I don't think using a distro like RHEL would make any difference. With pay distrobutions like that, the only bonus you get is official support, not specialised apps to make this sort of thing easier. If you want to try RHEL for free, download CentOS. it's RHEL with the Redhat logos removed, all perfectly legal under the GPL.

It won't be easy but the good thing is there are a lot of people here who are willing to help and a few have probably done it before. I'd suggest doing some research, pick a technology and then giving it a go somewhere. Boot them all from live CDs and practise sharing an X display over a network
 
Old 01-02-2006, 09:10 PM   #7
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Original Poster
Rep: Reputation: 0
Well, I have had little luck finding any additional information on this topic. I don't really even know what to search for. Would I be looking for an appication server? Here are two models that I am presently thinking of.

1) Each client has the OS installed on it. Somehow the client computers authenticate with the server (I assume LDAP?) which gives them a permissions number (the word is escaping me now... something like 0500). Each of the client servers would have the necessary applications, and the user would only be allowed to run the programs that I allow. Somehow updates would be pushed to the clients, or the configuration to make them do it themselves would be pushed. The part I'm confused about this method is how you determine which programs can and cannot be run. Is this pure program permissions, or what?

2) Each client has a VERY basic OS installed on it. Somehow the applications are launched from the server, such as that only ONE copy of the program needs to be present, and only one set of permissions needs to be set.

Any suggestions which is better, etc?

Thank you once again for helping me foray into a new frontier (for me)

Cody
 
Old 01-29-2006, 08:41 AM   #8
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Original Poster
Rep: Reputation: 0
Alright, resurrecting this thread from the dead after more work-week frustrations with XP/2003

I am taking a new look at the thin-client setup. This seems like a great model, as far as one copy of all software and only one machine to upgrade. However, I have a few concerns:

1) Server dies. What would happen if we came in one day to find our server's HD had crashed, and we were stuck? Realizing that it's necessary to have constant backups, would there be a way to clone the HD, and even possibly only clone the important parts of the HD, and quickly move them to a new machine?

2) Resources. With 15 computers, all running a web browser, some running videos, music, etc., this would probably put a heavy tax on our server, no? It's a 3GHz P4, a very nice fast machine, but definately not a dual-processor server.

3) Computers. All of our computers are fairly new (By that I mean 1GHz+, 256MB Ram+, 20GB HD+)... In other words, plenty of computing power and resources for each user. I'm afraid that by cutting each of these computers' advantages out, and making them run all off of one server, we would be slowing things down for everyone. Unless, perhaps, I am confused. I have been assuming that, with a Thin Client Setup, that the server would be running the apps, and simply forwarding the X data onto the client, who is really just acting like a dumb terminal. Perhaps it is the case that the server forwards the program onto the client, which uses it's own resources to run it?

Thanks for the guidance!
 
Old 01-29-2006, 08:48 AM   #9
cs-cam
Senior Member
 
Registered: May 2004
Location: Australia
Distribution: Gentoo
Posts: 3,544
Blog Entries: 4

Rep: Reputation: 56
Quote:
Perhaps it is the case that the server forwards the program onto the client, which uses it's own resources to run it?
I've never done a thin-client setup but I'm pretty sure you can do it this way. I don't have the money for enough hardware The answer to the crashing hard disk is simple. Buy another and use a RAID mirror. The two disks (preferably the same size) will appear as one disk so it's like having half the space (bad) but if one drive mysteriously dies you won't experience any downtime, just a note in your logs letting you know it happened so you can buy a replacement drive and rebuild the RAID tree on it while the library is closed
 
Old 01-29-2006, 09:12 AM   #10
jschiwal
Guru
 
Registered: Aug 2001
Location: Fargo, ND
Distribution: SuSE AMD64
Posts: 15,733

Rep: Reputation: 655Reputation: 655Reputation: 655Reputation: 655Reputation: 655Reputation: 655
You could have the system partitions mounted located on the server. This would centralize the installation and upgrading, but wouldn't tax the server as much because it would be working as a fileserver and not have to execute the code itself.
I think that this might require that the hosts be identical. If not, perhaps the /etc and /lib and /usr/lib partitions should be local. Having the /tmp and /var local should is a no brainer to. I would recommend reading the Linux Filesystem Hierarchy Standard for ideas on which partitions can be shared/static partitions. ( Can be found on the www.tldp.org website. ) If done correctly, and if the hosts are similar enough, then you could perform security upgrades on the central server, and the programs would be updated. Also, if a web browser is the only software that should be running, you might want to google for "linux kiosk".

Also consider using a web proxy like squid to control access to the web.

This link may be helpful in this respect: http://www.faqs.org/docs/Linux-HOWTO...art-HOWTO.html

I had read about libraries that use a thin client solution. One problem they have is with printing. Printing to a gdi printer, for example, overtaxes the network. A program at a thin client terminal is actually running on the server. The server is the X-windows client, and the terminals are the X-Windows servers. So, the graphics are traveling down the network. I think that for normal office applications like database terminals or word processing, the thin wire solution would work better than in your case where the users are web-browsing.

One other thought is that you may want the public terminals kept separate from the network that the library uses for its normal work. Being connected to the internet presents dangers. Especially if the public terminals are using windows hosts. Someone picking up malware on one of the public terminals could turn it into a zombie computer which could attack the other hosts.
 
Old 01-29-2006, 09:13 AM   #11
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Original Poster
Rep: Reputation: 0
Cool. The RAID idea is definately what I was thinking of. (Funny, I know what RAID is and what it does but it never occurred to me.)

Anyone have any information on what resources are really used to run a program when using this thin-client setup?

P.S., thanks cs-cam for all of your help thusfar.

Cody
 
Old 01-29-2006, 11:17 AM   #12
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Original Poster
Rep: Reputation: 0
Also, how does Video using mplayer in Firefox function over a 100mb network? Will it be smooth?

Cody
 
Old 01-29-2006, 12:10 PM   #13
sundialsvcs
Guru
 
Registered: Feb 2004
Location: SE Tennessee, USA
Distribution: Gentoo, LFS
Posts: 5,452

Rep: Reputation: 1172Reputation: 1172Reputation: 1172Reputation: 1172Reputation: 1172Reputation: 1172Reputation: 1172Reputation: 1172Reputation: 1172
Since you have a small number of workstations to deal with, and almost no "user accounts," you can simply set up a standard Linux distribution on each machine. It should be a minimalist configuration, with only the applications you intend the public to have.

Each workstation automatically logs-in to a single, very limited account. It boots and goes straight into that account and cannot go into any other. Ctrl+Alt+Del and other key-sequences are disabled. XWindows also ignores magic-sequences like Ctrl+Alt+Bksp.

When a user session ends, it logs-out, and is immediately logged-back in again, fresh and new. The critical files in the user's home-directory are created by another user and are read-only to this one. Upon login, all other files are erased.

So, how do you log in? Through an alternate runlevel. Booting to any other runlevel, or changing the boot-sequence in any way, requires a password. The BIOS ignores any other boot-device, again without a password.
 
Old 01-29-2006, 01:45 PM   #14
CodyDH
LQ Newbie
 
Registered: Dec 2005
Location: Wisconsin
Distribution: Fedora Core
Posts: 12

Original Poster
Rep: Reputation: 0
This sounds like what I had originally had in mind.

A few more questions

1) What kind of script would erase "all other files?" (i.e. is there one made, if not what would make this script run, etc.)

2) How would i restrict what programs users can run?

Cody
 
Old 01-29-2006, 02:20 PM   #15
GrueMaster
Member
 
Registered: Aug 2005
Location: Oregon
Distribution: Kubuntu.
Posts: 848

Rep: Reputation: 30
I think that what sundialsvcs was referring to as to files being auto deleted, is that you would have the home directory created on bootup in memory (ramdisk). Rebooting would of course completely flush this out. This could also be accomplished with a login script, that would just "rm -Rf ~;cp -Rfd /etc/skel ~" (as an example).

As to restricting which programs a user can run, I would assume that you would create a user menu with only the programs needed (OpenOffice, Firefox, xine, etc). I don't think users would need shell access, so there should be no terminal windows. I'd use either Gnome or KDE as the gui backend for interoperability and ease of use (they can both be configured to match more well known environments, like Mac & Windows, plus they are both highly configurable).

If the thought of having all the systems running off the server is too complex, you could build an image on one system (call it a gold image), then use partimage to save it to the server to replicate to other systems. This works especially good if all systems are esentially the same basic hardware (same video vendor, network, ide controller, etc). In other words, if all the "client" machines are running the same base video (nVidia GeForce 2mx or better), the same type of hard drive (IDE with a ribbon cable - aka PATA), and the same basic monitor settings (1024x768 @60hz for example), then you take the lowest system in the pool to build the master image, and it should work on the rest without a problem. You could even get really advanced, and have them auto-reimage nightly if you wanted to. This way though, you only have one system to upgrade the image on, then propigate to the rest.

Also, streaming video over 100mb ethernet is fine, unless you are doing HD quality video. Most internet video is either 320x200 or 640x480, so it should work no problem. Use network switches instead of hubs if possible, as they will perform far faster (full duplex vs half duplex, etc).
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Public School Library software klnasveschuk Linux - Software 2 09-15-2006 07:02 PM
Linux for Internet in Public Library AlexV Linux - General 24 02-15-2006 09:53 AM
Forward public ipaddress though linux MarleyGPN Linux - Networking 1 12-10-2005 12:40 AM
Linux on public computer Tepsunius Debian 11 06-27-2005 09:09 PM
Legal Question: Can a public library 'borrow' the design of the FC website? AlexV General 3 02-21-2005 02:13 AM


All times are GMT -5. The time now is 06:15 AM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration