Linux - NewbieThis Linux forum is for members that are new to Linux.
Just starting out and have a question?
If it is not in the man pages or the how-to's this is the place!
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Introduction to Linux - A Hands on Guide
This guide was created as an overview of the Linux Operating System, geared toward new users as an exploration tour and getting started guide, with exercises at the end of each chapter.
For more advanced trainees it can be a desktop reference, and a collection of the base knowledge needed to proceed with system and network administration. This book contains many real life examples derived from the author's experience as a Linux system and network administrator, trainer and consultant. They hope these examples will help you to get a better understanding of the Linux system and that you feel encouraged to try out things on your own.
Click Here to receive this Complete Guide absolutely free.
I'd like to know what the process is by which the contents of a Linux distribution are determined. I've been a software engineer for 44 years and it's a little disingenuous of me to post in a newbie forum since I've been flirting with Linux for about 5 years now as an avocation in the hopes I could find a way out of the Redmond world in my numerous home systems.
But with the advent of Fedora Core 12, I saw what I interpret as a kind of step backward in the evolution of usability in Linux. To me, the viability of Linux as a replacement for the products offered by the birdbrains of Bellevue (no offense intended to our avian cohabitants, merely a comparison on brain volume) will depend on it's ability to do everything that the competition does.
My principal Linux box has an Nvidia FX5200 AGP card which requires the .173. version of the Nvidia drivers and in the generations of Fedora Cores leading up to 11 I didn't have any serious issues replacing the default video drivers with the drivers Nvidia has provided for Linux. And I felt that was necessary because one of my test cases for Linux viability as a desktop replacement is video games - particularly real-time 3d Graphics such as employed in flight simulators and FPS games. Up through FC11 it was an experience not really different than a Redmond environment - load the drivers supplied by a manufacturer. And that process has become easier over time in the way the Linux systems have made such installations easier to perform inside the GUI rather than making the user type arcane commands in a shell.
But what I saw happen with FC12 and now FC13, is a decision by the packagers of a Linux distribution, in this case Fedora but according to the many websites it as a common decision amongst the many distributors, to employ what is an open source project called "nouveau" as the driver of choice for an Nvidia graphics card. The nouveau driver does not support the 3D functionality of the Nvidia hardware. But it is bundled into the distribution in such a way that you have to go back to "expert" mode in a shell script to disentangle nouveau to even begin to replace it with the drivers from Nvidia - which actually work just fine and provide the 3D functionality. It only cost me 3 days of spare time tracking down how to fix the problems and I'm comfortable with shell-scripting so making the necessary fixes wasn't really all that big a deal. But what bothered me was that previously working capabilities were deliberately changed, apparently amongst a larger group of Linux packagers/distributors, to switch en masse to a product that was clearly not ready for primetime. In the discussion on several sites, Linux mavens were defending the choice to go to nouveau claiming that the "needed" functionality was there and completely ignoring that they had completely disabled the ability to play games requiring 3D hardware graphics. Prominent applications like FlightGear won't run with the nouveau driver.
And so I come to the question - in that context, of trying to make Linux a viable replacement on desktops for the products of the evil empire, who makes the decisions about what goes into a distribution, and more particularly about the architecture of the core such that the installation of a video driver precludes the use of any other video driver such that you need to be competent in shell scripting to correct the problem and restore functionality that had been working relatively easily in previous versions? And to me, the issue isn't specifically about the use of nouveau so much as about how the decision process works that determines the contents of the kernel distribution. And once I know that maybe I need to get more involved in that process. Linux has come a long way and I'd like to see it fully succeed. And when 3D game-playing can be done painlessly, whether by emulation of the evil empire or by convincing vendors to develop for it, Linux will, at last, be ready for primetime - everything else is there.
Let's see what the Fedora project says about their goals:
Fedora is a Linux-based operating system that showcases the latest in free and open source software.
Nvidia drivers are not "the latest in free and open source software;" in fact they are "proprietary" or "closed-source." This is the reason why Fedora does not "showcase" them.
Nouveau however is "the latest in free and open source software" which is why it is included with Fedora.
You may argue that, in many cases, the closed-source software is superior to its open-source alternative. This may very well be true, however, it does not change the mission and goals of the Fedora project.
If you are concerned about the direction of the Fedora project, I encourage you to get involved. That is how Open Source works. It sounds like you have some good knowledge and expertise to share.
But with the advent of Fedora Core 12, I saw what I interpret as a kind of step backward in the evolution of usability in Linux.
No, you see a step backward in the evolution of usability in Fedora. Many distros continue making it easy to install the proprietary nvidia drivers.
To me, the viability of Linux as a replacement for the products offered by the birdbrains of Bellevue (no offense intended to our avian cohabitants, merely a comparison on brain volume) will depend on it's ability to do everything that the competition does.
No OS can beat Windows by merely imitating Windows. The only way to be better is to be different. GNU/Linux has a reputation as being "not for games" - and indeed, while it does have a not-bad number of games that number pales in comparison to what Windows has. As such, gaming is perhaps not a focus of many Linux distributions. (The expected release of Steam for Linux may change that though).
Also, you assume that Linux distributions neccessarily care about market share. Often it does, but most Linux distributions are non-commercial, so they don't have any need for market share like commercial software does, and their developers are free to pursue other goals if they want to. Many distributions place the idealogy of Free Software above matters like market share and usability - and before you criticse that, bear in mind that without people with a committment to Free Software and a goal of a 100% Free system the rest of us probably wouldn't have a 99% Free system.
It is also worth mentioning that Fedora is, first and foremost, the public testing ground for Red Hat Enterprise Linux. Red Hat has a strict hardware certification system and does not support hardware that requires additional 3rd-party drivers to function correctly.
In other words, Fedora users are "guinea pigs" for new features that will "trickle down" and improve the usability and compatibility of future Red Hat releases. The idea situation is that, through active development and widespread testing, Nouveau continues to improve and eventually becomes as good as (or better than?) the 3rd party drivers.
Thanks for the level-set - reminds me I need to read a few things more carefully that are readily available to see. I will look a little deeper to see which distro emphasizes an attempt to become a fully functioning replacement for the evil empire.
I've found the Open Office product to work well enough for primetime.
MySQL has worked for the small data problems I've given it but I want to work with some much larger datasets in the TB size range to see how it handles that. And I guess I'll need to look at postgres now that there has been a threat to the future of MySQL. I haven't followed the most recent News but somehow tangled up in Oracle's purchase of SUN was a veiled threat to MySQL last I paid attention, despite Oracle's claims to the contrary.
For a media player I've been happy with VLC over most others for handling every format I've thrown at it so far without any real problems - to the point of wondering why the default player in Fedora seems to be Totem which chokes on several formats I've given it.
The SANE functionality is almost there - I had that working with a network scanner in FC11 but that's been problematic with FC12 and FC13 - not sure whose problem it is yet but it's another convoluted install that goes easily with the evil empire but is a hair-puller in FC12 and FC13. The SANE community seems to have done a good job keeping up with the variety of scanners out there and I found mine fully supported - but again have had problems with the latest incarnations of Fedora. And this is another one complicated by Canon choosing to ignore Linux in the US Market - you can find help overseas but nothing from Canon USA.
Anyhow - thanks for the cogent comments and for completely ignoring my flamebait references wrt the evil empire. Hardly worth the typing for that part but they've been a thorn in my paw for years - should try to get over it!
If it took you 3 days as a developer, perhaps you could spend 2 more writing the functionality to make it easy. Perhaps add a gui too. Post a link to the result or blog on relative websites. Maybe petition fedora to adopt your code and be done with it. Why? Because sometimes it takes longer to figure it out (3 days times thousands of users), than to influence change. Perhaps not in alignment for your goals and time. But if you want things to be better, and have the ability to make it better, choosing not to contradicts your post. Who knows, maybe you could petition nVidia to PAY you to make that change on their behalf.
A good question - I'll not waste your time with a long-winded dissertation on the precise reasons other than having a 6-day a week job and a daughter starting college in 3 weeks - but a lot of people have those distractions and still manage to find a way. And perhaps it's the perfectionist in me that doesn't want to do a half-fast job. With my daughter leaving home, I will have more time on my hands, and that's why I'm starting to dig in a little instead of pushing the problems to the side for another day. What I now know about how to solve the nVidia driver problem I got from multiple websites where I found every piece I needed but not the whole answer in one place - it's not a 3-day task to do it - it's a 30 second task every time a new kernel update comes out from Fedora. Since first solving the problem about a month or two ago there have been three kernel updates I think 87, 109, 124, and now 147 I think (but don't quote me - I didn't fire up the main Linux box to check. All the steps have been described by others - blacklist nouveau in at least two places, make sure your system starts at init level 3 and not 5, and tell the X system to ignore an ABI error where nVidia's drivers were designed for the V1.8 version of X and not the V1.9 that ships with FC12 and FC13.
To do a package for that means I'll have to study the RPM system to see what's involved - probably only take a couple of hours as there's plenty of documentation floating around and I've done dozens if not hundreds of 'make' files - but to do it right I'll need to make some assumptions about how users want to operate their systems - do they want to come up in terminal mode or always go all the way to GUI - should I alias 'startx' - and about how to detect a kernel update since every time the kernel changes I have to recompile the nvidia driver package. And to get around that I'll have to study why nVidia chose to use a kernel module method rather than a dynamically loaded one - and I'm sure there's 1000 people out there who already know why and can help with that - so you're right, if I'm such a whiz why don't I sit right down now and do that.
But to me, that all starts by listening to what other folks are doing right now so I don't wind up doing something that someone else is already working on. And that's why I'm starting to post actively after lurking in the shadows for 5 or more years. Maybe there is something I can do that's useful to the larger community. I'm just setting out on that road and we'll see where it goes. After I drop my daughter off 700 miles from home in mid-August I'll dig deeper.
There has been an awful lot of good work done by a lot of people working on core functionality of Linux who've made the time to make it happen.
I've barely scratched the surface so far looking at it only as a user hoping to replace my many windows systems with something I can actually control and that doesn't insult me at every turn with meaningless error messages and which thinks I'm too stupid to understand the problem. The Linux community has so far proven to be vastly different, with most replies to posts on troubleshooting forums providing real answers instead of snide references to vague documentation in infinitely-looped KB articles.
We'll see how this all develops. You know the old saw about the infinite number of monkeys at the infinite number of typewriters - maybe there's a seat for me in there somewhere!
If you've managed to get that fix/update down to a few cmds that take 30 secs to run, at least post that in a few places and send to a few distros eg Fedora (especially if it's likely Fedora specific).
Some one else may get inspired and get it included in distro(s).
Share your knowledge, however little; if it works it's worth posting.
As above, Fedora is RedHat's R&D bleeding edge distro; do not expect stability or predictability.
OTOH, it can be interesting to spin up occasionally to see what people are thinking.
If you want a close-to-MS look, try Ubuntu or Mint.
A good solid distro would be Centos (free version of RedHat Enterprise Linux aka RHEL).
Take a look at any of the top 10 or so at distrowatch.com.
Also, the true Linux enthusiasts (as opposed to fan boys) just want a good free open source system; they're not interested in taking over the world, although a large user base would help with eg getting HW & big SW App manufacturers to support Linux.
unSpawn - You Dawg!!! That RFC rocked! I've had to read (and re-read) so many RFCs over the years but I have never seen that one - and it's 10 years old!!! Never knew it existed. I've got to work that into a spec!!!! I work for one of the telecomm giants and I now know the protocol they must be using for the Architecture Committee! Thanks - made my day!
chrism01, et al, I will go find the right place in this forum to put in writing the basic setup and the few steps it takes once you have things setup to keep nVidia current with new kernel releases. The method I use seems to be required for the older driver sets which nVidia no longer supports, like if you're using the .173. drivers. And I think I have an old RIVA TNT card I could resurrect and try the method out on the even older set of drivers nVidia produced for Linux that are no longer maintained. I think the install of later model drivers isn't as problematic based on what I've read in various forums but I don't have Linux running on anything more modern from nVidia - YET! We'll see.
I do have one box running Ubuntu for about a year now and I like it - it was a very painless install on an older generation machine with a Cyrix processor, 550MHz and 384MB of RAM and it hums right along - a lot faster in getting things done like typical word processing and spreadsheet work than the XP environment that was on it originally. I hadn't heard of the CentOS distribution so I'll look into that one.
As to the market share proposition - I think what I'm really driving at is that I'd like to see Linux succeed and take market share because it is not predicated on the concept that users are idiots that can't be trusted - and that they are so stupid that we have to make them do what we want them to do whether that's in their best interests or not - and, let's ignore standards and put our own arrogant opinions in code to deliberately undermine standards efforts. I can't say that the following is literally true - it's more an opinion based on observation over the years, but it appears Microsoft's efforts in trying to own the web were predicated on the idea that the real money they could make was in giving websites the power to take over the desktop so as to market people - to become the commercial TV supplier so to speak by building widgets that website developers would use to throw commercials at you - that your desktop really belonged to Microsoft like your TV used to belong to the old TV networks. And despite the warnings of computing community and their fairly early on experience with security problems, they continued to emphasize the marketing widgets at the expense of data privacy and reasonable security. And (conjecture - not provable) that attitude of supporting marketing capabilities rather than robust security spilled over into operating system product line resulting in poor quality and even poorer performance over the years despite the near-quantum leaps in computing power.
Now I'm probably a little naive here - but I think it comes from having grown up with computing - and perhaps the idealism of those people drawn to engineering - but with 20 years to get Windows stable they really haven't done a very good job. I'd really like to see the results of the questionnaires they have to have circulated and the people they interviewed to make them think they have had to change windows the way they have. It looks to me more like change for the sake of change rather than an architected plan to achieve perfection and stability.
It's in that context that I'd like to see Linux succeed - to get into the hands of the users of computing a system which serves their needs rather than tries to control their behavior and limit their choices.
But in the long run, the home computer will be a passing fad and probably gone from collective consciousness in another generation. In the end, our cell phones and our home entertainment systems will be our gateways to information we need when we want it. And petty squabbles over which OS/GUI succeeds and dominates will simply fade away. But for the next 10 years it'll be a fun playground to be in. The future, if you listen to all the server hardware marketeers, is in the cloud where speed, security, and efficiency will dominate - where windows has never done particularly well compared to Linux and UNIX and even mainframes. An old friend opined back in the 70s that he would buy a personal computer when he could walk up to one and say "What is the capital city of Idaho?" and it answered "Boise!" With the advent of the 'net and the power of handheld devices and high resolution displays, projected or hanging on the wall of your home, then who really needs a computer heating up the house and requiring upgrades and other maintenance when you could ask your TV or your cellphone a question or ask for a piece of entertainment and have it respond immediately with what you wanted or needed?
That tech, while mostly existing tech, is still another 50+ years out IMO. Too much patent and copyright side stepping to make it an affordable reality today. For anything other than uncle sam that seems to write it's own rules as needed.
Personal computers are not going away IMO. The home studio is becoming the new mainstream. And we're far from making that happen in realtime, much less by voice recognition. Factor in hitting upon the limits of physics with current technology and probably not in my lifetime for what you describe. Unfortunately you almost can't exist these days without a computer. Even if you only use it for the sake of mapquest.
The home phone is no longer in everyones home, so what is that going to do for the printed phone books that we used to rely upon so much? In a crisis we don't even have power. Much less internet to know who to call for help. Sure 911, but that'd be a bit like trying to use a phone on new years eve in the 1980's. All circuits are busy, please try again later. In short we're no where near that kind of workflow. Even though all of the pieces currently exist to one degree or another.
Perhaps when your newspaper comes on DVD or bluray and not a stack of colored wood (for all intents). Perhaps when dialup isn't the ONLY option in some regions. Perhaps when every household generates it's own power usage needs via solar, wind, geothermal, or any number of options. Perhaps when cars drive themselves, and fly through the air. We were selling tickets to space 30+ years ago, and who has actually been able to use their tickets yet? Perhaps when there's a colony on the moon. Tech that we've had since 1969 for all intents. From a certain POV, there's where we should be, could be, and have otherwise have NOT DONE (yet). Meanwhile where are we currently at? Down here on this blue ball trying to keep religious zealots from blowing up our toys.