Some help for anyone having trouble with OpenGL, nVidia cards and Xorg
Linux - HardwareThis forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
Some help for anyone having trouble with OpenGL, nVidia cards and Xorg
I could get flamed for stating the obvious here, but in my Gentoo 64 installation this was giving me the screaming heebies until I fixed it 5 minutes ago.
No matter what I did, emerge out the wazoo, whatever, I couldn't get OpenGL working on my GF5200 card and Xorg. Glxinfo complained about not being able to find RGB, and so on. I finally found the problem (in /var/log/Xorg.0.log).
NVidia OpenGL apparently only works in 16 or 24-bit. When my KDE session started up, it would change resolutions on me before settling down, and nothing GL-based would work. The reason is that no matter what your "Display" Depth is set to in the Screen section of Xorg.conf, it needs a DefaultDepth entry of 16 or 24 for OpenGL to load properly. Mine looks like this:
That is true. You see, 32bit color isn't really 32bit color, it is 24bit color with an 8 bit Z-buffer (if you don't believe me then load up windows sometime and look at the actual number of colors in 24 and 32 bit mode... yes they are the same). It is my understanding that the nvidia driver assumes the buffer so it uses a max of 24 bit color.
Ok... I need to clarify... some graphics card people use the extra 8 bits in "32-bit" color for a Z-Buffer (or sometimes other buffers). I believe nVidia does this, but I'm not totally sure.
The biggest reason for 32bit color originally was so that the data would always by aligned in a single word. For example... if you have 24bit color (24 bit color = true color = 16,777,216 colors) then three colors held in memory back to back would look something like this:
0x00 11 11 11 22
0x01 22 22 33 33
0x02 33 xx xx xx
I hope this wasn't too confusing.... but the whole point is the 2nd and 3rd colors overlap the word boundries and thus it is much more difficult to get access to the data on a machine with 32bit words (you need both words in memory instead of just one.. which means you end up loading 64bits of data to access 24bits in a lot of cases).
In 32 bit color you insure each one gets a full word, and you never have to deal with things crossing word boundries. This trick also works for 64 bit machines because the size of a color is evenly divisible by the word size.
My understanding from nVidia Tech support is that they byte align there colors by default so all you are setting is the actual number when you set the display mode.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.