LinuxQuestions.org
Share your knowledge at the LQ Wiki.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - General
User Name
Password
Linux - General This Linux forum is for general Linux questions and discussion.
If it is Linux Related and doesn't seem to fit in any other forum then this is the place.

Notices


Reply
  Search this Thread
Old 06-21-2004, 06:00 AM   #1
nsX
LQ Newbie
 
Registered: Nov 2003
Location: Germany
Posts: 16

Rep: Reputation: 0
colordepth on fedora core 2 and nvidia


Hi there!

I changed my default color depth from 16bit to 24 bits in /etc/X11/xorg.conf.

32 bits are not supported by the nvidia driver.

so far, X runs very well. but if i start an opengl application (screen saver, quake or rune), nothing happens. quake for example resizes my screen to 1600x1024 and exits. rune only makes my screen brighter and exits too.

with 16 bit, opengl worked well.

as far as i can see, there are no error messages in Xorg log files.

any suggestions?


-nsX
 
Old 06-21-2004, 07:08 AM   #2
qwijibow
LQ Guru
 
Registered: Apr 2003
Location: nottingham england
Distribution: Gentoo
Posts: 2,672

Rep: Reputation: 47
Quote:
32 bits are not supported by the nvidia driver.
32bits Are supported by the nvidia driver.
it is just the graphical user environment (X) which does not support 32bit color.

MS Windows for that matter does not support 32 bit colour either.

in 32bit colour.. there are 8 bits of red, 8bits of blue, 8 buts of green, and 8 bits of alpha.

(alpha is transparency)

in windows, the colour depth is set to 32. But because the desktop doesnt support hardware accelerated transparency, the 8 bits of transparency are all set to 0 perminantyly.

in linux... the colour depth is 24bits.

its basically like Windows saying it has 5 fingers. and Linux saying it has 4 fingers and a thumb. same thing.

anyways..

what happens when you run the command glxgears.

this will test openGL

also, whats the output of "glxinfo"

run quake as follows...

strace -o /home/username/quakeError.log /path/to/quakeBinary

after it crashes, open quakeError.log and read up from the bottom,
this will let you know with errors if you are missing any file dependenc's or if your /dev/nvidia* permissions are too restrictiive.
 
Old 06-21-2004, 07:35 AM   #3
nsX
LQ Newbie
 
Registered: Nov 2003
Location: Germany
Posts: 16

Original Poster
Rep: Reputation: 0
32bits Are supported by the nvidia driver.
it is just the graphical user environment (X) which does not support 32bit color.


humm... don't know... when i try to run X in 32 bit color depth mode, the nvidia module says 32 bit is not supported. anyway... does'nt matter.

i checked glxgears and glxinfo. both tools run without any errors. about 3000 fps in glxgears on my laptop

my screensavers work too for now but still, quake and rune do not. quake tries to set the screen resolution to 1600x1024 but my max resolution is 1024x768 (laptop). so i have to scroll with my mouse. at the top left of screen, there is a black ractangle with 1024x768 pixels in size.

don't know the problem because i only changed the depth- value in /etc/X11/xorg.conf from 16 to 24. If i change back, games do not work either...


-nsx
 
Old 06-22-2004, 08:05 AM   #4
qwijibow
LQ Guru
 
Registered: Apr 2003
Location: nottingham england
Distribution: Gentoo
Posts: 2,672

Rep: Reputation: 47
Quote:
humm... don't know... when i try to run X in 32 bit color depth mode, the nvidia module says 32 bit is not supported. anyway... does'nt matter.
yeah, X doesnt support 32bit colour depth.

I have a G-Force4 64 Nvidia graphics card, and Unreal Tournament 2004 Runs excellent in 32bit colour.

so is everything fixed and working now ?

if not, what did glxinfo say ???
what errors were reported with strace ?
 
Old 06-22-2004, 09:12 AM   #5
nsX
LQ Newbie
 
Registered: Nov 2003
Location: Germany
Posts: 16

Original Poster
Rep: Reputation: 0
yeah worx now

i do not know what i have done but it seamed to be the right

you say X does not support 32 bit color depth but you play UT in 32 bit c.d. how does that work? Or has X nothing to do with the games? I can play rune in 32 bit mode now. looks much nicer although, it is a little confusing that X does not support 32 because i always thought that the games just create a screen-sized widget which uses the same color depth as X does. seems to be wrong uhm?

k, thanks for your help guys!


-nsX
 
Old 06-22-2004, 09:59 AM   #6
qwijibow
LQ Guru
 
Registered: Apr 2003
Location: nottingham england
Distribution: Gentoo
Posts: 2,672

Rep: Reputation: 47
maybe the Alpha channel (8 bits of the 32bits) is only used internally within OpenGL binary's and the Game. maybe after the Alpha Channel is used to blend the other 24bits of colour it can be removed?

or maybe X hands over controll of that part of the screen to the nvidia Binary's

im not sure, just guessing.
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



Similar Threads
Thread Thread Starter Forum Replies Last Post
NVIDIA and Fedora Core 3 MalachiX Linux - Newbie 3 02-10-2005 06:37 PM
nvidia drivers on fedora core 3 srasiroslayer Linux - Software 2 12-31-2004 11:17 AM
fedora core 2 nvidia drivers cprogrck Linux - Hardware 21 06-16-2004 11:50 AM
Fedora Core 1 Nvidia Install fzx1 Linux - Newbie 1 04-07-2004 10:27 PM
fedora core +nvidia mortal Linux - Newbie 1 03-17-2004 06:37 PM

LinuxQuestions.org > Forums > Linux Forums > Linux - General

All times are GMT -5. The time now is 10:28 PM.

Main Menu
Advertisement
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
Open Source Consulting | Domain Registration