NVidia GeForce 8400 GS Framerate
The following is my glxgears output:
29761 frames in 5.0 seconds = 5949.690 FPS 30303 frames in 5.0 seconds = 6046.559 FPS 30287 frames in 5.0 seconds = 6056.364 FPS 30266 frames in 5.0 seconds = 6052.519 FPS 30280 frames in 5.0 seconds = 6055.456 FPS Is this a pretty standard framerate for this card (an 8 series GeForce)? This is my glxinfo: Code:
name of display: :0.0 Is there any reason why DRI shouldn't be running? I have Desktop Effects enabled through gnome... JF |
Using glxgears to benchmark a video card is inaccurate and stupid. The utility glxgears was designed to show that OpenGL is working properly, so programmers and users can create or run programs with OpenGL. A better test is to download game scripts for Unreal Tournament and Quake then compare the results on the internet.
SUSE uses proprietary ways to setup video cards. Read the documentation that nVidia provides for SUSE. nVidia does not use DRI because they use their own library for rendering 3D. You are not using their library. You are using X11 3D rendering library. This is OK, but it will not support all the 3D features that your card can handle. On my setup, I get between 30 to 80 FPS in Unreal Tournament 2004 using a GeForce8 8400M GS, Intel Core 2 Duo (T7300) 2x2.0 GHz, and at 1440x900. I have the settings at low because I prefer those settings than high. It is a Dell Inspiron 1520. |
Well, first of all... I appreciate your input and information.
Second, asking if the framerate from glxgears (considering teh fact that I showed the output from glxinfo) was not stupid. Assuming that I was using it as a benchmark was stupid. This just happens to be what most people have available to them and is directly related to the type of drivers you have installed on your system as well as the x configuration. Just for more infomration (in case the rest of the system makes any difference...) I'm running an AMD AM2 Athlon 64 2x2.8 Ghz, 4GB of 667Mhz DDR2, and of course, my 256MB GeForce 8400 GS PCI-E graphics card. Instead of just telling me that I'm not using the Nvidia rendering library, why don't you tell me where i can get information on installing it. I checked NVidia's website adn all I could find was information on their latest drivers. JF |
The utility glxgears is given to ONLY test if OpenGL is working. If it crashes, the GUI crashes, or the whole computer crashes while running it, there is something wrong with the modules and libraries to handle OpenGL. Again it is not a benchmark program because a software 3D render such as Mesa3D can render frames as fast as rendering 3D in hardware.
Please read http://wiki.cchtml.com/index.php/Glx...ot_a_Benchmark No offense, but I should not be doing your work finding the link or documentation on nVidia's site. It is on nVidia's page when you download the Linux driver. If you do not want to do the special way for your distribution use another distribution like Gentoo. Gentoo makes nVidia module (drivers) easier than its own installer. HINT: Do a text search using "SUSE" on nVidia's web page titled Linux Display Driver. |
Quote:
http://en.opensuse.org/Nvidia#Installation |
I went ahead and used the one click install and got the drivers loaded, but ran into another problem all together: overscanning.
It appears that the NVIDIA utility for Windows has taken care of that by including the ability to use underscanning and selecting the digital output type (1080i, 780p, etc.) and the version for XP x64 even has an HDTV resizing feature to get the most of your screen. This, however, has not yet been implemented (from what I have seen) in the linux version of the utility. I know I should start a new thread on this, but do any of you know how to take care of this overscanning problem on HDTV's? BTW, I wasn't asking for anyone to do the work for me. As I stated, I looked on Nvidia's website and only found information on their drivers... no explination as to how they happen to use a different rendering library or anything. Thanks, JF |
All times are GMT -5. The time now is 06:45 PM. |