new nVidia FX5200, crappy picture, lots of screen noise
SlackwareThis Forum is for the discussion of Slackware Linux.
Notices
Welcome to LinuxQuestions.org, a friendly and active Linux Community.
You are currently viewing LQ as a guest. By joining our community you will have the ability to post topics, receive our newsletter, use the advanced search, subscribe to threads and access many other special features. Registration is quick, simple and absolutely free. Join our community today!
Note that registered members see fewer ads, and ContentLink is completely disabled once you log in.
If you have any problems with the registration process or your account login, please contact us. If you need to reset your password, click here.
Having a problem logging in? Please visit this page to clear all LQ-related cookies.
Get a virtual cloud desktop with the Linux distro that you want in less than five minutes with Shells! With over 10 pre-installed distros to choose from, the worry-free installation life is here! Whether you are a digital nomad or just looking for flexibility, Shells can put your Linux machine on the device that you want to use.
Exclusive for LQ members, get up to 45% off per month. Click here for more info.
new nVidia FX5200, crappy picture, lots of screen noise
Hello everybody.
I just installed a nVidia FX5200. I'm running X.org 6.8, and I've tried it on kernels 2.4.26 and 2.6.8.1 with the most recent nvidia drivers, and on 2.6.9 with the patched driver. No matter what setup I use, there's a lot of screen noise including a faint flickering everywhere, and a couple lines down the right side of the screen. I've tried 800x600, 1024x768, and 1280x0124 at both 16 and 24 bit depths. I tried using the nvidia-settings tool to clear it up, and nothing I can change there helps.
I just upgraded from a Radeon 7000VE, and I never had distortion like this, so I know it's not a monitor problem, although it does seem like it has something to do with the refresh rates. The strange thing is that the same distortion occurs in the framebuffer console too, but the bios and lilo screens are perfectly clear. It doesn't get bad until framebuffer starts. I checked the Xorg log, and the only thing I noticed was failure to load the "GLcore" module, saying that the module doesn't exist. I doubt this has anything to do with the distortion, as it's present in the framebuffer--well before any GL stuff loads.
If anyone can help me out here, I'd really appreciate it--I don't know what else to try.
Yeah, I strongly recommend posting your xorg.conf file, I have a nvidia FX5200 as well and just installed the driver for it and had to modify my xorg.conf file.
Though upon installing it while outside of X it did say something about needing to remove rivafp module, which I can't seem to find. i think maybe the system took it out for me.
I could use some input on that from someone.
RivaFB doesn't come enabled by default in Slackware so you shouldn't worry about it.
Can you try the video card on some other OS? Try using vga=normal to avoid having frambuffer which actually uses a graphic resolution.
Thanks for all the input so far.
Here's my xorg.conf.
For some reason, the GLcore module (as specified by nVidia's README) will not load--see warning below. I know that fast-write is not supported by my mb chipset, and I'm not sure how to set PageFlips and AGP mode. For some reason Xorg.log shows warnings when using the standard xorg options for setting these values (they're commented out in my xorg.conf now, but the warnings shown are as follows:
Code:
kris@bucksnort:~$ grep '(EE)' /var/log/Xorg.0.log
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(EE) Failed to load module "GLcore" (module does not exist, 0)
kris@bucksnort:~$ grep '(WW)' /var/log/Xorg.0.log
(WW) warning, (EE) error, (NI) not implemented, (??) unknown.
(WW) `fonts.dir' not found (or not valid) in "/usr/X11R6/lib/X11/fonts/local/".
(WW) Open APM failed (/dev/apm_bios) (No such device)
(WW) Warning, couldn't open module GLcore
(WW) NVIDIA(0): config file hsync range 30-69kHz not within DDC hsync ranges.
(WW) NVIDIA(0): config file vrefresh range 50-110Hz not within DDC vrefresh ranges.
(WW) NVIDIA(0): Not using mode "700x525" (height 1050 is larger than
(WW) NVIDIA(0): EDID-specified maximum 1024)
(WW) NVIDIA(0): Not using mode "1152x768":
(WW) NVIDIA(0): horizontal sync start (1178) not a multiple of 8
(WW) NVIDIA(0): Not using mode "576x384":
(WW) NVIDIA(0): horizontal sync start (589) not a multiple of 8
(WW) NVIDIA(0): Not using mode "360x200":
(WW) NVIDIA(0): horizontal sync start (378) not a multiple of 8
(WW) NVIDIA(0): Option "EnablePageFlip" is not used
(WW) NVIDIA(0): Option "AGPMode" is not used
My entire Xorg log is here if you want to see it, but everything else is normal...except for the aforementioned warnings and errors. (note that the GLCore module in the log is incorrect...I accidentally capitalized the "C". That has been changed, and I still get the same error...module does not exist.)
I did change the lilo.conf to use "vga=normal". I guess you could say it "fixed" the framebuffer probelm, as it uses the same standard vga settings as bios, but...big fonts suck, and no tux splash at the top. I'm still getting the same distortion once I start X, no matter what resolution or depth I use.
Any suggestions?
I have the same card, and don't even have that line.
Since it just reports that module does not exit, comment it out.
# Load "GLcore"
As to the distortion, my best guess is it's a refresh rate issue.
Chek the exact refresh rates of your monitor.
Make sure your screen res & refresh rate for it are right.
Only weird thing, I have found that with this card, I need to add some ram under Device..
Section "Device"
Identifier "VESA Framebuffer"
Driver "nvidia"
VideoRam 4096
EndSection
If I change or remark the videoram setting, it doesn't work as well.
Not really sure why, & I have been unable to find an answer via google.
Last edited by nick_th_fury; 10-30-2004 at 07:19 AM.
My Xorg log detects the correct amount of video memory (128 meg), so I haven't tried defining it explicitly in my xorg.conf....I'll have to try that when I get home.
As for the refresh rates, I specified the ranges as they were defined in the manual for my monitor. For some reason, my xorg log shows a warning about the refresh rates, but I've heard that in this situation, it's best to go with the monitor's specs and not worry about what xorg says. Is it possible to define specific refresh rates for each resolution/depth? or in the case of my monitor where it only lists a range in the specs, do you just have to let xorg figure it out. I think the refresh rates could be causing the distortion problem, and if you can specify a specific horizontal and vertical refresh rate for a particular resolution, that may be the fix I need. If it's possible, then I'm faced with the task of finding out the correct rate for my monitor at 24 bit 1280x1024...and 1024x768 (my framebuffer).
Quote:
originally posted by nick_th_fury
Only weird thing, I have found that with this card having I need to ass some ram.
Section "Device"
Identifier "VESA Framebuffer"
Driver "nvidia"
VideoRam 4096
EndSection
This seems odd to me. Do you have your monitor referring to the "VESA Framebuffer" device somewhere? It doesn't seem like this should make any difference unless you had another monitor section that only used the "VESA Framebuffer" device. And then I don't know how that would have any effect on the monitor using the card's regular device entry. nick_th_fury, would you mind posting the other device/screen sections of you xorg.conf?
You should also comment out Load "dri"...I believe in the nvidia docs it tells you to only have "glx" loaded for it to work properly. If after that you are still having problems in the framebuffer it might be that conflicting modules are being loaded.
Thanks for all the help everybody....turns out my old Gateway EV700 is a hunk of sh**. I swapped it out for a ViewSonic I had sitting in the closet, and BOOM, distortion is gone. For some reason the Gateway maxed out at 63.9kHz H and 60Hz V@1280x1024. That was the problem. I reinstall the driver for 2.6.9, add a monitor section for the ViewSonic in my xorg.conf, reboot, and the problem is solved. Even the framebuffer is fixed. Now running 1280x1024@80kHz H and 75Hz V. No distortion.
Thanks again. Now I just need to figure out how to fully tweak the FX5200.
Glad to hear you have solved your issue.
I have one thing to add though.
I also use a Gateway EV700 monitor and don't have any issues about framebuffer or refresh rates.
I think it is your particular monitor that was going bad.
Check out the nvidia readme file (should be in in your doc folder somewhere), they list all the options for tweaking your nvidia driver in an appendix at the end, there's some pretty nifty stuff in there.
SG_1, it is entirely possible that my monitor was going bad...the thing that I can't figure out is why the EV700 on my old radeon 7000 worked fine at 1280x1024 with those refresh rates, and the nvidia crapped out. It's a mystery... I remember something was screwy with this viewsonic and my radeon--that's why I was using the EV700. Now I guess it's the viewsonic's turn to shine!
And thanks, typho...I read the README a bit, but I will definitely be tearing through now it to get everything config'd.
Thanks again for the help ppl.
Originally posted by Cedrik VideoRam 4096 seems a little small to me (4MB)
did you try to ajust your card memory spec, like for a 64MB card :
VideoRam 65536
(64 * 1024 = 65536)
I have tried that, but when I tell it how much ram my card has X won't start.
Quote:
Originally posted by kersten78
This seems odd to me. Do you have your monitor referring to the "VESA Framebuffer" device somewhere? It doesn't seem like this should make any difference unless you had another monitor section that only used the "VESA Framebuffer" device. And then I don't know how that would have any effect on the monitor using the card's regular device entry. nick_th_fury, would you mind posting the other device/screen sections of you xorg.conf?
Sure. I have a Nvidia FX5200 pci card w/128mb.
This was originally generated under xfree back in Slack 9.0
Copied it to by xorg.conf when I replaced the XServer.
Funny thing is, I have better performance in Quake3 & Unreal tournament after upgrading to Xorg.
They both run smooth as glass. Although I now have a stutter in UT2003 that I haven't figured out yet which was not there under Xfree.
LinuxQuestions.org is looking for people interested in writing
Editorials, Articles, Reviews, and more. If you'd like to contribute
content, let us know.