Missing the perimeter of my desktop - Video Card / DVI-HDMI
Hi,
I am using a NV 9400 GT video card with Ubuntu 8. When ever I use the DVI or HDMI outs to an HDTV, no matter what resolution I set the card to, the Linux desktop is missing an inch of the perimeter of the desktop. I assume I need to adjust my xorg.conf file and I have but nothing produces the correct size screen. This problem shows up when I use an nvidia driver like (version 177). If I use a default driver my desktop looks fine. So do I have the correct modelines? And why do I need these, that means I will have to change my xorg.conf file everytime I change to a different monitor? I was trying to work with this command Option "ModeValidation" "NoDFPNativeResolutionCheck" but it did not seem to make a difference. One other point that might show the problem, on other machines using DVI to connect to my HDTV, when I first log into my desktop the TV screen displays what resolution it is running, and it has said in the past, the same resolution I have set in the xorg.cong file, but now it always says 1680X1050, the max resolution of the TV screen. Here is the xorg.conf file I am using: Code:
|
Overscan
HDTV's have OVERSCAN, which means that the borders of the screen are OFF screen to hide the DIRTY edges. It's a function of it being an HDTV. There are ways to sometimes tame the beast. I can get my 1080p HDTV to go up to 1776x1000, although 1080p is offscreen (~10%). And it only looks "good" (and on screen) at 720p (1280x720).
Check the /var/log/Xorg.0.log for EDID information. Sometimes what the TV reports to the OS is wrong. Changing your HorizSync and VertRefresh rates in your xorg.conf will help. And specific modelines for specific TVs and/or cards. I was never able to get 1080p working on mine. And it's too much of a power hog to have the 450W desktop, 225W studio monitors (x2) AND the 450W HDTV drawing current from the same plug. So it's back to TV duties. MythTV.org has a number of modelines for some known HDTVs that seem to work for some people. None of them worked for me though. I toyed about a month on experimenting with modelines and eventually gave up. Powerstrip in windows didn't help much either, since I really didn't know what the magic values were to tame my beast. Or if it's even possible. My VisionTek/ATI HD 4550 might also be the culprit since it didn't seem to actually use any of the modelines and such that I tried. Even the ones that xvidtune -show said I was using for a particular working video mode. |
Thanks Shadow_7,
this info helps! One strange think about what you have told me is I have used this HDTV on two other machines using the same xorg.conf file and TV but a different video card and all was fine. Meaning the only thing different in my current setup is the video card and now I am having OVERSCAN issues - so would you say the video card effects the overscanning too? Thanks, waffe |
Quote:
The only viable way to use my existing card was to get some sort of intermediate box to convert the video signal to an HDTV compatible signal. i.e. 60Hz EXACTLY, not 59.97 or 60.01. But I didn't want to invest since there's no guarantee that overscan wont still be an issue. |
All times are GMT -5. The time now is 04:57 PM. |