LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Hardware (https://www.linuxquestions.org/questions/linux-hardware-18/)
-   -   Missing the perimeter of my desktop - Video Card / DVI-HDMI (https://www.linuxquestions.org/questions/linux-hardware-18/missing-the-perimeter-of-my-desktop-video-card-dvi-hdmi-720325/)

waffe 04-20-2009 12:15 AM

Missing the perimeter of my desktop - Video Card / DVI-HDMI
 
Hi,

I am using a NV 9400 GT video card with Ubuntu 8. When ever I use the DVI or HDMI outs to an HDTV, no matter what resolution I set the card to, the Linux desktop is missing an inch of the perimeter of the desktop. I assume I need to adjust my xorg.conf file and I have but nothing produces the correct size screen.

This problem shows up when I use an nvidia driver like (version 177). If I use a default driver my desktop looks fine.

So do I have the correct modelines? And why do I need these, that means I will have to change my xorg.conf file everytime I change to a different monitor? I was trying to work with this command Option "ModeValidation" "NoDFPNativeResolutionCheck" but it did not seem to make a difference.

One other point that might show the problem, on other machines using DVI to connect to my HDTV, when I first log into my desktop the TV screen displays what resolution it is running, and it has said in the past, the same resolution I have set in the xorg.cong file, but now it always says 1680X1050, the max resolution of the TV screen.

Here is the xorg.conf file I am using:

Code:


Section "InputDevice"
    # generated from default
    Identifier    "Mouse0"
    Driver        "mouse"
    Option        "Protocol" "auto"
    Option        "Device" "/dev/psaux"
    Option        "Emulate3Buttons" "no"
    Option        "ZAxisMapping" "4 5"
EndSection
 
Section "Monitor"
    Identifier    "Computer Monitor"
    HorizSync      15.0 - 80.0
    VertRefresh    48.0 - 76.0
    ModeLine      "1280X1024" 81.80 1024 1080 1192 1360 768 769 772 802 -hsync +vsync
EndSection
 
Section "Device"
    Identifier    "Configured Video Device"
    Driver        "nvidia"
EndSection
 
Section "Screen"
 
    #Option        "ModeValidation" "NoDFPNativeResolutionCheck"
    Identifier    "Default Screen"
    Device        "Configured Video Device"
    Monitor        "Computer Monitor"
    DefaultDepth    24
    SubSection    "Display"
        Depth      24
        Modes      "1280x1024"
    EndSubSection
EndSection


Shadow_7 04-20-2009 07:09 AM

Overscan
 
HDTV's have OVERSCAN, which means that the borders of the screen are OFF screen to hide the DIRTY edges. It's a function of it being an HDTV. There are ways to sometimes tame the beast. I can get my 1080p HDTV to go up to 1776x1000, although 1080p is offscreen (~10%). And it only looks "good" (and on screen) at 720p (1280x720).

Check the /var/log/Xorg.0.log for EDID information. Sometimes what the TV reports to the OS is wrong. Changing your HorizSync and VertRefresh rates in your xorg.conf will help. And specific modelines for specific TVs and/or cards. I was never able to get 1080p working on mine. And it's too much of a power hog to have the 450W desktop, 225W studio monitors (x2) AND the 450W HDTV drawing current from the same plug. So it's back to TV duties.

MythTV.org has a number of modelines for some known HDTVs that seem to work for some people. None of them worked for me though. I toyed about a month on experimenting with modelines and eventually gave up. Powerstrip in windows didn't help much either, since I really didn't know what the magic values were to tame my beast. Or if it's even possible. My VisionTek/ATI HD 4550 might also be the culprit since it didn't seem to actually use any of the modelines and such that I tried. Even the ones that xvidtune -show said I was using for a particular working video mode.

waffe 04-20-2009 10:45 AM

Thanks Shadow_7,

this info helps! One strange think about what you have told me is I have used this HDTV on two other machines using the same xorg.conf file and TV but a different video card and all was fine. Meaning the only thing different in my current setup is the video card and now I am having OVERSCAN issues - so would you say the video card effects the overscanning too?

Thanks,
waffe

Shadow_7 04-20-2009 01:23 PM

Quote:

Originally Posted by waffe (Post 3515178)
Thanks Shadow_7,

this info helps! One strange think about what you have told me is I have used this HDTV on two other machines using the same xorg.conf file and TV but a different video card and all was fine. Meaning the only thing different in my current setup is the video card and now I am having OVERSCAN issues - so would you say the video card effects the overscanning too?

Thanks,
waffe

It appears to in my case. I think that my second aftermarket graphics card purchase will be nVidia. In my case, if I try to set a modeline, it forces overscan, even on relatively low resolutions. And some modes 1024x768 are only available if I let everything default. i.e. comment out HS, VR, ML's, and other efforts. I needed the HDMI out to use my HDTV, and this cheapo ($100-ish) card had that. But the visiontek variant of an ATI card doesn't appear to be all that linux compatible. Although I did recently upgrade the driver and haven't rechecked for improvement.

The only viable way to use my existing card was to get some sort of intermediate box to convert the video signal to an HDTV compatible signal. i.e. 60Hz EXACTLY, not 59.97 or 60.01. But I didn't want to invest since there's no guarantee that overscan wont still be an issue.


All times are GMT -5. The time now is 04:57 PM.