LinuxQuestions.org

LinuxQuestions.org (/questions/)
-   Linux - Hardware (http://www.linuxquestions.org/questions/linux-hardware-18/)
-   -   advice on graphics cards - 2 TFT monitors via DVI cheaply? (http://www.linuxquestions.org/questions/linux-hardware-18/advice-on-graphics-cards-2-tft-monitors-via-dvi-cheaply-650579/)

mr_git 06-20-2008 01:02 PM

advice on graphics cards - 2 TFT monitors via DVI cheaply?
 
Sorry if this is a stupid question, but I'm struggling to find a clear answer despite a lot of searching and reading.

I've just acquired a new 22" widescreen TFT, which has a native resolution of 1680x1050. I'd like to add this to my 19" TFT, which is 1280x1024... both displays have DVI and VGA inputs.

I need a new graphics card (at present running dual screen with a 15" CRT and an old PCI card with 4mb! along with my aged geforce 2 feeding the TFT via VGA), and it'll need to be AGP to fit my motherboard.

I want to know whether I can expect to run both of these monitors by DVI from one graphics card, but without buying something boutique (and therefore expensive).

One reason for my confusion is seeing so many cards showing a max digital resolution of 2560x2048. Is this in total, or per screen?

I'm pretty sure I could buy a fairly entry-level card (e.g. nvidia 6200, say), and run the 22" from the DVI, and the other by VGA, but my question is would it be possible to run both by DVI?

Would I need dual DVI outputs on the card (e.g. a quadro card)? a lot of these still seem to quote a max digital res of 2560x2048 - does that mean both DVI outputs together?

Then there's dual-DVI, and whether I need that and then a DVI Y-cable? The wikipedia page on DVI is very detailed, but I'm just not finding the answer to my questions there.

I'm a bit confused as to what I should be looking for in my new card - two DVI outputs, dual-DVI and a Y-cable? or is one DVI and one VGA the best dual-screen I can hope for without raiding the piggy-bank?

TIA for any light anyone can shed on this for me - I'm really looking forward to seeing compiz going across these two screens!

TB0ne 06-20-2008 02:20 PM

I believe that's per monitor resolution.

I've done dual display with a Y cable, with two DVI cables, and with one DVI and one analog VGA, off of single cards, and they all work fine. You don't have to spend a fortune on a card, either, providing you're not wanting to do some huge resolution, with massive 3D framerates. I spent a whopping $60 on a card, and use it at the office, driving two 20" monitors, each at 1280x1024x32, so my desktop winds up being 2560x1024.

FWIW, I'd make sure that the resolutions on both screens match...it's doable with them at different settings, but it looks very strange. Matching monitors helps too, if you've got the cash. :)

lazlow 06-20-2008 02:42 PM

If you are trying to run both monitors as one big desktop then the physical size and resolution match is important, but if you are running them independently it is not an issue. We run some with one monitor in CLI and the other in gui. You can monitor edits live that way.

Edit: You also have to be careful on the video card. Just becuase a card has multiple outputs does not mean that you can use them at the same time.

johnsfine 06-20-2008 03:19 PM

I wish I understood this stuff myself.

I've run two displays on one card, usually two VGA CRTs each at 1920x1440 for a total of 3840x1440. I've done that with a bunch of different low cost display cards.

Sometimes (including the system I'm typing on now) the card had a single connector, dual-link DVI-I going to a DVI Y cable, going to a pair of DVI to VGA adapters. I just read that wikipedia article and now don't understand why this even works. The pin out for dual-link DVI-I seems to have two digital channels, but only one analog. I thought the DVI to VGA adapter only passes the analog, so how do I have two CRTs working?

I didn't look at the specs on any of the several cards on which I run a pair of 1920x1440 displays, but they are all low price cards.

I'm considering the purchase of an LCD that uses 2560x1600 dual-link DVI. I would need a card for that. Obviously it needs to support dual-link. (The specs on some cards say they do and on other cards don't say. I haven't found a card spec saying it doesn't. Does not mentioning it mean it doesn't?) Most cards I looked at (even dual link) spec max res lower than 2560x1600. But they don't spec a max single link res and a max dual link res. Those should be distinct numbers. By the info in the wikipedia article you'd expect the max dual link res could be even more than double the max single link res (but also could be less than double).

The OP asked about the relationship between the spec'ed max res and the total res of two independent signals, which may be different from the res of one dual link signal.

So I have lots of questions myself and the only contribution to answering the OP's question I have is that several low cost cards each were able to do a pair of 1920x1440 signals and I haven't tried any card that can generate two signals at all and has enough ram for a pair of 1920x1440 that didn't have enough speed for a pair of 1920x1440 (at 60Hz).

mr_git 06-29-2008 09:34 AM

got my two monitors working from a single card
 
...just to update this thread in case it's any use to anyone else...

I ended up going for a very cheap nvidia card - an FX5200 which has a DVI (single link) and a D-SUB / VGA output.

At first I struggled to get the 22" up to 1680x1050 on the DVI output, even without the other monitor plugged in.

It turns out this is a limitation of these cards, but there are workarounds involving disabling the pixel clock check in the driver with a setting in xorg, or using a particular modeline to set a Reduced blanking DVI pixel clock (an example of this modeline - the one that I'm using - is lower down the first thread I linked to.)

I went with the reduced blanking option rather than overclocking the card, which has a nice quiet passive heatsink that I wouldn't want to get any hotter than it already does.

So, I've now got 1680x1050 over DVI plus 1280x1024 over VGA, using twinview.

This works very nicely - compiz is working great. The slight mismatch of vertical resolution means on my 2nd (smaller screen), it's seems to be possible to lose the mouse pointer in the 'missing' 26 pixels below the screen, but it doesn't really seem to cause any problems; my gnome panels are on the other monitor, and windows snap to maximise on the smaller screen to fit the visible resolution fine.

So - not quite both screens running on DVI, but a solution I'm very happy with that cost me less than GB 20 for the new card.

Here's my xorg.conf if it's any use to anyone:

Code:

# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 1.0  (buildmeister@builder3)  Mon Apr 16 20:38:05 PDT 2007

Section "ServerLayout"
        Identifier        "Layout0"
  screen 0 "Screen0" 0 0
        Inputdevice        "Keyboard0"        "CoreKeyboard"
        Inputdevice        "Mouse0"        "CorePointer"
EndSection

Section "Files"
        Fontpath        "/usr/share/fonts/X11/misc"
        Fontpath        "/usr/share/fonts/X11/100dpi/:unscaled"
        Fontpath        "/usr/share/fonts/X11/75dpi/:unscaled"
        Fontpath        "/usr/share/fonts/X11/Type1"
        Fontpath        "/usr/share/fonts/X11/100dpi"
        Fontpath        "/usr/share/fonts/X11/75dpi"
        # path to defoma fonts
        Fontpath        "/var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType"
        Rgbpath                "/usr/X11R6/lib/X11/rgb"
EndSection

Section "Module"
        Load                "dbe"
        Load                "extmod"
        Load                "freetype"
        Load                "glx"
EndSection

Section "ServerFlags"
        Option                "Xinerama"        "0"
EndSection

Section "InputDevice"
        # generated from default
        Identifier        "Mouse0"
        Driver                "mouse"
        Option                "Protocol"        "auto"
        Option                "Device"        "/dev/psaux"
        Option                "Emulate3Buttons"        "no"
        Option                "ZAxisMapping"        "4 5"
EndSection

Section "InputDevice"
        # generated from default
        Identifier        "Keyboard0"
        Driver                "kbd"
        Option                "CoreKeyboard"
        Option                "XkbRules"        "xorg"
        Option                "XkbModel"        "pc105"
        Option                "XkbLayout"        "gb"
EndSection

Section "Monitor"
        # HorizSync source: edid, VertRefresh source: edid
        Identifier        "Monitor0"
        Vendorname        "Unknown"
        Modelname        "Dell SP2208WFP"
        Horizsync        30.0        -        83.0
        Vertrefresh        56.0        -        76.0
        Option                "DPMS"
        Option                "ExactModeTimingsDVI"        "True"
       
  modeline  "1680x1050rb" 119.00 1680 1728 1760 1840 1050 1053 1059 1080 -hsync +vsync
       
EndSection

Section "Device"
        Identifier        "Videocard0"
        Driver                "nvidia"
        Vendorname        "NVIDIA Corporation"
        Boardname        "GeForce FX 5200"
        Busid                "PCI:1:0:0"
        Option                "ModeValidation"        "NoDFPNativeResolutionCheck"
        Option                "AddARGBGLXVisuals"        "True"# for compiz
        Option                "NoLogo"        "True"
        #  Option        "ModeValidation"  "NoMaxPClkCheck" # required to get 1680x1050 on my GeForceFX5200
EndSection

Section "Screen"
        Identifier        "Screen0"
        Device                "Videocard0"
        Monitor                "Monitor0"
        Defaultdepth        24
        Option                "TwinView"        "1"
        Option                "metamodes"        "DFP: 1680x1050rb +0+0, CRT: nvidia-auto-select +1680+0; DFP: 1280x1024 +0+0, CRT: nvidia-auto-select +1280+0; DFP: 1024x768 +0+0, CRT: nvidia-auto-select +1024+0; DFP: 800x600 +0+0, CRT: nvidia-auto-select +800+0; DFP: 640x480 +0+0, CRT: nvidia-auto-select +640+0"
        #Option        "metamodes" "DFP: 1680x1050 +0+0, CRT: nvidia-auto-select +1680+0; DFP: 1280x1024 +0+0, CRT: nvidia-auto-select +1280+0; DFP: 1024x768 +0+0, CRT: nvidia-auto-select +1024+0; DFP: 800x600 +0+0, CRT: nvidia-auto-select +800+0; DFP: 640x480 +0+0, CRT: nvidia-auto-select +640+0"
        SubSection "Display"
                Depth        24
                Modes                "1600x1200"        "1280x1024"        "1024x768"        "800x600"        "640x480"
        EndSubSection
EndSection


ajarmoniuk 04-01-2009 02:58 PM

Thanks, mr_git! It works for me now! :D


All times are GMT -5. The time now is 12:41 PM.