LinuxQuestions.org
Download your favorite Linux distribution at LQ ISO.
Go Back   LinuxQuestions.org > Forums > Linux Forums > Linux - Hardware
User Name
Password
Linux - Hardware This forum is for Hardware issues.
Having trouble installing a piece of hardware? Want to know if that peripheral is compatible with Linux?

Notices

Reply
 
Search this Thread
Old 06-20-2008, 01:02 PM   #1
mr_git
LQ Newbie
 
Registered: Dec 2005
Distribution: Ubuntu
Posts: 10

Rep: Reputation: 1
advice on graphics cards - 2 TFT monitors via DVI cheaply?


Sorry if this is a stupid question, but I'm struggling to find a clear answer despite a lot of searching and reading.

I've just acquired a new 22" widescreen TFT, which has a native resolution of 1680x1050. I'd like to add this to my 19" TFT, which is 1280x1024... both displays have DVI and VGA inputs.

I need a new graphics card (at present running dual screen with a 15" CRT and an old PCI card with 4mb! along with my aged geforce 2 feeding the TFT via VGA), and it'll need to be AGP to fit my motherboard.

I want to know whether I can expect to run both of these monitors by DVI from one graphics card, but without buying something boutique (and therefore expensive).

One reason for my confusion is seeing so many cards showing a max digital resolution of 2560x2048. Is this in total, or per screen?

I'm pretty sure I could buy a fairly entry-level card (e.g. nvidia 6200, say), and run the 22" from the DVI, and the other by VGA, but my question is would it be possible to run both by DVI?

Would I need dual DVI outputs on the card (e.g. a quadro card)? a lot of these still seem to quote a max digital res of 2560x2048 - does that mean both DVI outputs together?

Then there's dual-DVI, and whether I need that and then a DVI Y-cable? The wikipedia page on DVI is very detailed, but I'm just not finding the answer to my questions there.

I'm a bit confused as to what I should be looking for in my new card - two DVI outputs, dual-DVI and a Y-cable? or is one DVI and one VGA the best dual-screen I can hope for without raiding the piggy-bank?

TIA for any light anyone can shed on this for me - I'm really looking forward to seeing compiz going across these two screens!
 
Old 06-20-2008, 02:20 PM   #2
TB0ne
Guru
 
Registered: Jul 2003
Location: Birmingham, Alabama
Distribution: SuSE, RedHat, Slack,CentOS
Posts: 14,223

Rep: Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474Reputation: 2474
I believe that's per monitor resolution.

I've done dual display with a Y cable, with two DVI cables, and with one DVI and one analog VGA, off of single cards, and they all work fine. You don't have to spend a fortune on a card, either, providing you're not wanting to do some huge resolution, with massive 3D framerates. I spent a whopping $60 on a card, and use it at the office, driving two 20" monitors, each at 1280x1024x32, so my desktop winds up being 2560x1024.

FWIW, I'd make sure that the resolutions on both screens match...it's doable with them at different settings, but it looks very strange. Matching monitors helps too, if you've got the cash.
 
Old 06-20-2008, 02:42 PM   #3
lazlow
Senior Member
 
Registered: Jan 2006
Posts: 4,362

Rep: Reputation: 171Reputation: 171
If you are trying to run both monitors as one big desktop then the physical size and resolution match is important, but if you are running them independently it is not an issue. We run some with one monitor in CLI and the other in gui. You can monitor edits live that way.

Edit: You also have to be careful on the video card. Just becuase a card has multiple outputs does not mean that you can use them at the same time.

Last edited by lazlow; 06-20-2008 at 02:45 PM.
 
Old 06-20-2008, 03:19 PM   #4
johnsfine
Guru
 
Registered: Dec 2007
Distribution: Centos
Posts: 5,052

Rep: Reputation: 1101Reputation: 1101Reputation: 1101Reputation: 1101Reputation: 1101Reputation: 1101Reputation: 1101Reputation: 1101Reputation: 1101
I wish I understood this stuff myself.

I've run two displays on one card, usually two VGA CRTs each at 1920x1440 for a total of 3840x1440. I've done that with a bunch of different low cost display cards.

Sometimes (including the system I'm typing on now) the card had a single connector, dual-link DVI-I going to a DVI Y cable, going to a pair of DVI to VGA adapters. I just read that wikipedia article and now don't understand why this even works. The pin out for dual-link DVI-I seems to have two digital channels, but only one analog. I thought the DVI to VGA adapter only passes the analog, so how do I have two CRTs working?

I didn't look at the specs on any of the several cards on which I run a pair of 1920x1440 displays, but they are all low price cards.

I'm considering the purchase of an LCD that uses 2560x1600 dual-link DVI. I would need a card for that. Obviously it needs to support dual-link. (The specs on some cards say they do and on other cards don't say. I haven't found a card spec saying it doesn't. Does not mentioning it mean it doesn't?) Most cards I looked at (even dual link) spec max res lower than 2560x1600. But they don't spec a max single link res and a max dual link res. Those should be distinct numbers. By the info in the wikipedia article you'd expect the max dual link res could be even more than double the max single link res (but also could be less than double).

The OP asked about the relationship between the spec'ed max res and the total res of two independent signals, which may be different from the res of one dual link signal.

So I have lots of questions myself and the only contribution to answering the OP's question I have is that several low cost cards each were able to do a pair of 1920x1440 signals and I haven't tried any card that can generate two signals at all and has enough ram for a pair of 1920x1440 that didn't have enough speed for a pair of 1920x1440 (at 60Hz).
 
Old 06-29-2008, 09:34 AM   #5
mr_git
LQ Newbie
 
Registered: Dec 2005
Distribution: Ubuntu
Posts: 10

Original Poster
Rep: Reputation: 1
got my two monitors working from a single card

...just to update this thread in case it's any use to anyone else...

I ended up going for a very cheap nvidia card - an FX5200 which has a DVI (single link) and a D-SUB / VGA output.

At first I struggled to get the 22" up to 1680x1050 on the DVI output, even without the other monitor plugged in.

It turns out this is a limitation of these cards, but there are workarounds involving disabling the pixel clock check in the driver with a setting in xorg, or using a particular modeline to set a Reduced blanking DVI pixel clock (an example of this modeline - the one that I'm using - is lower down the first thread I linked to.)

I went with the reduced blanking option rather than overclocking the card, which has a nice quiet passive heatsink that I wouldn't want to get any hotter than it already does.

So, I've now got 1680x1050 over DVI plus 1280x1024 over VGA, using twinview.

This works very nicely - compiz is working great. The slight mismatch of vertical resolution means on my 2nd (smaller screen), it's seems to be possible to lose the mouse pointer in the 'missing' 26 pixels below the screen, but it doesn't really seem to cause any problems; my gnome panels are on the other monitor, and windows snap to maximise on the smaller screen to fit the visible resolution fine.

So - not quite both screens running on DVI, but a solution I'm very happy with that cost me less than GB 20 for the new card.

Here's my xorg.conf if it's any use to anyone:

Code:
# nvidia-settings: X configuration file generated by nvidia-settings
# nvidia-settings:  version 1.0  (buildmeister@builder3)  Mon Apr 16 20:38:05 PDT 2007

Section "ServerLayout"
	Identifier	"Layout0"
  screen 0 "Screen0" 0 0
	Inputdevice	"Keyboard0"	"CoreKeyboard"
	Inputdevice	"Mouse0"	"CorePointer"
EndSection

Section "Files"
	Fontpath	"/usr/share/fonts/X11/misc"
	Fontpath	"/usr/share/fonts/X11/100dpi/:unscaled"
	Fontpath	"/usr/share/fonts/X11/75dpi/:unscaled"
	Fontpath	"/usr/share/fonts/X11/Type1"
	Fontpath	"/usr/share/fonts/X11/100dpi"
	Fontpath	"/usr/share/fonts/X11/75dpi"
	# path to defoma fonts
	Fontpath	"/var/lib/defoma/x-ttcidfont-conf.d/dirs/TrueType"
	Rgbpath		"/usr/X11R6/lib/X11/rgb"
EndSection

Section "Module"
	Load		"dbe"
	Load		"extmod"
	Load		"freetype"
	Load		"glx"
EndSection

Section "ServerFlags"
	Option		"Xinerama"	"0"
EndSection

Section "InputDevice"
	# generated from default
	Identifier	"Mouse0"
	Driver		"mouse"
	Option		"Protocol"	"auto"
	Option		"Device"	"/dev/psaux"
	Option		"Emulate3Buttons"	"no"
	Option		"ZAxisMapping"	"4 5"
EndSection

Section "InputDevice"
	# generated from default
	Identifier	"Keyboard0"
	Driver		"kbd"
	Option		"CoreKeyboard"
	Option		"XkbRules"	"xorg"
	Option		"XkbModel"	"pc105"
	Option		"XkbLayout"	"gb"
EndSection

Section "Monitor"
	# HorizSync source: edid, VertRefresh source: edid
	Identifier	"Monitor0"
	Vendorname	"Unknown"
	Modelname	"Dell SP2208WFP"
	Horizsync	30.0	-	83.0
	Vertrefresh	56.0	-	76.0
	Option		"DPMS"
	Option		"ExactModeTimingsDVI"	"True"
	
  modeline  "1680x1050rb" 119.00 1680 1728 1760 1840 1050 1053 1059 1080 -hsync +vsync
	
EndSection

Section "Device"
	Identifier	"Videocard0"
	Driver		"nvidia"
	Vendorname	"NVIDIA Corporation"
	Boardname	"GeForce FX 5200"
	Busid		"PCI:1:0:0"
	Option		"ModeValidation"	"NoDFPNativeResolutionCheck"
	Option		"AddARGBGLXVisuals"	"True"# for compiz
	Option		"NoLogo"	"True"
	#   Option        "ModeValidation"  "NoMaxPClkCheck" # required to get 1680x1050 on my GeForceFX5200
EndSection

Section "Screen"
	Identifier	"Screen0"
	Device		"Videocard0"
	Monitor		"Monitor0"
	Defaultdepth	24
	Option		"TwinView"	"1"
	Option		"metamodes"	"DFP: 1680x1050rb +0+0, CRT: nvidia-auto-select +1680+0; DFP: 1280x1024 +0+0, CRT: nvidia-auto-select +1280+0; DFP: 1024x768 +0+0, CRT: nvidia-auto-select +1024+0; DFP: 800x600 +0+0, CRT: nvidia-auto-select +800+0; DFP: 640x480 +0+0, CRT: nvidia-auto-select +640+0"
	#Option         "metamodes" "DFP: 1680x1050 +0+0, CRT: nvidia-auto-select +1680+0; DFP: 1280x1024 +0+0, CRT: nvidia-auto-select +1280+0; DFP: 1024x768 +0+0, CRT: nvidia-auto-select +1024+0; DFP: 800x600 +0+0, CRT: nvidia-auto-select +800+0; DFP: 640x480 +0+0, CRT: nvidia-auto-select +640+0"
	SubSection "Display"
		Depth	24
		Modes		"1600x1200"	"1280x1024"	"1024x768"	"800x600"	"640x480"
	EndSubSection
EndSection
 
Old 04-01-2009, 02:58 PM   #6
ajarmoniuk
LQ Newbie
 
Registered: Aug 2005
Posts: 2

Rep: Reputation: 0
Thumbs up

Thanks, mr_git! It works for me now!
 
  


Reply


Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is Off
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
dual monitors with dual graphics cards Furlinastis Linux - Hardware 7 04-05-2007 09:31 PM
Few questions about TFT monitors Mega Man X Linux - Hardware 0 12-03-2004 04:32 PM
TFT (DVI) on linux Jestrik Linux - Hardware 10 02-01-2004 06:31 AM
What is the status of TFT monitors in Linux? purplecow Linux - Hardware 1 12-06-2003 12:17 PM
Graphics Cards for Dual Monitors JC404 Linux - Hardware 2 09-04-2003 05:23 AM


All times are GMT -5. The time now is 05:38 PM.

Main Menu
My LQ
Write for LQ
LinuxQuestions.org is looking for people interested in writing Editorials, Articles, Reviews, and more. If you'd like to contribute content, let us know.
Main Menu
Syndicate
RSS1  Latest Threads
RSS1  LQ News
Twitter: @linuxquestions
identi.ca: @linuxquestions
Facebook: linuxquestions Google+: linuxquestions
Open Source Consulting | Domain Registration