image quality, cables, mixing old & new hardware
The card:
A PNY Nvidia Quadro FX 1300, which has two DVI outputs. Monitor #1: An old Silicon Graphics behemoth of a CRT with both VGA (HD15) and 13W3 inputs. Monitor #2: An old Viewsonic CRT with both VGA (HD15) and 5-BNC inputs. Question: Is there any difference in image quality whatsoever when using one cable / connector type over another, or are the multiple possible connections on the back of each of these monitors merely a result of their manufacture during a period of technology transition? With the hardware mentioned above, is there any reason to use anything besides DVI to VGA adapters and a couple of standard cables? |
big...difference....dvi is sweet .........
where vga is.......sour even using an adapter.......will degrade the quality......but it is still better than standard....... ive dealt with a few plasma tv's...........trust me |
I was asking about the display side - I have to use the DVI outputs on the graphics card, but would like to know if there is any perceptible difference using the BNC and 13W3 inputs to the monitors as opposed to their HD15 inputs.
|
|
All times are GMT -5. The time now is 08:31 AM. |