image quality, cables, mixing old & new hardware
The card:
A PNY Nvidia Quadro FX 1300, which has two DVI outputs.
Monitor #1:
An old Silicon Graphics behemoth of a CRT with both VGA (HD15) and 13W3 inputs.
Monitor #2:
An old Viewsonic CRT with both VGA (HD15) and 5-BNC inputs.
Question:
Is there any difference in image quality whatsoever when using one cable / connector type over another, or are the multiple possible connections on the back of each of these monitors merely a result of their manufacture during a period of technology transition?
With the hardware mentioned above, is there any reason to use anything besides DVI to VGA adapters and a couple of standard cables?
|