- May 11, 2004
- 19
- 0
- 0
An increasing number of graphics-cards have dual DVI connectors instead of one 15-pin VGA and one DVI. At the moment I have a card with one VGA and one DVI output, and drive a 22" CRT from the VGA output and use 1600x1200 @ 100hz.
If I got a card with two DVI connectors and connected my monitor using a DVI->VGA dongle, would I still be able to run it at 1600x1200 @ 100hz or would I need to drop the refresh-rate to an unacceptable 75hz which is the maximum a digital single-link DVI can output?
I assume the dongle is taking the signal from the analogue (DVI-A) pins of the DVI-I connector, so does that allow it to run at the full range of refresh-rates and resolutions supported by the card's RAMDAC, including those that exceed the bandwidth of the digital single-link (DVI-D) pins? If it doesn't, then these dual DVI cards are a serious problem.
If I got a card with two DVI connectors and connected my monitor using a DVI->VGA dongle, would I still be able to run it at 1600x1200 @ 100hz or would I need to drop the refresh-rate to an unacceptable 75hz which is the maximum a digital single-link DVI can output?
I assume the dongle is taking the signal from the analogue (DVI-A) pins of the DVI-I connector, so does that allow it to run at the full range of refresh-rates and resolutions supported by the card's RAMDAC, including those that exceed the bandwidth of the digital single-link (DVI-D) pins? If it doesn't, then these dual DVI cards are a serious problem.
