DVI to VGA and vice versa adapter questions

Trizik

Senior member
Jun 17, 2005
362
0
0
Hi, I am just wondering what modes (VGA or DVI) would be used for the following scenerios?

Scenerio 1:
monitor DVI port > adapter > VGA wire > adapter > video card DVI port

Scenerio 2:
monitor DVI port > adapter > VGA wire > video card VGA port

Scenerio 3:
monitor VGA port > VGA wire > adapter > video card DVI port
 

Aluvus

Platinum Member
Apr 27, 2006
2,913
1
0
1. VGA for the computer, DVI for the monitor. The (expensive) VGA-DVI adapter will be man in the middle.

2. VGA for both.

3. Same as 1.

The DVI-VGA adapters that come with video cards are actually just pass-throughs. Computers [normally] have DVI-I ports, which output both a DVI-D signal and a DVI-A signal, where DVI-A is essentially a condensed form of VGA. Using one is the same as just using a VGA port. However, the adapters that go the other way are in fact more complicated and more expensive, and the dinky little passthroughs cannot perform this task.
 

Lord Evermore

Diamond Member
Oct 10, 1999
9,558
0
76
I think those answers are slightly off. It depends on exactly what adapters are used, and what modes the hardware can accept. All 3 scenarios could have analog at both ends, or DVI at both ends.

There are two possibilities for each case, depending on whether the DVI port on the monitor is capable of accepting an analog signal (which I think none of them do).

#1: if the monitor DVI can accept analog input, then you'd have VGA analog signalling on both ends. The DVI port on video cards is usually DVI-I, and so it sends out the analog signal on the appropriate pins. The adapter just connects those pins to the appropriate pins on a VGA plug, it's the same actual signal as if the card had a VGA port. The cable carries it to an adapter, which just reverses the ping conversion to send the signals to the analog pins on the monitor port. You lose a bit of signal quality due to the multiple connections being made, but that's only an issue at high resolutions and refresh rates.

If the monitor CANNOT accept a VGA input, then you have to have the more expensive active converter connected to the monitor DVI port, to convert the signal to digital input. In this case, it is analog VGA at the video card, and DVI at the monitor.

If you really wanted to, you could have digital output from the video card, an expensive converter connected to the VGA cable, and then another expensive converter at the other end to make it digital again. That'd cause a lot of image quality issues though from all the conversions.

Scenario 2 is essentially the same as 1, except that you don't have the adapter from the video card port. Again, it's an analog signal from the video card, and you need either a cheap adapter at the monitor end, to make it also an analog VGA signal, or you need an active converter to make it a DVI signal.

#3 is the only one that will be analog almost certainly at both ends. Analog output from the DVI-I connector, adapter to make the pin connection to the cable, to analog input on the monitor. You could again however make it digital output from the video card, and use an active converter to make it an analog signal for the cable. Obviously that'd be stupid. However if you had a digital only video output, DVI-D, then you'd have to do it that way.