DVI to VGA adaptor

life24

Senior member
Mar 25, 2014
283
0
76
Hi,
When we use DVI to VGA adaptor, Eventually output signal is analog or digital?
And The conversion to VGA causes some loss of quality?
Thanks
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,344
10,047
126
The most common DVI-to-VGA adapter, actually only mechanically splits out the analog signal present on some of the pins of the DVI-I connector.
 
  • Like
Reactions: Joe NYC

mikeymikec

Lifer
May 19, 2011
17,714
9,597
136
I sometimes wonder if VGA 1080p isn't quite as clear as digital 1080p, but I really wouldn't like to make such an assertion without getting identical kit to easily compare, and if there is a difference it can't be that huge IMO.

Elaborating on others' points, a DVI-I output sends out both analog and digital signals, so plugging in a simple DVI -> VGA adapter just connects the analog one. No conversion goes on. The only potential loss therefore is due to the drawbacks of using analog signals such as with VGA cables/monitors.

There is at least one variant of DVI that is digital only: DVI-D. You can't use a standard cheap DVI -> VGA adapter with that because a) the adapter won't fit and b) there's no analog signal coming from the host through that socket.

For more information:
https://en.wikipedia.org/wiki/Digital_Visual_Interface#Connector
 
  • Like
Reactions: coercitiv

Aikouka

Lifer
Nov 27, 2001
30,383
912
126
Hi,
When we use DVI to VGA adaptor, Eventually output signal is analog or digital?
And The conversion to VGA causes some loss of quality?
Thanks

It will always output as analog since VGA uses an analog signal. As for how it is handled, that depends on what you're doing. If you're using an actual DVI-to-VGA (digital to analog) converter, it may have some impacts upon quality, but I doubt they'd be substantial. If you're using one of the adapters that they used to throw in the box with video cards (like this one), then there's no conversion. On a DVI port, you'll see that there's something that looks like a giant plus sign '+'. If the DVI port is analog-capable, it should have a pin in each quadrant of that plus sign. However, most cards have been moving away from being analog capable. You can see an example of the different connectors here.
 

SamMaster

Member
Jun 26, 2010
148
75
101
At work, we have hundreds and hundreds of screens with a turnaround of 8 years for desktop computers, all with a mix of VGA, DVI, HDMI, and DP. We also have setups with projectors and splitters that require VGA.

With that in mind, the answer is: it depends on the quality and age of the cables, the screens, the adapters, the PCs, and the surrounding interference present. It can range from barely discernible with a quality screen and cable, to looking like a washed out drunken vision with hard drugs at the other end of the spectrum. Analog can be finicky.

Try it yourself, see if you can see or live with the differences or not.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
Hi,
When we use DVI to VGA adaptor, Eventually output signal is analog or digital?
And The conversion to VGA causes some loss of quality?
Thanks
VGA is always analog.


The best DVI to VGA adaptors are the simple ones, that simply pass the analog VGA signal already present on most DVI outputs to a VGA connector. These tend to work very well with no signal loss / loss of quality.

If you run into the rare DVI port that does not have the analog VGA signal pins present, you might be required to get something more complicated.

Now, on the subject of quality, it is frequently the case a DVI/HDMI/DisplayPort device/implementation will be superior to its VGA equivalent. They are typically newer then the VGA devices they replace.