What's the difference between DVI-I DVI-A DVI-D Dsub

tigersty1e

Golden Member
Dec 13, 2004
1,963
0
76
15 pin D sub
D sub
DVI-I
DVI-A
DVI-D
Sometimes it says just DVI


I've seen monitors come in all these different flavors for connections. Somebody want to explain the differences? Thanks.
 

vhx

Golden Member
Jul 19, 2006
1,151
0
0
most video cards (if not all) are DVI-I, which means they carry both a digital and analog signal. So DVI-D will work on them, and you can convert DVI-I to VGA using an adapter for that analog connection.

I don't think anyone uses DVI-A at all anymore.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
DVI-I transmits both DVI-A (analog) and DVI-D. DVI-A is just like VGA in another form factor (exact same from what I know). DVI-D is the digital, LCD type of DVI.

15 pin D-sub (same as D-sub) is VGA. That's an analog connection best fit for CRTs but it can work well on LCDs as well. Sometimes with poor graphics card output VGA gets degraded considerably, especially with LCDs. Hooking up a high res LCD to a VGA-output laptop can sometimes yield very disappointing results. DVI is generally perfect unless your video card is very weak. Then, noise and jittering can occur but this is rare. It's most common with high bandwidth requirements (refresh rate * resolution).

Honestly Wikipedia has great explanations of the DVI type connectors. http://en.wikipedia.org/wiki/DVI-A