D- Sub or DVI connection what is better?

cirocorby

Member
Apr 21, 2002
81
0
0
Hi can you visualy see a diferance between connecting to D -sub video card ...........or.......... DVI-D connection, my Hitachi LCD monitor has both connections but my video card only has D SUB and i am thinking about up grading to a ATI radeon 9700 pro ..........from a ASus TI-200 card would i see any diferance using DVI connections ? thanks for the imput.
 

ProviaFan

Lifer
Mar 17, 2001
14,993
1
0
I haven't had a chance to observe an LCD up close for an extended period of time (as I do Photoshop work too much to be able to use a LCD with its inherent color rendition problems), but in theory you should get better 2D image quality with DVI. How much of an advantage this actually gives you with a card that already has good 2D IQ with analog output (such as a built-by-ATI Radeon 9700 Pro or a Matrox G550) wouldn't likely be worth the money, but since you have an older Geforce (many of which were notorious for bad 2D IQ), it would probably be worth the upgrade if you can afford it.
 

kylef

Golden Member
Jan 25, 2000
1,430
0
0
If you have an LCD flat panel that has a DVI input, then you should definitely go that route.

Otherwise, the RAMDAC (digital to analog converter) in your video card is taking a DIGITAL frame at your desktop's resolution (say, 1024 by 768), converting this digital frame to an analog scan signal, sending this over the D-sub VGA cable, and then your LCD panel has to covert this analog signal BACK to a digital signal to display it on the LCD panel.

In the process of converting the signal from digital -> analog -> digital, some signal fidelity is lost.

If you use a DVI connector, the RAMDAC process does not occur. The EXACT same digital representation is seen by the LCD panel as is seen by your video card. So it is technically superior in this respect.

Also, from my experience using D-sub input on flat panel displays leads to "signal jitter" where the resolution is sharp, but the entire picture on the flat panel "jumps" around a bit. It's not really noticeable unless you look at the edges of the display. Most high-quality LCD panels have signal processing A->D converters that can correct for this jitter, but I have seen it nonetheless. DVI connectors do not have this problem.

The ONLY disadvantage to using DVI cables is usually their cost. Anyone disagree?