DVI -> VGA adapter causing high res quality loss?

brinstar117

Senior member
Mar 28, 2001
954
4
91
Good day fellow AnandTech'ers!


I recently repaired my Iiyama 22" Vision Master Pro 510 (22/20" Viewable) CRT and now I have it hooked up to my main computer.

My main computer's video card is an ATI Radeon 8500DV All-In-Wonder. The AIW only has DVI and TV-out. My monitor uses VGA connectors only. So, understandably I must use the DVI to VGA adapter bundled with the video card.

However, at high resolutions such as 1600x1200@32bpp @ 85hz and above, the 2d quality is very poor. Text is not sharp and there's a noticable shimmering effect on borders of windows. (most noticable on borders since contrasting colors are next to each other)

I tested the very same monitor on another computer of mine with a Voodoo 5 5500 PCI and when I set the resolution to 1600x1200@32bpp @ 85hz the 2d quality was very good. Text was crisp and there was no noticable shimmering whatsoever.

I used to own a retail Radeon 8500 64MB and I had it paired with the monitor in question and I do not recall that it had this shimmering problem. However, I used its VGA connector and not the DVI with the adapter.

What I need are suggestions :)

Either I get a new card (rather not, 8500DV serves all my needs except that it doesn't display high res 2d well), or I somehow "fix" the DVI to VGA adapter (or find a better one), or I use an older video card (V5 PCI) but I'd rather not, since the V5 is a bit slow for my tastes when it comes to modern gaming.
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Anytime you add an adapter in the video path you risk degrading the signal. With any connector there is a possibility of an impedance miss match that could send ripples up and down the video cable. The ripples can be amplified by the video amp and show as shadows or ghosts after light to dark or dark to light transitions on the screen. The shadows make text look out of focus. It is always best to run video directly from the card to the monitor (no adapters, switch boxes etc.) I doubt that you will find a better adapter since odds are, it is the connector that causes the problem.

Unfortunately if you have a card with DVI-I only you are forced to use a pin to pin converter.
 

jarsoffart

Golden Member
Jan 11, 2002
1,832
0
71
I don't see why the signal quality would degrade. Normally, the video card converts it from digital to analog for the VGA port, but for DVI ports, it just leaves it as digital. When you put the adapter, you are just doing what would happen for a VGA port, unlike when you have the video card convert digital to analog and have it converted back to digital for some LCD monitors. I guess it could happen if the adapter is poor quality.