DVI and "sharpness" of 2D display

programmer

Senior member
Mar 12, 2003
412
0
0
Back in the day of VGA analog connections, people generally preferred ATI-based video for sharper 2D over nVidia chipsets.

With DVI, will all GPUs have the same quality of 2D? Or is there still a difference?

[edit: in general, I mean from the low end solutions: ATI X300/X550/X600, or nVidia 6200/6600]
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Pixel data are digitally transmitted. There cannot possibly be any image quality difference between "works 100%" and "completely unusable" - although some graphics card and cable vendors are trying to tell you otherwise.
 

Insomniak

Banned
Sep 11, 2003
4,836
0
0
DVI is nice because the path from GPU to cable to monitor is digital - digital - digital. With VGA analog, it's digital - analog - digital.

To this day, the argument continues over whether or not color reproduction and clarity is lost due to the conversion of the signal to analog and then back to digital again.

For my two cents, every DVI connection I have seen is MUCH sharper than VGA, not just in 2D, but in all apps. Take that as you will.
 

programmer

Senior member
Mar 12, 2003
412
0
0
Thx for the replies. I had been leaning towards ATI due to experience, but was considering a nVidia 6200. But, as it turns out, I can find only one card which fits all my criteria: DVI, fanless, PCIe, low profile, *and* comes with the low profile bracket -- the ABIT RX300SE 256HM ($60 at newegg)

Which begs the question -- why don't low profile cards typically include a low profile bracket?!

EDIT: Now I am not so sure. The pic at newegg shows the low profile bracket with the card accesories, but the info at Abit's site doesn't mention it. argh!