• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

DVI and "sharpness" of 2D display

programmer

Senior member
Back in the day of VGA analog connections, people generally preferred ATI-based video for sharper 2D over nVidia chipsets.

With DVI, will all GPUs have the same quality of 2D? Or is there still a difference?

[edit: in general, I mean from the low end solutions: ATI X300/X550/X600, or nVidia 6200/6600]
 
Pixel data are digitally transmitted. There cannot possibly be any image quality difference between "works 100%" and "completely unusable" - although some graphics card and cable vendors are trying to tell you otherwise.
 
DVI is nice because the path from GPU to cable to monitor is digital - digital - digital. With VGA analog, it's digital - analog - digital.

To this day, the argument continues over whether or not color reproduction and clarity is lost due to the conversion of the signal to analog and then back to digital again.

For my two cents, every DVI connection I have seen is MUCH sharper than VGA, not just in 2D, but in all apps. Take that as you will.
 
Thx for the replies. I had been leaning towards ATI due to experience, but was considering a nVidia 6200. But, as it turns out, I can find only one card which fits all my criteria: DVI, fanless, PCIe, low profile, *and* comes with the low profile bracket -- the ABIT RX300SE 256HM ($60 at newegg)

Which begs the question -- why don't low profile cards typically include a low profile bracket?!

EDIT: Now I am not so sure. The pic at newegg shows the low profile bracket with the card accesories, but the info at Abit's site doesn't mention it. argh!
 
Back
Top