• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

DVI Specification Compliance by ATI and Nvidia

easy123

Member
I seem to remember that ATI implemented the specification perfectly on the
9800 Pro and Nvidia failed to do so, notably on the 5200, and other models.
As a user of a flat panel at 1600 x 1200 native via DVI, should I weigh this
in my decision of which brand to go with, or has Nvidia corrected its' non-compliance
issues with their latest and greatest cards?

I will appreciate this forums' input. Thanks
 
The "compliance to standards" for DVI are more up to the OEM than the manufacturer, if you avoid the brands that use crap color replication filters you should be fine. (Abit, Gigabyte, Albatron, Asus, MSI, and Leadtek are all good, AVOID Evga and Jetway)

Leadtek has the highest quality in both VGA and DVI, hands down.
 
On DVI, where the signal path is digital, there is no "crap color replication filter" component issue.
 
Originally posted by: Peter
On DVI, where the signal path is digital, there is no "crap color replication filter" component issue.

Well theres something Evga and jetway manage to do to degrade the IQ, ive seen it with my own eyes.
 
Thanks Peter and Acanthus,

Speed not-with-standing, it would seem to me that if the DVI Standards were not implemented
correctly, that some of the new and improved "eye candy" offered by Nvidia might not
make it to the screen properly.
I am not worried about ATI, because they have shown perfect compliance in the past.
My problem is, I can't find information that Nvidia has improved or ever met compliance.
 
As long as you don't get any more precise about exactly where the violation is in NVidia, I'm not quite willing to believe any of it. Particularly since I've seen enough NVidia cards work just fine with all sorts of DVI gear - including 1920x1200 displays from Sony and Apple.
 
So they can only see these problems using this tool? Or is this an actual issue people face, because ive never heard of a problem like this before.

This article also mentions filters, which as peter said, doesnt help with a digital signal, only analog.
 
Acanthus,

The article states "Being out of compliance substantially increases the risk that the overall image quality may be poor, but it isn't a guarantee it will. Still, it's something to think about as you make your graphics card-buying decision."

After spending a lot of money buying my flat-panel (native 1600x1200), then
contemplating spending $300-500 more on the latest graphic card, I don't feel
like gambling on Nvidia. I believe my money is more safely spent on ATI.

I was hoping there might be someone here in this forum that could show me
Nvidia is now making their cards compliant with the DVI standards and then
I would feel like I had a choice between Nvidia and ATI.

Maybe someone still will.
 
Not Nvidia's fault if the makers of the card doesn't follow their reference design. Then uses cheaper components so they can make more profit.
 
Ouch. That analysis is rather consistent in that NVidia's chip internal transmitters look rather bad in signal quality, while ATi's internal ones as well as everyone's discrete transmitter chips are somewhere between good enough and real good.

This defeats Viper's point ... if you're just using what the NVidia chip gives you, it'll be no good. Use an external transmitter chip from a company that got it right, and you're there. ATi on the other hand demonstrate that a TMDS transmitter inside the main graphics chip can be done properly.
 
Originally posted by: Acanthus
So they can only see these problems using this tool? Or is this an actual issue people face, because ive never heard of a problem like this before.

This article also mentions filters, which as peter said, doesnt help with a digital signal, only analog.

That eye diagram method is a neat way to pinpoint the quality of a differential-pair digital transmission. The worse the margins are, the more likely it is you'll get data corruption at certain cable lengths - not necessarily with long cables, but at any random cable length that happens to bring the "wrong" capacitance into the game.

Filters? Well yes, there are. But they don't affect image quality like they do on an analog VGA line. There, you'd see poor contrast, washed out edges, general blurriness, low color gamut etc. etc. On a digital TMDS line, it's about shaping the digital signal best possible so the pixel data reach their destination. Effects of getting that wrong are not in the area of image quality, but rather on the not-working-at-all side of things.
 
Both the NVidia reference card and the MSI use an external transmitter chip. (And yes, poor power stability on the card does contribute. This might be part of why it's often bad with NVidia chips; maybe it is attributed to their power hunger, paired with not really good power distribution and stabilization in the card design.)
 
Back
Top