interesting DVI comparison between nvidia and ati chips

Jan 31, 2002
40,819
2
0
Another hardware site (Extremetech?) did this better awhile ago, and with 90% Less FUD.

Proper Conclusion: "You mean that different manufacturers have different standards of quality? No sh!t!"

THG Conclusion: "OMG, TEH NVEEDIA IS SUX0R!"

:p

- M4H
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
extremetech

actually extremetech trashed the nvidia cards more...

not a gpu problem, but poor card manufacturing decisions on components.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: MercenaryForHire
Another hardware site (Extremetech?) did this better awhile ago, and with 90% Less FUD.

Proper Conclusion: "You mean that different manufacturers have different standards of quality? No sh!t!"

THG Conclusion: "OMG, TEH NVEEDIA IS SUX0R!"

:p

- M4H

Conclusion from the THG article:

The result of our DVI compliance test is positive across the board, with all six cards reaching DVI compliance. However, while the three ATI based cards provided by ABIT and ATi turned in exemplary results, MSI's NVIDIA based cards are only able to reach DVI compliance in UXGA at a reduced frequency of 141 MHz and using a reduced blanking interval. This greatly limits the NVIDIA cards' "upward mobility" - since they don't have enough reserves for TFT displays with higher native resolutions than UXGA (1600x1200). The MSI NX6800 card only reached compliance at 162MHz when a separate TMDS transmitter chip was used. Counting these results, it seems that ATi's integrated TMDS transmitter is superior to NVIDIA's implementation. Yet the MSI cards' eye diagrams displayed a turbulent distribution of the data even when the SiL 164 TMDS transmitter was used. This, in turn, limits the maximum usable cable length, especially when cheaper cables are used.

Conclusion from the ExtremeTech article:

What's clear from our testing is that ATI has developed a robust internal DVI transmitter. Even the failure we noted with the Tyan board was borderline ? and occurred at 162MHz. nVidia-based boards are considerably more problematic. We're not sure why, but in our conversations with Silicon Image, they've indicated that circuit board design issues may be a factor. That, and the use of lower-cost filers and capacitors on some boards, may come into play. Some manufacturers may be making a calculated decision that no one would attach a $1,200 flat panel to a $60 graphics card. That may be a faulty assumption. Corporate IT departments may not want to invest big bucks in the graphics hardware, but users may want ? and buy ? high resolution displays. We're hoping the manufacturers of nVidia-based solutions get the message: DVI compliance will become more important as UXGA panels become more common. As we've seen from the nVidia reference boards, this is not an insurmountable problem.

Yeah, clearly THG is nuts. :disgust:

Poking fun at THG is definitely the 'in' thing around AT, but they're hardly one-sided idiots. The 6800 they tested, whether by fault in the reference design or cheapness on the part of MSI, was not up to snuff in terms of DVI compliance. Extremetech found much the same thing with various GeForceFX boards.
 
Jan 31, 2002
40,819
2
0
Originally posted by: Matthias99
Poking fun at THG is definitely the 'in' thing around AT, but they're hardly one-sided idiots. The 6800 they tested, whether by fault in the reference design or cheapness on the part of MSI, was not up to snuff in terms of DVI compliance. Extremetech found much the same thing with various GeForceFX boards.

Yeah, ET found that the reference FX5700, FX5900U, and Quadro FX2000 all passed the tests fine. The other manufacturers had a bit more trouble. :p

- M4H
 

Pete

Diamond Member
Oct 10, 1999
4,953
0
0
Er, that ET article shows all FX cards (borderline) failing with the internal TDMS, and usually at <=141MHz. THG seems to show the same thing. nV's internal TDMS transmitter appears to be generally subpar, their external SI transmitter appears fine, and ATi's internal transmitters appear "exemplary." Not sure if this will affect most people (who are probably not running 16x12+ LCDs via DVI), but two sites appear to have reached the same conclusion.

I don't see the FUD, M4H.
 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Whatever the emotions, from professional experience (including signal analysis with expensive engineering tools) I must second that NVidia's chip internal TMDS transmitter is impressively far from good. Obvious signal quality degradation starts at 1280x1024, and I've even seen visible signalling issues at 1024x768 (on an AOpen FX5200). Don't even think about 16x12 let alone 19x12. On the other hand, I've seen nothing but rock solid DVI from Radeon cards.
 
Mar 11, 2004
23,444
5,849
146
Well all this is all fine and good, but the real question does it actually matter? I find the quality and color from my 6800GT on my 2001FP to be very good, so I could care less what some tests found. I doubt an ATi card would really look much different, in fact I'd probably think it to be worse with the lack of DVC.

 

Peter

Elite Member
Oct 15, 1999
9,640
1
0
Does your 6800GT actually use the main NVidia chip's TMDS transmitter, or is there a discrete unit on the board? If the latter, then no wonder it's working for you ;)
Unlike on analog VGA, failing to give a digital signal (TMDS in this case) quality proper attention in chip and board design will not affect image quality. Digital signalling will simply stop working altogether as the signal disappears in the noise - and until it does, the data do get there and the image is not impaired at all. When you're right on the edge of workingness (like that AOpen card I mentioned above), you'll get short dropout twitches of the entire image before the next frame transmits a visible one again.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Originally posted by: gururu
extremetech

actually extremetech trashed the nvidia cards more...

not a gpu problem, but poor card manufacturing decisions on components.
But the TMDS is integrated in both Nvidia and ATi chips so no external hardware comes into play for the primary DVI connection. Only for the second DVI connection does the possibility of crappy components come into play.

The results are pretty bad in my opinion. NVidia chips aren't able to deliver more than about 73hz at 1600x1200. That's not much of a problem as it stands but there are new fangled LCD's coming out with 8ms refresh rates and that means that there might be UXGA LCD's that can exceed 75hz in a year or two.

 

ZombieJesus

Member
Feb 12, 2004
170
0
0
Ya Nvidia often show to have transmitters that are not as good as ati but different manufacturers may use different transmitters (I hope) after all MSI is always cheap garbage.
 
Jan 31, 2002
40,819
2
0
Originally posted by: Pete
Er, that ET article shows all FX cards (borderline) failing with the internal TDMS, and usually at <=141MHz. THG seems to show the same thing. nV's internal TDMS transmitter appears to be generally subpar, their external SI transmitter appears fine, and ATi's internal transmitters appear "exemplary." Not sure if this will affect most people (who are probably not running 16x12+ LCDs via DVI), but two sites appear to have reached the same conclusion.

I don't see the FUD, M4H.

All FX cards, you say?

nVidia GeForceFX 5900 Ultra Reference Board
Result: Passes at 162MHz.

nVidia GeForceFX 5700 Reference Board
Result: Passes at 141MHz.

nVidia Quadro FX 2000 Professional Graphics Board
Result: Both Quadro FX 2000 ports pass at 162MHz.

Funny, all the nVidia reference boards pass just fine. :p

- M4H
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: MercenaryForHire

Funny, all the nVidia reference boards pass just fine. :p

- M4H

Of course, you can't *buy* NVIDIA reference boards... and all the third-party ones failed. And one of the reference boards only passed with a reduced blanking interval (141Mhz instead of 162Mhz bandwidth). But I'm sure that's just a fluke. :p
 

Slaimus

Senior member
Sep 24, 2000
985
0
76
This is probably why no card manufacturers used Nvidia's internal transmitter for the GF2-4 generations.
And high res DVI shootout without at least one Matrox card is just not right. Those are the cards that people with those expensive displays actually use.

Has anyone seen those IBM displays with so many pixels that they require 2 DVIs inputs to drive the whole thing?
 
Mar 11, 2004
23,444
5,849
146
Well, I have a Gainward, so its very possible that they used different transmitters than nVidia does.

Even so, there are plenty of people using a GT from another company and the 2001FP, and I haven't heard them saying they've been having a problem with it.

I'm not trying to say that what was found was wrong, but it doesn't appear to be affecting anyone that I can tell. I could understand if there were people having problems, like say when using those 23" widescreens from Apple. If I remember correctly, people were having problems with those displays and ATi cards, but nVidia ones seemed to be fine.

I can understand there being a problem with workstation cards, as those are expensive cards, and its for a business, so it had better work. And if people were actually having issues with higher resolutions and nVidia cards then I'd be all for nVidia answering for it, but it just doesn't look like that is the situation.

Its an interesting find, and is good that it points out nVidia using weak components (which would hopefully get them to change). Personally, I would like more in depth reviews of grahics cards. Test non-gaming image quality, make sure its using power like it should, and test other components on the card and don't accept an it'll get fixed later answer (maybe make some negative comments about a missing feature, like the video processor for instance).

I don't know if what I wrote makes sense, and I was trying to be non-aggrevating of the situation.
 

gururu

Platinum Member
Jul 16, 2002
2,402
0
0
I agree darkswordsman.

I think it would be great if more thorough reviews were done as well. I think we get so excited about the 3D speed that we take all the other features for granted. Video performance, 2D quality, S-video out quality, DVI output quality, heat generation, loudness, and other things should all be known. I guess these things often change with driver releases making it hard, but with DVI such as this review covers, it is clear that some companies either have better quality control or use much better components.