Testing Nvidia vs. AMD Image Quality

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
I noticed a difference in 2D image quality by going from nVidia 7600GS to ATi 4870. Nice and crisp image on the ATI card.

Yea, when I jumped from a GeForce 7900 to a Radeon 2900 I noticed a significant improvement. I had almost always used Nvidia cards up to that point, so I thought what I saw was how things were. Once I went to the 2900 I found how much better and crisper things could actually be.

But, I think this was fairly well known at the time, that the GeForce cards before the 8xxx were all of lower image quality than AMD's cards. I believe after the 8xxx things were pretty even.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Yeah. I think it's being blown way out of proportion. Both companies do it. But how many are attempts to cheat versus legitimate attempts to optimize speed without hurting image quality (even if those optimizations sometimes aren't as image-quality neutral as hoped in all cases)? I'm willing to give both sides the benefit of the doubt. Certain posters (who coincidentally are also given free hardware by nVidia) seem quicker to accuse ATI of cheating.

From what I've seen so far a few older games slipped through the cracks during a large overhaul of how these sorts of quality<->performance tradeoffs are presented to the user on ATI's side. If you disagree with the default choices ATI has made, bump the quality up or down a notch in the driver control panel and you'll be happy.

On nVidia's side, they made a silent change in the driver which degraded the image quality TrAA. Can you undo this "optimization" like you can with ATI cards? Maybe, if you download a third party app and know where to look to hack it back in.

An honest thread on AMD vs NV image quality would discuss both problems. Instead, it looks like NV is trying to get reviewers to disable IQ-neutral optimization in on ATI's cards to make the competition more "fair".

I own a GTX 275 and like it, even if it's getting a bit old. I just wish NV would focus on making their product better rather than whining about their competition doing exactly what NV themselves do.


Can you personally offer some examples with your GTX 275? With Fermi, there are some changes with transparency multi-sampling because it has the ability to work in conjunction with CSAA.
 

peonyu

Platinum Member
Mar 12, 2003
2,038
23
81
It would help if Brands were listed to for 2d Quality. If its a non-reference card its possible the 3rd party was being cheap and cutting corners whereas the reference design has the non-cheap 2d/good 2d quality left intact.

The reference Nvidia cards I have used have always been as good as the ATI/AMD ones [not counting Geforce 3 and earlier products which did have poor 2d]. But maybe my eyes are going bad...Im using a Radeon card right now but plan on getting a faster card soon, if anyone has indepth comparisons of 2d quality between ATI and Nvidia and not just opinions on it that would be nice to see.

Also take a look at Gtx 460 prices on Newegg. Some of the cards listed have the exact same specs as each other, yet there is a price difference of $50 or more...Perhaps it is those cheapo cards that took a hit to their 2d to make them that cheap, either through shoddy RAMDAC or capacitors etc [/conspiracytheory].
 
Last edited:

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Can you personally offer some examples with your GTX 275? With Fermi, there are some changes with transparency multi-sampling because it has the ability to work in conjunction with CSAA.

Nope, I haven't looked for it in person. Did you find anything wrong with the data posted earlier in this thread?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Yea, when I jumped from a GeForce 7900 to a Radeon 2900 I noticed a significant improvement. I had almost always used Nvidia cards up to that point, so I thought what I saw was how things were. Once I went to the 2900 I found how much better and crisper things could actually be.

But, I think this was fairly well known at the time, that the GeForce cards before the 8xxx were all of lower image quality than AMD's cards. I believe after the 8xxx things were pretty even.

Yeah, my 7800 GTX had major shimmering... I remember those days. My 8800 GTS 320MB was much improved.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Nope, I haven't looked for it in person. Did you find anything wrong with the data posted earlier in this thread?

From the Guru3d thread:

or this lame trick in nv control panel: transparency antialiasing - multisampling mode with older gpus G80, g92, g200.. with new 2xx.xx drivers it forces them to use some weak supersampling method instead of true multisampling like with older drivers, but Fermi g100, 104, 110 is allowed and is still using this normal multisampling method.



I don't understand. Multi-sampling transparency is a very small hit and yet this offers a weak super-sampled, which super-sampled is much more of a performance hit. What would be nice is for someone to offer examples with performance, too. If this is an optimization to improve frame-rate -- multi-sampling transparency performance hit is so small and always has been and very odd.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
That is odd. Digital 2D output is Digital 2D output. As long as you're using DVI or HDMI and not Analog or any DVI to RGB adapters.
It's not odd at all. Just because the signal is digital, that does not mean the output is the same. The output can and does differ from what a perfectly represented reference image should be. And like I said, anyone that has actually compared the cards on a high end monitor will notice the difference within a few seconds.
AnandThenMan has a good point though. Even though the difference isn't mind blowing, it's subtle but noticeable and once you see it you will not want to go back applies perfectly for this thread topic. If AMD cuts subtle corners lowering image quality, and look closely and see this, you may not want to go back to them if you now know they are doing this. It makes sense. Good comparison ATM.
You're comparing apples to oranges. Plus, AMD has the potential to improve the image quality in drivers, or you can change the driver settings to your liking. The Nvidia hardware is limited in the quality of the output in hardware and it cannot be brought up to the level of the Radeon cards without a hardware redesign.

This has been going on for a very long time actually, since ATI started making cards. Matrox has always been very very strong in image quality, perhaps even better than ATI. The output stages of the hardware does not change all that much over the years, so the tradition continues.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The output stages of the hardware does not change all that much over the years, so the tradition continues.

Not sure about that. I mentioned already in this thread that my GTX470 was vastly superior to 8800GTS 320mb in 2D image quality (text specifically).
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
It's not odd at all. Just because the signal is digital, that does not mean the output is the same. The output can and does differ from what a perfectly represented reference image should be. And like I said, anyone that has actually compared the cards on a high end monitor will notice the difference within a few seconds.

You're comparing apples to oranges. Plus, AMD has the potential to improve the image quality in drivers, or you can change the driver settings to your liking. The Nvidia hardware is limited in the quality of the output in hardware and it cannot be brought up to the level of the Radeon cards without a hardware redesign.

This has been going on for a very long time actually, since ATI started making cards. Matrox has always been very very strong in image quality, perhaps even better than ATI. The output stages of the hardware does not change all that much over the years, so the tradition continues.

Ok, now you're just making stuff up. AMD can improve, but Nvidia is limited and cannot. Okey Dokey.
You're argument for 2D clarity is equivilent to the 3D argument and lower image quality for AMD compared to Nvidia using 10.10 and 10.11 drivers for the latest round of benches. There isn't really anyway around that.
As I was reading your post, I was like, "How can he say it matters here, but doesn't matter there? Not apples to apples? Apples to oranges? WTF?"

See what I mean?
 
Mar 11, 2004
23,444
5,852
146
Ok, now you're just making stuff up. AMD can improve, but Nvidia is limited and cannot. Okey Dokey.
You're argument for 2D clarity is equivilent to the 3D argument and lower image quality for AMD compared to Nvidia using 10.10 and 10.11 drivers for the latest round of benches. There isn't really anyway around that.
As I was reading your post, I was like, "How can he say it matters here, but doesn't matter there? Not apples to apples? Apples to oranges? WTF?"

See what I mean?

Don't be so willfully ignorant. He's saying the reason the nvidia cards were worse (in 2D image quality) is due to the hardware they used. That's not something that can be magically fixed via software. Image quality in software optimizations, however, can be changed. In fact, you can just change settings yourself to make this a non-issue, right now.

See what people mean?
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
TH did a article that exposed all sorts of 2d problems in the 5 series cards last year, which I think AMD eventually fixed through drivers.
http://www.tomshardware.com/reviews/2d-windows-gdi,2539.html
We wanted to stay away from emotional overtones in this article, even though devoted occupants of the red or green camps might need to rub their eyes as they read through this material. Because we ourselves didn&#8217;t want to believe the results of our own tests, we took extra time and care in this story&#8217;s preparation, in the interests of all affected parties, to produce results that are as objective and defensible as possible. We also worked hard to create the most objective possible bases for comparing graphics cards against one another. We will also avoid pointing fingers: rather, it&#8217;s important to understand this article as a contribution and an aid to those users who not only use their PCs for gaming, but also for those who use their PCs to get real work done.
In this context, it&#8217;s important to observe that, currently, it can be quite vexing to work productively with 2D graphics in Windows 7. For example, using a Radeon HD 5870 and the latest drivers, we found it difficult to produce simple vector-based graphics, to render simple or complex CAD designs, or even to play 2D games in higher graphics quality modes. We mention this not as a criticism, but instead as an approach to a definite problem that we sought to analyze and understand as fully as possible.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
No, I don't at all TBH. You're comparing software to hardware, and saying that is apples to apples. Darkswordsman's post explained it very well, no need to elaborate.

And so I'll direct you to Notty's linked article above.

Not everything is so cut and dry as you would have us believe.
 

Outrage

Senior member
Oct 9, 1999
217
1
0
and i point you to this using 10.4 drivers http://www.tomshardware.com/reviews/ati-2d-performance-radeon-hd-5000,2635.html

Both ATI and Nvidia currently offer their cards with Windows 7
drivers that serve up enough 2D performance for using GDI/GDI+. ATI has almost completely fixed the issues pointed out by our testing with an average performance improvement by almost 100%. This is really a clear answer.

Despite significant improvements made by the ATI driver developers and Nvidia, the two graphics manufacturers are not exactly soiling themselves with glory. The performance level that Windows XP sported in 2D is still very much missing.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
And so I'll direct you to Notty's linked article above.

Not everything is so cut and dry as you would have us believe.
That article is almost a year old, and it has nothing to do with image quality. Plus those issues were fixed months ago.
 

Outrage

Senior member
Oct 9, 1999
217
1
0
So have nvidia discovered any games relesead this year, or games being used in benchmarks that are being effected by this AF "cheat" that they are throwing a fit about. If they are so upset about this they must have found it in games that would put there cards at a disadvantage in games used in recent benchmarks?
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Terrific, now have you anything to say about the actual thread title?
And so I'll direct you to Notty's linked article above.
You directed me to the article, which has nothing to do with this thread.


And to anyone that has concerns over Radeon image quality, if you find it unacceptable, go with Nvidia. That's what choice is all about. I personally have not had one single person complain about the image quality of either Nvidia or AMD hardware in all my years of building systems.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
So have nvidia discovered any games relesead this year, or games being used in benchmarks that are being effected by this AF "cheat" that they are throwing a fit about. If they are so upset about this they must have found it in games that would put there cards at a disadvantage in games used in recent benchmarks?

As you might see, it isn't Nvidia doing the discovering or the reporting.
More titles are being tested now I can assure you.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So have nvidia discovered any games relesead this year, or games being used in benchmarks that are being effected by this AF "cheat" that they are throwing a fit about. If they are so upset about this they must have found it in games that would put there cards at a disadvantage in games used in recent benchmarks?

Third party web-sites that investigated these issues and some-how pointing fingers at nVidia for some odd reason. The amazing part is posters were showing off the wonderful 5 series AF flower to show anisotropy supremacy and posters fell victim to disingenuous data based on areas of texturing actually broken. Posters from all over the world offered threads about this and the same old downplaying and I can't tell the difference in games diatribe.

There are some posters and gamers that care about texture quality and thankfully some third party web-sites investigate texturing on the products and share their findings.

This reminds so much of the 6 and 7 series of products from nVidia where nVidia had some issues with filtering and aggressive optimizations and the same views -- I can't tell the difference -- why get so upset --- I play games and don't have time to notice these things.