Testing Nvidia vs. AMD Image Quality

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
If people want something to read, here is a article that explains one of the last issues Nvidia raises in the current blog article. sept 15
ATI ‘cheating’ benchmarks and degrading game quality, says NVIDIA
http://blogs.nvidia.com/ntersect/2010/11/testing-nvidia-vs-amd-image-quality.html
FP16 Render Observations
In addition to the above recent findings, for months AMD had been performing a background optimization for certain DX9 applications where FP16 render targets are demoted to R11G11B10 render targets, which are half the size and less accurate. When recently exposed publically, AMD finally provided a user visible control panel setting to enable/disable, but the demotion is enabled by default. Reviewers and users testing DX9 applications such as Need for Speed Shift or Dawn of War 2, should uncheck the “Enable Surface Format Optimization” checkbox in the Catalyst AI settings area of the AMD control panel to turn off FP16 demotion when conducting comparative performance testing.
 
Last edited:

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
If people want something to read, here is a article that explains one of the last issues Nvidia raises in the current blog article. sept 15
ATI ‘cheating’ benchmarks and degrading game quality, says NVIDIA

And here is theirconclusion:
So does FP16 Demotion actually do anything at all?
From our testing it's clear that ATI's implementation FP16 Demotion does affect image quality in games as challenged by NVIDIA. However, the extent and prevalence of it is not universal - from four games, only one showed any real visible influence. On the other side of the coin are performance improvements, which are plentiful in two games from four: boosting performance by 17 per cent at high resolutions, and a still-impressive 10 per cent at lower resolutions.

Ultimately the decision to run FP16 Demotion has always been up to the end-user, as the use of Catalyst A.I has been optional since its first inclusion in Catalyst drivers - now many years ago - and as it appears predominantly beneficial or benign in the majority of cases, we'd suggest you leave it enabled.

We also can't help but wonder why NVIDIA attempted to make such an issue over this rendering technique; one that they wholeheartedly support and simultaneously condemn. In the two titles that their hack functioned with we saw no image quality nor performance loss, and rather the opposite: a performance boost. Why haven't NVIDIA included support for FP16 Demotion in their drivers until now for specific titles? Why choose now to kick up a fuss?

The answer, as always, will never really be known.
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
This had me rolling on the floor.

Here in the snowy town of Bruma, near the East Gate, we see one very slight change - the torch to the far left of both pictures. With Catalyst A.I enabled the flame is quite average, and with it disabled, the flame is slightly higher. That wouldn't be so weird, until we saw...
Catalyst A.I Enabled Catalyst A.I Disabled

That inside the town of Bravil we had the same torch flame height difference. Does this mean that the extra half-frame performance boost comes from a slight reduction in flame height? It would appear so!
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
This is the glenn beck method of "not" saying something.

"Now people! I can't prove that Keysplayer has gout. But I can't disprove it either. I'm just asking questions here!"

I mean, you posted this thread, you to hold a neutral "Let's wait for more information" stance, but everybody else who has actually stated anything neutral has gotten shot down (by you) without any technical explanation.

IN the end, this is a technical forum. We want answers, not spin. And definitely not fried rice. Not today.

Don't know what you're problem is, but you definitely need to get a handle on it. And if you think you do, you don't.
Me having gout has nothing to do with this thread!! And how did you find out about my gout anyway!!!!

Just relax dude and "focus" less on me, and more on the subject. "focus", get it? :D
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Even if some people disagree that there is a material difference in filtering image quality between both camps, the fact is that a lot of websites have changed their testing methodologies. As independent reviewers, they have every right to independently test image quality and arrive at their own conclusions. We have seen FX5800/5900 series penalized for trying to diminish image quality and some of the same websites took notice (Xbitlabs for example). This example clearly highlights they will revise their testing if they feel the workload is not comparable, regardless of the brand.

The irony of this thread is that some people who dismiss image quality differences unless they are tested objectively find completely subjective HardOCP benchmarking as the best method for testing videocards...Kyle Benett at HardOCP comes up with his own "smooth gameplay" benchmarks which he claims are the only right way to test videocards. The problem is Kyle is the only one who determines what "smooth" is.

So why do we dismiss subjective opinions of some people about image quality and yet accept subjective opinions of others on what constitutes "smooth frames per second" as facts?
 
Last edited:

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
Don't know what you're problem is, but you definitely need to get a handle on it. And if you think you do, you don't.
Me having gout has nothing to do with this thread!! And how did you find out about my gout anyway!!!!

Just relax dude and "focus" less on me, and more on the subject. "focus", get it? :D

We have a multi page thread with almost no concrete information (until recently)

You are solely responsible for this. Make this thread better. Answer my questions. Send me free stuff.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
We have a multi page thread with almost no concrete information (until recently)

You are solely responsible for this. Make this thread better. Answer my questions. Send me free stuff.

Well, then I'd say do yourself a favor and stay out of the thread if you feel it offers you nothing. How would you suggest "I" make this thread better? I only linked to the blog which linked to multiple German sites with this info. If I commented that all 4 German sites were posting false info, would that make it better for you? What makes you think I have the answers to your questions?
What free stuff would you like?

LOL, I am solely responsible for WHAT? Exactly?
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
Why has this turned into an attack Keysplayr thread instead of a lets find out for ourselves and get answers thread?
 

busydude

Diamond Member
Feb 5, 2010
8,793
5
76
Why has this turned into an attack Keysplayr thread instead of a lets find out for ourselves and get answers thread?

Well.. there aren't any answers to be found.. discounting a couple of 5+ year old games, that is. We should be seeing a few more games than that.. if AMD was really trying to gain performance by degrading IQ.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
Here is my general theory right now based on what I have seen from my 5850 vs my old 8800gts. Both shimmer. ATI doesn't seem to introduce any new shimmering, but it would appear to me that where they both shimmer ATI now shimmers worse. I realize in some older games this is not the case, but we still don't know about the LOD settings in those older games either.
Now where it shimmers more is it AMDs new optimizations that cause the additional shimmer or a driver bug they need to work on I think is still the question. of course most will not notice because as I said it does it in the same spots. But this is just my theory. There are exceptions to everything.
 

thilanliyan

Lifer
Jun 21, 2005
12,059
2,272
126
Can't this all be put to rest by testing newer games to see if it is actually a "cheat"? How hard is that really? If I had the cards I would test it myself.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Can't this all be put to rest by testing newer games to see if it is actually a "cheat"? How hard is that really? If I had the cards I would test it myself.

That is what has been asked throughout this thread. Nothing yet.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
So why do we dismiss subjective opinions of some people about image quality and yet accept subjective opinions of others on what constitutes "smooth frames per second" as facts?
I think you already know the answer to that question. AMD vs Nvidia is like rooting for your favorite team on Sunday. And, when that 'no call' that was called that went against your team that angered you to no end suddenly finds it way to being commited by your team you somehow find a way to justify why it wasn't called and was 'legal' even though the same thing occured.

In regards to the issue at hand, if AMD would have left a way to disable the optimization they enabled by default and been up front about it...this would be a non-issue imho. Of course, now that it is, it is somewhat of a non-issue....though reviewers need to be sure and double check that the driver settings are indeed the same with each driver release when benchmarks are made to be sure accurate comparisons are indeed being made because trusting either AMD or Nvidia to do the right thing is a bit naiive at this point imho.
 
Last edited:

dust

Golden Member
Oct 13, 2008
1,328
2
71
I think you already know the answer to that question. AMD vs Nvidia is like rooting for your favorite team on Sunday. And, when that 'no call' that was called that went against your team that angered you to no end suddenly finds it way to being commited by your team you somehow find a way to justify why it wasn't called and was 'legal' even though the same thing occured.

In regards to the issue at hand, if AMD would have left a way to disable the optimization they enabled by default and been up front about it...this would be a non-issue imho. Of course, now that it is, it is somewhat of a non-issue....though reviewers need to be sure and double check that the driver settings are indeed the same with each driver release when benchmarks are made to be sure accurate comparisons are indeed being made because trusting either AMD or Nvidia to do the right thing is a bit naiive at this point imho.

Good post!
On the same note I find that the same fanboys tend to attack every single move of the competitor, while saying "It's just good business" when the same applies to their preferred company. Both companies are in the business to make money obviously, so yeah, it's childish to assume they have the customers at heart.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That is what has been asked throughout this thread. Nothing yet.

Not everyone has HD6850/70 cards on hand; so asking people to mysteriously come "up" with benchmarks without cards in their possession is unrealistic don't you think? In the previous Catalyst 10.9 vs. 10.10 thread, I asked for owners of HD6850/70 cards to test image quality with Q and HQ and make comments, but no one volunteered.

Most importantly, all of these websites found that the only way for HD6850/70 series to have superior filtering quality to HD5870 was to enable HQ in the first place; and even AnandTech made note of the changes in the driver control panel. It's difficult to claim that HD68xx series has superior image quality to the previous generation of AMD cards without testing it at HQ in reviews. Therefore, the "increased image quality over HD58xx series" as it is being touted by AMD as a feature is somewhat misleading since it's not enabled by default. Without even bringing NV into the equation, this alone is a valid reason to test HD68xx series at HQ only.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Not everyone has HD6850/70 cards on hand; so asking people to mysteriously come "up" with benchmarks without cards in their possession is unrealistic don't you think? In the previous Catalyst 10.9 vs. 10.10 thread, I asked for owners of HD6850/70 cards to test image quality with Q and HQ and make comments, but no one volunteered.

Most importantly, all of these websites found that the only way for HD6850/70 series to have superior filtering quality to HD5870 was to enable HQ in the first place; and even AnandTech made note of the changes in the driver control panel. It's difficult to claim that HD68xx series has superior image quality to the previous generation of AMD cards without testing it at HQ in reviews. Therefore, the "increased image quality over HD58xx series" as it is being touted by AMD as a feature is somewhat misleading since it's not enabled by default. Without even bringing NV into the equation, this alone is a valid reason to test HD68xx series at HQ only.

I was asking members. I was wondering why none this websites tested newer games for image quality degradation.

Edit: Im going from a 5770 to a 6850 in a few days, I could do some testing.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
The irony of this thread is that some people who dismiss image quality differences unless they are tested objectively find completely subjective HardOCP benchmarking as the best method for testing videocards...Kyle Benett at HardOCP comes up with his own "smooth gameplay" benchmarks which he claims are the only right way to test videocards. The problem is Kyle is the only one who determines what "smooth" is.

The irony of this is you not understanding why people like [H] testing.

It is nothing to do with "smooth gameplay" and at what resolutions/settings a card will offer that gameplay - which is interesting because you like to dismiss benchs that are tested at such demanding settings they will be unplayable, clearly a subjective quality, and/or make all the cards perform at similar levels.

People like it because it doesn't run canned benchmarks.

Canned benchmarks can suffer from the "3d mark optimization" syndrome - which isn't bad at all if all those optimizations occur in every bit of the game but it isn't very informative if you have a scene completely optimized giving you X performance but the rest of the game only has <X performance.

Additionally for those that feel [H] crew "feelings" aren't accurate or aren't similar to theirs there is generally an apples to apples comparison.

The only downside of [H] approach is the limited number of games tested and maybe a second resolution in apple to apples would be nice.
 
Last edited:

thilanliyan

Lifer
Jun 21, 2005
12,059
2,272
126
I actually find myself checking out the [H] reviews more now. Of course I always check the apples-to-apples comparison moreso than anything else.
 

tannat

Member
Jun 5, 2010
111
0
0
Even if some people disagree that there is a material difference in filtering image quality between both camps, the fact is that a lot of websites have changed their testing methodologies. As independent reviewers, they have every right to independently test image quality and arrive at their own conclusions. We have seen FX5800/5900 series penalized for trying to diminish image quality and some of the same websites took notice (Xbitlabs for example). This example clearly highlights they will revise their testing if they feel the workload is not comparable, regardless of the brand.

The irony of this thread is that some people who dismiss image quality differences unless they are tested objectively find completely subjective HardOCP benchmarking as the best method for testing videocards...Kyle Benett at HardOCP comes up with his own "smooth gameplay" benchmarks which he claims are the only right way to test videocards. The problem is Kyle is the only one who determines what "smooth" is.

So why do we dismiss subjective opinions of some people about image quality and yet accept subjective opinions of others on what constitutes "smooth frames per second" as facts?

You're mixing things. The differences in the IQ is according to several reports minimal and usually not noticeable. And you can adjust it with a slider if you want. We are talking about 5% in some old games.

I don't think you can expect people to become very upset over this. Especially when nvidia is the key source and Keysplayr is handing out the torches, indignant and with only the truth in mind...

It would be stupid to join the lynch mob so obviously manipulated by nvidia. If the IQ effects were large and the intent was obvious then maybe...

And the comparison with [H] I don't understand. I know that some people like to call their testing procedure subjective but that's definitely not why they are so valuable...

They do focus on relevant parameters (gameplay) instead of easily manipulated noise (canned benches). Even if they would need to use a stopwatch instead of a labview setup we are not really talking subjective as compared to objective. This is choice of relevant parameter.

If you say that the people who like to check in videocard tests at [H] are the same that do not let themself be easily incited by obvious spins?
This may be true, I never thought about it in that way.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
You just need to think of [H]'s testing as showing which card provides better IQ not which card provides better FPS. They pick a target FPS and see which card can turn on more settings, rather than picking a target setting and see which can provide more FPS.

Either method suffers when a card is exceptionally slow or fast at a specific effect or combination of effects so you can never get an all encompassing measure of performance anyway because no review site has enough time to test a game with every combination of resolution, AA, AF, DoF, tessellation, HDR, etc.

Most sites only change AA settings between benches, which means you only learn how much of a hit AA causes on a card. If a card crawls at 4xAA and max settings, but turning off just DoF suddenly makes it playable, that's not something you'll usually find out on a non-[H] review.

By testing with AA on and off it's designated as the least important effect. Personally I find DoF and HDR the least important and would rather see tests with 4x AA and optional DoF/HDR.