Testing Nvidia vs. AMD Image Quality

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Essentially here's the issue in a nutshell:
1. AMD tried to slightly simplify CCC and make it less confusing.
2. They made the default settings for optimizations the ones that the vast majority of regular end-users will be best served by using (higher performance, imperceptible image quality compromises).
3. Some (small minority) of people jump on it as "omg it's a conspiracy" because they are for some reason incapable of going into CCC to change settings.

Hint: If people are having to go onto the internet, take screenshots, and then zoom in and look at tiny portions of the screenshot to find where optimization "might be effecting something, somewhere"...then the image quality impact is a non-issue.

EXTRA how do you think AMD vs NVIDIA cards should be tested by reviewers.
Considering AMD can gain up to 10% perf advantage over NV lowering IQ, how would you have reviewers test NV vs AMD for a fair comparison?
 

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
It is indeed an optimization. For higher framerates at the cost of image quality.

He counters your statement that this is not a driver bug (which you say would affect all models not just selected ones - you are wrong). Driver tweaks are also done on particular models and a tweak to those 3 families could have introduced a bug in a few older games for those 3 families. You paint a picture that this is not a bug but a deliberate optimization made to increase speed at a cost of IQ. We just know the newest drivers introduce a speed increase and cause some old games IQ problems. There are more than one conclusions after that:

1) This may very well be a driver bug
2) This is an optimization but the 3 older games weren't tested for any abnormalities hence no further tweaks were done
3) This is an optimization, AMD is well aware of the lowered IQ in some old games, but since there are ways to disable it and the old games run lightning-fast anyway, who cares? New games don't seem to be impacted IQ-wise - there is NO IQ difference for me between Cat 10.9 and Cat 10.11 in Fallout : New Vegas (runs on GameBryo, same as Oblivion). And I run the game on a 40" HDTV 1.5m away from the screen.

The only thing I don't like about the whole situation is AMD not informing anyone about the change.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Come on - Ati releases a new brand new range of cards, and that just *happens* to coincide with a driver bug that gives them a 5-6% performance boost at a cost to image quality. Yeh, right, pull the other one.

It's quite simple, they cheated in the benchmarks to make their cards sound better. It is quite noticeable on sites that force high IQ and disable optimisations (such as the well trusted xbitlabs) nvidia cards do better. This isn't something anyone should defend, unless you want both companies to go on an image quality lowering performance war.

That said the 6800 reviews are past - Ati got away with that one - but I expect review sites to be a lot more critical when the 6900's come out. Unfortunately the only ones on the ball seem to be these German ones - there has been no comment from [H] at all for example.
 

Outrage

Senior member
Oct 9, 1999
217
1
0
since nvidia have tested this, did they find any other games then oblivion and half-life that had a iq problem?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
since nvidia have tested this, did they find any other games then oblivion and half-life that had a iq problem?

Nope. Xbit labs also mentioned that the the 6800s have better filtering quality than the 400 series on high quality. So Its the nvidia cards that have lower image quality in those test. Yet I don't see that mentioned here.

Not to mention the fact that this is a repost. It clearly shows an agenda.
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
since nvidia have tested this, did they find any other games then oblivion and half-life that had a iq problem?

This is what I'd like to know. If there are IQ problems in other games (more importantly games released in the last 5 years), then I really want to see it shown. I expected the NVIDIA blog post to have every bit of visual evidence. If these IQ problems are only limited to Oblivion and Half-Life, what exactly is the problem here? I seem to recall BFG's testing on older games being discredited because he was testing "thousand year old games". Oblivion was released nearly five years ago. Any modern AMD card can run that game just fine without Catalyst AI.
 
Last edited:

Xarick

Golden Member
May 17, 2006
1,199
1
76
The banding bug is a hardware deficiency not a software thing and it is present in modern games. The rest I am unsure of as even ANAND has stated he sees no difference in visual quality in modern games. Oh and the shimmering thing I do not agree with at all. I ran through a bunch of my games including significantly buggy ones like gothic 3 and do not see any significant level of shimmering with cat ai on or off on a 5850.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
How do you think AMD vs NVIDIA cards should be tested by reviewers? Question remains.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
They same way they always have been. On default settings. For both vendors.


thats just it... afew sites have started useing default nvidia setting vs HQ setting for AMD cards. Which might be unfair to Amd, OR might be justifed if these things are true.

So far it doesnt seem justifed though, because its only something we ve seen in screenshots in a single old game like Oblivion, and a few old benchmark/tests.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
How do you think AMD vs NVIDIA cards should be tested by reviewers? Question remains.

Either test both on default settings and watch IQ carefully (reviewers should be doing this anyway) or test both on HQ. To test test one on Q and the other on HQ seems skewed IMO, especially since this phenomenon seems to be limited to a couple old games.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Come on - Ati releases a new brand new range of cards, and that just *happens* to coincide with a driver bug that gives them a 5-6% performance boost at a cost to image quality. Yeh, right, pull the other one.

Because Half-Life 2 and Oblivion are games that are really reviewed today, right?

And actually your affirmation is false.

5-6% difference is the performance difference between running HQ and Q for the 6800 series in the games ComputerBase.de review.

But it doesn't say anything about the games that actually presented the IQ problems, Oblivion and Half-Life, since those games aren't and weren't reviewed.

Answering Keys question - definitely disabling Cat AI is not the way to do it.

The interesting part is that I don't think you can't do that any more for the 6800 series.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
How do you think AMD vs NVIDIA cards should be tested by reviewers? Question remains.

How about they test AMD hardware twice, once with standard, once on High Quality, and let readers decide?

Or if that takes too much time, then test them the way NV insists they should be tested (which is on High Quality I think?), but with a short comment from the reviewer about how they can gain a small perf boost with zero to small loss in image quality? That way everybody is happy?
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
How do you think AMD vs NVIDIA cards should be tested by reviewers? Question remains.

Both control panels set to:

  • high quality textures
  • all optimizations off
  • vertical sync forced off


  • CCC AI "standard"
note: 8xQ in Nvidia's CP is closest to 8xAA in CCC


Did i miss anything?
:confused:


Is anything else really a fair "apples-to-apples" comparison?
. . . . This is how i have benched for a long time. ():)
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Both control panels set to:

  • high quality textures
  • all optimizations off
  • vertical sync forced off


  • CCC AI "standard"
note: 8xQ in Nvidia's CP is closest to 8xAA in CCC


Did i miss anything?
:confused:


Is anything else really a fair "apples-to-apples" comparison?
. . . . This is how i have benched for a long time. ():)

I think this sounds fair but 8xQ = 8xAA? Stuff like that and http://www.anandtech.com/show/2918/5 kind of confuse me as to what the comparable settings are, especially since some people think CSAA isn't as effective as MSAA, even though it might not be portrayed that way in reviews.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Both control panels set to:

  • high quality textures
  • all optimizations off
  • vertical sync forced off
  • CCC AI "standard"
Did i miss anything?
:confused:

Ya, you did. The testing methodology you provided above is exactly what the German websites and Xbitlabs use for AMD cards. However, other websites have Optimizations ON and test at Quality not High Quality texture settings.

Your testing methodology makes the most sense imo (and HQ should also be used for NV for that matter too). I don't spend $200+ on a videocard to play on default texture settings when HQ option is there from both camps. When I set my previous Radeon 4890 to "Performance", the visual quality drop was not worth the 5% performance boost I got.
 
Last edited:

HurleyBird

Platinum Member
Apr 22, 2003
2,812
1,550
136
Thank God Nvidia is making a stink about this. I don't particularly like when they harp on stuff that makes no noticeable difference like FP16 demotion -- free performance is a good thing. However, AF is another matter altogether. *Immediately* after getting my HD5870 I noticed the worse AF, and now the new Radeons are supposed to be worse. Nvidia whining about FP16 actually managed to get AMD to include a toggle in Catalyst, perhaps their valid complaints about AF will also get AMD to respond.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
How do you think AMD vs NVIDIA cards should be tested by reviewers? Question remains.

I think they should both be tested on the highest quality settings possible in the drivers. Then screen shots should be taken to compare which cards offer the best IQ. If I'm not mistaken, at the highest quality settings the latest AMD (6xxx series) cards actually have the upper hand.

Honestly I would take better IQ over a few frames per second any day. I wish Anand would delve deeper into this and write an article comparing the IQ in a wide range of games between NV and AMD. Maybe they'll do that when the 69xx series are released next month :)
 
Last edited:

German_IQfreak

Junior Member
Nov 20, 2010
9
0
0
If somebody is interested in more video- and picture comparisons than please take a look at this thread
http://www.forum-3dcenter.org/vbulletin/showthread.php?t=482069

On the last pages are some comparisons between Cypress, Barts, R520 and Nvidiacards too, which shows that Nvidia offers a far better IQ. But especially the thing that AMD should be ashamed of is that R520 with AI Off offers a much better filtering than Cypress and Barts.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Ya, you did. The testing methodology you provided above is exactly what the German websites and Xbitlabs use for AMD cards. However, other websites have Optimizations ON and test at Quality not High Quality texture settings.

Your testing methodology makes the most sense imo (and HQ should also be used for NV for that matter too). I don't spend $200+ on a videocard to play on default texture settings when HQ option is there from both camps. When I set my previous Radeon 4890 to "Performance", the visual quality drop was not worth the 5% performance boost I got.
So .. what did i miss?
:confused:

i am not concerned about *other* websites except to know what their methodology is; the ones i trust, and my own benching, uses High Quality texture settings in both vendor's control panels


The issue was Cat 10.10 evidently brought an unannounced change in default settings..... but then i never used default settings before, so it didn't matter to me. i DO like the option to disable some optimizations in CCC that i couldn't adjust before.

Frankly, i'd like to see more checkbox IQ settings, not less
^_^
 

Rezist

Senior member
Jun 20, 2009
726
0
71
Well you can't turn off driver optimizations on nVidia cards.

Fair settings would be for both to be
AF=HQ
But in those situations AMD cards never took a big hit, the bigger hit comes from turning off driver optimizations.