AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
ComputerBase investigates Catalyst 10.9 vs. 10.10 image quality and performance impact.

Original article with pictures

Translated

Morphological AA:

Call of Duty 2 - blurry textures
Mass Effect 2 - entire picture content is blurred
Crysis Warhead - The blur effect is very pronounced, and MLAA has difficulties with polygon edges

Conclusion: In each case, surprisingly the performance drop in the games was more with MLAA than with 4xMSAA. The article provided performance charts to show this. (Note: Perhaps the latest 10.10d driver fixes MLAA performance. So I am not sure if their findings are conclusive here).

Anisotropic Filtering AA:

- Starting with Catalyst 10.10, Radeon HD 6800 card filters with the standard driver settings worse than the previous generation ATI! The quality of the default Catalyst 10.9 is reached from the Catalyst 10.10 only when the Catalyst AI is OFF and when "High Quality" filtering is set on a Radeon HD 6800 card. This is also true for HD58xx series. They too run at a reduced image quality at default control panel settings from Catalyst 10.9 settings.

- Only the banding problem has been fixed with HD6800 series. The annoying texture flicker has not been fixed.

- AMD's "High Quality" texture filtering image quality is only comparable to "Standard" setting in NV control panel. AMD's High Quality texture filtering is still superior to AMD's High Quality settings.

Conclusion: AMD's standard Catalyst 10.10 standard image quality settings are not comparable to Catalyst 10.9 standard image quality. The only way to reproduce similar image quality is to Disable Catalyst AI (Off) and to set AMD drivers to "High Quality" when testing. AMD "High Quality" image quality in the control panel is equivalent to "Standard" for NV. The texture flickering in Track Mania has not been fixed in HD68xx series.

Performance with "High Quality" in Cats 10.10 investigated vs. "Standard/Default" settings in Cats 10.10:

Anno 1404
HD6870 = 8-9% slower with AA/AF enabled to High Quality (depending on 4AA vs. 8AA)
HD6850 = 9-10% slower

BF:BC2
HD6870 = 7-9% slower with AA/AF enabled
HD6850 = 8% slower

BattleForge
HD6870 = 6-7% slower with AA/AF enabled
HD6850 = 5-7% slower

CD4:MW2
HD6870 = 4% slower with AA/AF
HD6850 = 3-4% slower

Crysis Warhead
HD6870 = 7-10% slower with AA/AF
HD6850 = 7-8% slower

Dirt 2
HD6870 = 6-8% slower with AA/AF
HD6850 = 6-9% slower

Mass Effect 2
HD6870 = 1-2% slower with AA/AF
HD6850 = 2-3% slower

Metro 2033
HD6870 = 7% slower with AA/AF
HD6850 = 6% slower

Risen
HD6870 = 4% slower with AA/AF
HD6850 = 6% slower

Splinter Cell Conviction
HD6870 = 2-3% slower with AA/AF
HD6850 = 2-3% slower

S.T.A.L.K.E.R.: CoP
HD6870 = 6% slower with AA/AF
HD6850 = 4% slower

Overall Performance Drop when running HD6800 series with comparable image quality to Catalyst 10.9:

4AA/16AF
HD6850 & 6870 = 6%

8AA/16AF
HD6850 & 6870 = 5%

Conclusions:

Starting with Catalyst 10.10+, ComputerBase will now test all Radeon HD 58/6800 by manually switching texture filtering quality from Standard to High Quality. Catalyst AI will be turned "Off". Otherwise, the results are not comparable due to HD5xxx and HD6xxx series running with reduced image quality compared to Catalyst 10.9 and below.


NOTE: Can someone with an HD6850/70 volunteer to make a 15-20 second video of a couple modern DX10/11 games to see if there is any texture filtering (AF) differences / mipmap transitions with "Quality/Default" Cats 10.10 vs. "High Quality"?
 
Last edited:

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I thought this was a known issue with the 5000 series as well? And that when looking at reviews the only way to get an apples to apples comparison is when ATI was forced to highest filtering setting while leaving Nvidia at their standard filtering.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I thought this was a known issue with the 5000 series as well? And that when looking at reviews the only way to get an apples to apples comparison is when ATI was forced to highest filtering setting while leaving Nvidia at their standard filtering.

Yes, you are correct. This is also true for HD58xx series. They too run at a reduced image quality at default Quality/Standard control panel settings compared to Catalyst 10.9 settings. Since ComputerBase wasn't aware of this prior to their investigation, their HD68xx launch review benchmarking results were inflated by 5-6% on average. How many other websites are going to strictly use High Quality in their reviews? :oops:

With modern videocards becoming faster and faster, perhaps gamers will want to see increased image quality at a slight loss in performance rather than reduced image quality for a 5-6% performance gain. Not sure I like this trend of faster performance at any cost.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It's looking like ATI intentionally cheated to get their 6850/6870's higher scores by 5-6% in the reviews :(
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
True, for me I am not entirely worried about the level of quality in the reviews provided both vendors are at the same level. Meaning ATI highest while Nvidia is at standard.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
1) question, if the image quality settings match the nvidia ones what is the problem? unless the amd cards are useing less AA or something the nvidia ones?

yes they lowered the "default" setting... but wasnt that done so they would be useing same AA ect as nvidia cards?
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
1) question, if the image quality settings match the nvidia ones what is the problem? unless the amd cards are useing less AA or something the nvidia ones?

yes they lowered the "default" setting... but wasnt that done so they would be useing same AA ect as nvidia cards?

Is this something that you believe happened? Or are you just throwing it out there as a possible reason or for some reasonable doubt?
Because in a nutshell, by default, Nvidia's "Standard" quality setting is the same as AMD's "High Quality" setting. Obviously when you set the AMD quality to "Standard", you'd be setting it to a lower IQ that Nvidia's default.
Now run benches with those settings and let me know if you think that's fair. Whether the human eye can perceive it or not isn't the issue. Don't pretend that could make any difference. We are talking numbers here.
You know what the problem is.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1) question, if the image quality settings match the nvidia ones what is the problem?

The problem is starting with Cats 10.10, AMD cards don't match NV's image quality w/ default settings. So if you buy a brand new NV or AMD card and install it and change nothing in the control panel, the AMD card is filtering at a reduced image quality. To set the 2 cards to produce similar image quality, you now have to set AMD cards to High Quality.

Why did the reviewers have to go out of their way to do their own investigation to come to this conclusion? Why didn't AMD release a statement saying "For gamers who want even higher performance, we have revised our image quality at default starting with Cats 10.10. However, if you feel you want the same image quality as before, please Disable Catalyst AI and set texture filtering to High Quality." I am not going to say it's "cheating", because you can still get great image quality by moving the slider back to High Quality.
 
Last edited:

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Eh it's not too big of a deal. Reviewers have had to change driver settings for years to match image quality settings between different cards. It's their job to find settings to make the tests as accurate as possible.

For half a decade at the least, Nvidia's control panel defaulted to a "quality" setting when first installed instead of the "high quality" setting that matched ATI's then default IQ setting. The performance difference for Nvidia was larger than 5-6% too.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
The problem is starting with Cats 10.10, AMD cards don't match NV's image quality w/ default settings. So if you buy a brand new NV or AMD card and install it and change nothing in the control panel, the AMD card is filtering at a reduced image quality. To set the 2 cards to produce similar image quality, you now have to set AMD cards to High Quality.

Why did the reviewers have to go out of their way to do their own investigation to come to this conclusion? Why didn't AMD release a statement saying "For gamers who want even higher performance, we have revised our image quality at default starting with Cats 10.10. However, if you feel you want the same image quality as before, please Disable Catalyst AI and set texture filtering to High Quality." I am not going to say it's "cheating", but perhaps a disclaimer would have helped.

Reviewers knew about this and they simply decided to ignore it since most of them couldn't see any difference.

This was already talked about when the 6800 series launched. It is exactly the same situation.

What apparently AMD did was shift some optimizations that only happened in Cat AI advanced to standard.

Now when reviewers turn Cat AI off ALL THE OPTIMIZATIONS OF THE DRIVERS GO OFF!

Are the reviewers turning all the NVIDIA optimizations off as well? Actually they don't even have the options too.

http://www.guru3d.com/article/radeon-hd-6850-6870-review/

The new AMD Catalyst A.I. options can be found by opening the 3D settings page, selecting the “All” tab and scrolling down. The Texture Filtering Quality slider has three settings – High Quality, Quality (default), and Performance. The High Quality setting disables all texture optimizations. The Quality setting enables a trilinear optimization as well as an anisotropic sample optimization, which are designed to have no visible impact on image quality while offering improved performance.

The Performance setting enables more aggressive versions of these optimizations that offer further performance improvement, but may have a small impact on image quality. The AMD Radeon HD 6800 series continues to support fully angle invariant anisotropic filtering, and incorporates further improvements in LOD precision relative to the ATI Radeon HD 5000 Series.

These image quality benefits come with no additional performance cost and remain enabled at all Texture Filtering Quality settings. Here at Guru3D when comparing performance and image quality against NVIDIA products, we use the same Texture Filtering Quality setting in the NVIDIA Control Panel to ensure the most direct and fair comparison.

However, as of this writing NVIDIA products do not offer a comparable anisotropic filtering option with full angle independence. The Surface Format Optimization checkbox allows improved performance in selected games that use 16-bit floating point surfaces for HDR rendering. It is designed to have no discernable effect on image quality, and therefore we recommend it be left enabled as your default.

But I guess we can wait on BFG10K to do some IQ analysis or at least to chime in.

This on the other hand explains the differences of that Xbitlas review if they turned off all the optimizations for AMD cards.
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Could this simply be a bug with the 10.10 drivers? Don't automatically assume malice when ignorance offers a valid explanation.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
For half a decade at the least, Nvidia's control panel defaulted to a "quality" setting when first installed instead of the "high quality" setting that matched ATI's then default IQ setting. The performance difference for Nvidia was larger than 5-6% too.

You are correct. However, reducing image quality at default for NV was brought up. So why should this not be brought up when AMD does it? Whenever NV or AMD reduce image quality, gamers should know about it. It's up to them to choose if they want more performance or higher image quality. However, reviews should be done apples to apples.

Xbitlabs FX5900 Review:

"It is evident that further degradation of tri-linear filtering in 43.80 driver is a way to increase GeForce FX chips performance when both: anisotropic filtering and tri-linear filtering are used. So there is only one option left: to disable tri-linear filtering [optimizations]."

"On the first two screenshots taken in OpenGL you can see texture compression artifacts: in Performance mode with OpenGL the driver forces texture compression."
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Who wins when both are set to highest? I'm talking just the Image Quality setting. I've been using highest for the past two or three generations on everything I buy, AMD or nVidia.

Also, any good screenshots of the difference to see what people are getting worked up over? Is it noteworthy? (Ed: ok, see the article has pics)
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Are the reviewers turning all the NVIDIA optimizations off as well? Actually they don't even have the options too.

http://www.guru3d.com/article/radeon-hd-6850-6870-review/

Optimizations are perfectly acceptable if they don't produce a noticeable reduction in image quality. Xbitlabs and ComputerBase both note that Quality is no longer comparable to Standard for NV. This was not so with Cats <=10.9. This is why both tested with HQ for AMD. If Guru3d wasn't able to notice the image quality differences, it doesn't mean they are not there when other reviewers have noted them.
 

CitanUzuki

Senior member
Jan 8, 2009
464
0
0
Any reviewer worth their salt should make a point of ensuring that the playing field is level.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Optimizations are perfectly acceptable if they don't produce a noticeable reduction in image quality. Xbitlabs and ComputerBase both note that Quality is no longer comparable to Standard for NV. This was not so with Cats <=10.9. This is why both tested with HQ for AMD. If Guru3d wasn't able to notice the image quality differences, it doesn't mean they are not there when other reviewers have noted them.

And why did the other reviewers not say anything about that?

It isn't as if you can't change the settings.

It is there in the open - you have a tab for them.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
"AMD's "High Quality" texture filtering image quality is only comparable to "Standard" setting in NV control panel"

Is there any proof of this? because yeah if thats true, then reviewers might have missed it. Why is it only comparable? arnt they useing the same techniques? Only comparable sounds like its subjective (the opinion of one man). Ideally you would waat to have both cards run same settings.


Anyways you see they tested amd cards in both high quality and in quality. Exsample
Battlefield BC2 - 1920x1200

6870 Quality_____64.3 fps
6870 High Quality_64.2 fps
460__standart____47.1 fps

most expamples are like the above... if you look thoughout the benchmark on that german site.
In most cases its 0.1~0.4 fps differnce. Oh well... not the end of the world.
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
The problem is starting with Cats 10.10, AMD cards don't match NV's image quality w/ default settings. So if you buy a brand new NV or AMD card and install it and change nothing in the control panel, the AMD card is filtering at a reduced image quality. To set the 2 cards to produce similar image quality, you now have to set AMD cards to High Quality.

Why did the reviewers have to go out of their way to do their own investigation to come to this conclusion? Why didn't AMD release a statement saying "For gamers who want even higher performance, we have revised our image quality at default starting with Cats 10.10. However, if you feel you want the same image quality as before, please Disable Catalyst AI and set texture filtering to High Quality." I am not going to say it's "cheating", because you can still get great image quality by moving the slider back to High Quality.

Waiting for the AMD fanboy spin response in 3......2......1......

"Bbbbbbut AMD cares about ME as a person!!!!! They do!"
 

Absolution75

Senior member
Dec 3, 2007
983
3
81
Could account for xbitlabs seemingly outlier data where the HD6850 loses to the nvidia 470 768mb - xbit did it right like they normally do and a lot of sites missed it

Check the other thread on xbitlabs' review.
xbitlabs testing methods said:
ATI Catalyst:

&#8226;Anti-Aliasing: Use application settings/Standard Filter
&#8226;Morphological filtering: Off
&#8226;Texture Filtering Quality: High Quality
&#8226;Surface Format Optimization: Off
&#8226;Wait for vertical refresh: Always Off
&#8226;Anti-Aliasing Mode: Quality
&#8226;Other settings: default
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
"AMD's "High Quality" texture filtering image quality is only comparable to "Standard" setting in NV control panel"

Yes for Cats 10.9. After Cats 10.10, the image quality is worse.

Is there any proof of this? because yeah if thats true, then reviewers might have missed it.

Straight from the review:

Cats 10.9 High Quality:

cats109hq.png


Cats 10.10 High Quality:

cats1010hq.png


Nvidia "Standard/Default" Quality:

nvstandard.png


How many websites did you see test Oblivion or Half Life 2??? Maybe that's why they didn't notice. ;)

Cats 10.9 High Quality:

trackmaniahq109.jpg


Cats 10.10 High Quality:

trackmaniahq1010.jpg


NV "Standard/Default" Quality:

nvstandardtrack.jpg
 
Last edited:

Absolution75

Senior member
Dec 3, 2007
983
3
81
Yes for Cats 10.9. After Cats 10.10, the image quality is worse.



Straight from the review:
How many websites did you see test Oblivion or Half Life 2??? Maybe that's why they didn't notice. ;)

10.9 and the nvidia screenshot clearly show forms of AA while the 10.10 shows clear signs of aliasing - not necessarily texture filtering issues.

AA was clearly not used on 10.10

edit after RS's edit: post only applied to oblivion
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
RussianSensation thanks for pics, that makes it easier to compair.

looks like:
10.9 HQ > Nv standart > 10.10 HQ

yeah the Nv standart has higher quality than 10.10 but less than 10.09.

Could this be a bug with 10.10 though + oblivion game?
Or is it like this with ALL games?

if its JUST with 1 game, than this is a non issue right?


Edit: looks like it happends in track mania too.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Yes for Cats 10.9. After Cats 10.10, the image quality is worse.

How many websites did you see test Oblivion or Half Life 2??? Maybe that's why they didn't notice. ;)

Basically they couldn't produce an example in newer games.

Curiously is that computerdabase.de or xbitlabs didn't test HL2 or Oblivion either.

And of course they didn't show performance difference in the games they actually shown the pictures.
 
Last edited: