AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
With Cats 10.9, AMD's Quality Mode = NV's Quality mode, but with HD6800 and Cats 10.10, AMD's Quality mode < than both. That's what these reviewers are basically saying. So they have decided to up the quality to compensate.

Actually reviewers generally used AMD High Quality, which used to be the default (or is the default for 5800 series Quality?)setting and upped the NVIDIA setting to High Quality.

Look here at xbitlabs review of the 5870

http://www.xbitlabs.com/articles/video/display/radeon-hd5870_8.html#sect0

ATI Catalyst:

* Smoothvision HD: Anti-Aliasing: Use application settings/Box Filter
* Catalyst A.I.: Standard
* Mipmap Detail Level: High Quality
* Wait for vertical refresh: Always Off
* Enable Adaptive Anti-Aliasing: On/Quality
* Other settings: default

Nvidia GeForce:

* Texture filtering &#8211; Quality: High quality
* Texture filtering &#8211; Trilinear optimization: Off
* Texture filtering &#8211; Anisotropic sample optimization: Off
* Threaded optimization: Auto
* Vertical sync: Force off
* Antialiasing - Gamma correction: On
* Antialiasing - Transparency: Multisampling
* Multi-GPU performance mode: NVIDIA recommended
* Multi-display mixed-GPU acceleration: Multiple display performance mode
* Set PhysX GPU acceleration: Enabled
* Ambient Occlusion: Off
* Other settings: default

With the 6800 series AMD changed their driver - Now the Mipmap is fused with the AI tab.

Additionally with this change there is the "Surface Format Optimization" which before was part of the Cat AI standard optimizations and now can be turned off. For example Xbitlabs turned those off on their 6800 series review. The 5800 series would have those optimizations applied by default, although I'm uncertain if those optimization involve more than the FP16->FP11 optimization in DX9.

What I've been unable to understand is if you can still turn off the Cat AI.

When you read the computerdatabase.de article they talk about turning off the Cat AI. If they did indeed turned the Cat AI off I would expect that part of the performance difference delta is caused by turning the Cat AI off.

For example in the beyond3d forums Dave Baumann stated that what AMD did was match the default quality of NVIDIA driver setting, which is Quality and not High Quality.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Actually reviewers generally used AMD High Quality, which used to be the default setting and upped the NVIDIA setting to High Quality.

Ya, I mentioned in this thread that Xbitlabs used High Quality from the very beginning (so they are unaffected). However, the others apparently were using default texture filtering only.

The 5800 series would have those optimizations applied by default, although I'm uncertain if those optimization involve more than the FP16->FP11 optimization in DX9.

Check out the response AMD provided PCGameshardware with in #58. It appears HD58xx had no such optimizations when run in Catalyst AI:Standard mode. This is why these guys keep saying it was unfair to test HD68xx in Cats AI: Standard (Quality) mode since with new drivers Cats AI Standard was actually equivalent to Cats AI: Advanced for HD5870.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Ya, I mentioned in this thread that Xbitlabs used High Quality from the very beginning (so they are unaffected). However, the others apparently were using default texture filtering only.

But just to be clear the 5800 series default is High Quality.

So everyone previously to this situation if using Cat drivers default would be using High Quality.

Because there is two groups: one group is saying the default quality is worse now than it was before because High Quality was the default and now the default is Quality. It isn't because the Quality setting of the 6800 series is worse than the Quality setting of the 5800 series or the HQ setting of the 6800 is inferior to the Quality setting or High Quality setting of the 5800 series.

Xbit labs didn't say that the "HQ setting of the 6800 series is worse than the HQ of the 5800 series". They actually say there are improvements.

Then there is this second group including some of these german sites that say the 6800 HQ settings is worse than both the HQ and Q settings of the 5800 series and they base this claim on the shimmering of HL2 and trackmania + some screenshots of oblivion/witcher where I don't see anything.


We even have some websites that think that the new Q setting is acceptable compared to the HQ setting of the 5800 series.

You keep including Xbitlabs on the group of german sites but Xbitlabs don't say anything of the sort the german sites are saying.

To recap, and I'm sorry if you already know this but it seems to me we aren't exactly talking about the same thing here, we have:

1) A group of reviewers that think the new default, Quality, of the 6800 series is comparable to the 5800 series default HQ setting;

2) A group of reviewers that always turn the settings to max quality, where Xbitlabs is included, and say that the new series quality is equal or even superior to the 5800 series;

3) A group of reviewers that say that the new HQ and Q settings of the 6800 series is worse than even the Q settings of the 5800 series.

You keep including Xbitlabs in the 3rd group, but clearly they are part of group 2:

http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850_5.html#sect1

The improved anisotropic filtering algorithm must be noted, too.

Anisotropic filtering doesn&#8217;t have a serious effect on the performance of modern GPUs, so algorithms can be used whose filtering quality doesn't depend on the angle of inclination of the plane. AMD and Nvidia have both transitioned to high-quality anisotropic filtering already and the Radeon HD 6800 just improves the algorithm further to smooth out transitions between mipmaps so that they would be less conspicuous in textures with lots of small details.

As opposed to MLAA, the benefits of the new AF algorithm are easy to see. They won't be so clear in the process of actual gaming, yet a sharp-eyed gamer will surely spot the difference, especially as modern games offer a lot of scenes suitable for that.

Thus, the Radeon HD 6800 doesn&#8217;t look like a completely new product. It is rather a steady and evolutionary development of the successful Radeon HD 5800 architecture.

Sorry but this doesn't read as "the new HQ setting sucks balls compared the the 5800 series Q and HQ settings" like the german sites are saying.

Check out the response AMD provided PCGameshardware with in #58. It appears HD58xx had no such optimizations when run in Catalyst AI:Standard mode. This is why these guys keep saying it was unfair to test HD68xx in Cats AI: Standard (Quality) mode since with new drivers Cats AI Standard was actually equivalent to Cats AI: Advanced for HD5870.

But those optimizations can be turned off now even if you keep Quality instead of High Quality.

Can someone please with a 6870/6850 please provide a screenshot of the complete Cat AI tab?
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I think the German sites are what's confusing us. I quote Computerbase.de:

Google translate:

"With the Radeon HD 6000 series is the standard quality of texture filtering has been reduced over the previous generation, which is completely incomprehensible. Although the Barts-GPU enabled by the Enable High-Quality setting still the same level as on a Radeon HD 5000 or in some ways a better AF (see banding), standard grade, but has become worse."

But, if you look at their screenshots, they show worse quality though ....which is contradictory to their own conclusion.

I think we beat this topic to the end, unless some HD68xx users are going to provide videos.

The bottom line is that all these sites are using HQ. So their results are not be comparable with other websites that only use defaults. That's why I included Xbitlabs in this too because some people complained that their numbers looked too low for HD68xx series.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
I think the German sites are what's confusing us. I quote Computerbase.de:

Google translate:

"With the Radeon HD 6000 series is the standard quality of texture filtering has been reduced over the previous generation, which is completely incomprehensible. Although the Barts-GPU enabled by the Enable High-Quality setting still the same level as on a Radeon HD 5000 or in some ways a better AF (see banding), standard grade, but has become worse."

But, if you look at their screenshots, they show worse quality though ....which is contradictory to their own conclusion.

I think we beat this topic to the end, unless some HD68xx users are going to provide videos.

The bottom line is that all these sites are using HQ. So their results are not be comparable with other websites that only use defaults. That's why I included Xbitlabs in this too because some people complained that their numbers looked too low for HD68xx series.

From you link.

A small all-clear we can give the sense that the controller "high quality" the quality of the anisotropic filter is returned to the level of Catalyst AI standard. (At least most, because Half-Life 2, the silica structures seem so "AF-murdering," to be that even high quality can not match the quality of AI Disabled comes. Why this is so, we do not currently apparent, especially the first-person shooter, the only game in which we have noticed the behavior.) A better picture is not possible..

The trackmania problem already happened with the 5800 series and the oblivion screen shots don't prove anything to me. Maybe you can see the difference between them and say one is better looking but I can't and so do others.

The HL2 problem doesn't go away with HQ settings. And the HL2 texture shimmering is the thing everyone can see and that it is in focus with videos and everything. That is what makes me think it is some other problem.

In the end it seems to be 1 game, that apparently isn't even related to the HQ or Q settings, trackmania that had similar problems with the 5800 series and some screen shots where there are some minute differences.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Hey RS take a look at this

Is this what you want to see?


I'm waiting for WaTaGuMp to confirm that the settings are performance, quality and high quality, but if they are you can't turn Cat AI off anymore.

Before you had off, standard and advanced. Is HQ off? Or is HQ just referring to the mipmap, that disappeared as an independent tab? And mipmap settings before were high performance, performance, quality and high quality while now are just performance, quality and high quality? What kind of optimizations are grouped in Surface Format Optimization? Is turning Surface Format Optimization equivalent to turning AI off and that explains some of the performance difference?

Loads of questions.
 

WelshBloke

Lifer
Jan 12, 2005
33,256
11,396
136
Texture filtering deficiency is almost impossible to see in screenshots because it's something that is much more clear in motion. You can't take a screenshot of texture flickering either. Therefore, besides TrackMania, it is often too difficult to discern in screenshots. But if you watch videos, it's obvious right away. BTW, the reviewers are testing NV with "Texture filtering – Quality: High quality", not Quality as you have at default in your system. So you are actually running your card in Performance image quality mode.

Just download the first video here. As the character is moving in HL2, you can see distinct areas of texture/mipmap transitions in front of him. This is not how high quality filtering should work. With Cats 10.9, AMD's Quality Mode = NV's Quality mode, but with HD6800 and Cats 10.10, AMD's Quality mode < than both. That's what these reviewers are basically saying. So they have decided to up the quality to compensate.

Those are the settings that the NV control panel defaults to if I reset it or clean install new drivers. Thats what youre complaining about AMD doing yes?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Those are the settings that the NV control panel defaults to if I reset it or clean install new drivers. Thats what youre complaining about AMD doing yes?

#1 - Catalyst 10.10 Defaults for HD68xx cards do not equal the same Defaults for HD58xx cards. Under default quality settings, HD68xx actually has worse image quality than HD58xx series. Therefore, AMD's claims of superior AF/texture quality can only be realized in High Quality mode. Well reviewers initially thought the improved image quality that AMD told them about was enabled by default. After investiging, their view and AMD's confirmation is that it isn't. Therefore, what's the point of touting these image quality improvements if you are not going to be testing with High Quality mode at all times? Hence, the new testing methodology. The impact is a 5-6&#37; performance drop from initial tests.

#2 - I am not complaining. I am only providing information on our forum about some noteable websites which have noted differences in image quality and as a result changed their testing methodologies. The result was a drop in performance which changed their original reviews. Going forward, this new testing methodology means that reviews from these 5 websites are not going to be comparable to other website that simply use default AMD settings.

#3 - You are still not understanding. This has little to do with Nvidia graphics cards. It is about testing similar workload for all graphics cards, including older AMD cards.

For example, if GTX580 was released and NV suddenly reduced image quality at its default Quality settings compared to default Quality setting on GTX480, then it would not really be reasonable to test GTX580 against the 480 under the same defaults. And if somehow GTX580 has superior texture image quality under High Quality settings, then all reviews should test GTX580 with HQ on.

Think about it, let's say Toyota releases a new model and claims that it can achieve 41mpg. Reviewers are under the impression that the car can achieve this with standard fuel much like the previous model. However, when they actually conduct the test, they end up with 38mpg when using standard gasoline. They test the old model and end up with 39 mpg with standard gasoline. So they scratch heads since Toyota just claimed that the new model is more fuel efficient. Then after asking Toyota, they get an official response that you need premium fuel to achieve the 41mpg rating, otherwise your fuel economy with standard gasoline will actually be worse than the previous model. So armed with this information, the testers redo the testing and get the 41mpg - as stated. However, now the cost for the identical trip distance has gone up since premium fuel costs more $$ (just like added image quality costs more graphics card performance). The automotive journalists then update their original findings and explain why they did that to the readers.
 
Last edited:

WelshBloke

Lifer
Jan 12, 2005
33,256
11,396
136
#1 - Catalyst 10.10 Defaults for HD68xx cards do not equal the same Defaults for HD58xx cards. Under default quality settings, HD68xx actually has worse image quality than HD58xx series. Therefore, AMD's claims of superior AF/texture quality can only be realized in High Quality mode. Well reviewers initially thought the improved image quality that AMD told them about was enabled by default. After investiging, their view and AMD's confirmation is that it isn't. Therefore, what's the point of touting these image quality improvements if you are not going to be testing with High Quality mode at all times? Hence, the new testing methodology. The impact is a 5-6% performance drop from initial tests.

Thats like saying there's no point in saying high levels of AA improve image quality unless you test with them at all times.
Its a choice for the user, increased performance or increased image quality. Its always been that way.



#2 - I am not complaining. I am only providing information on our forum about some noteable websites which have noted differences in image quality and as a result changed their testing methodologies. The result was a drop in performance which changed their original reviews. Going forward, this new testing methodology means that reviews from these 5 websites are not going to be comparable to other website that simply use default AMD settings.

Did they notice it or was it brought to their attention by some PR spods?

#3 - You are still not understanding. This has little to do with Nvidia graphics cards. It is about testing similar workload for all graphics cards, including older AMD cards.

For example, if GTX580 was released and NV suddenly reduced image quality at its default Quality settings compared to default Quality setting on GTX480, then it would not really be reasonable to test GTX580 against the 480 under the same defaults. And if somehow GTX580 has superior texture image quality under High Quality settings, then all reviews should test GTX580 with HQ on.

Of course its relevant to Nvidia cards as well. When people read new card reviews they are most interested in how they perform with respect to the competition, this is nearly always form a different vendor.

You have already stated that the drivers in my screen capture were "not Quality as you have at default in your system. So you are actually running your card in Performance image quality mode." Which is exactly the same as the AMD ones.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I believe it was noticed, because the 5 series cards suddenly picked up 5-6&#37; across the board results vs 10.9. So they investigated. AMD chose the timing of driver settings, standard @ the release of new hardware.
At least how I understand it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So you are actually running your card in Performance image quality mode." Which is exactly the same as the AMD ones.

No, it's not the same. Default Q for NV > Q for HD68xx series. I only pointed this out to you because I thought you should up it to HQ for yourself anyway :)

BTW, Xbitlabs took all their benchmarks down. Their HD6800 series has no numbers :whiste:
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
No, it's not the same. Default Q for NV > Q for HD68xx series. I only pointed this out to you because I thought you should up it to HQ for yourself anyway :)

BTW, Xbitlabs took all their benchmarks down. Their HD6800 series has no numbers :whiste:

NOOO WAAYYYYYY!!!!! LMAOOOOO. I guess AMD didn't like their article???
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
No, it's not the same. Default Q for NV > Q for HD68xx series. I only pointed this out to you because I thought you should up it to HQ for yourself anyway :)

And this conclusion is based on the trackmania mipmap problems, HL2 shimmering that doesn't go away with HQ and oblivion screen shots.

For example this guy, http://hardforum.com/showpost.php?p=1036385448&postcount=112 says:
These are some shots from AoC using the HD 6850's default quality settings. 4xAA|16xAF was set in game. I chose that title because I've also played it on GTX 470 SLi and HD 5870 Crossfire. I saw no differences in IQ running about between any of the cards I've named.
.

But I hope we will get some more detailed IQ analysis that doesn't involve exclusively 4+ year old games.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Is there somewhere that has tested this objectively (using reference images)?

As Gaia and others noted, it was tested with older games (Crysis, HL2, The Witcher, Oblivion, TrackMania). When you mean "objectively", you mean based on facts/quantifyable measurements? Image quality is both subjective (what is a large image quality difference to some is unnoticeable to others) and objective (you can render nearly identical image with a completely different workload as a result of optimizations). For example 99.9&#37; of people would never notice FP16 demotion in games; still the image quality difference is there and actually changes the way the developer intended the game to look.

NOOO WAAYYYYYY!!!!! LMAOOOOO. I guess AMD didn't like their article???

Click:
Xbitlabs HD6800 series review.

Nothing after Page 12... :oops:

Maybe they are redoing all the tests with different image quality settings? Who knows.
 
Last edited:

WelshBloke

Lifer
Jan 12, 2005
33,256
11,396
136
As Gaia and others noted, it was tested with older games (Crysis, HL2, The Witcher, Oblivion, TrackMania). When you mean "reference images", you mean standard filtering software as opposed to games?


A long time ago, when this issue came up, I remember some website rendered an in game image with an ATI card and an Nvidia card and then "subtracted" the image from a reference image and displayed the results.

Both cards were pretty far from the reference image if I remember correctly.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
A long time ago, when this issue came up, I remember some website rendered an in game image with an ATI card and an Nvidia card and then "subtracted" the image from a reference image and displayed the results.

Both cards were pretty far from the reference image if I remember correctly.

That's a very cool test. I've never seen something like that done. They did run the standard "cylinder" test. AnandTech has shown HD5870/HD6870/GF100 side by side. You can easily see HD6870 is a lot better than the 5870. What these sites keep talking about is to achieve what you see in that ^^ comparison, they had to enable HQ for HD6800 series.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
2.png


The 480: (above)


1.png


The 6870: (above)



meh to me they look enough alike to not matter, maybe with slight egde to the 6870.
Its easier to see on their own page:
http://www.anandtech.com/show/3987/...enewing-competition-in-the-midrange-market/5#