AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
GaiaHunter, are you saying this only happends in older games? like non dx11 games?
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
This is good information to have since my 6850 is supposed to arrive tomorrow -- it sounds like I should turn off Catalyst AI for at least some games.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
GaiaHunter, are you saying this only happends in older games? like non dx11 games?

I'm saying if it is a widespread problem then it should be easily reproduced in all the games.

Otherwise what you have is an optimization that created some bug in some older titles.

Or do you think AMD needs the 6850/70 to have more performance in Half Life 2 or Oblivion at expense of IQ to look good in reviews, that don't even bench these games anymore?

It still is an issue, although users have the chance to fix it by choosing their settings.

If these optimizations do indeed lower IQ to all games, then the reviewers only need to take some screenshots of the games they actually benchmarked.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I'm saying if it is a widespread problem then it should be easily reproduced in all the games.

Otherwise what you have is an optimization that created some bug in some older titles.

Or do you think AMD needs the 6850/70 to have more performance in Half Life 2 or Oblivion at expense of IQ to look good in reviews, that don't even bench these games anymore?

It still is an issue, although users have the chance to fix it by choosing their settings.

it sounds like that is what people are makeing it out to be.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I'm saying if it is a widespread problem then it should be easily reproduced in all the games.

It's unrealistic to expect them to test 1000 games. If you want more performance, leave Cats AI on and texture filtering to Quality. If you don't notice the difference in the games you are playing, leave Cats AI on and filtering to Quality. If you want the best image quality, disable Cats AI and set filtering to HQ.

With older drivers, the image quality was better at default. This is pretty important to gamers (forget about benchmarks for a second).

However, when doing reviews, settings should utilize comparable filtering quality, which means HQ for AMD in reviews (something that Xbitlabs and ComputerBase are doing). If you don't agree with this methodology, then don't put any weight into their reviews. The choice is always yours in the end in how you want to play your games. :thumbsup:
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
I'm saying if it is a widespread problem then it should be easily reproduced in all the games.

Otherwise what you have is an optimization that created some bug in some older titles.

Or do you think AMD needs the 6850/70 to have more performance in Half Life 2 or Oblivion at expense of IQ to look good in reviews, that don't even bench these games anymore?

It still is an issue, although users have the chance to fix it by choosing their settings.

If these optimizations do indeed lower IQ to all games, then the reviewers only need to take some screenshots of the games they actually benchmarked.


does this only apply to dx9 titles ? if so, big whoop dx11 will be the standard going forward and if some old dx9 game is a bit buggy, oh well.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
It's unrealistic to expect them to test 1000 games. If you want more performance, leave Cats AI on and texture filtering to Quality. If you don't notice the difference in the games you are playing, leave Cats AI on and filtering to Quality. If you want the best image quality, disable Cats AI and set filtering to HQ.

However, when doing reviews, settings should utilize comparable filtering quality, which means HQ for AMD in reviews (something that Xbitlabs and ComputerBase are doing). If you don't agree with this methodology, then don't put any weight into their reviews. The choice is always yours in the end in how you want to play your games. :thumbsup:

What I don't agree with is the methodology to reach this conclusion.

I just want them to show me 1 game of the ones they actually benchmark.

The implication here is that the games are faster due to low IQ.

They show a couple of games where there seems to be some lower IQ.

Then their turn off the optimizations of the drivers, AF ones or otherwise, and test completely different games to show the lower speed.

They don't show the cause->effect relation.

This would be like saying that SLI is broken because it doesn't work in CIV V.

What they actually shown is:

a) there seems to be side effects/bugs of this new optimization in a few older games;

b) optimizations of the drivers give around 5-6% more performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
What I don't agree with is the methodology to reach this conclusion.

I just want them to show me 1 game of the ones they actually benchmark.

does this only apply to dx9 titles ? if so, big whoop dx11 will be the standard going forward and if some old dx9 game is a bit buggy, oh well.

More websites are reporting the same findings. Differnet games investigated too.

HT4U

TweakPC (with videos)

Clearly you need to compare the HQ modes of AMD with Catalyst AI: Disabled with the default quality of nVidia to get a more or less similar IQ. That's 3 separate websites now reporting the same thing....with different games (The Witcher, Crysis, Oblivion, HL2).

Perhaps BFG can investigate the mipmap/texture filtering transitions in his Image Quality Analysis.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Russiasensation, GaiaHunter asked for in-game results.

the 2 links dont show any of the games they benchmarked.. those are old synthetic benchmarks.
Howcome non of those guys show it where it has any meaning? inside a modern game?

like AvP, dirt2 ect ect.

take one screen shot of HQ in 10.09, one in HQ with 10.10, and then show a nvidia card standart settings.

So you could rule out its not just a issue it has with old dx games ect.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Is this something that you believe happened? Or are you just throwing it out there as a possible reason or for some reasonable doubt?
Because in a nutshell, by default, Nvidia's "Standard" quality setting is the same as AMD's "High Quality" setting. Obviously when you set the AMD quality to "Standard", you'd be setting it to a lower IQ that Nvidia's default.
Now run benches with those settings and let me know if you think that's fair. Whether the human eye can perceive it or not isn't the issue. Don't pretend that could make any difference. We are talking numbers here.
You know what the problem is.
Precisely this. In order to do a fair comparison in regards to performance it is esesntial each card is rendering textures to the same quality level. If AMD Radeon cards on their default level are putting out lower image quality, and in order to match nVidia's image quality at their default level is to move the setting to high quality (on the Radeon card), this is the only way to do a straight up, and fair, comparison. It doesn't take too much intellectual capacity to understand that any card rendering less texture quality is going to perform faster. Plus, given the fact that all it takes is for a single frame or two for reviewers to laughingly call one card vs another a 'clear' winner (based on the scaling of their graphs) which affects people's buying decisions...then this is all the more reason to be sure everything is set the same image quality wise.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Wow AMD....this is about as low as it gets. Even the fanbois are having a hard time with this one.

I think getting another 5850 is out of the question at this point.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Teizo no one is against that... we re just asking if you have proof the same thing happends in all games, or atleast newer dx11 ones? Can you show 10.09 HQ + 10.10 HQ + Nvidia standart, in newer game, so we can compaire quality and see if it happends in dx11 titles too (best if you can show same thing happending across a few games so its not just a isolated incident).

if it happends in 1 or 2 old dx9 games its not a issue. No one benchmarks old dx9 games.

if it does happend, in all games <.< yeah amd is playing dirty then, and should change it back to 10.09.
Until someone proves thats the case.. Im inclinded to be skeptical.


Also has anyone seen the differnce in fps? its like a 0.1-0.4 fps loss at most changeing it to higher setting.
People make all this fuss over 0.1 fps?
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
More websites are reporting the same findings. Differnet games investigated too.

HT4U

TweakPC (with videos)

Clearly you need to compare the HQ modes of AMD with Catalyst AI: Disabled with the default quality of nVidia to get a more or less similar IQ. That's 3 separate websites now reporting the same thing....with different games (The Witcher, Crysis, Oblivion, HL2).

Perhaps BFG can investigate the mipmap/texture filtering transitions in his Image Quality Analysis.


I have absolutely spot on perfect 20/20 vision. I have to really stare at most of those images to see the IQ difference and in game I doubt I would have enough focus to care.

did it occur to anyone that changes in the driver code itself might be cuasing some generic rendering issues ?? At what point is IQ really going to matter with the rasterization/shader model anyways ? essentially this type of image generation is quickly hitting a brick wall.

Also I really don't care. I have a hard time discerning the IQ differences and as I siad, I have PEREFECT 20/20 vision with excelent detail. I would be glad to pass forward the opthomology test I had done 6 months ago as part of renewing my CDL.

either way, I can hardly see any difference as standard screen resolutions.

what this really is, is a red herring.

enjoy building your strawman. I am going to go back to enjoying my games.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Precisely this. In order to do a fair comparison in regards to performance it is esesntial each card is rendering textures to the same quality level. If AMD Radeon cards on their default level are putting out lower image quality, and in order to match nVidia's image quality at their default level is to move the setting to high quality (on the Radeon card), this is the only way to do a straight up, and fair, comparison.

Exactly! If the end user can't tell the difference in the games he/she is playing at home, they can leave Catalyst Optimizations and the filtering quality at default settings. However, professional reviewers should strictly test with HQ on. Granted, this probably won't affect most people's gaming experience since I imagine anyone spending $200+ on a videocard is already using HQ texture filtering in the first place.

Also I really don't care. I have a hard time discerning the IQ differences and as I siad, I have PEREFECT 20/20 vision with excelent detail. Either way, I can hardly see any difference as standard screen resolutions.

what this really is, is a red herring.

enjoy building your strawman. I am going to go back to enjoying my games.

If you don't find image quality differences material for you, not a big deal. Leave the optimizations and enjoy the added performance :). This doesn't make my post biased/strawman. Thank you.
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
^

So the AMD fan response is "Who cares, I can't tell the difference in IQ anyway!"


Talk about grasping at straws.

As Russian mentions above, it absolutely matters when comparing graphics cards.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Why don't they show these issues in the titles that they use in their benching? I'm sure there're plenty of instances of repeating texture patterns on current titles to test these things out or am I missing something?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I agree it matters... however PLEASE SHOW FACTS!

Go find a few new dx11 games and test it and show that.... so far we havnt seen any of that! only old synthetic dx9 / old dx9 games ect.

It might be a dx9 issue ONLY... and not be present in dx11.


Don't see any difference either.

There is a differnce, its not huge but it is there!

10.09 = prettier than Nvidia standart.
10.10 = worse than Nvidia standart.

Just out of curiousity... did AMD guys give nvidia guys crap about them useing WORSE texture quality before 10.10? if so its only fair they call faul on this.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Exactly! If the end user can't tell the difference in the games he/she is playing at home, they can leave Catalyst Optimizations and the filtering quality at default settings. However, reviewers should strictly test with HQ on. Granted, this probably won't affect most people's gaming experience since I imagine anyone spending $200+ on a videocard is already using HQ texture filtering in the first place.



If you don't find image quality differences material for you, not a big deal. Leave the optimizations and enjoy the added performance :). This doesn't make my post biased/strawman. Thank you.
I guess that is what the reviewers have been doing too - leaving the optimizations on as they don't affect IQ.

Of course NVIDIA keeps fighting for the cat AI be turned off.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why don't they show these issues in the titles that they use in their benching? I'm sure there're plenty of instances of repeating texture patterns on current titles to test these things out or am I missing something?

Back in the days, FX5900 series was sacrificing image quality in some games (not all games) with tri-linear texture filtering optimizations. Since it was impossible for every review website to test each game to figure out where they could leave the optimizations ON vs. OFF and not adversely impact the image quality, the respected websites like Anandtech and Xbitlabs started testing with Trilinear optimizations off at all times to level the playing field.

This is a similar scenario, except now AMD is doing it. Of course some websites are not buying it and are voicing their view that in order to have apples vs. apples comparison, HQ should be applied for AMD and Cats AI: Disabled (like Computerbase/Xbitlabs, etc.). The end user still has the option to leave these optimizations ON at home.

Just out of curiousity... did AMD guys give nvidia guys crap about them useing WORSE texture quality before 10.10? if so its only fair they call faul on this.

No, because when setting NV to HQ, the image quality > AMD HQ. Of course reviewers are already using HQ for NV, not Standard.

You seem to have missed the whole point that AMD's Q with Cats 10.10 < AMD's Q setting with Cats 10.9 when comparing AMD cards with each other (get NV out of the picture for a moment). So now AMD's HQ with Cats 10.10 = AMD's Q with Cats 10.9. Therefore, to keep apples vs. apples comparison like before, from now on HQ should be used since that now equals previous Quality setting.

I guess that is what the reviewers have been doing too - leaving the optimizations on as they don't affect IQ.

Of course NVIDIA keeps fighting for the cat AI be turned off.

If you don't agree with Computerbase/Xbitlabs and other websites that choose to go HQ route, then you don't read their reviews. Again, this doesn't correlate with the assertion that image quality is unaffected.
 
Last edited:

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Teizo no one is against that... we re just asking if you have proof the same thing happends in all games, or atleast newer dx11 ones? Can you show 10.09 HQ + 10.10 HQ + Nvidia standart, in newer game, so we can compaire quality and see if it happends in dx11 titles too.

if it happends in 1 or 2 old dx9 games its not a issue. No one benchmarks old dx9 games.
Ok, I see what you are saying here. Testing each game would be incredibly time consuming. However, this needs to be looked in to. Don't you think?

@ Russian Sensation

Yeah, I remember nVidia being the first to do this I believe as they were getting waxed by the ATI 9800 Pro (of which one I still own) and the 9800XT back in the day. It's a damned shame if this is true as much mud as they slung at nVidia for doing so to only turn around and do the same later on.
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
^

So the AMD fan response is "Who cares, I can't tell the difference in IQ anyway!"


Talk about grasping at straws.

As Russian mentions above, it absolutely matters when comparing graphics cards.


Can you see the difference or has your brain been convinced to percieve one ?

Ask yourself this. Is there really a difference ?

Wheres waldo ?

A true IQ issue would be visable upon the whole scene. It only seems to imperctably pick a few random objects with nearly zero consistency. did it occur to anyone thats its not a particular quality issue in terms of driver settings but more likely a code path that just isn't jiving well.

BTW on my new big 31 inch monitor. I still can't see a difference unless I zoom way in and the image is so pixilated its not even discrenable.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Ok, I see what you are saying here. Testing each game would be incredibly time consuming. However, this needs to be looked in to. Don't you think?

@ Russian Sensation

Yeah, I remember nVidia being the first to do this I believe as they were getting waxed by the ATI 9800 Pro (of which one I still own) and the 9800XT back in the day. It's a damned shame if this is true as much mud as they slung at nVidia for doing so to only turn around and do the same later on.

Yes this needs to be look into.
We need to see screenshots of 3-4 differnt new DX11 games.

we need 10.09 HQ , 10.10 HQ, Nvidia standart screenshots for each of the 3-4 games. So we can draw a valid conclusion.