AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Are you sure ? There was no compression at all on the pictures. None whatso ever.

Thats great that you see a difference BTW

They are both the Nvidia images.

I just copied them for both sides and removed all indications of what they are.

Good to know that there is no difference yet you see one.

surprise

I was hoping that you had simply cut the AMD and NVIDIA tags...
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
wow after all that hyperbole about how great these pictures are and stuff and when I plant the easter egg, everybody shuts up. wow. huh I was hoping this topic might have actually had some merit, geuss not.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
wow after all that hyperbole about how great these pictures are and stuff and when I plant the easter egg, everybody shuts up. wow. huh I was hoping this topic might have actually had some merit, geuss not.

What are you talking about? What "easter egg"? Everyone stopped talking about this topic because it's not going anywhere. There are clearly 3 sides: (1) people who can see the difference (in videos) and think HQ should be used (as shown by the trend of websites strictly switching to this new testing methodology), (2) people who can't see the difference or just don't care enough and then (3) those who didn't even bother watching any videos but are still discussing pictures... and throwing "compression artificats" arguments considering ORIGINALS are available on the websites. :whiste:

Even BFG confirmed that current default settings in Cats 10.10 result in reduced image quality compared to default settings in Cats 10.9. Also, he implied that AlienBabeltech already tests at the Highest quality anyway. So obviously for some "Advanced" reviewers, this will never be an issue. But the point still stands that for every gamer, out of the box, there will be reduced image quality with Cats 10.10 compared to prior Cats versions.
 
Last edited:

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
What are you talking about? Everyone stopped talking about this topic because it's not going anywhere. There are clearly 3 sides: (1) people who can see the difference (in videos), (2) people who can't see the difference or just don't care enough and then (3) those who didn't even bother watching any videos but are still discussing pictures.... :whiste:


People can see a difference between the same image. That in itselfs implies that

1. there is little to no difference
2. the topic is entirely hyperbole
3. the bias infer a difference when one does not exist.

So no the topic is crap. Yes there are a few buggy image renders, maybe try emailing AMD and let them know so they can fix the driver optimizations.

stop whining.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So no the topic is crap.
stop whining.

Is that why 5 websites changed their testing methodologies/results? Because they are "stupid", "blind" and "biased"? You realize this issue came about when comparing image quality between Radeon cards.

If you have nothing interesting to contribute to this dicussion, please move along.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Even BFG confirmed that current default settings in Cats 10.10 result in reduced image quality compared to default settings in Cats 10.9. Also, he implied that AlienBabeltech already tests at the Highest quality anyway. So obviously for some "Advanced" reviewers, this will never be an issue. But the point still stands that for every gamer, out of the box, there will be reduced image quality with Cats 10.10 compared to prior Cats versions.

BFG also stated that he also tests NVIDIA in HQ and it is hard/impossible to do a direct comparison between AF.

The other sites are doing HQ 6800 vs Q NVIDIA and/or turning Cat AI OFF!
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
BFG also stated that he also tests NVIDIA in HQ and it is hard/impossible to do a direct comparison between AF.

For the 100th time, this is NOT ABOUT NVIDIA VS. AMD.
It's about running HD68xx at the same image quality as HD58xx series. HD58xx at Q settings > HD68xx at Q.

The other sites are doing HQ 6800 vs Q NVIDIA and/or turning Cat AI OFF!


Website #6 (GPU-Tech.org):

"That would also mean, that apart from the harsh banding the Evergreens texture filter is as good as it gets - at least with AI disabled - and HD6k is step back with regard to texture quality."

Also, these websites are actually using NVidia High Quality, not Standard Q, against HQ for AMD (which always used to be their standard Q setting).

"Supposedly, with the slider set to High Quality all texture optimizations are turned off as was the case on HD 5800 with Catalyst 10.9 and AI set to default already" - GPU-Tech.org

Therefore, new testing will be HD58xx Q = HD68xx HQ.

NV shots are only there to show that HD58xx and NV produce nearly identical image quality compared to inferior HD68xx series in Q.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
No, they are actually using NVidia High Quality, not Standard, against HQ for AMD (which always used to be their standard Q setting).

Also you aren't getting this - this only affected website reviews which tested everything at default, not the ones who already used HQ for both camps. This has NOTHING to do with NVDA. It's about running HD68xx at the same image quality as HD58xx series. HD58xx ~ NV at Q settings > HD68xx at Q.

Therefore, new testing will be HD58xx Q = HD68xx HQ.

NV shots are only there to show that HD58xx and NV produce nearly identical image quality compared to inferior HD68xx series in Q.

Again there is no shots of commonly/recent used benchamark games.

If the IQ difference is everywhere then it should be easy to show on a recent title too.

And again, the videos show shimmering on old titles which can be attributed to LoD bias settings.

It is true that AMD shifted optimizations from AI advanced to AI standard, but once more show screenshots/videos of all these game that get boosted performance numbers, show the reduced IQ.

Because I've also seen all these reviews from AT, [H], Guru3D, etc, where they haven't noticed a difference in IQ.

Are you telling me they are biased, stupid or blind?

EDIT: And by the way xbitlabs always used high quality.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Because I've also seen all these reviews from AT, [H], Guru3D, etc, where they haven't noticed a difference in IQ.

Are you telling me they are biased, stupid or blind?

No, but not a single one of these did an image quality investigation/comparison. HardOCP recently concluded that Morphological AA looks better than 8x MSAA. I am not going to take anything they say seriously after that article when all their screenshots show massive blurring of textures with MLAA. HardOCP also conlcluded that HD5870 CF is no faster than GTX460 1GB SLI, which is obviously untrue.

The bottom line is when NV produced inferior texture quality with tri-linear optimizations, Xbitlabs quickly noted this issue. They have shown no mercy for any unfair optimizations in all of their reviews leading 7-8 years. Xbitlabs was one of the first that pointed out that usual AA reduces texture image quality in Metro 2033. Again, AT, Guru3D, HardOCP never mentioned that. If you don't agree, then feel free to disregard all PCGameshardware, Xbitlabs, Computerbase.de reviews.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
fact is there is a differnce in a OLD game useing dx9? or something.

If anyone could just show us the same issue in a newer dx11 game, there would be no doubt. NO one is doing that... 1 picture from 1 old game, where it might be a codeing bug issue or something?.

I question old synthetic tests for IQ comparisons...... why isnt there a newer game with the issues showing? Like GaiaHunter says, it could be its not something that happends in newer games that use dx11 but only in that 1 older game. Maybe thats why no one noticed it in reviews? and no one has found any pictures of it in a newer game?
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
fact is there is a differnce in a OLD game useing dx9? or something.

If anyone could just show us the same issue in a newer dx11 game, there would be no doubt. NO one is doing that... 1 picture from 1 old game, where it might be a codeing bug issue or something?.

I question old synthetic tests for IQ comparisons...... why isnt there a newer game with the issues showing? Like GaiaHunter says, it could be its not something that happends in newer games that use dx11 but only in that 1 older game. Maybe thats why no one noticed it in reviews? and no one has found any pictures of it in a newer game?

Why would texture filtering vary between API's?
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
The bottom line is when NV produced inferior texture quality with tri-linear optimizations, Xbitlabs quickly noted this issue. They have shown no mercy for any unfair optimizations in all of their reviews leading 7-8 years.

And I was thinking you were saying this wasn't about NVIDIA at all.

Xbitlabs was one of the first that pointed out that usual AA reduces texture image quality in Metro 2033. Again, AT, Guru3D, HardOCP never mentioned that. If you don't agree, then feel free to disregard all PCGameshardware, Xbitlabs, Computerbase.de reviews.

And AT mentioned mipmap transitions problems on their 6800 review in trackmania.

Xbitlabs didn't say anything about AF problems though http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850_5.html#sect1 .


As opposed to MLAA, the benefits of the new AF algorithm are easy to see. They won't be so clear in the process of actual gaming, yet a sharp-eyed gamer will surely spot the difference, especially as modern games offer a lot of scenes suitable for that.

Thus, the Radeon HD 6800 doesn’t look like a completely new product. It is rather a steady and evolutionary development of the successful Radeon HD 5800 architecture.

Even the Computedatabase.de says the only shooter that they noticed had this problem was HL2 and HQ and disabling AI wasn't enough to fix the problem!

If they want to change their test methods it is fine.

I just wished they show proof of their claims of lower IQ quality based on 1 single frigging game they actually BENCHMARK!

"The new AF IQ is lower because we see some minor problems* and/or flickering on a few old games! So we going to test it with CAT AI off and HQ even though in HL2 turning the CAT AI OFF didn't solve the problem!"

*which are small differences in some zoomed in screenshots of old games that are even open to subjectivity.

It would be fun if next driver fixed the shimmering in HL2. Would that mean the AF quality isn't lower?
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Why does a 480 get 5-10 fps in a old game like FFXI, when any older 2xx card can run capped 30 fps in it? What is it now? 8 months and no driver fixes for the issues for 4xx cards?
(on nvidia forums, it has like 84,000 views and 1000 replies, and a driver dude saying for 6months a fix would come within 1 week or something, and just kept pushing it back and back)

Its a old game... hardware changes over time what do I know (apparntly sometimes they arnt as backwards compatable as they should be)... I just know its not uncommon for new hardware to eventually have problems with old stuff.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1 picture from 1 old game, where it might be a codeing bug issue or something?.

If you followed this thread, it was found by reviewers in at least 5 games: Oblivion, HL2, Crysis, The Witcher, Trackmania.

And AT mentioned mipmap transitions problems on their 6800 review in trackmania.

No, they said mipmap transition problems happened with HD5800 series, not 6800. Once you set HD6800 to HQ, these issues are improved from HD58xx series. The problem is with websites who tested in Q only (of course they have corrected their reviews since then). I am pretty sure AT tested with HQ, so their review is perfectly valid.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Why would texture filtering vary between API's?

Don't know.

Why are all the examples games that are several year old?

Someone just decided to test IQ on games they NEVER BENCHMARK?

If they had come forward and said "we found out on some of the games we generally test some IQ quality problems" and then proceed to show the problems that would be natural.

But no - a problem in a few old games and they don't even care enough about these games to benchmark them at default settings and their new methodology to see/show performance difference.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Why are there no pictures of Crysis or the witcher? are those games running in dx11 when he took the screenshots or dx9?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
But no - a problem in a few old games and they don't even care enough about these games to benchmark them at default settings and their new methodology to see/show performance difference.

At least now we know that performance for HD68xx series will be 5-6% slower in Xbitlabs, Computerbase, PCgameshardware compared reviews which continue to use Q only setting. This will probably also be true for HD69xx series going forward. So at the very least, this new information regarding their testing methodologies is important to us the readers. Personally, I will continue to review Guru3D, AT and Xbit/ComputerBase/PCGamesHardware, but now it will be easier to explain why the numbers don't exactly match.

I think you and I can agree or disagree with the approach used (and that's cool esp. since you always approach differences of opinion with respect :thumbsup:). However, we have more up-to-date information on how their reviews will change - and that's important. ;)
 
Last edited:

BathroomFeeling

Senior member
Apr 26, 2007
210
0
0
wow after all that hyperbole about how great these pictures are and stuff and when I plant the easter egg, everybody shuts up. wow. huh I was hoping this topic might have actually had some merit, geuss not.
'Shuts up'? There's this thing called sleep you know, which you yourself partook in, yet not one here said it meant your deafening silence.

As for merit, no you didn't. You flatout lied about using nVidia images when you clearly used AMD ones. For the record, there was a difference between the images you posted, pixel values were measured and compared.

In any case folks, my point has gone beyond the bounds of sense. It was originally intended to address Arkadrel statement & to serve my curiosity, in a somewhat passing fashion, nothing more. It has served it's purpose with his mature & calmed response, so I'll leave it be. It's clear though that some here are not interested in a mature discussion, and would rather exaggerate it to the point of absurdity (you know, 'hyperbole'). I'll leave them to apologize for their own behavior.

Thanks.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
At least now we know that performance for HD68xx series will be 5-6% slower in Xbitlabs, Computerbase, PCgameshardware compared to every other review, and probably the same is going to be true for HD69xx series going forward. So at the very least, this new information regarding their testing methodologies is important to us the readers. Personally, I will continue to review Guru3D, AT and Xbit/ComputerBase/PCGamesHardware, but now it will be easier to explain why the numbers don't match. ;)

The problem of canned/synthetic benches like the 3dmark is not that NVIDIA/AMD can optimize. Optimizing performance by get rid of unnecessary/wasteful work isn't a bad thing. The problem with canned/synthetic benches is that they not represent actual games or even entire games. I generally skip all the synthetic benchmarks on any review.

Someone that is scientist might be annoyed with companies creating short cuts but what matters for the consumer is the image he sees while playing, regardless if the GPU is doing all the work the game is asking for or just enough work to present an identical/equal image had it done the full work.

Of course optimizations have downsides - sometimes it creates an error/bug.

So if there is an optimization that induces less IQ but I can't see the difference unless I'm still and have zoomed I don't know how many factors, I'll disregard it. Of course there will always be someone that just have an incredible eye sight and spots the most minute difference, but those are the exception.

Is there shimmering texture on that HL2 video - yes, there are. And aren't corrected by turning HQ on or even disabling CAT AI. For me that seems a bug but I might be wrong. In teackmania and HL2 you can even see the mipmap transition in a screen shot (happens for 5800 series in trackmania too as the AT 6800 review shows).

The oblivion screenshots and he witcher screenshot on the other hand don't show if anything at all.

If the IQ is really lower then it should be visible in all the games and so there will be no problem in picking a game like SC2 or AvP or BFBC2 or whatever and show this, is there?

EDIT:
I think you and I can agree or disagree with the approach used (and that's cool esp. since you always approach differences of opinion with respect ). However, we have more up-to-date information on how their reviews will change - and that's important.

I guess I fit in the category of "I don't really care about this minute difference" and I'm just debating cause I like to do so.

I would be worried if someone just went ahead and showed huge differences that couldn't be denied by even the biggest fanboy of either company (hmm maybe they would still deny) and then I would lose all the trust in the reviewer sites and I would have to start asking/demanding demonstrations in the PC shops. But these are at best minute differences, with the shimmering being the exception.

Maybe I should - "I want to see this game with a 6870 an with a GTX460".
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If the IQ is really lower then it should be visible in all the games and so there will be no problem in picking a game like SC2 or AvP or BFBC2 or whatever and show this, is there?

I don't think it's as simple as noticing the difference in all games though, because the driver will optimize shaders/textures in different games to a different extent. So what may result in reduced image quality in 1 game, has no impact visual in another game.

For example, AMD already admitted to "FP16 Demotion" in several games, but only after NV pointed this out. FP16 demotion allows the Catalyst driver to boost the performances of 3D rendering in benchmarks or games a bit, but without a significant reduction in image quality. This would simply be the use of a 32-bit render target instead of a 64-bit render target. However, since the actual workload is still decreased, this makes it more difficult to provide fair benchmarking results. FP16 Demotion only affected several games, but not all games by any means. As a result, these issues may have no effect on SC2, AvP or BFBC2 at all. What about the next 10 future games? It would be too time consuming for reviews to keep checking for these issues. So their more realistic solution would be to test with highest quality available at all times, and then comment on any perceived differences.

In the perfect world, I agree with you. They should test each game 1 by 1 and then decide if the optimizations negatively affect the game in question. I just don't think it's feasible to do such a study. Personally, I think NV and AMD cards should always be tested with highest image quality. That's what $200 graphics cards are for!!, not another 4 fps.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
I don't think it's as simple as having the difference in all games though, because the driver will optimize different coding in different games. So what may result in reduced image quality in 1 game, has no impact on another game.

For example, AMD already admitted to "FP16 Demotion" in several games, but only after NV pointed this out. FP16 demotion allows the Catalyst driver to boost the performances of 3D rendering in benchmarks or games a bit, but without a significant reduction in image quality. This would simply be the use of a 32-bit render target instead of a 64-bit render target. However, since the actual workload is still decreased, this makes it more difficult to provide fair benchmarking results. FP16 Demotion only affected several games, but not all games by any means. As a result, these issues may have no effect on SC2, AvP or BFBC2 at all. What about the next 10 future games?

Have you noticed that now you can disable that optimization?

It is the "Surface Format Optimization". Kinda cool by AMD.

It would be too time consuming for reviews to keep checking for these issues. So their more realistic solution would be to test with highest quality available at all times, and then comment on any perceived differences.

In the perfect world, I agree with you. They should test each game 1 by 1 and then decide if the optimizations negatively affect the game in question. I just think it's not feasiable to do such a study.

While I understand that point, optimizations in drivers is a big part of this industry - we just need to see all those performance increases for games in new drivers releases.

Then there are those optimizations you cant disable.

And the thing here is that the Cat AI tab got merged with the Mipmap tab - before we had high performance, performance, quality and high quality. Now there is performance, quality and high quality. Dunno exactly how that equates.

And all this is complicated by the fact that subjectivity can affect the perception of what looks better when comparing small details - does that sign look better a bit blurrier or sharper, etc.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
'Shuts up'? There's this thing called sleep you know, which you yourself partook in, yet not one here said it meant your deafening silence.

As for merit, no you didn't. You flatout lied about using nVidia images when you clearly used AMD ones. For the record, there was a difference between the images you posted, pixel values were measured and compared.

In any case folks, my point has gone beyond the bounds of sense. It was originally intended to address Arkadrel statement & to serve my curiosity, in a somewhat passing fashion, nothing more. It has served it's purpose with his mature & calmed response, so I'll leave it be. It's clear though that some here are not interested in a mature discussion, and would rather exaggerate it to the point of absurdity (you know, 'hyperbole'). I'll leave them to apologize for their own behavior.

Thanks.


You said the images were different and now your going to claim that I didn't use the ATI images.

dude your credability is ZERO

you even claimed that one had this or that.

the truth is

THERE IS NO PERCEPTABLE DIFFERENCE BETWEEN THOSE PHOTOS

whats also very telling is that none of you picked up the shade variances and half tone differnces in the colors.

my painter did.

lol.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
the truth is

THERE IS NO PERCEPTABLE DIFFERENCE BETWEEN THOSE PHOTOS

I think you also have to review the videos as well. Sometimes, texture filtering/mipmap transitions are too difficult to detect in screenshots alone. Again, if you can't tell the difference, doesn't mean it's not there in motion. Plus, you can always keep the optimizations on for yourself in games. All we are trying to do is understand why did these websites decide to change their testing methodologies? It appears there are differences in older games. Whether HQ should be enabled in all games or just in older games is as always - debatable and is up to the end user in the end.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
No, but not a single one of these did an image quality investigation/comparison. HardOCP recently concluded that Morphological AA looks better than 8x MSAA. I am not going to take anything they say seriously after that article when all their screenshots show massive blurring of textures with MLAA.

They also picked a game that has virtually no drop going from no AA to 8xAA. I know, i have the game.