Exploring ATI Image Quality Optimizations by Guru3D

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Actually AMD should be given credit for giving ME (an AMD user) 5-10% extra performance for free.
NV users can have it to if they decide to enable ASO. Only thing is yours is set that way by default.

It was nice to have a civilised discussion in the video forum. :thumbsup:
Like wise. After all, they are the only ones worth having :thumbsup:
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
I see absolutely zero difference between the images at guru3d; all the Mass Effect screens like identical.

The apocalyptic tone of some of these replies is hilarious.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Guru3d pretty much reached the same conclusion. They estimate that 99.99% people won't see an actual difference.

I lied. There is a difference. The character is facing different directions and the camera is shifted a fraction of a degree!
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
I can only assume AMD decided the vast majority of gamers would never notice the difference in visuals but appreciate the extra speed, and from what I've seen in comparison screenshots, I agree with their judgment. I think the change will end up giving gamers a superior experience, but the underhanded way it was implemented can't be helping their credibility. They should have known better.

That said, the onus should always be on review sites to check driver settings before benchmarking hardware. In this case it's simply one slider setting away from NVidia's defaults, hardly a complex conspiracy.

Demanding that either company keep tabs on the others' optimizations to keep things as lock-step as possible is kind of insane, frankly. What if NVidia had decided first that ASO was worth enabling by default, that the benefits outweighed the minute visual differences? Would ATI then be free to call foul, or should they just follow suit in a sense of "fairness"? Is neither company allowed to have its own perceived "optimal experience settings"?

So long as any such changes are 1. transparent and clearly mentioned in change logs, and 2. do not remove or restrict access to manually overriding the default optimizations I don't see a reason to fault anyone. In this case AMD clearly dropped the ball on number 1, which is why they were called out, and very rightly so. The change itself is very much ado about nothing IMHO.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
I couldn't tell the difference between the pics, and spent 5 minutes zoomed in on them comparing various areas of the room. Meh. Yeah it's slimey, but it's not enough to make me care.
 

badb0y

Diamond Member
Feb 22, 2010
4,015
30
91
Can we put this issue to rest now if we have to pause and magnify the picture like 20x just to see the graphical difference then it's pretty much a non-issue. On top of that it's optional...seriously can we end this now?
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.

Get what I'm saying?

I get what you're saying... you're saying you don't ever use MP3s or even lossless compression and keep all songs in WAV format because any possible loss of information will cause your golden ear to twinge with pain when you think you hear something you really can't.

Reviewers who are smart enough to compare performance on an equal level will get kudos, but optimizations that improve performance by reducing quality, but cannot be perceived in-game, are things that ultimately lead to a better consumer experience because they can use that processing power for things that you can actually notice in games, or they can make consumer cards cheaper by being able to do more with less.

Personally, I don't game by staring at stills, if that's the only way to perceive a difference, then I'm fine with optimizations, as long as there's a way to turn them off so that reviewers can compare the two on an equal level.
 
Last edited:

Attic

Diamond Member
Jan 9, 2010
4,282
2
76
Looked at the pics... Seriously this is what the uproar is about? Haha.. ok, continue the AMD hate. Get 'em Happy! :) How dare AMD give an option in their drivers that makes the image look identical but speeds up performance and can be turned off! Grrrr..!


lol, that nicely sums up the lunacy of the uproar of the critics on this (non)issue.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Looked at the pics... Seriously this is what the uproar is about? Haha.. ok, continue the AMD hate. Get 'em Happy! :) How dare AMD give an option in their drivers that makes the image look identical but speeds up performance and can be turned off! Grrrr..!

A comparison like this should probably be "in motion", not a static pic. I don't know why static pics are used for IQ comparisons except zooming in to see AA levels.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Agree with the general consensus... If even after you stare at the pics, you still cant find a difference, imagine actually playing the game where youre worried about something other than looking at walls

The whole thing is just sad... If a company can give us better performance for the same quality, whats not to like? Instead of NV fans complaining, why dont you ask Nvidia to do the same? Id be annoyed if my card was underperforming because of some stuff I cant see
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Agree with the general consensus... If even after you stare at the pics, you still cant find a difference, imagine actually playing the game where youre worried about something other than looking at walls

The whole thing is just sad... If a company can give us better performance for the same quality, whats not to like? Instead of NV fans complaining, why dont you ask Nvidia to do the same? Id be annoyed if my card was underperforming because of some stuff I cant see

Actually, as I said in the post RIGHT above yours, it's more likely to see any degradation while actually playing and moving rather than a still screen shot. And which general concensus are you agreeing with? The one that sees no difference and all should keep on going as always with default settings? Or the concensus that maintains that when benchmarking, all things should be equal. Image Quality degradation for sake of a few FPS isn't worth it (in the words of Guru3d).
 
Last edited:

Aristotelian

Golden Member
Jan 30, 2010
1,246
11
76
SO if Nvidia lower its quality and we compare them , thats makes it better for us? So say Amd does it again and we compare it, then Nvidia does the same and we compare. Soon we will all have console like graphics.

If we could all just put our fanboyism aside for a minute and put whats better for pc gamers first, you might see what im saying.

Hello? It's the default setting that you are referring to on ATI's cards here. Where does this imply an overall reduction in IQ? That's news to me.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Hello? It's the default setting that you are referring to on ATI's cards here. Where does this imply an overall reduction in IQ? That's news to me.

Ah, ok maybe you've missed the whole premise of this situation AMD has created for itself.

You could start by viewing (translate if need be) the 4 German web sites that have made some discoveries regarding AMD 10.10 and 10.11 driver and default image quality settings.

Just to be clear, all these conversations have to do with IQ of AMD as compared to Nvidia. So If I mention Nvidia in here, it's still on topic because in the end, what do you thing Guru is comparing AMD IQ to?

Anyway back on track. What is being maintained, is that AMD's default Image quality settings in the 10.10 and 10.11 drivers are a lower image quality than Nvidia's default Image quality settings in current drivers. They in fact state that even after upping the default image quality to High quality on the AMD driver, this still doesn't equal the default image quality in NV's default image quality. Comes closer, and better, but still behind.
This is what all the hubbub is about.
AMD releases 10.10 for the 68xx series cards and sends this driver to review sites for benching about a week before NDA expires. All sites conduct their benches and write up nice shiny reviews getting all ready for launch day.
NDA expires. Reviews are up. All of which are using lower IQ on the AMD cards. This gives AMD an advantage and boosts up there numbers a bit. Every little bit helps right?

Ok, now all those reviews are out of the gate with wrong numbers. AMD has an out either way. If nobody picks up on the IQ difference, all is warm and fuzzy and nobody is the wiser. But, if say a few German websites dig into things a bit more, and as you can see causes some heat in the tech community, AMD can use their other out. "Oops, it was a bug in both drivers. 10.10 and 10.11 and we didn't pick up on it til it was brought to our attention.".
At this point, AMD doesn't care because all those benches are out in the wild with incorrect numbers.
I know now that those 4 German sites are conducting a much wider investigation since the original blog was released about them. And from this point forward, any review site that doesn't first ensure image quality is the best it could be for both camps, shouldn't be taken seriously as a referential review site any longer. What's the point of reviewing when all things aren't equal.

Anyway, I hope this clears up the whole situ for you.

Keys
 

heflys

Member
Sep 13, 2010
72
0
0
Anyway back on track. What is being maintained, is that AMD's default Image quality settings in the 10.10 and 10.11 drivers are a lower image quality than Nvidia's default Image quality settings in current drivers. They in fact state that even after upping the default image quality to High quality on the AMD driver, this still doesn't equal the default image quality in NV's default image quality. Comes closer, and better, but still behind.
This is what all the hubbub is about.
AMD releases 10.10 for the 68xx series cards and sends this driver to review sites for benching about a week before NDA expires. All sites conduct their benches and write up nice shiny reviews getting all ready for launch day.

I'm still trying to figure how they were able easily reach such a conclusion. Heck, one of the sites I saw even provided a chart about the percentage of change. How the hell can someone even gauge such thing, particularly when it's impossible (or nearly) to discern a real difference?

The only solution to this so called "problem" is for reviewers to review games at HQ only.
 
Last edited:

heflys

Member
Sep 13, 2010
72
0
0
Oh, yeah, the guys over at guru3d posted this.

In it, they say letters on the sign are more visible for AMD (right side), and that the metal bars are more visible. To see the pic more clearly, press the zoom button.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Oh, yeah, the guys over at guru3d posted this.

In it, they say letters on sign are more visible for AMD (right side), and that the metal bars are more visible. To see the pic more clearly, press the zoom button.

What guys? Editors?
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
Ah, ok maybe you've missed the whole premise of this situation AMD has created for itself.

You could start by viewing (translate if need be) the 4 German web sites that have made some discoveries regarding AMD 10.10 and 10.11 driver and default image quality settings.

Just to be clear, all these conversations have to do with IQ of AMD as compared to Nvidia. So If I mention Nvidia in here, it's still on topic because in the end, what do you thing Guru is comparing AMD IQ to?

Anyway back on track. What is being maintained, is that AMD's default Image quality settings in the 10.10 and 10.11 drivers are a lower image quality than Nvidia's default Image quality settings in current drivers. They in fact state that even after upping the default image quality to High quality on the AMD driver, this still doesn't equal the default image quality in NV's default image quality. Comes closer, and better, but still behind.
This is what all the hubbub is about.
AMD releases 10.10 for the 68xx series cards and sends this driver to review sites for benching about a week before NDA expires. All sites conduct their benches and write up nice shiny reviews getting all ready for launch day.
NDA expires. Reviews are up. All of which are using lower IQ on the AMD cards. This gives AMD an advantage and boosts up there numbers a bit. Every little bit helps right?

Ok, now all those reviews are out of the gate with wrong numbers. AMD has an out either way. If nobody picks up on the IQ difference, all is warm and fuzzy and nobody is the wiser. But, if say a few German websites dig into things a bit more, and as you can see causes some heat in the tech community, AMD can use their other out. "Oops, it was a bug in both drivers. 10.10 and 10.11 and we didn't pick up on it til it was brought to our attention.".
At this point, AMD doesn't care because all those benches are out in the wild with incorrect numbers.
I know now that those 4 German sites are conducting a much wider investigation since the original blog was released about them. And from this point forward, any review site that doesn't first ensure image quality is the best it could be for both camps, shouldn't be taken seriously as a referential review site any longer. What's the point of reviewing when all things aren't equal.

Anyway, I hope this clears up the whole situ for you.

Keys

I hate to say it, but given their driver quality history I am inclined to believe this is actually the case.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71

Qbah

Diamond Member
Oct 18, 2005
3,754
10
81
I still don't understand how can this be called lowered IQ "in general", if nobody can notice a difference in a modern game? Yes, Trackmania has a clear difference. Trackmania is an old game (not to mention some other stuff in it looks better on AMD...). What was the other game? Half Life 2? Is there any difference in L4D/L4F2? Portal? Any other source games?

So... can someone enlighten me? How is that cheating? Is there a perceivable difference in IQ when running benchmarks? No. Any recent games? No. Speed increase? Yes. I don't think you can get a better definition of optimization than that.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Seriously? You can clearly see the Trackmania sign on AMD's side looks better than the one on Nvidia's.

It's blatant.

What I need to know is, why are you drawing attention away from the huge thing on the road that isn't circled? And toward tiny text on a flapping waving banner? And some girders that actually look clearer and less blurred on the left side than the right? And, which is going to give you a headache when moving.