• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Exploring ATI Image Quality Optimizations by Guru3D

Do you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat.
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-hq.bmp

http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-default.bmp
vs
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/nv-default.bmp



IF you compaire the AMD default vs the Nv Default and look at them side by side, its impossible for me to tell them aprat if I didnt know by the name of the pic which was which.
 
Last edited:
He states clearly the optimization is there, and does not match Nvidia's default. So, if he says it is there, then it is not some Nvidia PR stunt.

It doesn't matter if it is hard to discern or not, or people are willing to game with it on. What matters is the default quality does not match Nvidia's default quality so when it comes to reviewing AMD should remove the optimization to match Nvidia to give a more fair comparison.

We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits.
So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.

Doesn't get more definitive than that from a very reputable source.
 
Last edited:
The cards tested should be at equal (or as close as possible) IQ settings. If that means switching ATI cards to HQ, then it should be done. Hopefully BFG10k can do an indepth analysis when he gets his 68xx card.
 
Image quality = fps and this is a form of cheating. It has nothing to do if you can see it. We can see it in reviews when one card it said faster by lowering our game quality.
 
Why do we buy videocards?

Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.

Get what I'm saying?
 
I have read the conclusion of the article and it basically comes down to this "It is not a cheat, but because it will cause people to scream CHEAT! you best just put it back to the way it was..." A 3% performance increase is not worth potential loss in sales...

I liked the article.
 
I have read the conclusion of the article and it basically comes down to this "It is not a cheat, but because it will cause people to scream CHEAT! you best just put it back to the way it was..." A 3% performance increase is not worth potential loss in sales...

I liked the article.

I agree.
 
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.

Get what I'm saying?

No, I don't think you understand graphic cards optimizations to begin with. Both camps are always trying to improve effiency. In fact, both camps have already done this to get where they are today!
 
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.

Get what I'm saying?

The games I play with my videocard involve images on the screen.

With this talk of 'cheat' and the like it seems some of you are using your cards for a different kind of gaming.

How is image quality being lowered if you cant perceive it?
 
Nvidia does not have anisitropic sample optimization enabled at defualt, which according to what I am reading AMD does with this new driver for the 68's. Thus, when the two cards are left at default then the AMD card is going to have a performance advantage and this is going to tilt review results.

So, you either have to move AMD's quality to high quality in order to remove the optimization, or perhaps enable Nvidia's ASO before doing a comparison between the two.

As mentioned, if people are willing to game with the optimization then that it up to them. But, from a reviewer's standpoint, when it comes to making accurate and fair comparisons and then making recomendatins off those comparisons, you need to be sure the drivers are set as close to equal as possible. AMD enabled an optimization with the 68's that was not there before and did not tell anyone until they got caught, which is a bit underhanded imho.
 
Last edited:
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.

Get what I'm saying?

They've added a third setting.

imageview.php

See the middle slider? Move it to the right if you want to. You have the choice.
 
The games I play with my videocard involve images on the screen.

With this talk of 'cheat' and the like it seems some of you are using your cards for a different kind of gaming.

How is image quality being lowered if you cant perceive it?

SO if Nvidia lower its quality and we compare them , thats makes it better for us? So say Amd does it again and we compare it, then Nvidia does the same and we compare. Soon we will all have console like graphics.

If we could all just put our fanboyism aside for a minute and put whats better for pc gamers first, you might see what im saying.
 
Image quality = fps and this is a form of cheating. It has nothing to do if you can see it. We can see it in reviews when one card it said faster by lowering our game quality.

SO if Nvidia lower its quality and we compare them , thats makes it better for us? So say Amd does it again and we compare it, then Nvidia does the same and we compare. Soon we will all have console like graphics.

If we could all just put our fanboyism aside for a minute and put whats better for pc gamers first, you might see what im saying.


You said it has nothing to do with what we can see. So if performance gets faster with no perceivable image quality loss we all win.
 
Then its up to reviews sites to choose the proper setting? Is this what your saying? Hey, if the quality is the same I'm all for it.

Please, don't twist my words into something that I didn't say. All I'm saying is it's a 3rd level of optimization. Better performance with minimal quality loss. It's up to you, the consumer, what you want to set it at. Seems like a good option, because nobody likes the performance setting anyway.
 
You said it has nothing to do with what we can see. So if performance gets faster with no perceivable image quality loss we all win.
Except when one company does it and doesn't inform reviewers of it until getting called out and exposed, and thus having an entire set of reviews skewed because an optimization was enabled that the other did not have enabled.

When that happens, the community loses.

That is what this is all about. It is not about whether or not you are willing to play games with the optimizations. It is about getting accurate information out to the gaming community about true performance when reviews are done. Obviously that was not done with the 68's and the original 10.10 driver before the hot fixes as AMD has ASO enabled at default which Nvidia does not...and it wasn't that way with 10.9 and the 5XXX series AMD cards.
 
Last edited:
SO if Nvidia lower its quality and we compare them , thats makes it better for us? So say Amd does it again and we compare it, then Nvidia does the same and we compare. Soon we will all have console like graphics.

If we could all just put our fanboyism aside for a minute and put whats better for pc gamers first, you might see what im saying.

Nvidia does plenty of optimizing that you can not see. They would be crazy not to.

The real question is: how does a card perform for the money you pay. Perform includes many factors from IQ, # of monitors it will run, fps, power, noise, heat. durability and features such as phys x. Than of course you must factor in your computer, monitor and games you play. If someone provides a better package for you, it turns into a pretty nice kind of cheat.

My 6770 is certainly a good enough mid range solution for my needs. Quiet and the games I have run and look good on my modest monitor.
 
Except when one company does it and doesn't inform reviewers of it until getting called out and exposed, and thus having an entire set of reviews skewed because an optimization was enabled that the other did not have enabled.

When that happens, the community loses.

That is what this is all about. It is not about whether or not you are willing to play games with the optimizations. It is about getting accurate information out to the gaming community about true performance when reviews are done. Obviously that was not done with the 68's and the original 10.10 driver before the hot fixes as AMD has ASO enabled at default which Nvidia does not...and it wasn't that way with 10.9 and the 5XXX series AMD cards.

Why? Both companies optimise in their drivers anyway, as long as the image quality doesn't go down its all good.

This issue seems more about fanboys worrying that their company might be at a disadvantage rather than a concern about image quality.
 
Nvidia does not have anisitropic sample optimization enabled at defualt, which according to what I am reading AMD does with this new driver for the 68's. Thus, when the two cards are left at default then the AMD card is going to have a performance advantage and this is going to tilt review results.

So, you either have to move AMD's quality to high quality in order to remove the optimization, or perhaps enable Nvidia's ASO before doing a comparison between the two.

As mentioned, if people are willing to game with the optimization then that it up to them. But, from a reviewer's standpoint, when it comes to making accurate and fair comparisons and then making recomendatins off those comparisons, you need to be sure the drivers are set as close to equal as possible. AMD enabled an optimization with the 68's that was not there before and did not tell anyone until they got caught, which is a bit underhanded imho.

What optimizations do nV have turned on by default? Just wondering cause you didn't say they don't have any on.
 
Except when one company does it and doesn't inform reviewers of it until getting called out and exposed, and thus having an entire set of reviews skewed because an optimization was enabled that the other did not have enabled.

When that happens, the community loses.

That is what this is all about. It is not about whether or not you are willing to play games with the optimizations. It is about getting accurate information out to the gaming community about true performance when reviews are done. Obviously that was not done with the 68's and the original 10.10 driver before the hot fixes as AMD has ASO enabled at default which Nvidia does not...and it wasn't that way with 10.9 and the 5XXX series AMD cards.

Nvidia doesn't have its drivers set to high-quality by default either. Do they always alert reviewers to this fact? Regardless, one can't discern an actual difference, so I don't see what the real issue is. If reviewers have an issue with it, they need to just review all games in high-quality. Simple.
 
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-hq.bmp

http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-default.bmp
vs
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/nv-default.bmp



IF you compaire the AMD default vs the Nv Default and look at them side by side, its impossible for me to tell them aprat if I didnt know by the name of the pic which was which.


Looked at the pics... Seriously this is what the uproar is about? Haha.. ok, continue the AMD hate. Get 'em Happy! 🙂 How dare AMD give an option in their drivers that makes the image look identical but speeds up performance and can be turned off! Grrrr..!
 
Back
Top