- May 25, 2003
- 1,421
- 0
- 0
I figure this needs a thread of its own....
http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/
http://www.guru3d.com/article/exploring-ati-image-quality-optimizations/
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-hq.bmpDo you spot the difference ? Probably not, that is the rule we life by here at Guru3D, if you can not see it without blowing up the image or altering gamma settings and what not, it's not a cheat.
We urge and recommend AMD/ATI to disable the optimization at default in future driver releases and deal with the performance loss, as in the end everything is about objectivity and when you loose consumer trust, which (as little as it is) has been endangered, that in the end is going to do more harm then good. The drop in 3-4 FPS on average is much more acceptable then getting a reputation of being a company that compromises on image quality. And sure it raises other questions, does ATI compromise on other things as well ? See, the cost already outweigh the benefits.
So the moral right thing to do for AMD/ATI is to make the High Quality setting the standard default. But again here we have to acknowledge that it remains a hard to recognize and detect series of optimizations, but it is there and it can be detected.
Image quality = fps and this is a form of cheating. It has nothing to do if you can see it. We can see it in reviews when one card it said faster by lowering our game quality.
Image quality = fps and this is a form of cheating. It has nothing to do if you can see it.
Why do we buy videocards?
I have read the conclusion of the article and it basically comes down to this "It is not a cheat, but because it will cause people to scream CHEAT! you best just put it back to the way it was..." A 3% performance increase is not worth potential loss in sales...
I liked the article.
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.
Get what I'm saying?
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.
Get what I'm saying?
Your not getting what I'm saying. Its cheating even if you need an instant replay to see it. So whats next Nvidia should lower image quality to gain FPS just enough that we "really can't tell". Thats not good for any gamer. Then what? AMD does it again, then Nvidia, then AMD, then Nvidia.
Get what I'm saying?
The games I play with my videocard involve images on the screen.
With this talk of 'cheat' and the like it seems some of you are using your cards for a different kind of gaming.
How is image quality being lowered if you cant perceive it?
They've added a third setting.
See the middle slider? Move it to the right if you want to. You have the choice.
Image quality = fps and this is a form of cheating. It has nothing to do if you can see it. We can see it in reviews when one card it said faster by lowering our game quality.
SO if Nvidia lower its quality and we compare them , thats makes it better for us? So say Amd does it again and we compare it, then Nvidia does the same and we compare. Soon we will all have console like graphics.
If we could all just put our fanboyism aside for a minute and put whats better for pc gamers first, you might see what im saying.
Then its up to reviews sites to choose the proper setting? Is this what your saying? Hey, if the quality is the same I'm all for it.
Except when one company does it and doesn't inform reviewers of it until getting called out and exposed, and thus having an entire set of reviews skewed because an optimization was enabled that the other did not have enabled.You said it has nothing to do with what we can see. So if performance gets faster with no perceivable image quality loss we all win.
SO if Nvidia lower its quality and we compare them , thats makes it better for us? So say Amd does it again and we compare it, then Nvidia does the same and we compare. Soon we will all have console like graphics.
If we could all just put our fanboyism aside for a minute and put whats better for pc gamers first, you might see what im saying.
Except when one company does it and doesn't inform reviewers of it until getting called out and exposed, and thus having an entire set of reviews skewed because an optimization was enabled that the other did not have enabled.
When that happens, the community loses.
That is what this is all about. It is not about whether or not you are willing to play games with the optimizations. It is about getting accurate information out to the gaming community about true performance when reviews are done. Obviously that was not done with the 68's and the original 10.10 driver before the hot fixes as AMD has ASO enabled at default which Nvidia does not...and it wasn't that way with 10.9 and the 5XXX series AMD cards.
This issue seems more about fanboys worrying that their company might be at a disadvantage rather than a concern about image quality.
Nvidia does not have anisitropic sample optimization enabled at defualt, which according to what I am reading AMD does with this new driver for the 68's. Thus, when the two cards are left at default then the AMD card is going to have a performance advantage and this is going to tilt review results.
So, you either have to move AMD's quality to high quality in order to remove the optimization, or perhaps enable Nvidia's ASO before doing a comparison between the two.
As mentioned, if people are willing to game with the optimization then that it up to them. But, from a reviewer's standpoint, when it comes to making accurate and fair comparisons and then making recomendatins off those comparisons, you need to be sure the drivers are set as close to equal as possible. AMD enabled an optimization with the 68's that was not there before and did not tell anyone until they got caught, which is a bit underhanded imho.
Except when one company does it and doesn't inform reviewers of it until getting called out and exposed, and thus having an entire set of reviews skewed because an optimization was enabled that the other did not have enabled.
When that happens, the community loses.
That is what this is all about. It is not about whether or not you are willing to play games with the optimizations. It is about getting accurate information out to the gaming community about true performance when reviews are done. Obviously that was not done with the 68's and the original 10.10 driver before the hot fixes as AMD has ASO enabled at default which Nvidia does not...and it wasn't that way with 10.9 and the 5XXX series AMD cards.
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-hq.bmp
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/amd-default.bmp
vs
http://archive.sunet.se/pub/games/PC/guru3d/img-cache/6800/nv-default.bmp
IF you compaire the AMD default vs the Nv Default and look at them side by side, its impossible for me to tell them aprat if I didnt know by the name of the pic which was which.