Exploring ATI Image Quality Optimizations by Guru3D

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I think you can use your imagination on this one.

Guess I will have to, since I don't have the game. My 6870 looks and runs pretty darn good on the games I have, but maybe my eyes aren't trained.


I was hoping that as you usually can, that you could add some information on what is happening here. Not sure I could tell a lod from a bsod.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Hilbert at Guru3D, whose reputation in the community is rock solid, agrees that it is there, and even though it hard to see (just the same as when NV has their ASO enabled) and doesn't degrade IQ, it does make giving a proper, objective, comparitive review impossible when driver defaults are loaded. Thus AMD needs to disable the ASO and go back how they were before with Cat 10.9., andd that is how it should be, and the community should demand it as well so as to maintain the integrity of the reviews else no one will be able to look at a review and trust what they are looking at. That is, no one who cares about objectivity that is. Unfortunately these discussions always get mired in the hands of Atidiots and Nvidiots and the information always gets misconstrued.
That's what the "high quality" setting is for, in both companies (unless it can be proven that some optimizations still aren't disabled with the high quality setting). Are you actually campaigning that these card companies should not work to improve the performance of their cards through their drivers? Personally, I think reviews should leave the drivers at default with optimizations on, that way, the cards are run as the companies intended. Most users are going to run them that way anyway, so why wouldn't the reviews? If reviewers notice something odd in the IQ while testing, then they can comment on it. If they don't, then the point is moot, no?
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
That's what the "high quality" setting is for, in both companies (unless it can be proven that some optimizations still aren't disabled with the high quality setting). Are you actually campaigning that these card companies should not work to improve the performance of their cards through their drivers? Personally, I think reviews should leave the drivers at default with optimizations on, that way, the cards are run as the companies intended. Most users are going to run them that way anyway, so why wouldn't the reviews? If reviewers notice something odd in the IQ while testing, then they can comment on it. If they don't, then the point is moot, no?

I would agree with this. Most people only seem to care about FPS. Image Quality is rarely thrown around (unless someone is accused of cheating it seems, and only becaue *gasp* it increases FPS.)

I still think 90% of the people who buy videocards won't tinker with the settings and the ones that do will know they want (FPS vs IQ.)

AMD should have just said its a bug like nVidia said about Hawx. haha.

"Yeah we had a hand in making the software, but trust us - it's a bug that we didn't know about nor did we know gave our product an edge on benches. Scouts honor!"
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
I still think 90% of the people who buy videocards won't tinker with the settings and the ones that do will know they want (FPS vs IQ.)

That's why I think they should keep the new default setting, people that are aware enough of IQ to notice any difference know how to change the default to get the IQ they want. The people that don't mess with settings will benefit from the new setting and not be able to tell the difference.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
That's what the "high quality" setting is for, in both companies (unless it can be proven that some optimizations still aren't disabled with the high quality setting). Are you actually campaigning that these card companies should not work to improve the performance of their cards through their drivers? Personally, I think reviews should leave the drivers at default with optimizations on, that way, the cards are run as the companies intended. Most users are going to run them that way anyway, so why wouldn't the reviews? If reviewers notice something odd in the IQ while testing, then they can comment on it. If they don't, then the point is moot, no?
I don't have any issues with driver optimizations done by either company in order to increase fps, even if IQ is degraded slightly. All that I ask is that when the default settings are set, that review sites use to do comparitive benchmarks with, that these equivlaent optimizations settings be the same so I can trust I am looking at accurate comparitve results when reviews are done.

Before Cat 10.10, this is what was happening as I understand it. With, and after Cat 10.10, this was not being done due to AMD enabling ASO which before was disabled with 10.9. So, for instance, when Nvidia's defaults are loaded, Trillinear Optimization is on, but ASO is off. With Cat 10.9 and before...AMD's equivalent settings were the same. However, with Cat 10.10, ASO was turned on by default which meant that Nvidia had a comparitive disadvantage since the extra optimization was enabled on AMD's side. The problem also is that they didn't bother to tell anyone until it was exposed and they were asked directly about it, and that gave off the impression of cheating, although maybe that is too strong of language to use.

I was on the fence about the whole issue, debating mostly theory before, but I trust Hilbert's opinion, and don't think somoene with his standing in the community would take the time to write such an article and make the plea he did for AMD to go back to their previous settings if it was not in fact that something had changed in AMD's driver settings that compromised a review sites ability to give objective comparisons, to which their sites depend.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I don't have any issues with driver optimizations done by either company in order to increase fps, even if IQ is degraded slightly. All that I ask is that when the default settings are set, that review sites use to do comparitive benchmarks with, that these equivlaent optimizations settings be the same so I can trust I am looking at accurate comparitve results when reviews are done.

Before Cat 10.10, this is what was happening as I understand it. With, and after Cat 10.10, this was not being done due to AMD enabling ASO which before was disabled with 10.9. So, for instance, when Nvidia's defaults are loaded, Trillinear Optimization is on, but ASO is off. With Cat 10.9 and before...AMD's equivalent settings were the same. However, with Cat 10.10, ASO was turned on by default which meant that Nvidia had a comparitive disadvantage since the extra optimization was enabled on AMD's side. The problem also is that they didn't bother to tell anyone until it was exposed and they were asked directly about it, and that gave off the impression of cheating, although maybe that is too strong of language to use.

I was on the fence about the whole issue, debating mostly theory before, but I trust Hilbert's opinion, and don't think somoene with his standing in the community would take the time to write such an article and make the plea he did for AMD to go back to their previous settings if it was not in fact that something had changed in AMD's driver settings that compromised a review sites ability to give objective comparisons, to which their sites depend.

My main issue with this that it is a known.

What are the default optimizations that nVidia uses on their default setting? Are they 1:1 to AMDs? The same for AMD, are they 1:1 to nVidias?

They fact that there are optimizations by default from both sides defeats an apples-to-apples comparative. I like BFG/App's suggestion of running everything on HIGH. Remove all optimization if you want a true comparative.

Otherwise crying foul because one side used one more optimization than the other seems...I don't know, pointless. I'd love for someone to list the optimizations that are used by default. I'd wager one side uses something a little more than the other and vice versa.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
That's why I think they should keep the new default setting, people that are aware enough of IQ to notice any difference know how to change the default to get the IQ they want. The people that don't mess with settings will benefit from the new setting and not be able to tell the difference.
I would agree with this. Most people only seem to care about FPS. Image Quality is rarely thrown around (unless someone is accused of cheating it seems, and only becaue *gasp* it increases FPS.)

I still think 90% of the people who buy videocards won't tinker with the settings and the ones that do will know they want (FPS vs IQ.)
Exactly. :thumbsup:

AMD should have just said its a bug like nVidia said about Hawx. haha.

"Yeah we had a hand in making the software, but trust us - it's a bug that we didn't know about nor did we know gave our product an edge on benches. Scouts honor!"
As I've said many times, it sounds like sour grapes from NVIDIA regarding AMD's offering newer, better products and maybe also finally getting their act together in the driver department. I don't think what you see in Trackmania is so much a bug as it is a hardware limitation. I can't remember where I saw it addressed before, but basically the algorithm(s) used in the 5xxx series hit certain limitations rendering scenes with low res/simple/repeat textures and you can see the filtering transitions. Furthermore, I believe it was addressed and corrected in the design of the 6xxx series, so really, it's a dead issue from here on out.
I don't have any issues with driver optimizations done by either company in order to increase fps, even if IQ is degraded slightly. All that I ask is that when the default settings are set, that review sites use to do comparitive benchmarks with, that these equivlaent optimizations settings be the same so I can trust I am looking at accurate comparitve results when reviews are done.
And what exactly would that be? I'm with you that the cards should be tested equally, and, to me, that means with their default settings using their default optimizations. These optimizations, described by both companies, offer better performance with no noticeable loss in image quality. If one company gets a significant boost over the other when running their respective optimizations, it just shows one company has superior driver development, no? That would factor into my decision to buy a card, especially if I knew I could expect better performance simply due to software.

Before Cat 10.10, this is what was happening as I understand it. With, and after Cat 10.10, this was not being done due to AMD enabling ASO which before was disabled with 10.9. So, for instance, when Nvidia's defaults are loaded, Trillinear Optimization is on, but ASO is off. With Cat 10.9 and before...AMD's equivalent settings were the same. However, with Cat 10.10, ASO was turned on by default which meant that Nvidia had a comparitive disadvantage since the extra optimization was enabled on AMD's side. The problem also is that they didn't bother to tell anyone until it was exposed and they were asked directly about it, and that gave off the impression of cheating, although maybe that is too strong of language to use.

I was on the fence about the whole issue, debating mostly theory before, but I trust Hilbert's opinion, and don't think somoene with his standing in the community would take the time to write such an article and make the plea he did for AMD to go back to their previous settings if it was not in fact that something had changed in AMD's driver settings that compromised a review sites ability to give objective comparisons, to which their sites depend.
OK, so ASO is on. What's the problem with having ASO "on"? Also, why doesn't NVIDIA offer it to be "on" by default?

I'm not doing the Socratic method here, I'm just curious as I don't know a whole lot about the issue.
My main issue with this that it is a known.

What are the default optimizations that nVidia uses on their default setting? Are they 1:1 to AMDs? The same for AMD, are they 1:1 to nVidias?

They fact that there are optimizations by default from both sides defeats an apples-to-apples comparative. I like BFG/App's suggestion of running everything on HIGH. Remove all optimization if you want a true comparative.

Otherwise crying foul because one side used one more optimization than the other seems...I don't know, pointless. I'd love for someone to list the optimizations that are used by default. I'd wager one side uses something a little more than the other and vice versa.
Why though? If a company can take the time to tweak its drivers so that its hardware runs faster, why wouldn't you use their hard work? I would think reviewers would be doing a disservice to their readers since they not only wouldn't be testing the cards as the companies intended, but also as most users would use them. So far, there have been no reasonable arguments for not running optimizations besides the fact that they are some how inherently bad.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Why though? If a company can take the time to tweak its drivers so that its hardware runs faster, why wouldn't you use their hard work? I would think reviewers would be doing a disservice to their readers since they not only wouldn't be testing the cards as the companies intended, but also as most users would use them. So far, there have been no reasonable arguments for not running optimizations besides the fact that they are some how inherently bad.

I may have word my comment badly, but my issue is with people crying foul at "unfair" representation when using default settings on benches due to the fact that both sides use optimization already.

To try to simply my original comment:
They are already using optimizations in default (which I have no issues with, as long as they aren't visually impairing on one side (see Quake 3 for AMD, or 3dMark for nVidia). Saying it is cheating, underhanded, or whatever because it isn't disclosed out the gate is foolish as I doubt anyone knows all the optimizations that are used.

I don't see people asking: "how is it with the new denotators/catalyst Metro got 15% performance increase, what optimizations did they use, is the IQ the same, OMG CHEATING!" No, it's "with newer drivers, they'll get more FPS thus my epeen will be bigger than yours."
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
It has been stated over and over again, to which some chose to deliberately ignore, or just for some reason fail to understand, that the optimization was not there before with 10.9 with the 5XXX AMD cards, and that this approach was the agreed upon stance between AMD and Nvidia when sending samples out to review
This is the first I have ever heard of AMD and Nvidia having an agreement on card settings. Do you have a source for this?

Either way, as others have stated, what optimizations are enabled by default should be at the sole discretion of the companies producing the cards. If AMD decides something Nvidia has chosen not to turn on at default settings will in fact provide a better experience to the vast majority of gamers--let alone with such a negligible loss of quality--it would be a travesty to forbid it just because "no one else is using it." Again, what percent of users ever open the control panel, let alone understand the settings, let alone change them around?

If your target is your average customer, turning on ASO in your default settings is the right choice, period.

Hilbert at Guru3D, whose reputation in the community is rock solid, agrees that it is there, and even though it hard to see (just the same as when NV has their ASO enabled) and doesn't degrade IQ, it does make giving a proper, objective, comparitive review impossible when driver defaults are loaded. Thus AMD needs to disable the ASO and go back how they were before with Cat 10.9., andd that is how it should be, and the community should demand it as well so as to maintain the integrity of the reviews else no one will be able to look at a review and trust what they are looking at. That is, no one who cares about objectivity that is.
I agree with a need for objective comparison, but the tools are there to make this happen in the High Quality settings. Further, demanding a change to default settings which will affect the entire userbase to satisfy a handful of tech sites is something I very, very strongly disagree with. AMD needs to provide a setting where all optimizations are off for benchmarking, and that is all. Telling ATI to turn default settings on or off depending on what Nvidia does or doesn't is the most inane thing I've ever heard.

AMD should have been way more forthcoming about this, no question. But these tech sites demanding changes to defaults when there are equivalent settings are ridiculous as well. If card makers shipped everything with benchmarking and a handful of reviewers in mind rather than their general userbase, well...I don't see how anyone can argue for that being preferable. It's such a backwards thing to suggest, especially when the image comparison shots prove there are obviously very differing optimizations being used under the hood by both camps already, setting ASO completely aside.

tl;dr - Basically what MrK6 and railven said.
 
Last edited:

Wreckage

Banned
Jul 1, 2005
5,529
0
0
I 100% agree with this statement from the article and it's why I continue to stick with NVIDIA.

Forfeiting on image quality will cost the manufacturer business as end-users want the best product. Especially in the high-end performance graphics area people really care about image quality.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I 100% agree with this statement from the article and it's why I continue to stick with NVIDIA.
You got that backwards. AMD has the superior image quality; NVIDIA blurs everything to hide their flaws :). This was already proven, using the very same Trackmania game.

Just in case you missed the post:

So I saw these posts and thought they were interesting. I don't know why these sites love Trackmania, but it seems that AMD does have much sharper filtering than NVIDIA:
http://forum.beyond3d.com/showpost.php?p=1499768&postcount=176
http://hardforum.com/showpost.php?p=1036521622&postcount=17
My guess is there's a problem with their filtering algorithm in some older games and how it handles low res/repeat textures. However, since it doesn't seem to be a problem in new games, I'll definitely take the superior image quality/sharpness. :thumbsup:
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
You got that backwards. AMD has the superior image quality; NVIDIA blurs everything to hide their flaws :). This was already proven, using the very same Trackmania game.

Just in case you missed the post:

I'm going by a number of sites that have been stating image quality sacrifices. In general linking to other forum posts holds little validity. As for trackmania....wow that's a stretch eh?

It seems the Germans with their high standards have already covered this and now the American sites are jumping in.

http://www.computerbase.de/artikel/...5/#abschnitt_anisotrope_filterung_auf_hd_6800

http://www.pcgameshardware.de/aid,7...irectX-11-Generation/Grafikkarte/Test/?page=4

It may just be personal preference, but I like to get the best image quality I can. Some may want a few extra fps. :shrug:

My new card seems to give me both :thumbsup:
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I'm going by a number of sites that have been stating image quality sacrifices. In general linking to other forum posts holds little validity. As for trackmania....wow that's a stretch eh?

It seems the Germans with their high standards have already covered this and now the American sites are jumping in.

http://www.computerbase.de/artikel/...5/#abschnitt_anisotrope_filterung_auf_hd_6800

http://www.pcgameshardware.de/aid,7...irectX-11-Generation/Grafikkarte/Test/?page=4

It may just be personal preference, but I like to get the best image quality I can. Some may want a few extra fps. :shrug:

My new card seems to give me both :thumbsup:
I know you've been away for awhile, but those sites are already old news and have been debunked. Would you care to point out the differences in the pictures? Or better yet, would you care to explain why NVIDIA's IQ falls so short of AMD's in Trackmania? NVIDIA's IQ probably falls short generally as well, since Trackmania seems to be used to generalize about a card's performance. Unfortunately, you seem to have purchased the wrong card for your tastes :(.
 

Will Robinson

Golden Member
Dec 19, 2009
1,408
0
0
He's "all about image quality" and he's going to "stick with NVDA"
wtf lol:confused:

Fail me Elmo for XMAS:sneaky:
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I know you've been away for awhile, but those sites are already old news and have been debunked. Would you care to point out the differences in the pictures? Or better yet, would you care to explain why NVIDIA's IQ falls so short of AMD's in Trackmania? NVIDIA's IQ probably falls short generally as well, since Trackmania seems to be used to generalize about a card's performance. Unfortunately, you seem to have purchased the wrong card for your tastes :(.

Debunked? AMD themselves offered they have addressed the mip-map issues with new hardware.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
So company A sacrifices image quality and gets crucified and company B does the same and is heralded as "boosting fps for us gamers". Some of the responses on this thread are a joke. Plain and simple.

Sacrificing IQ to gain fps is ALWAYS suspect and a cheap manuever. Don't apologize for any company when they do it.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
So company A sacrifices image quality and gets crucified and company B does the same and is heralded as "boosting fps for us gamers". Some of the responses on this thread are a joke. Plain and simple.

Sacrificing IQ to gain fps is ALWAYS suspect and a cheap manuever. Don't apologize for any company when they do it.

What are you talking about? Both companies do it. To different degrees. They use different optimizations, we all knew that. We know one of the ones AMD use now and we know one of the ones nV doesnt use, but thats about it. As you can see AMD has problems with the road nV has problems on the sign.

If you have a problem with either, then turn it off or run both in HQ and be done with it.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
What are you talking about? Both companies do it. To different degrees. They use different optimizations, we all knew that. We know one of the ones AMD use now and we know one of the ones nV doesnt use, but thats about it. As you can see AMD has problems with the road nV has problems on the sign.

If you have a problem with either, then turn it off or run both in HQ and be done with it.

It's not about what *I* can do, it's about trying to improve fps for the people who nit-pick performance (like us here at AT). Now comparing reviews from site to site is no longer an apple to apple comparison because not everyone is using the same settings. It's shady, no way around it.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
So company A sacrifices image quality and gets crucified and company B does the same and is heralded as "boosting fps for us gamers". Some of the responses on this thread are a joke. Plain and simple.

Sacrificing IQ to gain fps is ALWAYS suspect and a cheap manuever. Don't apologize for any company when they do it.

I'm all for optimizations though because some of them may be subtle to some or may not even notice and the added frame-rate may help someone with a resolution setting. Gaming has always been about pro-and-cons, trade-offs, gain in this area but lose in another but worth it to the individual.

One of the reasons for leaving AMD was the lack of flexibility, hand holding, locked into CaT AI, all this awareness, some cheap-shots and constructive views, defenders and attackers is AMD is trying much harder to bring flexibility and more tools for their end-users to decide what is best for their own eyes -- not a web-site's eyes, vocal posters eyes but an individual's set of eyes.

For me this is absolutely wonderful to see.