AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
They also picked a game that has virtually no drop going from no AA to 8xAA. I know, i have the game.

I think it's always great to have more options in terms of anti-aliasing filters. MLAA has huge potential in games where normal AA doesn't work. However, other reviews have shown worse performance with it on and degrading image quality. So I am not sure why HardOCP just stopped at testing F1 only.

BTW, is it a good game? I love racing games. I am prob. going to pick this one up.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I think you also have to review the videos as well. Sometimes, texture filtering/mipmap transitions are too difficult to detect in screenshots alone. Again, if you can't tell the difference, doesn't mean it's not there in motion. Plus, you can always keep the optimizations on for yourself in games. All we are trying to do is understand why did these websites decide to change their testing methodologies? It appears there are differences in older games. Whether HQ should be enabled in all games or just in older games is as always - debatable and is up to the end user in the end.

I agree to some extent. I can't see it, but that doesn't mean its not there. So I keep the optimizations on, in fact after a driver update I always put Cat-AI to advanced. I think the problem is that some people are making more out of this than it actually is.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Actually, if you go over those Oblivion screenshots carefully, you'll see that the nVidia Standard is clearly better compared to the AMD 10.09 ones.

On the first row, the green moss-covered texture on the column immediately next to the blond man is worse in 10.09 compared to nVidia. On the second row, blurring is evident on the green roof texture of the building. On the third row, the details of the Corinthian-like column is less defined compared to nVidia. And on the last row, the moss-covered texture as well as the lower stone texture are clearly less detailed compared to nVidia's 'standard' quality level.

98740571.jpg

The problem is without access to a reference render it's impossible to say which one is "correct". Different people may find different images more pleasing than others, but we do not have all the information available to say that one image is wrong.

Heck, they may all be wrong.
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
BTW, is it a good game? I love racing games. I am prob. going to pick this one up.

F1 2010? Lots of bugs, recent patch didn't fix them. Lots of ire on forums about this game.

Dirt 2 is excellent if you like Rally cars, Need for Speed: Shift is excellent if you like stock to highly modified street cars on tracks. Can't go wrong with either.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
Nothing to see here. As someone who actually uses his videocards to play actual games, on a frequent basis, this issue with the 10.10s has no affected me. I didn't even notice, nor do I care as a paying customer. Seems it's mostly the Nvidia guys who care.

This sort of thing goes back to the 90s, really is nothing to see here.. old hat. NV does and will do it more ad nauseum into the future.. if you don't believe me, just wait and see.
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
BTW, is it a good game? I love racing games. I am prob. going to pick this one up.

It is a good game. There are bugs, but nothing like STALKER. It's weird though, I haven't had many of the bugs that are talked about in the forums. Like the AI not pitting, getting held up in the pitlane, etc. The main issue I had with the game were the grip levels in wet weather. the patch fixed that. There was too much grip before.

EDIT: Turns out most of the bugs happen when you choose only 20% race distance. I say go ahead and get it. Its cheaper than most new games anyway.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
The problem is without access to a reference render it's impossible to say which one is "correct". Different people may find different images more pleasing than others, but we do not have all the information available to say that one image is wrong.

Heck, they may all be wrong.

Isn't that the implied purpose of a quality setting in the first place?

To guarantee the image is wrong but to limit the extent of the wrongness such that it falls within the tolerance on an individual basis?

If everybody felt standard quality looked like crap then it would never get used, everyone would set the quality to the highest setting where the wrongness was as minimal as the driver teams allow.

But apparently there are people who feel less quality, as evidenced by there being a lower quality setting to begin with, is still just fine for their tastes.

But I find it kinda odd to see arguments over IQ when discussing different quality settings like there is any question if IQ changes...of course there is difference, if there wasn't then the driver teams would not have spent their resources creating the user-selectable quality options in the first place.

But not everyone is expected to notice or care to notice the image degradation that takes place, if everyone did then the option would be useless and therefore go unused.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
When money was tighter (stretching out an old Ti4200), I've turned quality settings to the lowest settings to get games running as well as possible. Otherwise, driver defaults are fine. Triple buffer/AA/AF are set in game for best results.

If NV/AMD/Intel can give me 5% or more performance, without me noticing the changes... please, keep doing it.

I would not buy from the company that does not give me 5% or more, with no noticeable impact. AMD should be commended for this improvement, but scolded for not doing this earlier.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
When money was tighter (stretching out an old Ti4200), I've turned quality settings to the lowest settings to get games running as well as possible. Otherwise, driver defaults are fine. Triple buffer/AA/AF are set in game for best results.

If NV/AMD/Intel can give me 5% or more performance, without me noticing the changes... please, keep doing it.

I would not buy from the company that does not give me 5% or more, with no noticeable impact. AMD should be commended for this improvement, but scolded for not doing this earlier.
I believe the argument is that not that to the human eye there is a difference, but the difference on the gpu level which makes the cards appear faster in reviews which affects buying decisions since review sites can easily sway decisions of purchasers. Thus, because of this practice, one card is processing less texture quality than the other on the agreed upon settings which gives one card vs another an unfair advantage.

Review sites were saying the 6850 was the clear winner over the 460, yet when X-Bit labs, and a few others, pointed out the discrepancy in IQ and benched on an apples/apples basis, another picture was painted entirely.

Now, as others have mentioned, if this only happens on a few old titles, it is a total non-issue. If it is happening on newer titles, then yes it is definitely an issue on the reviewer level in regard to accurately painting the proper picture for those that read, and depend, on their reviews to make their buying decisions.
 

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
If the reviewer didn't notice with the naked eye, then AMD should be commended for improving framerates without IQ loss. NV needs to come up with a driver team that can properly implement the same optimizations, they screwed up on this one.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
I really don't see what is so hard to comprehend about ensuring the cards tested are rendering the same amount of information in order to do an apples to apples comparison in order to make a properly informed buying choice.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
I really don't see what is so hard to comprehend about ensuring the cards tested are rendering the same amount of information in order to do an apples to apples comparison in order to make a properly informed buying choice.

But what's the point of rendering useless data?

If the results are the same to human perception there is no point in going any further for some kind of irrelevant technical measurement.

Let's flip it around. I can say Nvidia's drivers penalize their card owners because they are cheated out of possible performance. They are spending resources producing data that is essentially wasted, when it they could be spending those resources for performance gains.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
But what's the point of rendering useless data?

If the results are the same to human perception there is no point in going any further for some kind of irrelevant technical measurement.
I guess Nvidia should just slide their setting to 'performance' then to match the 'quality' setting of AMD since both settings would be disregarding useless data to the same level, and reviewers bench from there then. That is what was being discussed. Not that the eye can catch it. Precisely because the eye can't catch it is why practices like this happen. This practice makes one card appear faster than the other when they are not. All it takes is 1 fps different, with the inclusion of a clever graph, to make the two cards look miles appart..which in the end affects sales because people are mindless sheep.

It's not about actual gaming, it's about benching and steering potential buyers in one direction or another based on those results. If I was a reviewer, I would surely make sure the playing field was level in order to make sure I wasn't accidently giving slanted reviews by mistake.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
I really don't see what is so hard to comprehend about ensuring the cards tested are rendering the same amount of information in order to do an apples to apples comparison in order to make a properly informed buying choice.

Why do they use old titles to determine what the apples are and then go on to bench totally new and different titles with their findings? Anyone care to point me to an IQ comparison using current titles?
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Why do they use old titles to determine what the apples are and then go on to bench totally new and different titles with their findings? Anyone care to point me to an IQ comparison using current titles?


that is the 1,000,000$ question that hasnt been answeared.

2 outcomes:
1) it happens in all games, people are to lazy even review sites to post pics (of newer dx11 games).
2) it doesnt happend in all games, only old ones. That german site is useing old game IQ in a few old games to make amd look bad.

that german site seemed very pro nvidia... thats the reason Im still not sure which is true.
If they didnt seem biased, 1) would probably be more likely.
 
Last edited:

Obsoleet

Platinum Member
Oct 2, 2007
2,181
1
0
My view is that they are reviewing total Products. Not specifically "texture rendering chips". I personally use my Radeon for hardcore gaming, all the time. Not for examining the details of a texture.

As a gaming product and part of a computer, the Radeon I bought does gaming very well at 1920x1200, and does it with lower heat and power requirements than the competition. I'd be sacrificing far too much to have gone with a 470 instead, even if the default driver settings are slightly better (and lost a good 6+ months of gaming time).

As far as the drivers, Product vs. Product... AMD successfully outsmarted Nvidia, higher frames with no noticeable loss in IQ. No unbiased consumer is going to complain about that, whether it's kept a secret or not.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
My view is that they are reviewing total Products. Not specifically "texture rendering chips". I personally use my Radeon for hardcore gaming, all the time. Not for examining the details of a texture.

As a gaming product and part of a computer, the Radeon I bought does gaming very well at 1920x1200, and does it with lower heat and power requirements than the competition. I'd be sacrificing far too much to have gone with a 470 instead, even if the default driver settings are slightly better (and lost a good 6+ months of gaming time).

As far as the drivers, Product vs. Product... AMD successfully outsmarted Nvidia, higher frames with no noticeable loss in IQ. No unbiased consumer is going to complain about that, whether it's kept a secret or not.

Yes. Sacrifice. I know what you mean. I have to pin prick my finger and feed a drop of blood to my GTX480 before each gaming session to ensure high framerates and godly promise from sacrifice that my house won't burn down.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Yes. Sacrifice. I know what you mean. I have to pin prick my finger and feed a drop of blood to my GTX480 before each gaming session to ensure high framerates and godly promise from sacrifice that my house won't burn down.

Cool.

The rest of us use money.
 

cusideabelincoln

Diamond Member
Aug 3, 2008
3,275
46
91
Yes. Sacrifice. I know what you mean. I have to pin prick my finger and feed a drop of blood to my GTX480 before each gaming session to ensure high framerates and godly promise from sacrifice that my house won't burn down.
You're priorities are better than everyone else's therefore you are right and everyone else is a blind fool deserving of such repercussions as sarcasm. /wow
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
... AMD successfully outsmarted Nvidia, higher frames with no noticeable loss in IQ. No unbiased consumer is going to complain about that, whether it's kept a secret or not.

It's interesting how many times this thread was being derailed into NV vs. AMD when reviewers were pretty clear about comparing image quality at "default settings" with HD5870 series. This was the primary reason for them changing to High Quality, not NVIDIA. Nvidia screenshots were only provided to show that both HD5870 and GF100 were producing superior quality. Therefore, to compare cards with apples vs. apples, it's only fair to move the slider to HQ for HD68xx series (this is the view of the reviewers who changes their methods).

As has been discussed, these websites are only providing their reasoning for their new test settings. This transparency by reviewers is better for the readers. For instance, if you don't agree with their reasoning, just add 5% to every benchmark for HD68xx series. Everyone else now knows that PCGamesHardware, Xbitlabs, Computerbase.de test HD68xx/58xx at highest image quality, which imo makes sense for someone buying $200+ videocards. I play all my games with highest quality on NV/AMD and never use Performance textures. If you play with bi-linear filtering, Performance textures, Cats AI: Advanced, that's your choice.
 
Last edited:

WelshBloke

Lifer
Jan 12, 2005
33,256
11,396
136
Is this about what settings the AMD drivers are at default?

'Cos my nvidia ones are like this.

JfveK.png


So given that both side are using some sorts of optimisation then the only way to compare them is to see if the images produced are comparable.

Looking at the screenshots earlier in the thread I'd say this is much fuss about nothing.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Looking at the screenshots earlier in the thread I'd say this is much fuss about nothing.

Texture filtering deficiency is almost impossible to see in screenshots because it's something that is much more clear in motion. You can't take a screenshot of texture flickering either. Therefore, besides TrackMania, it is often too difficult to discern in screenshots. But if you watch videos, it's obvious right away. BTW, the reviewers are testing NV with "Texture filtering – Quality: High quality", not Quality as you have at default in your system. So you are actually running your card in Performance image quality mode.

Just download the first video here. As the character is moving in HL2, you can see distinct areas of texture/mipmap transitions in front of him. This is not how high quality filtering should work. With Cats 10.9, AMD's Quality Mode = NV's Quality mode, but with HD6800 and Cats 10.10, AMD's Quality mode < than both. That's what these reviewers are basically saying. So they have decided to up the quality to compensate.
 
Last edited: