RussianSensation thanks for pics, that makes it easier to compair.
GaiaHunter, are you saying this only happends in older games? like non dx11 games?
I'm saying if it is a widespread problem then it should be easily reproduced in all the games.
Otherwise what you have is an optimization that created some bug in some older titles.
Or do you think AMD needs the 6850/70 to have more performance in Half Life 2 or Oblivion at expense of IQ to look good in reviews, that don't even bench these games anymore?
It still is an issue, although users have the chance to fix it by choosing their settings.
I'm saying if it is a widespread problem then it should be easily reproduced in all the games.
I'm saying if it is a widespread problem then it should be easily reproduced in all the games.
Otherwise what you have is an optimization that created some bug in some older titles.
Or do you think AMD needs the 6850/70 to have more performance in Half Life 2 or Oblivion at expense of IQ to look good in reviews, that don't even bench these games anymore?
It still is an issue, although users have the chance to fix it by choosing their settings.
If these optimizations do indeed lower IQ to all games, then the reviewers only need to take some screenshots of the games they actually benchmarked.
It's unrealistic to expect them to test 1000 games. If you want more performance, leave Cats AI on and texture filtering to Quality. If you don't notice the difference in the games you are playing, leave Cats AI on and filtering to Quality. If you want the best image quality, disable Cats AI and set filtering to HQ.
However, when doing reviews, settings should utilize comparable filtering quality, which means HQ for AMD in reviews (something that Xbitlabs and ComputerBase are doing). If you don't agree with this methodology, then don't put any weight into their reviews. The choice is always yours in the end in how you want to play your games. :thumbsup:
What I don't agree with is the methodology to reach this conclusion.
I just want them to show me 1 game of the ones they actually benchmark.
does this only apply to dx9 titles ? if so, big whoop dx11 will be the standard going forward and if some old dx9 game is a bit buggy, oh well.
Precisely this. In order to do a fair comparison in regards to performance it is esesntial each card is rendering textures to the same quality level. If AMD Radeon cards on their default level are putting out lower image quality, and in order to match nVidia's image quality at their default level is to move the setting to high quality (on the Radeon card), this is the only way to do a straight up, and fair, comparison. It doesn't take too much intellectual capacity to understand that any card rendering less texture quality is going to perform faster. Plus, given the fact that all it takes is for a single frame or two for reviewers to laughingly call one card vs another a 'clear' winner (based on the scaling of their graphs) which affects people's buying decisions...then this is all the more reason to be sure everything is set the same image quality wise.Is this something that you believe happened? Or are you just throwing it out there as a possible reason or for some reasonable doubt?
Because in a nutshell, by default, Nvidia's "Standard" quality setting is the same as AMD's "High Quality" setting. Obviously when you set the AMD quality to "Standard", you'd be setting it to a lower IQ that Nvidia's default.
Now run benches with those settings and let me know if you think that's fair. Whether the human eye can perceive it or not isn't the issue. Don't pretend that could make any difference. We are talking numbers here.
You know what the problem is.
More websites are reporting the same findings. Differnet games investigated too.
HT4U
TweakPC (with videos)
Clearly you need to compare the HQ modes of AMD with Catalyst AI: Disabled with the default quality of nVidia to get a more or less similar IQ. That's 3 separate websites now reporting the same thing....with different games (The Witcher, Crysis, Oblivion, HL2).
Perhaps BFG can investigate the mipmap/texture filtering transitions in his Image Quality Analysis.
Precisely this. In order to do a fair comparison in regards to performance it is esesntial each card is rendering textures to the same quality level. If AMD Radeon cards on their default level are putting out lower image quality, and in order to match nVidia's image quality at their default level is to move the setting to high quality (on the Radeon card), this is the only way to do a straight up, and fair, comparison.
Also I really don't care. I have a hard time discerning the IQ differences and as I siad, I have PEREFECT 20/20 vision with excelent detail. Either way, I can hardly see any difference as standard screen resolutions.
what this really is, is a red herring.
enjoy building your strawman. I am going to go back to enjoying my games.
I have absolutely spot on perfect 20/20 vision. I have to really stare at most of those images to see the IQ difference and in game I doubt I would have enough focus to care.
Don't see any difference either.
I guess that is what the reviewers have been doing too - leaving the optimizations on as they don't affect IQ.Exactly! If the end user can't tell the difference in the games he/she is playing at home, they can leave Catalyst Optimizations and the filtering quality at default settings. However, reviewers should strictly test with HQ on. Granted, this probably won't affect most people's gaming experience since I imagine anyone spending $200+ on a videocard is already using HQ texture filtering in the first place.
If you don't find image quality differences material for you, not a big deal. Leave the optimizations and enjoy the added performance. This doesn't make my post biased/strawman. Thank you.
Why don't they show these issues in the titles that they use in their benching? I'm sure there're plenty of instances of repeating texture patterns on current titles to test these things out or am I missing something?
Just out of curiousity... did AMD guys give nvidia guys crap about them useing WORSE texture quality before 10.10? if so its only fair they call faul on this.
I guess that is what the reviewers have been doing too - leaving the optimizations on as they don't affect IQ.
Of course NVIDIA keeps fighting for the cat AI be turned off.
Ok, I see what you are saying here. Testing each game would be incredibly time consuming. However, this needs to be looked in to. Don't you think?Teizo no one is against that... we re just asking if you have proof the same thing happends in all games, or atleast newer dx11 ones? Can you show 10.09 HQ + 10.10 HQ + Nvidia standart, in newer game, so we can compaire quality and see if it happends in dx11 titles too.
if it happends in 1 or 2 old dx9 games its not a issue. No one benchmarks old dx9 games.
^
So the AMD fan response is "Who cares, I can't tell the difference in IQ anyway!"
Talk about grasping at straws.
As Russian mentions above, it absolutely matters when comparing graphics cards.
Ok, I see what you are saying here. Testing each game would be incredibly time consuming. However, this needs to be looked in to. Don't you think?
@ Russian Sensation
Yeah, I remember nVidia being the first to do this I believe as they were getting waxed by the ATI 9800 Pro (of which one I still own) and the 9800XT back in the day. It's a damned shame if this is true as much mud as they slung at nVidia for doing so to only turn around and do the same later on.
