AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Actually the page I offered has exact examples of motion for HQ and Q with the 5XXX, 6XXX and 4XX series of products and can run them all at the same time. No need for static shots.

But those show texture shimmering, not lack of AF IQ.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Curbing shimmering and texture aliasing artifacts are part of anisotropy, too.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
Where do you guys even find this high quality setting.
All I have is cat AI standard, advanced or disabled. Then the mip map filtering is by default set to high quality.. that is on the 10.10s with a 5850.
 

BathroomFeeling

Senior member
Apr 26, 2007
210
0
0
its not as big a differnce as between Nvidia standart and 10.10 HQ.
But it is there... the 10.09 is slightly prettier if you look closely at the images posted in the start of the thread.

I had the 2 pictures side by side and zoomed in on the yellow boxes. Its hardly noticeable otherwise.

The differnce between the 10.10 HQ and Nvidia standart are bigger though, you notice that even at normal image size just looking at the two. The Nvidia standart > 10.10 HQ.
Actually, if you go over those Oblivion screenshots carefully, you'll see that the nVidia Standard is clearly better compared to the AMD 10.09 ones.

On the first row, the green moss-covered texture on the column immediately next to the blond man is worse in 10.09 compared to nVidia. On the second row, blurring is evident on the green roof texture of the building. On the third row, the details of the Corinthian-like column is less defined compared to nVidia. And on the last row, the moss-covered texture as well as the lower stone texture are clearly less detailed compared to nVidia's 'standard' quality level.

98740571.jpg
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Although this is not exactly news to those who read Beyond3D, I think AMD was irresponsible for changing default settings like this. At the very least they should have had a popup box or something upon installation explaining the change and allowing the user to select his or her preference.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
After you blew them up side by side...

#1 row: nvidia crack in wall is prettier, bottom of the metal hing thingy for "inn" sign is better on amd's. The blonde woman looks better on Amds too (her hair ect).

#2 row: nvidia has better roof tiles, amd has easier to read text (no dark spot in between the first n letter). Side of house (under the inn sign) looks sharper on amd.

#3 row: think both are equally defined. Ur right bottom of collum is slightly sharper on nvidias, but then again at the curve above that, amds is smoother.

#4 row: The green patch thats like a line.. stands out more.. not sure if thats a win for nvidia though, to me theyre kinda meh in differnce in here too.


10.09 and Nvidia standart ... too close to really call I guess.
I could have sworn when I first looked, I was leaning more to amd though <.<
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Thats all fine and practical. But in reality after a new gpu is released, we have endless debates based on reviews where 8% difference is deemed by rabid fans of either side a stunning victory, a next generation product making the competitor obsolete. These conclusions are often argued about by people that don't even
GAME !

You know what this NV vs AMD thing reminds me of? Photography. People endlessly comparing Canon v. Nikon cameras and lenses, and some of the most rabid ones hardly even take photos. The pros just use whatever is reliable. I remember one pro talking about how he got an award-winning shot of an eagle in flight using a relatively lame consumer-grade lens (Nikon 18-200mm), and thinking, "omg, the people in photography forums would have a heart attack over that!"

It also reminds me of the endless 9mm v. 40 S&W debates in gun forums, where some of the most ardent arguers aren't even that good of shots in the first place. An on-target .22 beats misses from EITHER 9mm or .40 S&W. :)

Now, I am just as guilty about arguing over relatively small differences, but at least I game. I just finished crafting the Saxton Hale mask in TF2 and broke 1000 hours played in TF2! :)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
They all the look the same to me (I purposely avoided reading your text so I can not pinpoint any area to focus on). But then I don't play games with a pair of magnifying glasses.

Cats 10.9 Quality shouldn't be much different to NV Standard setting. The issue is with Cats 10.10. If they are set to Quality, their default image quality is worse than Cats 10.9 Quality. To achieve the same image quality with ATI cards as in the past, you now have to enable High Quality on Cats 10.10. Some reviews ran all of their benchmarks at default settings testing HD6870 vs. HD5870 vs. NV. Once they realized the image quality was even less than HD5870 series, this they re-ran their benchmarks for HD68xx series with High Quality. The result was a loss of performance between 5-6&#37;. This means that some reviews of HD68xx vs. HD58xx were favoring HD68xx series since HD58xx cards ran at a higher level image quality. Since AMD never made this disclaimer, these review websites decided to do it themselves.

Also, as has been said, if you can't notice the difference, feel free to leave all optimizations on. But for fair benchmark comparisons, this is not acceptable.

People endlessly comparing Canon v. Nikon cameras and lenses
Now imagine a hypothetical scenario. Suddenly Nikon lenses were designed without Image Stabilization. The only way to get image stabilization would be to enable them in the camera. However, Nikon would ship all their new cameras with image stabilization OFF. In 97% of the shots, having Image stabilization off, would produce identical results to previous Nikon cameras or to Canon cameras. However, in 3% of the cases where you needed image stabilization but didn't know you needed it, you'd have artifacts/blur compared to your old Nikon camera. So to fix the 3% of those shots, you realized you have to have Image stabilization ON at all times. So now if you are buying a new Nikon camera, do you want tests to be done with Image Stabilization ON or OFF? Well that depends on the shots you take. But since we don't know what shots you take, we'll test with Image Stabilition ON to cover all possibilities.

In this example, AMD gets away with 5% faster performance in 97% of the cases. But in a few cases, it produces artifacts while doing so. To remove all artifacts in 100% of the cases requires a loss of 5% in performance. Now you have this new information, and it's your choice whether you want to keep it ON or OFF. This new information can actually help you :)

However, reviewers don't know which games you play, which games will have negative image quality. So the best solution is to enable maximum quality at all times for both vendors.

Again I think people are missing the point here. For the 10th time, it appears that the image quality at Default settings is worse on HD68xx series vs. HD58xx series. If you want the same image quality as HD58xx series, you have to set High Quality for HD68xx series and thus lose 5% in performance.

So you are saying it's acceptable to compare HD68xx series with reduced image quality vs. AMD's own HD58xx series?

Where do you guys even find this high quality setting.
All I have is cat AI standard, advanced or disabled. Then the mip map filtering is by default set to high quality.. that is on the 10.10s with a 5850.

The control panel looks different for HD68xx series.

For HD68xx series, AI Quality = AI Advanced for HD58xx series (basically aggressive optimizations).
HD68xx series, AI High Quality = AI Standard for HD58xx series

Some reviewers ran HD58xx series with AI Standard vs. HD68xx series with AI Quality. You see the issue now?
 
Last edited:

BathroomFeeling

Senior member
Apr 26, 2007
210
0
0
After you blew them up side by side...

#1 row: nvidia crack in wall is prettier, bottom of the metal hing thingy for "inn" sign is better on amd's. The blonde woman looks better on Amds too (her hair ect).

#2 row: nvidia has better roof tiles, amd has easier to read text (no dark spot in between the first n letter). Side of house (under the inn sign) looks sharper on amd.

#3 row: think both are equally defined. Ur right bottom of collum is slightly sharper on nvidias, but then again at the curve above that, amds is smoother.

#4 row: The green patch thats like a line.. stands out more.. not sure if thats a win for nvidia though, to me theyre kinda meh in differnce in here too.


10.09 and Nvidia standart ... too close to really call I guess.
I could have sworn when I first looked, I was leaning more to amd though <.<

#1 The lower part of the Inn sign can't really be used for comparison since it appears to be significantly affected by AA and the distance to the camera. Nonetheless it still looks a bit better on nVidia as it's more solid-looking. As for the shading of the blond man, apart from the large shaded area above the ear and the upper part of the leather shoulder (both more defined on nVidia), there is no discernible difference between both screenshots.

#2 There is no difference between both texts.

#3 This row is a little harder to discern, it's actually clearer in Photoshop (ensure 'Nearest Neighbor' resize when doing your own comparison). The texture on the curved protrusion is less defined in AMD compared to nVidias. And around the darker region, there is less definition in AMD's screenshot.

#4 Actually there is a difference in texture quality, it's more blurry on AMD compared to nVidia. I'll highlight the difference in lossless PNG so it's clearer:
84664635.png


not sure if thats a win for nvidia though, to me theyre kinda meh in differnce in here too.


10.09 and Nvidia standart ... too close to really call I guess.
I could have sworn when I first looked, I was leaning more to amd though <.<
This isn't about who 'wins' here, I'm just looking for any differences and quality issues between both implementations as objectively as I can. I was very curious at your initial statement that 10.09 was clearly better than nVidia, as initially my impression was of a perfecly equal render. However to my surprise it would appear the gist of this thread is somewhat valid, where the 10.09 is actually less in terms of quality compared to nVidia's standard level.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
I have no idea what your seeing. but I'm not seeing anything but a pretty crappy set of game graphics regardless of whose card this is.

How old is this game ?

Secondly. Those differences are not really there. How are people capturing these renderings in the first place ? round trip back from the card using screen capture or fraps ?

You'll get some odd results for sure pulling that crap.

I see small differences, differences that might boil down to differences in how the shaders render the scene.

All this hyperbole, its about nothing.

I am not seeing a delibrate distortion of image quality that suggests ATI is pulling shenanigans.

what I see is alot of piss poorly written games that have some odd texture issues that are sort of random.



#1 The lower part of the Inn sign can't really be used for comparison since it appears to be significantly affected by AA and the distance to the camera. Nonetheless it still looks a bit better on nVidia as it's more solid-looking. As for the shading of the blond man, apart from the large shaded area above the ear and the upper part of the leather shoulder (both more defined on nVidia), there is no discernible difference between both screenshots.

#2 There is no difference between both texts.

#3 This row is a little harder to discern, it's actually clearer in Photoshop (ensure 'Nearest Neighbor' resize when doing your own comparison). The texture on the curved protrusion is less defined in AMD compared to nVidias. And around the darker region, there is less definition in AMD's screenshot.

#4 Actually there is a difference in texture quality, it's more blurry on AMD compared to nVidia. I'll highlight the difference in lossless PNG so it's clearer:
84664635.png



This isn't about who 'wins' here, I'm just looking for any differences and quality issues between both implementations as objectively as I can. I was very curious at your initial statement that 10.09 was clearly better than nVidia, as initially my impression was of a perfecly equal render. However to my surprise it would appear the gist of this thread is somewhat valid, where the 10.09 is actually less in terms of quality compared to nVidia's standard level.
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
I have no idea what your seeing. but I'm not seeing anything but a pretty crappy set of game graphics regardless of whose card this is.

How old is this game ?

Secondly. Those differences are not really there. How are people capturing these renderings in the first place ? round trip back from the card using screen capture or fraps ?

You'll get some odd results for sure pulling that crap.

I see small differences, differences that might boil down to differences in how the shaders render the scene.

All this hyperbole, its about nothing.

I am not seeing a delibrate distortion of image quality that suggests ATI is pulling shenanigans.

what I see is alot of piss poorly written games that have some odd texture issues that are sort of random.

I see what you see.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I see what you see.

Well then you are missing the point.

In a world where graphics cards reviews can swing either way by 5-10 FPS, intentionally degrading IQ is as much as a scandal as anything. It doesn't matter if you can see the difference or not.

Because sales can hinge on a few FPS &#37;s, I would be looking into this if I was the FTC.

BTW, I thought it was just as wrong for skewed Vantage scores due to offloading Physx by nV. ;)


People keep going back to the "I can't tell" argument in this thread, not realizing that means nothing. We are discussing "apples to apples" comparison issues.
 

Tsavo

Platinum Member
Sep 29, 2009
2,645
37
91
Well then you are missing the point.

In a world where graphics cards reviews can swing either way by 5-10 FPS, intentionally degrading IQ is as much as a scandal as anything. It doesn't matter if you can see the difference or not.

Because sales can hinge on a few FPS %s, I would be looking into this if I was the FTC.

BTW, I thought it was just as wrong for skewed Vantage scores due to offloading Physx by nV. ;)


People keep going back to the "I can't tell" argument in this thread, not realizing that means nothing. We are discussing "apples to apples" comparison issues.

I didn't miss the point. I read RS's usual lot of well-reasoned posts about the performance issue; that's not my point. People are posting screens of which there's neigh little difference...to the point of being undetectable, amongst the two. If the IQ is the same while playing the game (not using the zoom tool in photoshop) then it doesn't really matter what AMDTI fiddled with in their control panel.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Well then you are missing the point.

In a world where graphics cards reviews can swing either way by 5-10 FPS, intentionally degrading IQ is as much as a scandal as anything. It doesn't matter if you can see the difference or not.

Because sales can hinge on a few FPS %s, I would be looking into this if I was the FTC.

BTW, I thought it was just as wrong for skewed Vantage scores due to offloading Physx by nV. ;)


People keep going back to the "I can't tell" argument in this thread, not realizing that means nothing. We are discussing "apples to apples" comparison issues.


did you find the IQ problems in the photos I pasted above ? Please do enlighten me now that I have removed all possiable sources of bias. Which one has the IQ problem.
 

BathroomFeeling

Senior member
Apr 26, 2007
210
0
0
Here you tell me which one has worse image quality.


98740571.jpg
Both are equally worse compared to the original, likely due to the high JPEG compression artifacts (you should try PNG loseless next time). Between the two, there is a subtle noise difference, and very very slightly more on the right based on noise strength.

Clearly this is not the same as my original comparison, where texture blurring is very much more pronounced and appears in several instances in the frame. Whether this is due to malice or neglect, is irrelevant to the point of the comparison.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
This was already discussed a few weeks ago when the 6xxx series launched. Yes, the default settings on newer Catalysts have worse quality than before.

It makes no difference to me because I always run the highest image quality settings when I game, so I benchmark exactly the same way. I told Mark to benchmark the same way, and he does too as far as I’m aware.

I don’t agree with disabling Catalyst AI though, because doing so disables all application specific optimizations (and fixes), so it can stop games from functioning properly. The equivalent action on nVidia’s parts would be to delete all application profiles, which again could cause titles to stop working properly. You would also lose all SLI scaling too, and the ability to force AA in many games.

I’m firmly against both actions, but anyone that advocates disabling Catalyst AI must also advocate deleting nVidia’s profiles too.
Because in a nutshell, by default, Nvidia's "Standard" quality setting is the same as AMD's "High Quality" setting.
No it’s not. nVidia’s standard quality mode has visible mip transitions and also filters less angles than ATi. ATi OTOH has more texture aliasing with their stock settings.

The fact is, it’s not possible to get equal AF with the two IHVs, and it’s never been possible given they’ve always implemented AF differently to each other. All you can do is run the highest possible quality levels for both and then comment on the observable differences.
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
This was already discussed a few weeks ago when the 6xxx series launched. Yes, the default settings on newer Catalysts have worse quality than before.

It makes no difference to me because I always run the highest image quality settings when I game, so I benchmark exactly the same way. I told Mark to benchmark the same way, and he does too as far as I’m aware.

I don’t agree with disabling Catalyst AI though, because doing so disables all application specific optimizations (and fixes), so it can stop games from functioning properly. The equivalent action on nVidia’s parts would be to delete all application profiles, which again could cause titles to stop working properly. You would also lose all SLI scaling too, and the ability to force AA in many games.

I’m firmly against both actions, but anyone that advocates disabling Catalyst AI must also advocate deleting nVidia’s profiles too.

No it’s not. nVidia’s standard quality mode has visible mip transitions and also filters less angles than ATi. ATi OTOH has more texture aliasing with their stock settings.

The fact is, it’s not possible to get equal AF with the two IHVs, and it’s never been possible given they’ve always implemented AF differently to each other. All you can do is run the highest possible quality levels for both and then comment on the observable differences.
if you are talking about me, i have always set IQ to the highest settings for both CPs; (in Cat 10-10, i also disabled surface optimizations but i always leave Cat AI enabled, i.e. 'Standard').
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
Both are equally worse compared to the original, likely due to the high JPEG compression artifacts (you should try PNG loseless next time). Between the two, there is a subtle noise difference, and very very slightly more on the right based on noise strength.

Clearly this is not the same as my original comparison, where texture blurring is very much more pronounced and appears in several instances in the frame. Whether this is due to malice or neglect, is irrelevant to the point of the comparison.


Are you sure ? There was no compression at all on the pictures. None whatso ever.

Thats great that you see a difference BTW

They are both the Nvidia images.

I just copied them for both sides and removed all indications of what they are.

Good to know that there is no difference yet you see one.

surprise