AMD 58/68xx series - Reduced image quality starting with Catalyst 10.10 at Default?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
RussianSensation thanks for pics, that makes it easier to compair.

looks like:
10.9 HQ > Nv standart > 10.10 HQ

yeah the Nv standart has higher quality than 10.10 but less than 10.09.

Could this be a bug with 10.10 though + oblivion game?
Or is it like this with ALL games?

if its JUST with 1 game, than this is a non issue right?


Edit: looks like it happends in track mania too.

I'm sure how you came to the conclusion that ATI 10.9 high > Nvidia standard because looking over the Nvidia vs. ATI 10.9 high pictures, and I honestly cannot see a difference in quality.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Can you see the difference or has your brain been convinced to percieve one ?

Ask yourself this. Is there really a difference ?

Wheres waldo ?

A true IQ issue would be visable upon the whole scene. It only seems to imperctably pick a few random objects with nearly zero consistency. did it occur to anyone thats its not a particular quality issue in terms of driver settings but more likely a code path that just isn't jiving well.

BTW on my new big 31 inch monitor. I still can't see a difference unless I zoom way in and the image is so pixilated its not even discrenable.
No, you won't be able to tell visually yourself perhaps unless you maginfy it and test it, but the gpu definitely will...and that is why such practices are being practiced because the gpu will have less to render which will increase performance giving the product the appearance of better performance to the end user via reviews. And it is these reviews that like to declare one product vs another as a 'winner' when only 2 fps are the difference. If those two fps come from not having to render the same texture quality (i.e. not doing the same level of calculations), we, the end user, need to know this. It is only fair and there is really no way to justify it not being done.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I agree the difference is noticeable in Oblivion, but it's also a good point that they need to show this happens with more modern games, not what I was playing in early 2006.

That doesn't mean "testing 1,000 games" that means showing it for Mass Effect 2, Stalker Call of Pripyat, other games in the AnandTech 68xx review.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Back in the days, FX5900 series was sacrificing image quality in some games (not all games) with tri-linear texture filtering optimizations. Since it was impossible for every review website to test each game to figure out where they could leave the optimizations ON vs. OFF and not adversely impact the image quality, the respected websites like Anandtech and Xbitlabs started testing with Trilinear optimizations off at all times to level the playing field.

This is a similar scenario, except now AMD is doing it. Of course some websites are not buying it and are voicing their view that in order to have apples vs. apples comparison, HQ should be applied for AMD and Cats AI: Disabled (like Computerbase/Xbitlabs, etc.). The end user still has the option to leave these optimizations ON at home.
I understand all that,but isn't Computerbase saying they will be benching ATI cards on HQ vs Nvidia cards at default based on their image quality observations on Halflife2,Oblivion and Trackmania. Why aren't they making the IQ comparisons on current titles? Maybe i'm missing something with google translate?
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
BTW on my new big 31 inch monitor. I still can't see a difference unless I zoom way in and the image is so pixilated its not even discrenable.

You don't have to zoom in. Texture shimmering and mipmap transitions occur while moving, not while being stationary. HL2 video from Computerbase clearly illustrates this. You have to look at the videos, not the pictures alone. Otherwise, it's almos timpossible to see worse AF and more flickering.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I'm sure how you came to the conclusion that ATI 10.9 high > Nvidia standard because looking over the Nvidia vs. ATI 10.9 high pictures, and I honestly cannot see a difference in quality.

its not as big a differnce as between Nvidia standart and 10.10 HQ.
But it is there... the 10.09 is slightly prettier if you look closely at the images posted in the start of the thread.

I had the 2 pictures side by side and zoomed in on the yellow boxes. Its hardly noticeable otherwise.

The differnce between the 10.10 HQ and Nvidia standart are bigger though, you notice that even at normal image size just looking at the two. The Nvidia standart > 10.10 HQ.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I guess the reviewers must be "fans" too, since a majority of them say exactly the same thing.

No, professional reviews are NOT agreeing that image quality is unaffected. So far all of these are disagreeing:

Xbitlabs (already used HQ in all their reviews with Cats AI disabled prior to this)
PCGamesHardware
Computerbase.de
HT4U
TweakPC

^^ all switching from Quality to High Quality starting with Cats 10.10.

PCGamesHardware

"Quality" is the new default setting, also, the render target Replacements active. We noticed at the beginning of the test show that the anisotropic filter of the HD 6800 cards compared with a Radeon HD 5870 flickers stronger.

After consulting with AMD, they informed us that the default "quality" has more aggressive filtering than the previous driver standard (AI standard, where the texture filtering optimizations in the HD 5800 series have already been disabled). Only the "High Quality" brings the improvements to previously unknown levels, and is similar to the previous AI Standard of HD 5800 cards. The HD 6000 cards filter that is standard on the level of a HD 5000 card with AI Advanced / Advanced. AMD gives its new cards a fps advantage at the expense of image quality. Once "High quality" is activated, the AF unsightly banding disappears almost completely, the flicker is also reduced - and the frame rate of course."
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@RussianSensation did you see the fps differnce in the german site? 0.1fps in most games (/.\)

to me it seems silly not to use best quality setting on amd cards, if all it costs is 0.1-0.4 fps in the games at 1600x res. Its alot of hassel to go through for 0.1 fps differnce.... the real kicker is that amd cards have lower image quality for almost no fps gain (which is sad).


HOWEVER still no proof this happends in new DX11 games as well.... maybe old dx9 games have this issue, doesnt mean newer games do.
Until this happends, we cant just crusify amd for no reason.
 
Last edited:

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
There have been a few people here wanting to see shots of more modern games. I am too. I don't play HL2 or oblivion anymore. AMD have said its more aggressive than before, but it doesn't reduce image quality. Maybe a site like Anandtech or Toms or HardOCP didn't see the difference because they didn't bother testing games from 2005/2006 that already run at hundreds of FPS.

There are a few people here with 6850s, you could ask them to take shots of some newer games and post them here.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@RussianSensation did you see the fps differnce in the german site? 0.1fps in most games (/.\)

I think you are looking at 1xAA/1xAF results (of course there will be no difference since no AA/AF is applied). The performance difference with 4AA/8AA are 5-6% on average. Please double check again.

HOWEVER still no proof this happends in new DX11 games as well.... maybe old dx9 games have this issue, doesnt mean newer games do. Until this happends, we cant just crusify amd for no reason.

The texture flickering on the 6800 series looks pretty sub-par in the videos. If you watch them, you'll see that the 6800 series texture flickering is actually way worse than that of the 5800. This is of course fixed once you re-enable HQ on 6800 series in the same games. There is nothing wrong with the HD6800 series themselves.

There are a few people here with 6850s, you could ask them to take shots of some newer games and post them here.

I think we need videos. Taking shots isn't conclusive enough.
 
Last edited:

heflys

Member
Sep 13, 2010
72
0
0
Xbitlabs (already used HQ in all their reviews with Cats AI disabled prior to this)

Here's what xbitlabs states in their review:

"As opposed to MLAA, the benefits of the new AF algorithm are easy to see. They won't be so clear in the process of actual gaming, yet a sharp-eyed gamer will surely spot the difference, especially as modern games offer a lot of scenes suitable for that."


http://www.xbitlabs.com/articles/video/display/radeon-hd6870-hd6850_5.html#sect1
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Here's what xbitlabs states in their review:

"As opposed to MLAA, the benefits of the new AF algorithm are easy to see. They won't be so clear in the process of actual gaming, yet a sharp-eyed gamer will surely spot the difference, especially as modern games offer a lot of scenes suitable for that."

Yes, in High Quality Mode. AMD drivers ship with High Quality disabled by default. Therefore, to achieve what you just linked, move the slider from Q to HQ. This is what this discussion is :thumbsup:
 

heflys

Member
Sep 13, 2010
72
0
0
Yes, in High Quality Mode. AMD drivers ship with High Quality disabled by default. Therefore, to achieve what you just linked, move the slider from Q to HQ. This is what this discussion is :thumbsup:

No, I wanted you see the part where they stated "it won't be so clear in the process of actual gaming," which is what this discussion has now become about. :thumbsup:
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Can someone please post pics of a 68xx in a newer game(dx11) (10.09 HQ + 10.10 HQ drivers) + 4xx series (standart quality setting) in same game.

until then this wont be resolved.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
No, I wanted you see the part where they stated "it won't be so clear in the process of actual gaming," which is what this discussion has now become about. :thumbsup:

Vader@xbit:
"We'll check later. I, personally, think, that all these "Super-Duper Ultra Progressive" AA modes are total overkill, if not pointless marketing blah-blah at all. In 99% real cases MSAA 4x is more, than enough, with transparency AA enabled, of course."

I know of at least 1 person on our forum who will disagree with the notion that "MSAA 4x is enough and you can't discern any differences going higher." So again, the subject of image quality itself can get subjective at times. I mean some people prefer 0AA-4AA and 100 fps over 32x AA and 50 fps. Nothing wrong with that. ;)

======

The issue here is about a comparison between drivers (10.10 vs 10.9), not graphics cards. Do HD5800 produce better image quality with Cats 10.9 than HD6800 cards do when run in Quality mode? Yes. Can you get better image quality with HD6800 cards than HD5800 cards? Yes, by enabling HQ by default.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
Im actually one of those people that would rather have a max fps of 100, with 4AA, than a max of 50 fps and 32xAA.

Same principle for tessellation... games need to have acceptable fps before eyecandy is worth anything to me.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Can someone please post pics of a 68xx in a newer game(dx11) (10.09 HQ + 10.10 HQ drivers) + 4xx series (standart quality setting) in same game.

until then this wont be resolved.

A user from XtremeSystems got an HD6870 and ran this:

Just Cause 2 (watch in 1080P full screen)

Look at the texture banding on the vents on the floor on the right of the character and also when he looks down on the ramp with the vents. That's with Cat AI set to High Quality.

EDIT: Looks like inability of HD68xx to do supersampling in DX10/11.

BFG, this is your calling card for the next Fermi vs. HD58/68xx image quality analysis!
 
Last edited:

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
wrong quote

Thats all fine and practical. But in reality after a new gpu is released, we have endless debates based on reviews where 8% difference is deemed by rabid fans of either side a stunning victory, a next generation product making the competitor obsolete. These conclusions are often argued about by people that don't even
GAME !
So they aren't worried about the IQ anyways, but they now have their ammunition in the form of bar charts and graphs !
And THAT is why these companies attempt to get the results, sometimes anyways they can.
We also had cards with extra SP's released. We have been arguing over who was right or wrong in AMD asking for certain tesselation heavy benchmarks be excluded.
IMO, these things will continue to happen, and to often its Nvidia accused of these shenanigans.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
A user from XtremeSystems got an HD6870 and ran this:

Just Cause 2 (watch in 1080P full screen)

Look at the texture banding on the vents on the floor on the right of the character and also when he looks down on the ramp with the vents. That's with Cat AI set to High Quality.

Just seen that, a couple of posts further another guy says he gets the same issue with their GTX 275
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Just seen that, a couple of posts further another guy says he gets the same issue with their GTX 275

I think I found the answer for the vents.

AlienbabelTech.com:

"The 8000/9000/200 series now supports SSAA too, but only in DX9 and OpenGL. [GF100/104] nVidia's advantage over ATi include the fact that SSAA functions in DX10/DX11, and also that the samples can be decoupled from the base MSAA level. nVidia now allows full scene super-sampling with rotated and sparse grids."

0xSSAA as you would get on GTX275 in DX10/11 game vs. 8xSSAA as you would get on GF100.

This likely explains why GTX275/HD68xx aren't going to super-sample the vents in Just Cause 2.
 
Last edited:

Leadbox

Senior member
Oct 25, 2010
744
63
91
I think I found the answer for the vents.

AlienbabelTech.com:

"The 8000/9000/200 series now supports SSAA too, but only in DX9 and OpenGL. [GF100/104] nVidia's advantage over ATi include the fact that SSAA functions in DX10/DX11, and also that the samples can be decoupled from the base MSAA level. nVidia now allows full scene super-sampling with rotated and sparse grids."

0xSSAA as you would get on GTX275 in DX10/11 game vs. 8xSSAA as you would get on GF100.

This likely explains why GTX275/HD68xx aren't going to super-sample the vents in Just Cause 2.

Excellent find :thumbsup:
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
I think I found the answer for the vents.

AlienbabelTech.com:

"The 8000/9000/200 series now supports SSAA too, but only in DX9 and OpenGL. [GF100/104] nVidia's advantage over ATi include the fact that SSAA functions in DX10/DX11, and also that the samples can be decoupled from the base MSAA level. nVidia now allows full scene super-sampling with rotated and sparse grids."

0xSSAA as you would get on GTX275 in DX10/11 game vs. 8xSSAA as you would get on GF100.

This likely explains why GTX275/HD68xx aren't going to super-sample the vents in Just Cause 2.

So the texture banding has nothing to do with texture filtering settings?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
So the texture banding has nothing to do with texture filtering settings?

It looks like I was wrong on the texture banding being related to AF. It looks like that particular example is actually a function of SSAA application, not AF. We learn something new every day. ;)