Testing Nvidia vs. AMD Image Quality

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
My 10.11s for my 5870 came via Windows Update and the setting remained on the "High Quality" the slider didn't adjust itself. He states further that even on the 5xxx series there was adjustments and that I would have to disable CAT AI to achieve the same as NVs default quality. It's still on standard and I have noticed no difference in games versus the previous 9.8s that I had installed.

Maybe there is a difference, maybe there isn't. Personally I couldn't tell between the two driver sets. Further until I see what I consider, and I don't mean to come off the wrong way here, english language in-depth testing of image quality from reputable sites such as AT, HardOCP I'm inclinded to believe that any differences discovered are negligable or are marketing speak.

I'm looking seriously at a GTX 580 vs a 6970 and it would be nice to have a seriously indepth non bias review of image quality between the two competitors and their latest and greatest and get the exact meat of if there is any difference, and if there is - does it really matter. I also find it strange that 4 german sites are seemingly all over this and the english language review sites seem to either not care, or do not have an issue so far.

Bold ^
Maybe that is exactly what you and a ton of others should be asking for, or quite frankly, demanding.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Noting new here.

Until I see this happening in recent games it is just a driver bug, plus the fact that AMD changed the way the Cat AI+mipmap are presented in the driver compared to older series and now that combined CAT AI+mipmap settings are Performance, Quality, High Quality while before you had Mipmap with high performance, performance, quality, high quality plus a Cat AI tab with OFF, Standard and Advanced.

AMD made a mistake by using Performance/Quality/High Quality for the new combined tab, because now the default is Quality, which makes it immediately lower IQ because Higher Quality is better than Quality even if the sliders don't exactly control the same - had AMD used something different for the new tab and then it would be a lot harder to say the image quality was reduced without the obvious High Quality -> Quality.
 
Last edited:

ronnn

Diamond Member
May 22, 2003
3,918
0
71
I love pr. Just put kyle on it, or better yet show us the videos of actual gameplay.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
While AMD makes a blog post cheering at Intel and ARM for embracing OpenCL, Nvidia makes FUD posts about their competitor--yay?

"AMD promotes “no compromise” enthusiast graphics, but it seems multiple reviewers beg to differ." OH NO!!!!

Really guys? Focus on products instead of sleaze. Any difference in image quality unless there is a specific bug or something, is semantics at this point. Both team's drivers are fine. They both have some advantages and disadvantages (AMD edge detect works great in some games, I prefer AMD's video quality and adjustments, NVIDIA's combined super sampling+multisampling modes that nvidiainspector exposes work great in old games).

Having a nice IPS or PVA monitor impacts picture quality *FAR* more than video card brand does (night/day difference).
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Noting new here.

Until I see this happening in recent games it is just a driver bug, plus the fact that AMD changed the way the Cat AI+mipmap are presented in the driver compared to older series and now that combined CAT AI+mipmap settings are Performance, Quality, High Quality while before you had Mipmap with high performance, performance, quality, high quality plus a Cat AI tab with OFF, Standard and Advanced.

AMD made a mistake by using Performance/Quality/High Quality for the new combined tab, because now the default is Quality, which makes it immediately lower IQ because Higher Quality is better than Quality even if the sliders don't exactly control the same - had AMD used something different for the new tab and then it would be a lot harder to say the image quality was reduced without the obvious High Quality -> Quality.

QFT

So its still not shown in games apart from Oblivion and some AF testers. If anything you have to give AMD credit for the fact that they can get it to work so well in games that you can't see a difference. If there is one.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
Bold ^
Maybe that is exactly what you and a ton of others should be asking for, or quite frankly, demanding.

They already exist...

http://www.techpowerup.com/reviews/HQV/HQV_2.0/8.html (video playback quality comparison) "When comparing ATI against NVIDIA, we see higher scores across the board on ATI and more elaborate options in the driver settings." and "I am also shocked that neither ATI nor NVIDIA provide a good set of defaults that help novices get the best out of their graphics processor in regard to video."

http://www.computerbase.de/artikel/grafikkarten/2010/bericht-radeon-hd-6800/ (found AMD to be slightly worse texture filtering than Nvidia so they used high quality for AMD and quality for Nvidia)

There's a lot more out there as well. It seems like right now essentially AMD has slightly higher quality filtering but by default has more optimizations enabled (which they figure, and probably rightfully so, that most people will prefer more fps to imperceptible-to-most quality differences). I can't really tell any difference in filtering quality between my 470 on my box and the 5770 on gf's box. Texture quality is basically flawless in both cases.

Again, to re-iterate: If you care about picture quality, focus on getting a nice IPS or PVA LCD display. It will do more for your picture quality enjoyment than anything else will.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
Just over at google business to check Nvidia , its been going up at nice pace last few months.
Top story picked up on the ticker.
http://www.google.com/finance?q=NASDAQ:NVDA&client=news
Review Websites Discover AMD Driver Reduces Image Quality

What Editors Discovered
Getting directly to the point, major German Tech Websites ComputerBase and PC Games Hardware (PCGH) both report that they must use the "High" Catalyst AI texture filtering setting for AMD 6000 series GPUs instead of the default "Quality" setting in order to provide image quality that comes close to NVIDIA's default texture filtering setting. 3DCenter.org has a similar story, as does TweakPC. The behavior was verified in many game scenarios. AMD obtains up to a 10% performance advantage by lowering their default texture filtering quality according to ComputerBase.
AMD's optimizations weren't limited to the Radeon 6800 series. According to the review sites, AMD also lowered the default AF quality of the HD 5800 series when using the Catalyst 10.10 drivers, such that users must disable Catalyst AI altogether to get default image quality closer to NVIDIA's "default" driver settings.
Going forward, ComputerBase and PCGH both said they would test AMD 6800 series boards with Cat AI set to "High", not the default "Quality" mode, and they would disable Cat AI entirely for 5800 series boards (based on their findings, other 5000 series boards do not appear to be affected by the driver change).
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
http://benchmarkreviews.com/index.php?option=com_content&task=view&id=12845&Itemid=47
EDITOR'S NOTE: This is a disturbing article, and the sources here are critical for legitimacy. NVIDIA is a direct competitor to AMD and is the author of this article, which lead some readers to ignore the message. However, it was several independent review website's that first brought this issue to the forefront, and proved it exists. I personally trust these websites, particularly 3DCenter.org, and have found them to be unbiased over the years.

Benchmark Reviews can confirm that issues with filtering still exist, and pointed this out in our Radeon HD 6850 and Radeon HD 6870 launch articles. We also made it public that certain AMD partners were sending 'juiced' video card samples to reviews sites, ours included, with details published in our 1120-Core "Fixed" Radeon HD 6850 Review Samples Shipped to Media article. So could this be AMDs last ditch effort to compete with NVIDIA by manipulating performance?
 

Red Storm

Lifer
Oct 2, 2005
14,233
234
106
Can someone link any screenshots that visually prove the claims from the articles? I've skimmed through them but I personally am not seeing any noticeable differences. I even went through the Oblivion one which everyone says is the one game where you can see it, but apparently not me. :(
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Noting new here.

Until I see this happening in recent games it is just a driver bug, plus the fact that AMD changed the way the Cat AI+mipmap are presented in the driver compared to older series and now that combined CAT AI+mipmap settings are Performance, Quality, High Quality while before you had Mipmap with high performance, performance, quality, high quality plus a Cat AI tab with OFF, Standard and Advanced.

AMD made a mistake by using Performance/Quality/High Quality for the new combined tab, because now the default is Quality, which makes it immediately lower IQ because Higher Quality is better than Quality even if the sliders don't exactly control the same - had AMD used something different for the new tab and then it would be a lot harder to say the image quality was reduced without the obvious High Quality -> Quality.

This ONLY effects 58xx/5900/68xx series GPU's. Not anything lower powered. If it was a driver bug, it would effect across the lineup.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
To clarify, this is mainly about benchmarking, correct? That is, the drivers are optimized at the default setting so that when both companies have the default settings used in benchmarks AMD will have a performance advantage and appear to be faster than their Nvidia equivalents. This issue is at the level of gpu rendering and needs to be tested to be verified as it is hard to discern with the eyes. In other words, if people want to play games with the optimized settings that is fine. However, when it comes to benchmarking apples to apples, both drivers need to be set on an apples to apples basis in order to make a fair comparison in performance.

Is this correct Keys?
 
Last edited:

extra

Golden Member
Dec 18, 1999
1,947
7
81
To clarify, this is mainly about benchmarking, correct? That is, the drivers are optimized at the default setting so that when both companies have the default settings used in benchmarks AMD will have a performance advantage and appear to be faster than their Nvidia equivalents.

Essentially here's the issue in a nutshell:
1. AMD tried to slightly simplify CCC and make it less confusing.
2. They made the default settings for optimizations the ones that the vast majority of regular end-users will be best served by using (higher performance, imperceptible image quality compromises).
3. Some (small minority) of people jump on it as "omg it's a conspiracy" because they are for some reason incapable of going into CCC to change settings.

Hint: If people are having to go onto the internet, take screenshots, and then zoom in and look at tiny portions of the screenshot to find where optimization "might be effecting something, somewhere"...then the image quality impact is a non-issue.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I love pr. Just put kyle on it, or better yet show us the videos of actual gameplay.

I was looking at the HL2 screenshots, couldnt notice anything either.

To be fair, texture shimmering/mip-map deficiencies are very difficult to show in a screenshot. You will see mipmap levels not blending together. The effects are more evident in video format. Here you go.


===========

On a side note, AMD needs to focus on new desktop product availability. HD6850/70 cards are about $20 more expensive than their respected MSRP prices. I think IQ is the least of AMD's "issues" at the moment.

GTX460 768mb is still the best card at $120-160
GTX460 1GB has no competition at $160-190
GTX460 1GB factory pre-overclocked versions are faster than HD6850s at $190-220
HD6870 still can't beat a GTX470 despite a $50 price premium on Newegg.
GTX580 has no single GPU-competition, while HD5970 supplies are dwindling.

Ironically the best gaming cards AMD has at the moment are last generation cards such as the $185 HD5850 and $270 HD5870.

I see AMD continuing to lose market share in the desktop discrete segment unless they can fix their HD6850/70 supply issues in Q4 2010. Also, that HD5770 card is in serious need for a replacement.
 
Last edited:

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
Essentially here's the issue in a nutshell:
1. AMD tried to slightly simplify CCC and make it less confusing.
2. They made the default settings for optimizations the ones that the vast majority of regular end-users will be best served by using (higher performance, imperceptible image quality compromises).
3. Some (small minority) of people jump on it as "omg it's a conspiracy" because they are for some reason incapable of going into CCC to change settings.

Hint: If people are having to go onto the internet, take screenshots, and then zoom in and look at tiny portions of the screenshot to find where optimization "might be effecting something, somewhere"...then the image quality impact is a non-issue.
The issue is not whether or not people can see the optimizations during game play. The issue is with benchmark testing. I worded my post in a way to try to clarify the matter better for people.

People are confusing the matter with the "I can't tell a difference so I don't care" ramblings.

It's not about that at all, as far I can tell. It is a matter of looking at a benchmark review and knowing that the results are not being skewed by one company vs. another due to optimizing drivers in a way to make them faster. The only way to verify this is with testing applications, and then have the cards benched on an apples to apples 'settings' basis in order to do a proper analysis of performance.
 

extra

Golden Member
Dec 18, 1999
1,947
7
81
The issue is not whether or not people can see the optimizations during game play. The issue is with benchmark testing. I worded my post in a way to try to clarify the matter better for people.

People are confusing the matter with the "I can't tell a difference so I don't care" ramblings.

It's not about that at all, as far I can tell. It is a matter of looking at a benchmark review and knowing that the results are not being skewed by one company vs. another due to optimizing drivers in a way to may them faster. The only way to verify this is with testing applications, and then have the cards benched on an apples to apples 'settings' basis in order to do a proper analysis of performance.

I agree with you on that. The thing that they need to do is make the driver info more clear on what settings do exactly what (the amd driver is really bad about this, the nvidia driver is better but far from great). Then sites need to be more knowledgeable that if they want identical optimizations performed (or none at all) then they may need to change default settings (which is okay, just tell the readers that you're doing it).
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Hasn't this already been beaten to death in the other thread? Did this really need a repost?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
To be fair, texture shimmering/mip-map deficiencies are very difficult to show in a screenshot. You will see mipmap levels not blending together. The effects are more evident in video format. Here you go.


===========

On a side note, AMD needs to focus on new desktop product availability to return to their respected MSRP prices and not on IQ at the moment.

GTX460 768mb is still the best card at $120-160
GTX460 1GB has no competition at $160-190
GTX460 1GB factory pre-overclocked versions are faster than HD6850s at $190-220
HD6870 still can't beat a GTX470 despite a $50 price premium on Newegg.
GTX580 has no single GPU-competition, while HD5970 supplies are dwindling.

Ironically the best gaming cards AMD has at the moment are last generation cards such as the $185 HD5850 and $270 HD5870.

I see AMD continuing to lose market share in the desktop discrete segment unless they can fix their HD6850/70 supply issues in Q4 2010. Also, that HD5770 card is in a serious need for a replacement.

The thing is people keep buying AMD cards at those prices. Also AMD cards are generally Cheaper than nV cards elsewhere around the world. A 6850 is around $55-$60 cheaper than a stock 460 and the same price as a 460-768mb. a 6870 is the same price a a stock 460 1GB and a massive $120 less than the 470.

In the UK, the gaps iin pricing are much smaller, but a 6850 is a little cheaper than the 460 1GB and the 6870 is a little cheaper than the Overclocked 460s.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The thing is people keep buying AMD cards at those prices.

Obviously at the prices you listed, AMD cards are the way to go. :D The US market is still pretty large. AMD had an absolute slam dunk at $179 and $239. Now at $199 and $259 or so, these cards aren't as good of a deal anymore. They really need to fix their supply issues since faster GTX560/570 will soon replace the 460/470/480 cards.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
This ONLY effects 58xx/5900/68xx series GPU's. Not anything lower powered. If it was a driver bug, it would effect across the lineup.

Not if it is due to optimizations.

Otherwise you wouldn't see driver notes stating "improvements up to x% at certain resolution/x level of AA for series X of cards" from both NVIDIA and AMD.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
The issue is not whether or not people can see the optimizations during game play. The issue is with benchmark testing. I worded my post in a way to try to clarify the matter better for people.

People are confusing the matter with the "I can't tell a difference so I don't care" ramblings.

It's not about that at all, as far I can tell. It is a matter of looking at a benchmark review and knowing that the results are not being skewed by one company vs. another due to optimizing drivers in a way to make them faster. The only way to verify this is with testing applications, and then have the cards benched on an apples to apples 'settings' basis in order to do a proper analysis of performance.

Yes AMD is clearly skewing the benchmark results for Oblivion, Half-Life 2 and Trackmania.

Really hot games in recent reviews.

Driver optimizations are part of a graphics card life and mostly what these optimizations do is selecting what information needs to be rendered and what can be discarded since it isn't seen by the user to improve performance.
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
I always have the drivers set to "High Quality" Mipmap detail level and Catalyst A.I. disabled, never noticed any issues with image quality in games. Maybe review sites should use these settings and also force the same on NV drivers when testing, although I don't think there is a setting in NV drivers that allows you to disable game optimizations like Catalyst A.I does. I haven't used an NV card for over 2.5 years, so not really sure about this.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
I find that while this is true and disappointing for AMD to do it is really tacky of Nvidia to post this on their site. Leave it to the review sites. Right now it just looks like Nvidia is so desperate to sell cards that they have stooped to mud slinging. Yeah it is true, but none the less tacky. I hate it when they do it in politics and not fond of seeing it here.

The one comfort you can take out of this whole thing if you are like me and do own an AMD card is that it is fixable. You can turn off the optimizations at the cost of some frames to get the best image quality.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Not if it is due to optimizations.

Otherwise you wouldn't see driver notes stating "improvements up to x% at certain resolution/x level of AA for series X of cards" from both NVIDIA and AMD.

It is indeed an optimization. For higher framerates at the cost of image quality.