Testing Nvidia vs. AMD Image Quality

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

German_IQfreak

Junior Member
Nov 20, 2010
9
0
0
Well you can't turn off driver optimizations on nVidia cards.

Fair settings would be for both to be
AF=HQ
But in those situations AMD cards never took a big hit, the bigger hit comes from turning off driver optimizations.

Really?;)


Sometimes I get the feeling that many Americans don't care anything about IQ.
BTW
Therefore it was possible that Germanuser find Options like SGSSAA on GF10x/110 @ all APIs and etc.
 
Last edited:

Outrage

Senior member
Oct 9, 1999
217
1
0
So nvidia have some af bugs in older games, just have to hope those germans do a write up about that also.
 
Feb 19, 2009
10,457
10
76
Just test games at 4x AA, both control panels on HQ mode OR default for both. Don't use fancy stuff like CSAA, MSAA, MLAA or QxAA. Do a separate article on perf impacts of those AA features.

Why 4x AA? It's image quality is good and both sides have comparable quality. Its what most ppl with mid-range use for lower res, and high end users use for high rest. Plus, who really notices the difference between 4x and 8x AA anyway, unless they squint and look hard enough?
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,330
126
A guru3d forums response to Rollo's same topic on their forum:

http://forums.guru3d.com/showthread.php?t=332883

Really don't know how to consider image quality from both sides other than manually watching the screens :p

So Rollo is spreading this nvidia spin around on forums as well ?

You know nvidia is ramping up some propaganda when they have Rollo, their viral marketer they polluted this forum with on the fud spreading case.

Good thing he was banned from these forums for his viral crap and we don't have to get inundated with too much of this nvidia pr spin.
 

Elfear

Diamond Member
May 30, 2004
7,165
824
126
From the Guru3D thread linked above:


so what, if they took nv under the magnifying glass you would be surprised that they aren't anything better, if even more back stabbing.

Like this Transparency(Tr) MSAA - instead of traditional multisampling on older gpus prior geforce 400 its now some kind halfway TrMSAA with worse image, it started with cuda 3.x.x /256+ drivers, when a driver detects non fermi gpu it switches to this uglier technique., but fermi uses normal multisampling.. if you test older drivers 190+ with cuda 2.x.x it uses normal transparency multisampling aa..


normal TrMSAA is the first one that's disabled, now with cuda 3.x.x. its the 2nd
- ALL_MODE_REPLAY_MODE_ALPHA_TEST and that has worse image.. normal Tr supersampling is untouched.


But you wouldn't know about it if you didn't check with nv inspector,. and nv was all quiet about it.. all they claimed was that with 256+ it will be faster and im talking about the whole driver in general and that was it..

If I'm interpreting that right, Nvidia also recently made their default setting use more optimizations.
 
Nov 20, 2010
1
0
0
Sometimes I get the feeling that many Americans don't care anything about IQ.
Sometimes I get the feeling that many Germans don't care anything about FPS.

Additional account for Ben90 who is taking a little break from posting.
Admin allisolm
 
Last edited by a moderator:

Xarick

Golden Member
May 17, 2006
1,199
1
76
I think you can quantify it if you are comparing against a target image. The problem comes up when the target image has been developed on one of the competing cards. Personally I think some of it comes down to preference.
Though I did watch those videos and there is definitely an issue. However, it is pretty hard to notice in actual gameplay. Infact on some games they had to slowly move backwards to show the issue and even then it was subtle.
The problem is they then completely dismissed the issues the other card was showing. For example in the Witcher shot they show the filtering issue on the boardwalk which is noticeable, but ignore the issue on the competing card with the man standing at the end of the boardwalk.
So in the end some of it is subjective. I know that there are things I do not like about my ATI card to the point I have considered selling it. Yet others have no issue with or don't notice.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Did our "experts" miss this question? Or, is there no way to objectively measure IQ?

There are some basic MIP-level tests that can be done such as the "cylinder" test. However, just like some people prefer more vibrant tones for their TVs or Digital cameras vs. the reference color spectrum, IQ will always be considered subjective since each person ultimately has their own preferences. Some will notice/care about texture filtering issues and some will not.

If you can't notice the difference, then keep everything at default. If you can notice the difference, feel free to move the slider to HQ. Whatever the gamer does at home is his/her option. I think the relevant point here is that when testing videocards apples-to-apples, the intention is to always compare similar workloads on the videocards. If one videocard is trying to render something below the reference image level of the game developer, it's generally considered in the industry as an unacceptable optimization.

NV has been caught cheating in the past in 3dMark03, as well as with their tri-linear filtering optimizations.

I quote:

"As you can see from the screenshot above, the [NV] quality-performance modes have been renamed again: Performance mode turned into High Performance, while former Balanced mode is now called just Performance.

You can clearly see that in Balanced and Performance modes tri-linear filtering has degraded almost to pure bi-linear filtering: smooth transitions between the MIP-levels got down to narrow bands 2-3 pixels wide. Besides that, the level of detail in Performance mode has been partially lowered: the first MIP-level border has moved closer.

It is evident that further degradation of tri-linear filtering in 43.80 driver is a way to increase GeForce FX chips performance when both: anisotropic filtering and tri-linear filtering are used. So there is only one option left: to disable tri-linear filtering [optimizations]." - Xbitlabs FX5900 review

Notice, this type of texture filtering optimization was not taken lightly. Without testing on a game-by-game basis, many websites such as Xbitlabs and AnandTech simply started testing FX series with tri-linear optimizations OFF, regardless if these optimizations didn't affect every game (because it was unfeasible to test every single game in reviews with them On/Off).
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
epic post from Guru3D.com


lol or nvidia with its bilinear anisotropic filtering., same $hit different story..

you have to manually force trilinear ingame (Fear1,2 and few other games) and then force anisotropic filtering in driver to get normal filtering - trilienear aniso filtering without that bow wash effect..

with blurred filtering in the middle

1.jpg


2.jpg


or force trilinear in config file (unreal 3 games) otherwise same bow wash -bliniear aniso filtering..


or this lame trick in nv control panel: transparency antialiasing - multisampling mode with older gpus G80, g92, g200.. with new 2xx.xx drivers it forces them to use some weak supersampling method instead of true multisampling like with older drivers, but Fermi g100, 104, 110 is allowed and is still using this normal multisampling method.

with old drivers 195 and up to 198 it was normal

jaawie.jpg

but now with newer 257 and up to most recent 263.xx its forced to use supersampling: AA_MODE_REPLAY_MODE_ALPHA_TEST = uglier image



dont get me wrong i like nvidia, its just these small things that make me mad., and these small things matter the most since it represents final product.




Nvidia cards have issue too, and its possible to find games where that shows as well... yet I dont think the amd forums (if there is even such a thing) have articals about how poor nvidia IQ is in those games.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Yeah, Fear was a very buggy game. I think Nvidia shows an exception once in a while, at the same time, AMD shows the norm. The norm being lower image quality.
Hey, how about instead of all the he said she said, lets get some more review sites to take on this challenge? Why war between ourselves? Lets get some answers.
 
Last edited:

Larries

Member
Mar 3, 2008
96
0
0
Yeah, Fear was a very buggy game. I think Nvidia shows an exception once in a while, at the same time, AMD shows the norm. The norm being lower image quality.
Hey, how about instead of all the he said she said, lets get some more review sites to take on this challenge? Why war between ourselves? Lets get some answers.

What you apparently missed is that, according to the guru3d poster, the setting was working till 198, but not working in 2xx anymore. (hard to argue this is a bug in the game in this case...)

You cannot just ask the reviewers to simply set the setting in the driver (e.g. in Nvidia's case, the option may not work).

So, what option do you left with? Manually and visually checking the IQ when reviewing the cards?

Also, IQ checking is not a one time exercise. It has to be a continuous check to make sure neither side introduce 'hidden features' or 'bugs' into the drivers (in both ATI and Nvidia case, the lower IQ does not happen until after certain driver version). And more interestingly, it may only affect certain cards.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
If somebody is interested in more video- and picture comparisons than please take a look at this thread
http://www.forum-3dcenter.org/vbulletin/showthread.php?t=482069

On the last pages are some comparisons between Cypress, Barts, R520 and Nvidiacards too, which shows that Nvidia offers a far better IQ. But especially the thing that AMD should be ashamed of is that R520 with AI Off offers a much better filtering than Cypress and Barts.

Lol at the Guildwars screenshots.

Now that is a game I have over 7000 hours logged.

Here is my Lion's Arch screenshot of the same place with a 4850 4xAA in game, Cat AI advanced, 16xAF and driver AAA on Quality.



Now GW is a bit shimmery, especially in the older campaigns - but it was so on the Ti4200, it was on the 7600GT and 7900GT, it was on the 9400M, it is on the 4850 and on the GTX260, which while I don't own one I've played on a system with it.
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,700
406
126
Also, IQ checking is not a one time exercise. It has to be a continuous check to make sure neither side introduce 'hidden features' or 'bugs' into the drivers (in both ATI and Nvidia case, the lower IQ does not happen until after certain driver version). And more interestingly, it may only affect certain cards.

Sincerely if you caught AMD or NVIDIA lowering IQ on a game that is constantly benchmarked that is one thing.

But believing either AMD or NVIDIA lower IQ on old games nobody (with the possible exception of BFG10K and TechPowerUP, ok you too apoppin) test to get better frame rates is silly.

What is the gain for NVIDIA or AMD by lowering IQ in games like HL2, GW, Oblivion, FarCry or Fear?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
What you apparently missed is that, according to the guru3d poster, the setting was working till 198, but not working in 2xx anymore. (hard to argue this is a bug in the game in this case...)

You cannot just ask the reviewers to simply set the setting in the driver (e.g. in Nvidia's case, the option may not work).

So, what option do you left with? Manually and visually checking the IQ when reviewing the cards?

Also, IQ checking is not a one time exercise. It has to be a continuous check to make sure neither side introduce 'hidden features' or 'bugs' into the drivers (in both ATI and Nvidia case, the lower IQ does not happen until after certain driver version). And more interestingly, it may only affect certain cards.

Of course we can ask the reviewers to simply set the setting in the driver if that setting gives IQ on an equal playing field. Specifically, the cards are doing the same amount of work and actually giving similar quality as a result of that similar work. Not a hard thing to comprehend and certainly not out of line in asking for this in a tech sites review. If I'm reading benchmarks of cards, I would like to think that it's apples to apples. Not Apples with a touch of cinnamon to apples. It's deceiving. Release a new card, set the testing drivers to a lower IQ during the week before NDA expires. Wait til benches are released and hope nobody checks into it, but if they do, be ready to claim driver bug and offer hotfixes long after initial reviews are out there for all to see with the bar charts representing the lower IQ driver. Doesn't matter now, it's already out there. So, a new user searching for benches gets the shaft.
 

tannat

Member
Jun 5, 2010
111
0
0
So Rollo is spreading this nvidia spin around on forums as well ?

You know nvidia is ramping up some propaganda when they have Rollo, their viral marketer they polluted this forum with on the fud spreading case.

Good thing he was banned from these forums for his viral crap and we don't have to get inundated with too much of this nvidia pr spin.

At least most people still know that Rollo is member of the nvidia focus group and Keysplayr admits his involvement in it by necessity.

IMO, It's even more disturbing with the unknown people who may be part of any viral marketing effort disguised as common forum members. I'm certain they do exist here also.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
At least most people still know that Rollo is member of the nvidia focus group and Keysplayr admits his involvement in it by necessity.

IMO, It's even more disturbing with the unknown people who may be part of any viral marketing effort disguised as common forum members. I'm certain they do exist here also.

Oh don't be so paranoid. There aren't any viral marketers here. Not whatsoever.
 
Last edited:

German_IQfreak

Junior Member
Nov 20, 2010
9
0
0
Lol at the Guildwars screenshots.

Now that is a game I have over 7000 hours logged.

Here is my Lion's Arch screenshot of the same place with a 4850 4xAA in game, Cat AI advanced, 16xAF and driver AAA on Quality.



Now GW is a bit shimmery, especially in the older campaigns - but it was so on the Ti4200, it was on the 7600GT and 7900GT, it was on the 9400M, it is on the 4850 and on the GTX260, which while I don't own one I've played on a system with it.

Sorry but what do you want to show me with this Shots ?
I think it should be clear that only videos can show the real quality of AF.

To clarify this please take a look at the Drakensang 2 shots and videos.
http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8383575&postcount=1704
http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8395327&postcount=1746

The shots on AMD are much shaper than Nvidia but the texture shimmering is epic.
 

Xarick

Golden Member
May 17, 2006
1,199
1
76
But isn't it on the head of the review sites to thoroughly inspect and review the options available in the drivers? ATI shouldn't have lowered default quality, but even I clearly noted the additional option in the CCC right away. A reviewer should have wanted to understand that before running benches.
 

tannat

Member
Jun 5, 2010
111
0
0
Oh don't be so paranoid. There aren't any viral marketers here. Not whatsoever.

That's reassuring, and maybe I'm paranoid:)

Sincerely, I wish I could take your word for it. But I don't understand how you can know. Maybe you have insight and can talk for the nvidiateam, but I learned to not trust any such statements from nvidia anyway. For ATi/AMD, I have no idea if they are into viral marketing. Some seem to be convinced so though.
 

KARpott

Member
Sep 23, 2010
43
0
0
Do you want to know why some GERMAN sites have come up with this issue?
This might sound stupid, arrogant or bigheaded or whatever, but the motivation for those guys to write about this mainly comes from our efforts at
http://www.3dcenter.org/
in analysing IQ.
Apart from some guys at beyond3D and here (namely BFG), we were the once that made the AMD banding issue known among German reviewers. We were the once to analyze the IQ changes with Cat10.10 and and and...
Many writers from German Tech-sites do have accounts at the 3DCenter forums, we have many 3D "gurus" with expert knowledge both in the software and hardware side of everything that belongs to 3D in any possible way. Not that this all wouldn't be the case for other forums across the world as well, but this more or less explains why suddenly, all the GERMAN sites do come up with AMDs IQ issues. Just compare it with the Cypress reviews. They were all sure that AMD had the best IQ on the market, just like in AnandTech review.
Before I have posted the TrackMania shots showing the banding, none of these German sites cared about this game, now it seems to be a quite popular example. To be able to understand this, you guys must consider that here in Germany, traditionally, the PC is much important as a gaming platform than in most other countries. I don't want to sound discriminating, but to me, it seems to be a fact that console players do not pay as much attention to IQ as PC players. One of the main reasons for playing with the PC is actually IQ, and frankly, I don't want to spend hundreds of euros in crappy image quality, no matter how good the performance is.

Now to the topic:
Computerbase.de : +6% average performance gain when using default quality
http://www.computerbase.de/artikel/grafikkarten/2010/bericht-radeon-hd-6800/13/

My research at 3DCenter: about -6% IQ loss (if you can somehow quantify it) when using default quality
http://www.forum-3dcenter.org/vbulletin/showpost.php?p=8373318&postcount=847

Unfortunately, knowledge of the German language is required.
 
Last edited: