Should hardware sites benchmark all cards AMD and nVIDIA on High Quality mode?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What do you think of the IQ issue?

  • Do you believe it's fair, that just put AMD on HQ and nVIDIA default?

  • Do you believe it should be the same on both as much as possible?

  • Are you not sure what to think, or think it's not an issue?

  • Do you think this IQ thing is just "fud"?

  • Are you fed up of hearing about this issue?


Results are only viewable after voting.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
When StarCraft II was launched it had to be tested withouth AA because AMD did not support it (just like Batman and some other titles). Same with PhysX, CUDA and ambient occlusion. Now testers have to dumb down AF to reach AMD's level.

I say benchmark at AMD's "level" and then show the readers what you can get when you enable all the image quality that AMD does not have.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
When StarCraft II was launched it had to be tested withouth AA because AMD did not support it (just like Batman and some other titles). Same with PhysX, CUDA and ambient occlusion. Now testers have to dumb down AF to reach AMD's level.

I say benchmark at AMD's "level" and then show the readers what you can get when you enable all the image quality that AMD does not have.

And 99% of people won't notice a difference.
 

Mistwalker

Senior member
Feb 9, 2007
343
0
71
When StarCraft II was launched it had to be tested withouth AA because AMD did not support it (just like Batman and some other titles). Same with PhysX, CUDA and ambient occlusion. Now testers have to dumb down AF to reach AMD's level.

I say benchmark at AMD's "level" and then show the readers what you can get when you enable all the image quality that AMD does not have.
As I'm sure you are fully aware (or if not, allow me to educate you), Starcraft II was benched at launch sans AA because Blizzard chose not to support it. Nvidia had an AA method that could be brute forced, AMD did not. Honestly Nvidia still walks away the winner here, even if AMD did scramble to get a solution our fairly quickly, but please don't be intentionally misleading.

Still, it sounds like we're overall in agreement: benchmark at the same settings and let sites/users judge the IQ differences for themselves.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
IQ differences?

HQbutt.png


Image selection taken from the Computerbase images linked here:
http://forums.anandtech.com/showthread.php?t=2117420

(And yes, I'm bad at cropping when I use Paint)

It does seems AMD is going the path of NVIDIA. I always preferred sharper textures.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Amd also has abit more shimmering than nvidia does...

These arnt really filtering issues though due to drivers... going to HQ (instead of default) doesnt fix it... its a hardware issue.
Which is why its odd people use it as a reason for testing default nv vs hq amd settings, its not a driver issue, its a hardware issue thats still there with HQ settings.

people are confuseing hardware issues with driver optimsations, and useing it as a argument when its really not.

The shimmering is caused by AMD going with sharper textures while NVIDIA goes for blurrier textures.

But yeah, people disable CAT AI and turn drivers to HQ and accomplish exactly 0 to solve this, other than getting slower performance.
 

arredondo

Senior member
Sep 17, 2004
841
37
91
It's ridiculous that so much negativity is made of ATI optimizations that should instead be cheered and encouraged by all. They are maintaining visual fidelity in 99.9% of the games released while improving frame rates for all of them. Really, haters are going to call foul on THAT?

The review standard should reflect real-world logic by showing frame rate differences in comparisons along with visual differences. Using frame rates alone skews the practicality factor when both that and the overall graphic output that reaches our physical eyes are equally important.

If a card cuts too many corners to get high frame rates at the cost of image quality THAT THE HUMAN EYE CAN EASILY DISCERN, evaluate it negatively. If they can maintian both frame rates and IQ at a top tier level, then give them praise. I mean, if you need to freeze frame an obscure or ancient game and zoom in with a marked indicator to show where the offense is to make your point, that's pretty pathetic IMHO.
 
Last edited:

Kuzi

Senior member
Sep 16, 2007
572
0
0
The problem is NV doesn't give you the choice of disabling optimizations, while ATI does, so how can you be sure you are testing fairly? There may always be some slight differences in some games, but the majority will look exactly the same on both NV and ATI hardware anyway. But I say just test on HQ for both camps.

I think the whole thing is really blown out of proportion, to me it seems like ATI found a way to increase performance without degrading IQ, but NV and Co. are trying their best to take that few % increase away and maybe gain a few % lost market share doing that.

Sorry but I don't spend my day comparing screen shots that are zoomed in on 1000% to see which pixel looks different on this or that card, I actually like to "play" the game :D
 

arredondo

Senior member
Sep 17, 2004
841
37
91
If NV can't replicate a feature right now, so what? Do we refrain from comparing two basketball players for MVP consideration because one can dunk and the other can't? That's just silly, since the totality of talent in each individual must be weighed, not just individual skills.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
If NV can't replicate a feature right now, so what? Do we refrain from comparing two basketball players for MVP consideration because one can dunk and the other can't? That's just silly, since the totality of talent in each individual must be weighed, not just individual skills.

To use your basketball reference. Should we lower the basket so that AMD can play?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
do we really need another thread on this?


This type of post is not constructive nor helpful, it is a form of thread-crapping.

If your query is genuine and directed to the OP then you should refrain from posting it publicly and instead you should communicate it via pm to the OP.

If your query is genuine and directed to the VC&G community at large then you should refrain from posting it publicly in VC&G and instead you should communicate it via creating a thread on the topic in the PFI sub-forum.

If your query is genuine and directed towards the moderators then you should refrain from posting it publicly and instead you should communicate as much by way of reporting the thread.

Moderator Idontcare
 
Last edited by a moderator:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
To use your basketball reference. Should we lower the basket so that AMD can play?
AMD allows you to disable optimizations if you wish, Nvidia does not. Sounds like Nvidia is the one lowering the basket.
 

arredondo

Senior member
Sep 17, 2004
841
37
91
To use your basketball reference. Should we lower the basket so that AMD can play?

Nah, that doesn't fit. In my example I am saying Card A/Player A can do this particular action, but Card B/Payer B can't. If that particular action is valuable in in reaching the overall goal, then props to Card A/Player A. Don't lower the bar just to help Card B/Player B since that isn't the overall goal - it in fact goes opposite of the overall goal.

What is the overall goal? In basketball, it is for the team to win, and in video cards, it is to get the best possible frame rates with as little degradation in image quality that the human eye can reasonably see.

Helping the self-esteem of Card B/Player B (and their fans) is not a worthy goal. If we instead celebrate excellence, then that encourages the other card maker to step up their game and add optimizations to their line-up, then we all win.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
The problem is NV doesn't give you the choice of disabling optimizations, while ATI does
Ever seen an Nvidia control panel? Anisitropic Sample Optimization is off by default, but you are welcome to enable it. Trillinear Optimization is on by default, but you are welcome to disable it. There is a slider there for each, and they are clearly labled.

And another thing, it was actually ATI who got caught with their pants down not giving people an option to disable the new optimization without moving the slider to HQ IQ in the 68xx review. It was only added via hotfix after it was exposed, but the reviews were already done. That is what the hoopla was all about. The true performance numbers were not shown since the optimizations from both AMD and Nvidia were not set equal, as they were with previous drivers when their respective default settings were the same.

I guess dumbing down IQ is a new feature to be applauded when AMD does it, but if Nvidia would have done it...the very same people praising this would be having a cow atm.
 
Last edited:

arredondo

Senior member
Sep 17, 2004
841
37
91
It's not dumbing down if they are visually maintaining the IQ in 99% of the games while improving frame rates. In fact, that's a smart step forwards, not backwards. I'll applaud any card maker that does that.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
It's not dumbing down if they are visually maintaining the IQ in 99% of the games while improving frame rates. In fact, that's a smart step forwards, not backwards. I'll applaud any card maker that does that.

That's a slippery slope I would not want to see. Once they get you to accept lesser quality, they will just keep setting the bar lower and lower.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
That's a slippery slope I would not want to see. Once they get you to accept lesser quality, they will just keep setting the bar lower and lower.

That is isn't true since there is more than one company producing video cards.

If the IQ differences are noticeable to the naked eye then the rival company, NVIDIA, can sell cards based on better IQ. Simple as that.

So why did websites (some) stopped doing IQ comparisons? Because the differences are minimal.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
I'm going to post what I posted in the previous thread on this topic in the hope that it answers some discrepancies or blatant misinformation, or, better yet, ends the discussion like it did last time:

So I saw these posts and thought they were interesting. I don't know why these sites love Trackmania, but it seems that AMD does have much sharper filtering than NVIDIA:
http://forum.beyond3d.com/showpost.php?p=1499768&postcount=176
http://hardforum.com/showpost.php?p=1036521622&postcount=17
My guess is there's a problem with their filtering algorithm in some older games and how it handles low res/repeat textures. However, since it doesn't seem to be a problem in new games, I'll definitely take the superior image quality/sharpness. :thumbsup:

Here's a link to the rest of the thread for further reading: http://forum.beyond3d.com/showthread.php?p=1499768#post1499768

It's been this way for years. NVIDIA blurs the heck out of its textures in order to reduce shimmering and get a more seamless scene. AMD reproduces textures with the greatest fidelity and gives the sharpest image, at the cost of some shimmering being produced with certain types of low res/repeat textures. I prefer AMD's sharpness as I generally don't see shimmering in games, but the extra IQ from the texture details makes my games look amazing at 2560x1600. Also, look how far some of these sites have to go to find a game that actually produces some kind of shimmering. Ask yourself why they do this and furthermore why it's always at the release of a new AMD series and I think the answer is pretty easy to come up with ;).
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
..or, better yet, ends the discussion like it did last time
Obviously it didn't end it last time....And what you are saying here is not really what the issue was about, only what it became after allot of spinning and blowing smoke. Nvidiots took an inch and went a mile claiming NV had better IQ which is wrong, and Atidiots chimmed in saying AMD should be congratulated for the performance boost which is somewhat stupid. The fuss isn't over who's AF algorithms are better, etc...

The issue was, is, and always will be that AMD changed their defualt optimization level that screwed up 68xx review results with the original 10.10 driver because there was an AS optimization enabled that was different at default than Cat 10.9. They didn't offer a hotfix until asked direcly about it after it was discovered, and that is what rose eyebrows. Whether or not they did it on purpose to bench better, that is hard to say and a slippery slope....and not one really worth going down.
 

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Obviously it didn't end it last time....And what you are saying here is not really what the issue was about, only what it became after allot of spinning and blowing smoke. Nvidiots took an inch and went a mile claiming NV had better IQ which is wrong, and Atidiots chimmed in saying AMD should be congratulated for the performance boost which is somewhat stupid. The fuss isn't over who's AF algorithms are better, etc...

The issue was, is, and always will be that AMD changed their defualt optimization level that screwed up 68xx review results with the original 10.10 driver because there was an AS optimization enabled that was different at default than Cat 10.9. They didn't offer a hotfix until asked direcly about it after it was discovered, and that is what rose eyebrows. Whether or not they did it on purpose to bench better, that is hard to say and a slippery slope....and not one really worth going down.
You already said all of this. I'm referring to posts like these:
There is currently a problem testing ATI cards vs. Nvidia cards. According to the testing conducted by some of the German sites, even when ATI driver is set to HQ, it still doesn't rival Nvidia's default image quality setting let alone Nvidia's HQ setting. So, it would not be apples to apples if you set both to HQ.

I say, just run the benches at default for both, and scrutinize IQ differences in the review. Just like they used to do in the "old days".
How is AF quality, how is AA handled and executed, is there shimmering mip maps transitions, missing objects, textures, incorrectly rendered objects/textures. Things like that.

Another thing is, these reviewers are REALLY pressed for time when they get their samples. Like one week before release. I know that isn't enough time for an in depth review AND an image quality analysis. I think reviewers need a bit more time with the new hardware. even a few days more would be helpful. When not rushed, things have less of a chance to be overlooked.
that have no references and are clearly false.

Overall, there's way to much crap going on with all of this and it's getting ridiculous. If some/any review sites would like to do in depth IQ comparisons, I think it'll be a great read.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
You already said all of this.
Yeah, more times than I probably should lol... I just get a bit frustrated when the issue gets spinned out of proporotion like it does so often. Not implying you were trying to spin it though, just it seems so much about this issue is misunderstood and misconstrued for the sake of trying to make one company look better/worse than the other.
Overall, there's way to much crap going on with all of this and it's getting ridiculous.
Yeah, things like this have a bad tendency to spiral downward to the gutter and get quite a bit off course from the original issue.
If some/any review sites would like to do in depth IQ comparisons, I think it'll be a great read.
Given enough time, perhaps a review of how each company does there AF texture filtering could be done, and then images with each setting could be shown to see the differnces, if any at all. Not necessarily trying to compare AMD to Nvidia directly per se since that will spiral downward rather quickly, but just a review of each and just what happens to IQ when the optimizations are enabled/disabled, and their respective performance impacts as well. I would enjoy reading something like that quite a bit.
 
Last edited:

sandorski

No Lifer
Oct 10, 1999
70,806
6,362
126
There's no magic "Fair" comparison. AMD/Nvidia will likely always have differences in their Output and just using "Highest" or some other generalized quality setting still won't tell you much. Reviews should certainly do Output Quality comparisons and most do, but to make exact Output for all testing from both AMD/Nvidia would take an absurd amount of tweaking and time.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
If the optimisation isn't visible with the bare eye I don't see any reason why the driver teams shouldn't do it. "high", "low" or whatever are just some abstract values that don't have to correspond to anything in reality and sure as hell don't have to correspond to the other teams settings.

So just testing both at high/whatever without also looking at the IQ results is completely useless. I'd test both at default values and then compare the IQ, if there are noticeable differences (just please no gigantic zooms of 6 year old games, where it's just as believeable that there's a bug in the driver) increase the settings for the weaker contestant to high and compare again.

That's what I was going to say... the settings are arbitrary.

It's like comparing two digital cameras and assuming the JPEG compression is the same because you set both on "high quality".


And what happens when reviewers start setting both to the highest setting? Why wouldn't nVidia just remove the top option? Why wouldn't NV and AMD just downgrade all the settings to get better framerates in reviews? That's the real slippery slope
 
Last edited:

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
The issue was, is, and always will be that AMD changed their defualt optimization level that screwed up 68xx review results with the original 10.10 driver because there was an AS optimization enabled that was different at default than Cat 10.9. They didn't offer a hotfix until asked direcly about it after it was discovered, and that is what rose eyebrows. Whether or not they did it on purpose to bench better, that is hard to say and a slippery slope....and not one really worth going down.

So what you are saying is that AMD introduced a better optimization on the Cat 10.10 that made the 6800 series benchmark faster.

All the IQ pictures and stuff are just smoke and mirrors that don't have anything to do with the actual optimization, since disabling that optimization didn't remove the shimmering.

From the GTX580 anandtech review, http://www.anandtech.com/show/4008/nvidias-geforce-gtx-580/2 ,

Moving on, GF110’s second trick is brand-new to GF110, and it goes hand-in-hand with NVIDIA’s focus on tessellation: improved Z-culling. As a quick refresher, Z-culling is a method of improving GPU performance by throwing out pixels that will never be seen early in the rendering process. By comparing the depth and transparency of a new pixel to existing pixels in the Z-buffer, it’s possible to determine whether that pixel will be seen or not; pixels that fall behind other opaque objects are discarded rather than rendered any further, saving on compute and memory resources. GPUs have had this feature for ages, and after a spurt of development early last decade under branded names such as HyperZ (AMD) and Lightspeed Memory Architecture (NVIDIA), Z-culling hasn’t been promoted in great detail since then.

For GF110 this is changing somewhat as Z-culling is once again being brought back to the surface, although not with the zeal of past efforts. NVIDIA has improved the efficiency of the Z-cull units in their raster engine, allowing them to retire additional pixels that were not caught in the previous iteration of their Z-cull unit. Without getting too deep into details, internal rasterizing and Z-culling take place in groups of pixels called tiles; we don’t believe NVIDIA has reduced the size of their tiles (which Beyond3D estimates at 4x2); instead we believe NVIDIA has done something to better reject individual pixels within a tile. NVIDIA hasn’t come forth with too many details beyond the fact that their new Z-cull unit supports “finer resolution occluder tracking”, so this will have to remain a mystery for another day.

[sarcasm] Sneaky sneaky NVIDIA improving their Z-culling so they can reject pixels that don't show up on the image instead of render them as the game developers wished, making their GTX580 appear much faster than the GTX480 on the tessellation department by lowering the work done.

I demand that NVIDIA introduces a slider that can stop this optimization because I want those unseen pixels rendered![/sarcasm]

The mistake AMD made was that while combining 2 CCC tabs they chose Performance, Quality and High Quality. They should have instead chosen Off, Default, Advanced and then people couldn't bicker "but it used to be High Quality and now it is Quality".

On a side note I can't see CAT AI on the 10.12 drivers.

EDIT: Scratch the not finding the CAT AI - I was simply on the standard view instead of the advanced view.
 
Last edited:

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I doubt one can receive a totally objective apples-to-apples but one must try. High quality with both with trying to eliminate texturing optimizations - and allow their hardware to be the gauge.

Allow the optimizations to be used for owners to enjoy, not to create objective testing for hardware -- the optimizations cloud things and may be unfair to gain a competitive advantage.

Since, filtering may differ, try to find a baseline, if possible, and share findings for movement through investigations that may differ. Point them out, talk about them and explain what is going on.

To simply bench default with default and place your head in the sand and offer performance -- does one desire cookie cutter data or reviews to try to make an attempt to offer baseline and offer investigations for consumers?
 
Last edited:

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
That's what I was going to say... the settings are arbitrary.

It's like comparing two digital cameras and assuming the JPEG compression is the same because you set both on "high quality".


And what happens when reviewers start setting both to the highest setting? Why wouldn't nVidia just remove the top option? Why wouldn't NV and AMD just downgrade all the settings to get better framerates in reviews? That's the real slippery slope

Some websites already set both to the highest quality settings available in the drivers, e.g. Xbitlabs.