Should hardware sites benchmark all cards AMD and nVIDIA on High Quality mode?

What do you think of the IQ issue?

  • Do you believe it's fair, that just put AMD on HQ and nVIDIA default?

  • Do you believe it should be the same on both as much as possible?

  • Are you not sure what to think, or think it's not an issue?

  • Do you think this IQ thing is just "fud"?

  • Are you fed up of hearing about this issue?


Results are only viewable after voting.

Bearach

Senior member
Dec 11, 2010
312
0
0
In this thread there was some "old news" that I've seen so much that's it getting silly, I am in no way an AMD fan in graphics, in fact I've never even owned one of their cards!

It just seems unfair to me that one should have their cards benchmarked to have most optimizations off and the other not, no matter what you think both companies "tweak" and fiddle with their drivers.

Optimizations that don't allegedly impact IQ will also be turned off in most instances with HQ mode on... So to make this fair for now, I believe all cards should be benchmarked with the HQ setting, not one having default and the other HQ.

I will not even entertain a site that would do that as it will not reflect the actual performance.

What is your opinion and views on this?

Do you believe it's fair, put AMD on HQ and nVIDIA default?

Do you believe it should be the same on both as much as possible?

Are you not sure what to think, or think it's not an issue?

Do you think this IQ thing is just "fud"?

Are you fed up of hearing about this issue?

The thread that brought this poll about : http://forums.anandtech.com/showthread.php?t=2127494

Grr wish I had read the poll questions fully before posting!
 
Last edited:

dpodblood

Diamond Member
May 20, 2010
4,020
1
81
Both Nvidia, and ATI cards should be tested using the exact same settings. This is the only way to truly test performance, and determine which card is the best bang for the buck.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
Its kinda a trick question. The first choice is the same as the second choice.
Default for Nvidia= HQ for AMD.
 

Bearach

Senior member
Dec 11, 2010
312
0
0
I believe at the default setting, NVIDIA's image quality is = to AMD set to HQ.

Glad you used "believe" as that means it's your opinion, do you have reasons for this belief?

Do you think nVIDIA don't tweak their drivers at all to get the boosts in different IQ settings?
 

Bearach

Senior member
Dec 11, 2010
312
0
0
Its kinda a trick question. The first choice is the same as the second choice.
Default for Nvidia= HQ for AMD.

How so? HQ on AMD will also disable any optimizations that didn't effect IQ too (that's if this IQ issue is true), wouldn't it also be fair that nVIDIA too had their optimizations that were the same in the sense also disabled?

Also how do I edit the poll? Perhaps make the questions more clearer.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I believe at the default setting, NVIDIA's image quality is = to AMD set to HQ.

That was my understanding also. Its hard to see but its there, and this makes benchmarks not fair.


Look at the join date and please don't take the hook. :)


Re: "Look at the join date and please don't take the hook."

I ask you, which part of the quoted text below is reflected in the last sentence of your post above?

AnandTech Forum Guidelines
We want to give all our members as much freedom as possible while maintaining an environment that encourages productive discussion. It is our desire to encourage our members to share their knowledge and experiences in order to benefit the rest of the community, while also providing a place for people to come and just hang out.

We also intend to encourage respect and responsibility among members in order to maintain order and civility. Our social forums will have a relaxed atmosphere, but other forums will be expected to remain on-topic and posts should be helpful, relevant and professional.

We ask for respect and common decency towards your fellow forum members.

Your moderation staff joins your fellow forum members in asking you to please help us in keeping this forum civil, respectful, and open-minded towards ALL of its members, regardless of race, color, religion, creed, sex, national origin, age, disability, veteran status or sexual orientation...including, but not limited to, post count and join dates.

Moderator Idontcare
 
Last edited by a moderator:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
How so? HQ on AMD will also disable any optimizations that didn't effect IQ too (that's if this IQ issue is true), wouldn't it also be fair that nVIDIA too had their optimizations that were the same in the sense also disabled?

Also how do I edit the poll? Perhaps make the questions more clearer.

I don't think you can.?
 

nitromullet

Diamond Member
Jan 7, 2004
9,031
36
91
Personally, I think both should be tested at their highest available quality settings. It's how I'm going to run them, so it's what I care about.
 

Bearach

Senior member
Dec 11, 2010
312
0
0
Look at the join date and please don't take the hook. :)

What are you saying here? Are you implying anything? I assure you I'm no AMD troll... my computing has never been on ATi.

First usage of graphics was EGA... and can't remember the chipset, graphics chips I've used from the first I can remember in order :

Cirrus Logic GD5420
S3 Trio
S3 Virge
nVIDIA TNT... gah! major issues with this and the VIA chipset...
3DFx Banshee... what I replaced that TNT with.
3DFx Voodoo 3000
nVIDIA GeForce 2 MX
nVIDIA GeForce 2 Ti
nVIDIA GeForce 4 Ti 4600
nVIDIA GeForce 6800
nVIDIA GeForce 7600GT (the card that I'm using atm... forgot this one!)
nVIDIA GeForce 9800GT... My baby that burnt out last week :(
 
Last edited:

KCfromNC

Senior member
Mar 17, 2007
208
0
76
If there was a way to set the image quality the same on both cards, that would be ideal. Until NV catches up with AMD and comes up with a way to disable IQ-reducing optimizations in the driver, though, there's no way to do that.

Until then, NV's not serious about this and it just using FUD to try and distract from the 69xx launch.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91
There is currently a problem testing ATI cards vs. Nvidia cards. According to the testing conducted by some of the German sites, even when ATI driver is set to HQ, it still doesn't rival Nvidia's default image quality setting let alone Nvidia's HQ setting. So, it would not be apples to apples if you set both to HQ.

I say, just run the benches at default for both, and scrutinize IQ differences in the review. Just like they used to do in the "old days".
How is AF quality, how is AA handled and executed, is there shimmering mip maps transitions, missing objects, textures, incorrectly rendered objects/textures. Things like that.

Another thing is, these reviewers are REALLY pressed for time when they get their samples. Like one week before release. I know that isn't enough time for an in depth review AND an image quality analysis. I think reviewers need a bit more time with the new hardware. even a few days more would be helpful. When not rushed, things have less of a chance to be overlooked.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Unfortunately the most fair way to test them is with optimizations on *and* off. The optimizations are actually a value add to the consumer. It's extra work the driver team does to make the card faster in some situations. If the optimizations don't matter at all then there's no point in the companies expending the extra effort of providing extra optimizations (even if they are quality trade offs).

Testing both with and without optimizations as well as analyzing IQ differences gives the most complete picture of what you can get from a card, but is entirely unfeasible.
 
Feb 18, 2010
67
0
0
In my opinion if the differences in IQ are not visible with the naked eye and only appear when zooming at 100% then all this discussions are really really dumb.
Both nVidia and ATI have a pass from me to optimize in any way they choose if their optimizations are not visible in day to day gaming.
 
Last edited:

Mistwalker

Senior member
Feb 9, 2007
343
0
71
I say, just run the benches at default for both, and scrutinize IQ differences in the review. Just like they used to do in the "old days".
How is AF quality, how is AA handled and executed, is there shimmering mip maps transitions, missing objects, textures, incorrectly rendered objects/textures. Things like that.

Another thing is, these reviewers are REALLY pressed for time when they get their samples. Like one week before release. I know that isn't enough time for an in depth review AND an image quality analysis. I think reviewers need a bit more time with the new hardware. even a few days more would be helpful. When not rushed, things have less of a chance to be overlooked.
I agree with all this. What's unfortunate is once upon a time review sites did slow down and look carefully at image quality, show comparison shots, all that jazz (or maybe they just had the hardware further in advance). Now they expect card makers to stay as close as possible for easier testing, which is a pretty dramatic shift.

On the one hand, you could point to AMD's recent optimizations and say "the difference is not at all noticeable, not a single review site initially caught them." But if looking at IQ was part of the benching process, and testers had more time with the cards, just maybe it would've come to light earlier.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
There is currently a problem testing ATI cards vs. Nvidia cards. According to the testing conducted by some of the German sites, even when ATI driver is set to HQ, it still doesn't rival Nvidia's default image quality setting let alone Nvidia's HQ setting. So, it would not be apples to apples if you set both to HQ.

I say, just run the benches at default for both, and scrutinize IQ differences in the review. Just like they used to do in the "old days".
How is AF quality, how is AA handled and executed, is there shimmering mip maps transitions, missing objects, textures, incorrectly rendered objects/textures. Things like that.

IQ differences?

HQbutt.png


Image selection taken from the Computerbase images linked here:
http://forums.anandtech.com/showthread.php?t=2117420

(And yes, I'm bad at cropping when I use Paint)
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Highest quality for both companies is fair and the only way. I didn't see a choice for that?

Yep, this is a good way to do it, to keep it scientific. Highest settings for both, then analyze image quality.

Or, just keep both at defaults, and again analyze image quality. As long as IQ is compared, if both settings are the same, it is fair.
 

PingviN

Golden Member
Nov 3, 2009
1,848
13
81
It's only "fair" if IQ is exactly the same. Turning both up to high might mean the one looks better, but takes a larger fps hit. Honestly, I don't care about "fair". I'm just sick and tired of reading about this non-issue blown out of proportion by PR-people.
 

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Yep, this is a good way to do it, to keep it scientific. Highest settings for both, then analyze image quality.

That's only useful if both companies agree that HQ disables all optimizations. Until NV promises to add this option to their drivers, it's just unfairly handicapping AMD for giving the user various options that NV refuses to.

Or, just keep both at defaults, and again analyze image quality. As long as IQ is compared, if both settings are the same, it is fair.

True. The problem then becomes figuring out an objective standard for quality. Look in the other threads to see that each company applies different settings which give different trade offs. AMDs seems to filter textrues less(?), which shows more detail but can lead to shimmering. NV seems to filter them more and reduces this, but that gives fuzzy text and blurred details. Which is right?
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
If the optimisation isn't visible with the bare eye I don't see any reason why the driver teams shouldn't do it. "high", "low" or whatever are just some abstract values that don't have to correspond to anything in reality and sure as hell don't have to correspond to the other teams settings.

So just testing both at high/whatever without also looking at the IQ results is completely useless. I'd test both at default values and then compare the IQ, if there are noticeable differences (just please no gigantic zooms of 6 year old games, where it's just as believeable that there's a bug in the driver) increase the settings for the weaker contestant to high and compare again.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
I think both should be benched at default, if one has better quality than the other, then thats just one of the "pro"'s of buying said product.

Reviewers could do a IQ test... where they take ingame pictures of the same spots for both cards? And high light the issues.

Trackmania shows banding issues for amd... most games dont have this issue.... every site seems to love to link to that one german site with the pics. Its odd they dont show from any other games? abit.... they found a game where its really extreme, when in other games it doesnt show at all/isnt noticeable (otherwise it wouldnt have taken so long to find this issue and made it common knowlegde).

Amd also has abit more shimmering than nvidia does...

These arnt really filtering issues though due to drivers... going to HQ (instead of default) doesnt fix it... its a hardware issue.
Which is why its odd people use it as a reason for testing default nv vs hq amd settings, its not a driver issue, its a hardware issue thats still there with HQ settings.

people are confuseing hardware issues with driver optimsations, and useing it as a argument when its really not.
 
Last edited: