Should hardware sites benchmark all cards AMD and nVIDIA on High Quality mode?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What do you think of the IQ issue?

  • Do you believe it's fair, that just put AMD on HQ and nVIDIA default?

  • Do you believe it should be the same on both as much as possible?

  • Are you not sure what to think, or think it's not an issue?

  • Do you think this IQ thing is just "fud"?

  • Are you fed up of hearing about this issue?


Results are only viewable after voting.

MrK6

Diamond Member
Aug 9, 2004
4,458
4
81
Yeah, more times than I probably should lol... I just get a bit frustrated when the issue gets spinned out of proporotion like it does so often. Not implying you were trying to spin it though, just it seems so much about this issue is misunderstood and misconstrued for the sake of trying to make one company look better/worse than the other.

Yeah, things like this have a bad tendency to spiral downward to the gutter and get quite a bit off course from the original issue.
Exactly :thumbsup:

I think the best thing an enthusiast can do for him/herself here is to understand the technology, how it's used, and its limitations. Then, each individual can construct a personal opinion on what they prefer for their gaming/computing experience. I've already said a lot of this is very subjective, and the people that want to label one card/method/company "good" or "bad" either clearly don't understand the technology or issues at hand or are purposefully distorting it for some other agenda. Either way, it's a disservice to the community. The objective here is to learn and gain some perspective, not try shove your opinion down someone else's throat.

Given enough time, perhaps a review of how each company does there AF texture filtering could be done, and then images with each setting could be shown to see the differnces, if any at all. Not necessarily trying to compare AMD to Nvidia directly per se since that will spiral downward rather quickly, but just a review of each and just what happens to IQ when the optimizations are enabled/disabled, and their respective performance impacts as well. I would enjoy reading something like that quite a bit.
I agree. Under the "there's more than one way to skin a cat" philosophy, it'd be interesting to see a break down of the different approaches available. However, I understand why a lot of review sites wouldn't do it, as it's extremely time-consuming and not really high-yield unless you find something revealing.
There's no magic "Fair" comparison. AMD/Nvidia will likely always have differences in their Output and just using "Highest" or some other generalized quality setting still won't tell you much. Reviews should certainly do Output Quality comparisons and most do, but to make exact Output for all testing from both AMD/Nvidia would take an absurd amount of tweaking and time.
Time is definitely a factor. Furthermore, I don't want to see review sites remove driver optimizations in their reviews by using "high quality" settings only. These companies invest a hell of a lot of time creating these optimizations, and they're done, in good faith, to improve performance without impacting image quality. Forcing the drivers to run without optimizations really defeats the purpose of good drivers/driver updates, and I'm not paranoid or arrogant enough to think myself better than either company's team of engineers.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
These companies invest a hell of a lot of time creating these optimizations, and they're done, in good faith, to improve performance without impacting image quality. Forcing the drivers to run without optimizations really defeats the purpose of good drivers/driver updates, and I'm not paranoid or arrogant enough to think myself better than either company's team of engineers.

I fully agree with this.

I also agree with the late President Ronald Reagan when he said "trust, but verify" in reference to agreeing to reduce nuclear weapon stockpiles but wanting to ensure there was infrastructure in place to actually verify both sides were honoring the treaty.

I think we should trust the driver teams to have the right intentions in mind when they craft their driver optimizations and I don't think there need be some over-reaching pedantic effort to normalize every last pixel on the screen before an arguably useless (for other reasons) average or minimum FPS value can be captured and reported to the reader in your plain-vanilla garden variety "gaming performance review".

(not saying it doesn't have its place, I very much enjoy BFG10K's articles on the topic, but I don't feel every review article out there need become an IQ comparison review article at the expense of actually reviewing gaming performance itself)

At the same time I think it is not wise to ignore the IQ story entirely, there should be a "spot check" as we call it in engineering at times, a quick check just to confirm no hi-jinks are going on, verify and move on without dwelling on the matter.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
These companies invest a hell of a lot of time creating these optimizations, and they're done, in good faith, to improve performance without impacting image quality.

Really? Are you that naive? You might want to look up "quake quack" before using the words "good faith" and "optimizations" in the same sentence.
 

MentalIlness

Platinum Member
Nov 22, 2009
2,383
11
76
Identical in game settings, and resolution. But biased review sites seem to have problems with this.

Maybe Ill just build two identical systems and cards from NV and AMD and compare myself.

Like one of 6970/6950/5870/5850/580/570/480/470

Probably not, but would be fun as hell though. :) But I just never know, can't wait till that "lump" sum of money arrives.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,731
428
126
Really? Are you that naive? You might want to look up "quake quack" before using the words "good faith" and "optimizations" in the same sentence.

And ATI (guess your signature says RIP ATI but you can't actually let ATI rest :p ) got caught the same way NVIDIA got caught optimizing the 3dmark bench.

Once you get people with 2 PCs next to each other, one using an AMD card and the other using an NVIDIA card, all these kinds of foul play (although I don't think NVIDIA not rendering stuff that isn't seen on 3dmark is foul play but just shows instead the vulnerabilities of canned benchmarks) are evident.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Last edited: