To Anandtech: What is your reasoning behind not using HQ settings on both cards?

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
We all know (though some will inevitably dispute) that:


Nvidia's default driver settings are "Quality" rather than "High Quality" with Anistropic Filtering and Trilinear filtering optimizations on as well.

We also know that nvidia loses performance (it tends to seem that that performance is between 1-3% to upwards of 15% or more) in games by going from the "Quality" to "High Quality" option in their drivers.


We also know that ATI's default driver settings are "High Quality" with Anistropic Filtering and Trilinear Filtering Optimizations turned Off

We also know that for many people there is a palpable difference between the two settings.

Given this, and that you routinely test cards at high end image settings like 2048x1536, why do you not explore this issue?

Other websites that do show serious changes in the results of their tests, as one would expect from taking 1-2%-15%+ off the scores from Nvidia cards.


Would people find it acceptable if ATI began lowering their "default" driver settings to 'Performance' if it gave them a 20% boost over Nvidia, and suddenly the X1950XTX destroyed 7950GX2?

Also, if you are able to include minimum frames in these reviews it would help immensely!

Thanks
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Amen, there needs to be a more standardized method in terms of reviewing different graphics cards. At the moment, it's hard to really know what kind of difference there is because of all the varying driver settings and optimizations.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
I thoroughly agree --- with the kind of horsepower a 7800/7900 or X1800/X1900 card provides, I can't understand why you wouldn't want to run them in High Quality.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
Getting this "fair" is impossible to force on both NV and ATI, unless the government is going to step in and force it.
And video cards are just not that big of a deal to worry about Quality settings to warrant that.

Short of government intervention, this will never happen nor will these things be judged "fairly" (whoever feels that they have the right to decide what that is).






What I would suggest instead of this "test NV on HQ" suggestion.. is to use default settings on both cards, as AT does today.. but include more IQ difference information at the end of each article.


No sense in caring if ATI drops their default quality to "Low Quality" and NV moves theirs up to "Uber High Quality"... let them.
Just make the note in the reviews if the default quality is worse on NV or ATI.


Otherwise, you'll play the hardware vendor settings-changing game constantly and only furthering confusion with readers.. beyond only the most intensive spectators that are way into this sort of thing (like yourself).
 

MrWizzard

Platinum Member
Mar 24, 2002
2,493
0
71
I wouldn't worry about such a small setting. If it causes Nvidia to be behind in all placed so be it.

I would still get a Nvidia because I like their drivers better.

ATI's are good to most of the time, but I have had many more problems with ATI than Nvidia.

And crusader made a good point too.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Crusader
Getting this "fair" is impossible to force on both NV and ATI, unless the government is going to step in and force it.
And video cards are just not that big of a deal to worry about Quality settings to warrant that.

Short of government intervention, this will never happen nor will these things be judged "fairly" (whoever feels that they have the right to decide what that is).

That was one of the weirdest, most out-there non sequiturs I've heard in awhile.

You simply go in to the drivers, and move the slider all the way to the right, from Quality to High Quality. Is that so hard Crusader??

The point is that why would you simply run "Quality" when you have a super-powerful GPU? Why compromise your graphics in games?

-------------------

What I would suggest instead of this "test NV on HQ" suggestion.. is to use default settings on both cards, as AT does today.. but include more IQ difference information at the end of each article.

That's a possibility - another is to start off with default in both sets of drivers and then move on to tests using High Quality for everything. No point to single out Nvidia - run all cards on HQ in the High Quality section.

No sense in caring if ATI drops their default quality to "Low Quality" and NV moves theirs up to "Uber High Quality"... let them.
Just make the note in the reviews if the default quality is worse on NV or ATI.

Non sequitur #2. Also, I think it would be a pretty big deal if either company defaulted to low quality. NV cheats pretty bad on some games -- Far Cry looks like s#!t for me on low quality!


Otherwise, you'll play the hardware vendor settings-changing game constantly and only furthering confusion with readers.. beyond only the most intensive spectators that are way into this sort of thing (like yourself).

Again - go into driver, set slider bar to High Quality. That is a universal setting -- all games now run in HQ mode. Problem solved.

Some people just try to make things so difficult!
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Either hit the ceiling with the settings or stay on the ground with them. Only when we bench on the borders to what each card can provide do we have an accurate comparison.

If ATI would ship their cards with a default driver setting that produces excessive shimmering, texture-crawling, etc. with all optimizations on, this wouldn't need to happen. That or if Nvidia actually cared about what the picture looks like they would give the buyer the best out of the box picture quality that they can deliver. Unfortunatley, Nvidia's cards can take an impact with their higher IQ, but once it's reached the only thing holding it back from being unmatched is angle independent AF IMO (and warmer colors).

I agree that Anandtech's benches always seem lacking driver details (settings for each card, optimizations on/off, etc.) but most other benches I see lack other things if not the same thing as well.
Getting this "fair" is impossible to force on both NV and ATI, unless the government is going to step in and force it.
You're right, switching some slider bars on a computer would be more difficult than having the government enforce it............err..........wait..............
And video cards are just not that big of a deal to worry about Quality settings to warrant that.
With periodic image quality comparisons from websites and consumer requests for equal IQ settings it is hard to ignore the fact that there are differences from the default image qualities that ATI and Nvidia provide. You are correct though in that the government isn't going to get involved. I don't know where you thought they would have to in the first place.
What I would suggest instead of this "test NV on HQ" suggestion.. is to use default settings on both cards, as AT does today.. but include more IQ difference information at the end of each article.
The idea is to eliminate the bias in a review and let the numbers and settings speak for themselves, not leave an individual the responsibility to decide which card has the better IQ. This isn't a call for IQ discussions, but driver respective frame rates and how those frames represent an image quality that is either unmatched or slipping buy to milk more performance.
No sense in caring if ATI drops their default quality to "Low Quality" and NV moves theirs up to "Uber High Quality"... let them.
Yeah, inaccuracy FTW. :thumbsdown:
Otherwise, you'll play the hardware vendor settings-changing game constantly and only furthering confusion with readers...
The only game that will be getting played is the game being benched. If the driver settings match, i.e. HQ vs. HQ. both with optimizations off, there is no room for error in a synthetic benchmark.
... beyond only the most intensive spectators that are way into this sort of thing (like yourself).
The same argument could be made to someone who is blind (like yourself).
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
ideally they'd do a full image quality comparo, testing all the way from craptastic with max optimizations to super high quality, and then test that to a software renderer that renders each scene perfectly (does MS provide such a thing for D3D?), give frames for each card, still images with mouseovers for each (pointing out areas where what the card renders is different from the software) and movies for each (the software one doesn't need to be in real time).
 

450R

Senior member
Feb 22, 2005
319
0
0
I agree that all hardware should be on even ground, as much as is possible. Why don't you guys join forces (regardless of brand preference) and start your own site? I'd think there's enough of you here to disperse the time/money to make it happen.
 

BDawg

Lifer
Oct 31, 2000
11,631
2
0
Too bad the Anandtech Article discussion feedback forum was removed. This thread would've been a perfect candidate.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
Originally posted by: BDawg
Too bad the Anandtech Article discussion feedback forum was removed. This thread would've been a perfect candidate.

there is article feedback built into the website


here is what Derek Wilson wrote when i posed the driver settings question:
Drivers were run with default quality settings.

Default driver settings between ATI and NVIDIA are generally comparable from an image quality stand point unless shimmering or banding is noticed due to trilinear/anisotropic optimizations. None of the games we tested displayed any such issues during our testing.

At the same time, during our Quad SLI followup we would like to include a series of tests run at the highest possible quality settings for both ATI and NVIDIA -- which would put ATI ahead of NVIDIA in terms of Anisotropic filtering or in chuck patch cases and NVIDIA ahead of ATI in terms of adaptive/transparency AA (which is actually degraded by their gamma correction).


If you have any suggestions on different settings to compare, we are more than willing to run some tests and see what happens.

Thanks,
Derek Wilson
 

Ackmed

Diamond Member
Oct 1, 2003
8,498
560
126
While I would love reviewers to compare cards more directly, most wont. NV suggests that they use the default driver settings, which sadly have worse IQ than ATi's defaults.

Originally posted by: jiffylube1024
I thoroughly agree --- with the kind of horsepower a 7800/7900 or X1800/X1900 card provides, I can't understand why you wouldn't want to run them in High Quality.

NV doesnt want them to, because it hurts their perfomance. In reviews that do, you can tell a rather large difference.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Ackmed
While I would love reviewers to compare cards more directly, most wont. NV suggests that they use the default driver settings, which sadly have worse IQ than ATi's defaults.

Originally posted by: jiffylube1024
I thoroughly agree --- with the kind of horsepower a 7800/7900 or X1800/X1900 card provides, I can't understand why you wouldn't want to run them in High Quality.

NV doesnt want them to, because it hurts their perfomance. In reviews that do, you can tell a rather large difference.

I know -- reviewers should make Nvidia pay by exposing this fact. It's as simple as that!

Before we had tards like Rollo coming up with excuses from every region of their arses for why not to do this. Quite frankly I can't think of a reason more sites don't review in HQ and illustrate the differences other than NV most likely leveraging their power to prevent this from happening. And now that AMD has ATI in their boat, they should use their power to turn things around.

Originally posted by: ElFenix
ideally they'd do a full image quality comparo, testing all the way from craptastic with max optimizations to super high quality, and then test that to a software renderer that renders each scene perfectly (does MS provide such a thing for D3D?), give frames for each card, still images with mouseovers for each (pointing out areas where what the card renders is different from the software) and movies for each (the software one doesn't need to be in real time).

Ideally - you're right ElFenix. However, time is limited so they could do what you propose maybe once a year tops. I think running settings once on default quality and then once on high quality is fair/doable. Or just HQ for the top cards.

The main thing is on premium cards. If I'm buying a $400 card or god forbid a $500+ Crossfire/SLI setup, then I want to see some high quality benchmarks because those are the settings I would use!
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This is why I give more credibility to sites that test all cards in HQ driver settings. Unfortunately, AT is not one of them.
 

quattro1

Member
Jan 13, 2005
111
0
0
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: quattro1
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?

So what then is your point? Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't. Should NV be on HQ, while ATI simply on Quality? The other way around? HQ vs HQ is the best for both cards; when benchmarking the top cards, most people would like to get the best quality their cards are capable of.

Asking a rhetorical question is nice, but take it one step further and make a statement!
 

Caecus Veritas

Senior member
Mar 20, 2006
547
0
0
Originally posted by: Crusader
Getting this "fair" is impossible to force on both NV and ATI, unless the government is going to step in and force it.
And video cards are just not that big of a deal to worry about Quality settings to warrant that.

Short of government intervention, this will never happen nor will these things be judged "fairly" (whoever feels that they have the right to decide what that is).

What I would suggest instead of this "test NV on HQ" suggestion.. is to use default settings on both cards, as AT does today.. but include more IQ difference information at the end of each article.

No sense in caring if ATI drops their default quality to "Low Quality" and NV moves theirs up to "Uber High Quality"... let them.
Just make the note in the reviews if the default quality is worse on NV or ATI.

Otherwise, you'll play the hardware vendor settings-changing game constantly and only furthering confusion with readers.. beyond only the most intensive spectators that are way into this sort of thing (like yourself).


Hello Crusader.

i think you're missing the big main point.. *blink* *blink* :light: the reason for all these reviews is not to be "fair" to one company over another or to prove one company is holier than another as you so often try to do. the whole point is to find products that best suits the consumer - us (in this case high-end consumers). nothing else.

and i'm gonna continue on the assumption that you have no other agendas on your mind. the reason why i read reviews is to find a card that best suits my taste. and what do i want when shopping for the top tier video card? i want the highest IQ with no compromise at playable speeds. why else would i sink $1,000 into just the video cards? if i buy a card that costs an arm, a leg, and an organ, i fully expect to play at the highest IQ. so when i read the reviews, i could careless how they compare to another card at various IQs. what i as a consumer want to know is - how does the card perform at its own highest settings? and how does the resulting IQ compare to another high-end card?

as the OP pointed out, a lot of these reviews become useless because - if i bought a 7950GX2 (a top tier card) i expect and will be playing at the highest IQ, and therefore, i need to know how these cards will perform at those settings... doing reviews on nvidia cards with 'default' settings may or may not be fair to ati, but in no way is it informational nor beneficial to the high-end consumers. this goes the same for ati cards as well. these reviews of high-end video cards should include tests at each of their own maximum settings at high resolutions..

cheers.


EDIT: Someone please tell me what FTW and FTL stands for... can't figure it out!! doh! :eek: oh and also YAGT YAC.. variations at off-topic too..


 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
If a review site is going to bother running the tests on some of the best hardware around (i.e. a 6800 Conroe, ASUS P5W DH 975X, 7900GTX/SLI, X1950XTX/X1900XTX/CF, etc.) then they should also bother to explore the best settings for that hardware. The whole reason why they bench things at high resolutions is to see how strong the core is and what kind of performance you can expect when utilizing areas that high-end users expect to see good performance out of. Setting the driver to HQ without optimizations is the same principle.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Most people don't use HQ I guess?

But, using HQ would make it a more controlled test, if a less practical one. OTOH it isn't all that impractical, 'cause if my card could push HQ 1280x1024 at x settings with 75 FPS, I'd probably use it.
 

schneiderguy

Lifer
Jun 26, 2006
10,801
91
91
Originally posted by: jiffylube1024
Originally posted by: quattro1
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?

So what then is your point? Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't. Should NV be on HQ, while ATI simply on Quality? The other way around? HQ vs HQ is the best for both cards; when benchmarking the top cards, most people would like to get the best quality their cards are capable of.

Asking a rhetorical question is nice, but take it one step further and make a statement!
ATI cards shimmer on default IQ settings
 

quattro1

Member
Jan 13, 2005
111
0
0
Originally posted by: jiffylube1024
Originally posted by: quattro1
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?

So what then is your point? Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't. Should NV be on HQ, while ATI simply on Quality? The other way around? HQ vs HQ is the best for both cards; when benchmarking the top cards, most people would like to get the best quality their cards are capable of.

Asking a rhetorical question is nice, but take it one step further and make a statement!


Ok, say you do get HQ tested on both cards, then what? Whats to say perf being measured is apples to apples as the original post was sort of asking for?

Heck if you are trying to get the "best" of each card tested, why not crank up the AA to the max that each card can do? I know 4x isnt the "best" each card can do...

I dont know why AT leaves it at the default settings, but testing at HQ wont be the "end all" of apples to apples bmarking that this post is trying to get at. Too many variables in each driver.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Very nice post, my latin-named friend. By the way, FTW means "For the Win", and is one of the latest expressions people spam everywhere on the net. FTL is the opposite - "For the Loss" -- I don't hear that much. Usually if someone posts something that person agrees with (let's say Jimmy), the first person will proclaim: "Jimmy FTW!!!"

And before you ask, OTOH (had to look it up myself) stands for "on the other hand" (I usually just type stuff like that out; though I'm a big fan of AFAIK - "as far as I know" ).

Cheers! :beer:
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: quattro1
Originally posted by: jiffylube1024
Originally posted by: quattro1
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?

So what then is your point? Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't. Should NV be on HQ, while ATI simply on Quality? The other way around? HQ vs HQ is the best for both cards; when benchmarking the top cards, most people would like to get the best quality their cards are capable of.

Asking a rhetorical question is nice, but take it one step further and make a statement!


Ok, say you do get HQ tested on both cards, then what? Whats to say perf being measured is apples to apples as the original post was sort of asking for?

Heck if you are trying to get the "best" of each card tested, why not crank up the AA to the max that each card can do? I know 4x isnt the "best" each card can do...

I dont know why AT leaves it at the default settings, but testing at HQ wont be the "end all" of apples to apples bmarking that this post is trying to get at. Too many variables in each driver.

It is apples to apples in the sense that it is the best rendering quality both cards can do, AF and AA excluded. It can never be true apples to apples because the cards and drivers are different, but the point is that you are setting the cards to their best capable quality.

Once you get into AA and AF, that's a separate section, which again has its own caveats (such as both companies not being able to to AAA, TrAA, HQ AF, etc) but the point is to have at least some sort of realistic baseline for performance, and on a high end card, default isn't it.

-----------

And for what you are talking about, the so-called "best-playable" that each card is capable of, check HardOCP .