To Anandtech: What is your reasoning behind not using HQ settings on both cards?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: quattro1
Ok, say you do get HQ tested on both cards, then what? Whats to say perf being measured is apples to apples as the original post was sort of asking for?

Heck if you are trying to get the "best" of each card tested, why not crank up the AA to the max that each card can do? I know 4x isnt the "best" each card can do...

I dont know why AT leaves it at the default settings, but testing at HQ wont be the "end all" of apples to apples bmarking that this post is trying to get at. Too many variables in each driver.

4xAA is the only eye-friendly level that is shared by both Nvidia and ATI. Nvidia can't do 6xAA, nor ATI 8xSAA. However the optimizations in both drivers are much more similar than the differing AA/AF. Keeping the AA at a equal setting (4x) and the AF at an equal setting (16x) and the driver level and optimizations at an equal setting (HQ with optimizations off) will provide more acurate results than to mismatch one area and display the frames per second.

EDIT: The game's video settings should, of course, also be maxed out for that kind of comparison. If one would wish to bench a mid-level bench, the game's video settings should be set to a "Medium" or equivilant standard, and the driver settings should then be reduced to "Quality" with the default optimized settings for each card. AA should still range from 4x-0x and AF between 16x-8x.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: munky
This is why I give more credibility to sites that test all cards in HQ driver settings. Unfortunately, AT is not one of them.


This is getting just silly. Why does nvidia just not make hq default?

Personally I thought the review sucked because the only used resolutions that are not what I ever use. I would have liked some stuff on lower resolutions with aa and af at high levels. Some image quality would also have been nice. Otherwise was entertaining and I did learn some stuff. Outside of the resolutions used, was a pretty good job by anandtech mob standards. ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Wow, look at those Quake 4 scores - 52 FPS X1950 XTX vs 38.2 FPS 7900 GTX.

Now take off another 10%-15% from the 7900 GTX for HQ and the results are even more amazing. :Q
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Well I get 11% but 3dcenter gets about 30%. I think 10%-15% is more realistic though.
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
Originally posted by: BFG10K
Wow, look at those Quake 4 scores - 52 FPS X1950 XTX vs 38.2 FPS 7900 GTX.

Now take off another 10%-15% from the 7900 GTX for HQ and the results are even more amazing. :Q

Hm I was skeptical but hardwarezone shows similar results

@ 1600x1200 4x AA 16x AF

x1950xtx - 81.6 (OMG? )
x1900xtx - 62
7950GX2 - 82
7900 GTX - 59

22 fps lead in an opengl game? Who saw that coming? Seems like Q4 really takes advantage of the extra bandwidth... maybe future titles will as well

Edit - Err.. this is weird.. most if not all of the other sites show a minor lead in Q4... These 2 are the only ones showing such a lead... huh... :confused:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Seems like Q4 really takes advantage of the extra bandwidth...
It's more likely to be the 6.8 drivers; many reviews are still using the 6.7 drivers so those don't show the large performance gain in OpenGL.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't.

At default settings, ATi's drivers shimmer and there is a noticeable degredation in IQ. At the highest settings ATi wins hands down in texture filtering, nV in AA. Which is more important to you depends on your perspective.
 

yacoub

Golden Member
May 24, 2005
1,991
14
81
Eventually you guys will realize you just need to completely ignore anything Crusader says. I don't think he's ever made a post that makes much sense, and most were construed as trolls but later it turned out he's just pretty incompetent.
 

Sonikku

Lifer
Jun 23, 2005
15,883
4,882
136
Originally posted by: ronnn
Originally posted by: munky
This is why I give more credibility to sites that test all cards in HQ driver settings. Unfortunately, AT is not one of them.


This is getting just silly. Why does nvidia just not make hq default?

Marketing? Better looking benches?
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: quattro1
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?

Yes, right.

IMHO, as far as possible the driver settings should be set manually setting by setting so as to be close as possible and then each disclosed in full detail before each review. Names like Q and HQ don't mean a damn thing. All optimizations that degrade IQ should be switched off for both companies' cards.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: jiffylube1024
Originally posted by: quattro1
Originally posted by: jiffylube1024
Originally posted by: quattro1
Why do people think HQ on both brands are the same? Why do people assume they are?

Is all 91 octane fuel the same?

So what then is your point? Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't. Should NV be on HQ, while ATI simply on Quality? The other way around? HQ vs HQ is the best for both cards; when benchmarking the top cards, most people would like to get the best quality their cards are capable of.

Asking a rhetorical question is nice, but take it one step further and make a statement!


Ok, say you do get HQ tested on both cards, then what? Whats to say perf being measured is apples to apples as the original post was sort of asking for?

Heck if you are trying to get the "best" of each card tested, why not crank up the AA to the max that each card can do? I know 4x isnt the "best" each card can do...

I dont know why AT leaves it at the default settings, but testing at HQ wont be the "end all" of apples to apples bmarking that this post is trying to get at. Too many variables in each driver.

It is apples to apples in the sense that it is the best rendering quality both cards can do, AF and AA excluded. It can never be true apples to apples because the cards and drivers are different, but the point is that you are setting the cards to their best capable quality.

Once you get into AA and AF, that's a separate section, which again has its own caveats (such as both companies not being able to to AAA, TrAA, HQ AF, etc) but the point is to have at least some sort of realistic baseline for performance, and on a high end card, default isn't it.

-----------

And for what you are talking about, the so-called "best-playable" that each card is capable of, check HardOCP .

Best Render Quality is also a good way to do these benchmarks, but they should then be accompanied with a detailed write-up about IQ for each card. If, for example card A has 10% better performance than card B but card B has better IQ assuming both cards at Best Render Quality, I'd want to know about that.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Originally posted by: BenSkywalker
Nvidia's default drivers shimmer, look like crap in some (many?) games. ATI's don't.

At default settings, ATi's drivers shimmer and there is a noticeable degredation in IQ. At the highest settings ATi wins hands down in texture filtering, nV in AA. Which is more important to you depends on your perspective.

That's why these "presets" are not very useful. I think reviewers should either use individual "expert" settings to best equalize IQ at the baseline of the highest IQ of the card with the worse maximum IQ. Or run with best render quality of each card with detailed notes on what the differences in IQ.
 

imported_Crusader

Senior member
Feb 12, 2006
899
0
0
This place is hilarious.

The IQ between ATI and NV at default settings doesnt result in ATI having superior image quality in all areas. Look at transAA and AA quality for example where Nvidia is superior.


Basically you guys are wasting your breath, because:

You suggest pushing both cards to max IQ, then benching. Right?
Then comparing FPS scores and IQ. Right?




How is this any better than leaving the cards on default settings (which 90% of people actually use), then benching.
Then comparing FPS scores and IQ.



Make sense for the thick skulls around here yet?
If ATI has such superior default quality, then review sites would be remarking on it and Nvidia would be working to increase theirs to the level necessary to compete. Instead, NV is superior in some areas (at default settings) and ATI is superior in others (at default settings).
Those are the facts boys.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
Originally posted by: Crusader
This place is hilarious.

The IQ between ATI and NV at default settings doesnt result in ATI having superior image quality in all areas. Look at transAA and AA quality for example where Nvidia is superior.


Basically you guys are wasting your breath, because:

You suggest pushing both cards to max IQ, then benching. Right?
Then comparing FPS scores and IQ. Right?




How is this any better than leaving the cards on default settings (which 90% of people actually use), then benching.
Then comparing FPS scores and IQ.



Make sense for the thick skulls around here yet?
If ATI has such superior default quality, then review sites would be remarking on it and Nvidia would be working to increase theirs to the level necessary to compete. Instead, NV is superior in some areas (at default settings) and ATI is superior in others (at default settings).
Those are the facts boys.
image quality is far more intensive to determine, imho, which is why most places don't look at it. a benchmark you can set up a batch test that you can leave running and go do something else. image quality, you have to compare not just screenshots but motion too. and you have to sit there and watch, and it'd be best to have the two side by side. far more intensive.

what we don't understand is why, if they're going to benchmark $700 graphics setups at 1920x1080+ resolutions and turn up all the in-game IQ settings, they don't then bother to turn up the driver IQ settings as well. seems like getting a ferrari with a fiat engine in it.
 

josh6079

Diamond Member
Mar 17, 2006
3,261
0
0
Originally posted by: Crusader
This place is hilarious.

The IQ between ATI and NV at default settings doesnt result in ATI having superior image quality in all areas. Look at transAA and AA quality for example where Nvidia is superior.
I was under the impression that niether company has any form of AA on alpha texutrues at default.
Basically you guys are wasting your breath, because:

You suggest pushing both cards to max IQ, then benching. Right?
Then comparing FPS scores and IQ. Right?
That is correct.
How is this any better than leaving the cards on default settings (which 90% of people actually use), then benching.
Then comparing FPS scores and IQ.
Because it isn't to the limit of the cards. When benchmarking, we are attempting to find the boundries of a GPU, where its weaknesses are, how much eye candy you can have with how many frames per second, etc. It would be fine if they did both Quality and High Quality benches, but leaving the most intensive bench results on settings that are not the most intensive for the cards can be misleading if you are paying almost half a grand on video cards. Also, can you prove that 90% of the people with an X1900 CF, 7950GX2, 7900GTX SLI, 7900GT SLI, X1900XT's ect. are NOT using High Quality?
Make sense for the thick skulls around here yet?
This is coming from an individual who tends to only see things in a green tint. How many times have you disregarded competitive technology just because you owned what it was competing against? I understand that the thought of pushing your card to the limit to see how it compares is something that only thick headed idiots would want to do. :roll:
If ATI has such superior default quality, then review sites would be remarking on it and Nvidia would be working to increase theirs to the level necessary to compete. Instead, NV is superior in some areas (at default settings) and ATI is superior in others (at default settings).
Most review sites only display frames per second scores to measure performance against other cards since there can be several cards with the same feature set but different performances. (i.e. 7800GT vs. 7800GTX, X1800XT vs. X1900XT, etc.) After testing multiple games and multiple cards, constantly uninstalling and installing drivers, posting results, etc. an image quality comparison such as Firing Squads would simply be too much of a review to do. The least we're asking is to bench $800~$1000 systems at their most highest settings so that if there are those of us who believe in moderation we will have some better guidlines. Why are you so hesitant to bench things at their utmost capability?
Those are the facts boys.
Says who? You? :laugh:
 

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
For some reason, I have never had or noticed shimmering with my 6600gt. Also, I've seen screenshots comparing Quality and HQ on Nvidia, and there was no difference.

All those who say there is...I think it's some mental idea you make yourself believe. Of course, if you run AA/AF as well, then you may see a difference.

IMO, games look great on Quality for Nvidia...I doubt the IQ difference, no AA/AF, between ATI on HQ and Nvidia on Q is very big...if even noticable.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
We should call NASA and have them analyse ATI and Nvidia HQ image quality with hubble so that the NSA and Jack Bauer can get around to enforcing it on review sites or whatever. :confused:
 

tvdang7

Platinum Member
Jun 4, 2005
2,242
5
81
ive never done a comparison but is there an image quality difference between high quality and quality?

can any one show me some pics.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Crusader
This place is hilarious.

The IQ between ATI and NV at default settings doesnt result in ATI having superior image quality in all areas. Look at transAA and AA quality for example where Nvidia is superior.

How is this any better than leaving the cards on default settings (which 90% of people actually use), then benching.
Then comparing FPS scores and IQ.[/b]

You're adding in factors to complicate things. Take a step back and forget about AA, HQ AF and all the other goodies. What we're saying is:

When benchmarking expensive cards, go into the drivers and set filtering quality from "Quality" to "High Quality" for both cards. That's your starting point - benchmark like that first.

Then turn on AA/AF and benchmark again. From that point, you can benchmark specific things, like transAA, HQ AF, etc.

Make sense for the thick skulls around here yet?
If ATI has such superior default quality, then review sites would be remarking on it and Nvidia would be working to increase theirs to the level necessary to compete. Instead, NV is superior in some areas (at default settings) and ATI is superior in others (at default settings).
Those are the facts boys.

No, that's your opinion, and you're using fallacious arguments to support them.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
Originally posted by: hans030390
For some reason, I have never had or noticed shimmering with my 6600gt. Also, I've seen screenshots comparing Quality and HQ on Nvidia, and there was no difference.

All those who say there is...I think it's some mental idea you make yourself believe. Of course, if you run AA/AF as well, then you may see a difference.

IMO, games look great on Quality for Nvidia...I doubt the IQ difference, no AA/AF, between ATI on HQ and Nvidia on Q is very big...if even noticable.


But does it have sm3.0? Oops I mean hdr with aa. :D
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
For some reason, I have never had or noticed shimmering with my 6600gt.
That's a totally different generation; nVidia took a dump on image quality on the 7xxx series.

Also, I've seen screenshots comparing Quality and HQ on Nvidia, and there was no difference.
Screenshots are useless for that sort of thing.
 

Ulfhednar

Golden Member
Jun 24, 2006
1,031
0
0
For some reason, I have never had or noticed shimmering with my 6600gt.
I noticed it heavily in Counter Strike: Source on my 6600GT, it was a bit worse on my 6800GT though. I've not used an Nvidia card since in my primary gaming machine.