Originally posted by: quattro1
Ok, say you do get HQ tested on both cards, then what? Whats to say perf being measured is apples to apples as the original post was sort of asking for?
Heck if you are trying to get the "best" of each card tested, why not crank up the AA to the max that each card can do? I know 4x isnt the "best" each card can do...
I dont know why AT leaves it at the default settings, but testing at HQ wont be the "end all" of apples to apples bmarking that this post is trying to get at. Too many variables in each driver.
4xAA is the only eye-friendly level that is shared by both Nvidia and ATI. Nvidia can't do 6xAA, nor ATI 8xSAA. However the optimizations in both drivers are much more similar than the differing AA/AF. Keeping the AA at a equal setting (4x) and the AF at an equal setting (16x) and the driver level and optimizations at an equal setting (HQ with optimizations off) will provide more acurate results than to mismatch one area and display the frames per second.
EDIT: The game's video settings should, of course, also be maxed out for that kind of comparison. If one would wish to bench a mid-level bench, the game's video settings should be set to a "Medium" or equivilant standard, and the driver settings should then be reduced to "Quality" with the default optimized settings for each card. AA should still range from 4x-0x and AF between 16x-8x.