Anyone know of benchies with equal quality settings for Nvidia and ATI's latest?

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
I've read about it before, but I noticed on the XTX I have, it does not by default enable trilinear and anistropic filtering optimization, but on my GTX it is enabled by default via the "Quality" option on the nvidia control panel. I remember This article/thread by BFG10k which noted a really sizeable performance drop (looks like an avg of about 15-20%) from going Q to HQ w/ nvidia...

If this were the case with today's latest as well, that's a pretty serious difference.

Does anyone have any info/benchmarks that show if there is a similar difference or not with the 7900 series? Thanks
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
http://www.legitreviews.com/article/310/5/

Check this out, its a XFX 7900 GTX Oc'd to 700/900, and since they used HQ settings, it still gets beaten by X1900... So yes, there is still a huge difference, and people buying NV products should know this before doing it... Also dont forget the X1900's are kept at stock speeds, so I wonder how bad a 700/900 X1900XTX would trounce the 7900 :)

For the lazy, all 16x12 4x aa results

Quake 4

7900 - 119
X1900 - 118

FEAR

7900 - 43
X1900 - 51

Serious Sam 2

7900 - 46
X1900 - 51

(the gap gets bigger with HDR on, plus they didnt test HDR + AA since the 7900 cant do it)

X3

7900 - 57
X1900 - 67

COD 2

7900 - 37
X1900 - 39

Btw look at how bad the 7800 GTX 512 gets trounced... the XFX 7900 only keeps up because of its huge OC!
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
WOW that is a significant difference, but seemingly in line with the percentile drops BFG10k noticed.... yes I wonder how mine would fare... it certainly seems to be doing really well with this memory overclock, I think my oblivion frames went up about 5-7fps in the outdoors. I am using 2xAA + Temporal AA, 8x HQAF + Full Trilinear, HDR 1680x1050 + all types of visual enhancement tweaks and the game looks great.

The XTX definitely handles high quality image settings better than my GTX, which is an understatement.

At 1680x1050, 4xAA, no AF, all BF2 settings high I benchmarked in Fraps an average of 140fps which is double that of my GTX at the same settings. When I add 16xHQAF + Full Trilinear it only drops to about 115fps (depending on server lag and all that) on which my GTX would be hitting about 50fps


Overall I continue to be exceedingly pleased by it... I may be going ATI for the near future unless G80 is a nice enough bet to switch back
 

ShadowOfMyself

Diamond Member
Jun 22, 2006
4,227
2
0
The worse is even at HQ nvidia cards wont have the same level of IQ as Ati cards... I hope nvidia catches up with their G80 too, both IQ and performance wise, if they dont I see another FX5800 vs 9700 PRO slaughter coming soon
 

Polish3d

Diamond Member
Jul 6, 2005
5,500
0
0
The angle independent AF is super nice, particularly for BF2 for instance, as well as Oblivion... but the real biggie for me is that the picture sent to my monitor is a much sharper one on the XTX, and I can see details in games (such as texturing on medic packs and other things in BF2) that were too blurry to show up on the GTX which is bad to me for the GTX. Its not just in windows either, in my bios I can make out each pixel in the lettering on XTX but on the GTX its just a slightly blurred white
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
These guys also run high quality but use caution when viewing the results as they tend to mix-and-match drivers across cards.