Getting ready for benches. Need a little advice.

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Yeah no need for synthetic benchmarks and if I had to decide between 1680x1050 and benchmarks with OCed cards I'd always choose the later. I think it'd be much more interesting to see those results then the differences between 1920 and 1680, also more relevant for most people, because who'd buy a high end GPU and play with such a low resolution?
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,674
146
106
www.neftastic.com
*goes to hide my computer inside my closet*

:D

I'm not saying there aren't people that do it. I'm just saying for an accurate comparison, I'd like to see benchmarks on a config that 90% of people run.

Because if I don't overclock the CPU, I'll hear a bunch of "Why didn't you o/c?!?!, you know that GPU is being held back by your stock processor!!!!!"

Safer to O/C the CPU to lessen the possibility of a CPU bottleneck in a GPU benchmark suite.

I'm running a Core i7 860 at a pretty rock solid 3.4GHz, HT and Turbo enabled, which I think is probably adequate. Sure, higher the better, but I want a stable system and I haven't had the greatest of luck o/c'ing CPUs.

As for the overclocking benchmarks mentioned above, I may do a few, but don't count on it being any kind of extensive. As it is, I have over 10 titles to bench in a system configured once for Nvidia, and once again for ATI (all else the same save the graphics card and hard drive) at two resolutions.

Oh don't get me wrong, I don't care if you overclock. But please take the time to put in some relevant tests too for the masses. Some of us would love to know where bottlenecks are on the various setups.

I really meant "Sure, run some OC'd tests... but for the love of god put in some stock tests too."
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I mean seriously, why do reviewers feel the need to benchmark with the absolute top end system overclocked to hell and back and feel those benchmarks actually mean something to the rest of the world?
I used an E6850 for a long time and a lot of people thought I was CPU limited. Of course that wasn’t the case, and I demonstrated this by underclocking it to 2 GHz and seeing almost no performance change.

I hope my new i5 750 @ stock isn’t too high-end for you. :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I used an E6850 for a long time and a lot of people thought I was CPU limited. Of course that wasn’t the case, and I demonstrated this by underclocking it to 2 GHz and seeing almost no performance change.

I hope my new i5 750 @ stock isn’t too high-end for you. :)

Somebody linked a TomsHardware article for you a few days ago showing some pretty interesting CPU scaling. Any comments on that? Or did they screw up the whole bench method? If you get a chance, take a look at it.

And just so you are aware, I am getting slightly better numbers with higher CPU clock in certain situations. And that is the reason I'm o/c'd for these benches. I'd like to set it to 5GHz if I could. Even if it made no difference (which I whole heartedly believe it would just for the sake of ensuring GPU limitation), it doesn't hurt.
 
Last edited:

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
Btw. when you launch the Dirt demo in DirectX 11, GF100 automatically switches to DirectX9 for whatever reason. Since you're benching with the demo it might be a useful information for your 'review'.
:)
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Btw. when you launch the Dirt demo in DirectX 11, GF100 automatically switches to DirectX9 for whatever reason. Since you're benching with the demo it might be a useful information for your 'review'.
:)

I won't post it here since it always seems to blow up into a flame fest, but Charlie's latest article is basically calling out Nvidia over that. You guys know the site if you want to read it. :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Btw. when you launch the Dirt demo in DirectX 11, GF100 automatically switches to DirectX9 for whatever reason. Since you're benching with the demo it might be a useful information for your 'review'.
:)

What about GTX295? Same thing?
I'm guessing this doesn't happen with the full version?
 

1h4x4s3x

Senior member
Mar 5, 2010
287
0
76
Nope, just the demo. Reason could be that the demo simply doesn't recognize the GF100 because it was released way before Fermi and hasn't gotten any updates since?!
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
I won't post it here since it always seems to blow up into a flame fest, but Charlie's latest article is basically calling out Nvidia over that. You guys know the site if you want to read it. :)

Wait, how can that be possible? GF100 is broken, and isnt going to be able to be fixed.

Obviously it switches to DX9 because GF100 is really a rebranded 8800GTS 640.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,697
397
126
Wait, how can that be possible? GF100 is broken, and isnt going to be able to be fixed.

Obviously it switches to DX9 because GF100 is really a rebranded 8800GTS 640.

Did you notice they had to drop to 480 shaders and change their entire allocation of 40 nm production to get cards to sell according to the latest rumours?