Help Design The Next AnandTech GPU Benchmark Suite

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

pcm81

Senior member
Mar 11, 2011
598
16
81
Add the sci-comp openCL benches.
I found that the 3DMark11 does not tax my system as much as Milkyway@Home does. With 3DM11 my temps would stay in 90s on air and with MW@Home they peaked at 109 C.

In sci-comp suit add a known value problem to check memory stability for OCing. Large Matrix addition would work well. Of course you would have to loop it many times.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
What I meant is that object detail and shadow detail can throw a spanner in the works when you're benchmarking a GPU.
There are many games that have settings that can be affected by CPU performance. Furthermore if other settings shift the primary bottleneck to the GPU (e.g. shader quality in Crysis) it doesn’t really matter about the smaller bottlenecks because they don’t really influence overall performance.

Also unless you’ve done some kind of profiling or CPU scaling tests (e.g. overclocking the same CPU, or dropping in a faster one while keeping everything else the same), you can’t infer that those settings are CPU limited.

I’ve done this quite a few times in Crysis and the difference with the CPU is negligible (e.g. no performance difference between an E6850 and i5 750), yet every GPU upgrade has yielded large performance gains in benchmarks and actual gameplay.
 

GoodRevrnd

Diamond Member
Dec 27, 2001
6,801
581
126
It is obviously useful to automatically turn on all features and eye candy to stress the GPUs the most, but do you think it is actually meaningful to turn on features that any sane person would not use? For example, nobody plays BFBC2 with Bloom enabled because you're just giving up frames for an overdone obnoxious effect that hampers your gameplay. As such, why bench with it enabled?


Don't over-represent any single engine *cough*unreal*cough*. This typically hasn't been a problem in the past, I just want to reinforce the point, especially if a glut of games on one engine all come out at once and you are tempted to use all of them for the sake of newest game = most relevant. I'm sure Batman: AC will tell us all we need to know about Unreal for the next year. ;) But seriously, I would also try and pick the title you feels best represents the engine as a whole and hasn't been tweaked to death to disproportionately favor one card over the other.
 
Last edited:

zebrax2

Senior member
Nov 18, 2007
975
66
91
We get this request a lot. Unfortunately minimum framerates are extremely unreliable - by definition it's an absolute and not an average, so otherwise minor variations between runs make a huge difference. We do want to do minimums where it makes sense, but there are only a few games that have ever proven to offer reliable minimum framerates.

How about doing % of the time below a certain fps the card is
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
How about doing % of the time below a certain fps the card is

That's not a bad idea, but even that won't tell you the whole story. Minimum frames per second or % below a certain FPS still end up being averages, even if they're on a shorter timeframe.

I think one of the bigger problems is sudden drops in performance that can last less than a second, and then get averaged away in benchmarks. For instance, you may have 50 frames in the first 1/2 second, and 1 frame in the second 1/2 second, and now you have 51fps. Sounds fine, but it isn't. But if that's your worst one-second increment, that's your minimum fps, and furthermore, your % below 50fps, for instance, would be zero.

I think that's what often causes the "hitching" in games, particularly with dual cards. Others may want to chime in if I'm off base here.
 

Anubis

No Lifer
Aug 31, 2001
78,712
427
126
tbqhwy.com
i would like a way to combine the 2010 and 2011 results, so i can compare any card to any card.

splitting it was a dumb idea
 

Mr. President

Member
Feb 6, 2011
124
2
81
There are many games that have settings that can be affected by CPU performance. Furthermore if other settings shift the primary bottleneck to the GPU (e.g. shader quality in Crysis) it doesn’t really matter about the smaller bottlenecks because they don’t really influence overall performance.

Also unless you’ve done some kind of profiling or CPU scaling tests (e.g. overclocking the same CPU, or dropping in a faster one while keeping everything else the same), you can’t infer that those settings are CPU limited.

I’ve done this quite a few times in Crysis and the difference with the CPU is negligible (e.g. no performance difference between an E6850 and i5 750), yet every GPU upgrade has yielded large performance gains in benchmarks and actual gameplay.

I hear ya, but that's exactly how I checked for it. The same Q6600 scaled in performance from 2.4ghz to 3ghz on all cards (8800GTS 320 & 512-HD4850-HD6850). Resolution obviously had an effect, as did the actual performance of the cards, but, on all cards barring the 320, object detail and shadows made the difference between playable and unplayable with everything else maxed out at 1680x1050. The same applied to the Core 2 Duo that the Q6600 replaced. I've also pointed this out to people on other forums who likewise saw the improvement that I did.

Like I said, I'll grant that the case may be different on an i5/i7 based system (because I haven't checked it) but this is something that I've confirmed many times over the past few years. I even used the ingame overlay (that can tell you the number of draw calls being used) and performance scaled predictably along with it.

This probably isn't a fitting discussion for this thread but I still thought it was worth mentioning. It's something that has bothered me a bit in reviews for some time and is why I brought it up.
 

sgrinavi

Diamond Member
Jul 31, 2007
4,537
0
76
How about some CAD type stuff?... Autodesk REVIT & 3dsMAX both have several bench programs out there that will test your viewport performance.
 

GoodRevrnd

Diamond Member
Dec 27, 2001
6,801
581
126
i would like a way to combine the 2010 and 2011 results, so i can compare any card to any card.

splitting it was a dumb idea

They have to do CPU upgrades to stay relevant though. But yes, this is why I'd like even new reviews to include old cards, even if it's just a single killer card from each generation to establish a baseline for comparison (like 8800GT, GTX460, 4870, etc.).
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,117
9,372
136
That's a good point. Many people still have a fascination with Crysis' performance. At the same time, I feel that Witcher 2 is the best example of the most demanding DX9 game. If we have Crysis 1 and 2, that's 2 / 7 games devoted to Crysis. If I had to choose an inter-generational game, I'd pick Witcher 2 for 3 reasons:

1) It's DX9, so that means just like Crysis 1 you can test cards all the way back such as the 9800GT and see exactly how much faster modern cards have become.

2) The Role-Playing genre's sales are larger by size than the FPS genre. So this bench will likely be more useful for more gamers than Crysis 1 is (I mean how many people are going to be replaying Crysis 1 with a GTX680 SLI setup?).

3) On Ultra settings with Super Sampling, this game is more demanding than Crysis 1 is:
http://gamegpu.ru/RPG/Rollevye/The-Witcher-2-Assassins-of-Kings-versii-1.2-tect-GPU.html

Witcher 2 also supports 3D vision, not sure if Crysis 1 does?

-One of the advantages of going with Crysis (not at the exclusion of witcher 2 but as the "retro" bench) at this point however is that the game is mature and done. There aren't going to be any more patches for the game and both AMD & Nvidia have squeezed out every drop of performance that they possibly can from the title with their drivers. While Witcher 2 is DX9 and compatible all the way down to the Geforce 6 series doesn't mean we'll get an accurate picture of performance scaling from one generation to another since older cards cannot expect any driver optimizations for the game.

Additionally, there is already a huge, workable catalog of benches for Crysis on older hardware and so long as the CPU stays the same (i7 950 is still more than powerful enough) it cuts down on a lot of extra work for AT while giving us a far broader comparison.

However, I think you completely nailed it with your prior post and it would be a mistake for AT to deviate too strongly from your suggestions, especially the non-gaming benches. Brilliant!
 

Anubis

No Lifer
Aug 31, 2001
78,712
427
126
tbqhwy.com
They have to do CPU upgrades to stay relevant though. But yes, this is why I'd like even new reviews to include old cards, even if it's just a single killer card from each generation to establish a baseline for comparison (like 8800GT, GTX460, 4870, etc.).

yes then there definitely needs to be a few cards that are run through every single test for a baseline as you said