Because the FarCry 2 canned benchmark doesn't give different results each run for example?
But lets average you run numbers:
Min 76.5, Avg 96.85, Max 124.5
So we are talking of a margin of error of +-6% (tops).
Seems reasonable.
What I don't understand is how the minimum is 81 in the first run when I see a 72. Or how the max is 125 when the max I see is 119.
Settings: Demo(Ranch Small), 1280x1024 (85Hz), D3D10, Fixed Time Step(No), Disable Artificial Intelligence(No), Full Screen, Anti-Aliasing(None), VSync(No), Overall Quality(Optimal), Vegetation(Very High), Shading(Very High), Terrain(Very High), Geometry(Very High), Post FX(High), Texture(Very High), Shadow(Very High), Ambient(High), Hdr(Yes), Bloom(Yes), Fire(Very High), Physics(Very High), RealTrees(Very High)
Loop 1
Total Frames: 4085, Total Time: 51.00s
Average Framerate: 80.10
Max. Framerate: 107.51 (Frame:618, 6.43s)
Min. Framerate: 64.13 (Frame:1749, 21.25s)
Loop 2
Total Frames: 4103, Total Time: 51.01s
Average Framerate: 80.44
Max. Framerate: 107.43 (Frame:625, 6.50s)
Min. Framerate: 64.16 (Frame:2700, 34.22s)
Loop 3
Total Frames: 4089, Total Time: 51.00s
Average Framerate: 80.17
Max. Framerate: 107.50 (Frame:614, 6.41s)
Min. Framerate: 65.06 (Frame:1737, 20.90s)
Average Results
Average Framerate: 80.24
Max. Framerate: 107.11
Min. Framerate: 64.93
Well its seems very consistent to me. What's the margin of error here?
This method of testing is accurate and consistant.
The Fraps way is for the most part inconsistant and inaccurate.
Why change a method thats been working for years and is consistant and accurate?
Thats simple, to put ATI's cards in a better light, because Nvidia's drivers are letting the gtx 4xx line pull away in terms of performance vs ATI and making the higher price seem worth it.