Can someone explain these scores?

Oct 16, 1999
10,490
4
0
Here.

I have a hard time believing the minimum framerate difference here between the Ti4400 & Ti4600. More specifically, I have a hard time believing the Ti4600 is that high.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Simply put, there absolutely has to be an error in the tests.
There is no way the difference could be that dramatic.
Even assuming the GF4 core was 100% clock efficient the GF4 Ti4600 could never be over twice the speed of the Ti4400.

The only difference between the two boards is core/mem clockspeed.

In an absolute best case scenario, assuming the GF4 core was 100% efficient and rendering a scene in which it was bandwidth limited... the GF4 Ti4400 would offer 86% of the performance of the Ti4600.
(Of course being bandwdith limited and 100% efficient is a contradiction in itself)

Or in another way of stating it, the Ti4600 would put up a frame rate 16.1% higher then that of the Ti4400.


Regardless of which way you choose to look at it, the Ti4400 cannot be even remotely close to half the speed of the Ti4600 so long as both cards are designed directly following specifications.
Other factors such as BIOS revision, drivers etc. could induce a greater impact however.
 

mastay

Member
Jul 3, 2002
130
0
0
I think the min framerate measure is stupid. You are measuring the performance of a single point in time. Simple statistical variance could create whacky results. Looking for something like average framerate is more better in measuring performance.
 
Oct 16, 1999
10,490
4
0
Originally posted by: mastay
I think the min framerate measure is stupid. You are measuring the performance of a single point in time. Simple statistical variance could create whacky results. Looking for something like average framerate is more better in measuring performance.

Assuming you get accurate results, measuring the minimum framerate is one of the most important benchmarks you can run.
 

Rand

Lifer
Oct 11, 1999
11,071
1
81
Originally posted by: Gonad the Barbarian
Originally posted by: mastay
I think the min framerate measure is stupid. You are measuring the performance of a single point in time. Simple statistical variance could create whacky results. Looking for something like average framerate is more better in measuring performance.

Assuming you get accurate results, measuring the minimum framerate is one of the most important benchmarks you can run.

I agree so long as it's a sustained minimum.
A minimum frame rate of 10FPS that lasts for .001 seconds is effectively unnoticeable IMHO. A minimum frame rate of 10FPS over a full second is noticeable.
 
Oct 16, 1999
10,490
4
0
Yes, that is why I like the Serious Sam becnhmark, it gives the min fps, min sustained fps, avg, peak sustained, and absolute peak. I don't understand why it isn't used more, especially since it can be run in D3D and OGL.