Frame rate non-linearity

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
I've been pondering today the idea of frame rate being a non-linear measurement of performance, it sounds kind of odd and I feel it's something many people get wrong.

This article is one I've referred to a lot over the years - http://www.mvps.org/directx/articles/fps_versus_frame_time.htm

I was wondering how does this relate to hardware increases in speed. For example, if we assume a simple (ideal) processor doing fixed instructions, and we double it's speed from 1Ghz to 2Ghz, we'd expect the execution time of something taking 10ms to drop to 5ms right? Twice the speed, half the time.

So am I right in thinking that the performance increase that gives us, in FPS is also non-linear? That it depends on the original frame rate.

So for example if you start with a piece of code that takes 10ms, it's reduced to 5ms. That translates into 100fps increased to 200fps, an net of 100fps.

However if your starting point is 20ms, and you're halved to 10ms, that translates into 50fps increaseing to 100fps, a net of 50fps.

Does this essentially mean that increasing your frame rate is easier the higher it is to begin with, and that linear incriments of the same piece of hardware (say overclocking 10% then +10% more then +10% more) lead to bigger and bigger gains? Or have I got something wrong somewhere?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
It responds in a linear way, that is a doubling in performance of the CPU also results in doubling of frames per second, the relationship between the two is clearly linear in the hypothetical scenario. But if each frame is simpler to begin with (ie the game is 100 fps on X1 CPU performance, not 50 fps) then you get more FPS from the same increase in the CPU. That doesn't make it non linear in relationship, it just means that you get twice as many frames out of the same CPU so when you double the CPU you also get twice as many frames as benefit.

The non linear relationship is the benefit you get. Going from 15 to 30 fps is an enormous gain in motion. Going from 30 to 60 fps is noticeable and quite beneficial but nothing like 15 to 30 which went from a slide show to acting like motion. Going from 60 to 120 is marginal, only about 70% of users even notice the difference. So there is a definite reduction in the improvements of the motion as the frame rate increases. Yet the performance increase necessary is double each time, thus we get a non linear relationship with our perception of motion improvements despite a linear increase in performance.
 

Falafil

Member
Jun 5, 2013
51
0
0
I can't tell if this is a very technical hardware question or a 4th grade math question.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
It responds in a linear way, that is a doubling in performance of the CPU also results in doubling of frames per second, the relationship between the two is clearly linear in the hypothetical scenario. But if each frame is simpler to begin with (ie the game is 100 fps on X1 CPU performance, not 50 fps) then you get more FPS from the same increase in the CPU. That doesn't make it non linear in relationship, it just means that you get twice as many frames out of the same CPU so when you double the CPU you also get twice as many frames as benefit.

The non linear relationship is the benefit you get. Going from 15 to 30 fps is an enormous gain in motion. Going from 30 to 60 fps is noticeable and quite beneficial but nothing like 15 to 30 which went from a slide show to acting like motion. Going from 60 to 120 is marginal, only about 70% of users even notice the difference. So there is a definite reduction in the improvements of the motion as the frame rate increases. Yet the performance increase necessary is double each time, thus we get a non linear relationship with our perception of motion improvements despite a linear increase in performance.

It seems like the ratio of percentage increase (say double the speed double the frame rate) is linear, but with respect to your initial frame rate, it's not absolute.

So you one piece of code that takes say 2ms to execute runs at 500fps and another takes 10ms to execute runs at 100fps, if you double the processor speed and half the time, that drops to 1ms (1000fps) and 5ms (200fps), the absolute difference in those 2 circumstance is 500fps vs 100fps.

So FPS isn't a linear measurement of performance, the same piece of hardware can show a different increase in fps for 2 separate pieces of code...?
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
So FPS isn't a linear measurement of performance, the same piece of hardware can show a different increase in fps for 2 separate pieces of code...?

Yes, an absolute difference does not itself express a relative one. For a relative difference you need something for it to be in relation to -- you know, the definition of the word.

What's your point? Nobody uses the absolute difference without context as a relative performance metric so why are you referencing it?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Yes, an absolute difference does not itself express a relative one. For a relative difference you need something for it to be in relation to -- you know, the definition of the word.

What's your point? Nobody uses the absolute difference without context so why are you referencing it?

I'm just pondering, as I said in my original post.

I've seen this in the past when it comes to video card reviews, where 2 cards are compared when seeing fps decreases from turning on AA and AF at certain graphics settings, because the initial frame rates weren't the same the measure of decrease in FPS was misleading.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Can someone rephrase what is being asked?

What I can tell, based on reading the article and the questions/comments here, is that a change in FPS can be misleading and you should consider the change in the time of one frame instead?

But because the two are interchangeable, I still don't see the bother with using one type of measurement over the other.

Changing your frame render time by one millisecond can be huge or it can be nothing, depending on what the pre-change time was. I mean comparing 1000 milliseconds vs 1001 milliseconds is about the same, who cares. But comparing 1 millisecond vs 2 milliseconds is a huge increase. This falls under the "duh" category.

But when you look at FPS, doesn't the same argument hold? The change is more meaningful when you consider how big the change is to the overall total? Like increasing by 50 FPS might be huge when you are coming from 10 FPS originally, but nothing when you are coming from 2000 FPS originally?

What am I missing?
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
If you read the original article I linked, you can see that in simple terms it means that the drop of say 60fps to 55fps represents a greater increase in execution time than dropping 900fps to 450fps.

It's not just a simple case of our perception, sure enough it's harder to see changes in frame rate as the frame rate gets higher (smoother), it's a case of the scale not being linear in its representation of performance, 1fps change at 60fps is not objectively the same as 1fps at 120fps, irrelevent of our perception the actual metric itself is not a linear relationship.

I'm simply trying to expand my understanding of how that translates into a relationship with increasing hardware speed, in part to understand if some benchmark ideas of comparing performacne gains/losses are really fair.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
I've seen this in the past when it comes to video card reviews, where 2 cards are compared when seeing fps decreases from turning on AA and AF at certain graphics settings, because the initial frame rates weren't the same the measure of decrease in FPS was misleading.

And you knew the initial frame rates weren't the same because they told you those frame rates.
Even morons who can't hold four values in their head can break it down into two groups of two, (with a third being the comparison of the results of the first two) so what's the problem?

Card 1 gets 100FPS with no AA/AF
Card 2 gets 200FPS with no AA/AF

Card 1 gets 90FPS with 8xAA/8xAF
Card 2 gets 90FPS with 8xAA/8xAF

Which card sees less of a drop from AA+AF?
This is not rocket science.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
And you knew the initial frame rates weren't the same because they told you those frame rates.
Even morons who can't hold four values in their head can break it down into two groups of two, (with a third being the comparison of the results of the first two) so what's the problem?

Card 1 gets 100FPS with no AA/AF
Card 2 gets 200FPS with no AA/AF

Card 1 gets 90FPS with 8xAA/8xAF
Card 2 gets 90FPS with 8xAA/8xAF

Which card sees less of a drop from AA+AF?
This is not rocket science.

Look, I never said it was rocket science.

In this case it's obvious because you've set 2 metrics equal the same (frame rate with 8xAA/8xAF is the same) so you have a point of comparison, but this is rare during benchmarks between 2 competing cards, usually you have 4 points of comparison and not 3.

I've seen benchmarks in the past that do this sort of comparison then give percentage which are merely calculated off the difference in frame rate, despite the starting frame rates being different, which most certainly is misleading.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Some additional very interesting factors for measuring fluidity.
This is a visually separate subject than input lag (though is partially linked, since a higher framerate can have lower input lag, as seen in AnandTech's testing and input lag graph).

(1) Microstutters. Consistent frametimes with 80fps looks smoother than random frametimes at 120fps. Basically, you see less stutters/judder/jerkiness if the frames are delivered to your eyes on a more regular fashion. Even if it involvs partial frames (e.g. slices of frames during VSYNC OFF)

(2) Your computer mouse can be the limiting performance in smoothness. A highly acclaimed 1000Hz mouse, can look smoother at the same framerate than a cheap mouse. If you cannot slowly turn left/right with your mouse as fluidly as you can keyboard strafe left/right, then your mouse is definitely the limiting factor in motion fluidity. Adjust game sensitivity a bit lower, and hardware sensitivity a bit higher, can help too.

(3) Displays with less motion blur, makes it easier to see stutters. For example, CRT, plasma, and LightBoost displays is easier to see tearing and stutters (e.g. even one framedrop at 119fps@120Hz) than on regular LCD displays.

(4) Synchronization with refresh can improve motion fluidity (note: separate topic than input lag).
e.g. 120fps@120Hz VSYNC ON can still look smoother than 300fps@120Hz VSYNC OFF. Unfortunately, VSYNC ON adds input lag, so a good compromise setting is Adaptive VSYNC, which is like an input-lag-reduced VSYNC ON. (You simply get tearing instead of sudden slowdowns, everytime you drop below framerate=Hz).

fps-vs-hz-1024x576.png
 
Last edited: