I've been pondering today the idea of frame rate being a non-linear measurement of performance, it sounds kind of odd and I feel it's something many people get wrong.
This article is one I've referred to a lot over the years - http://www.mvps.org/directx/articles/fps_versus_frame_time.htm
I was wondering how does this relate to hardware increases in speed. For example, if we assume a simple (ideal) processor doing fixed instructions, and we double it's speed from 1Ghz to 2Ghz, we'd expect the execution time of something taking 10ms to drop to 5ms right? Twice the speed, half the time.
So am I right in thinking that the performance increase that gives us, in FPS is also non-linear? That it depends on the original frame rate.
So for example if you start with a piece of code that takes 10ms, it's reduced to 5ms. That translates into 100fps increased to 200fps, an net of 100fps.
However if your starting point is 20ms, and you're halved to 10ms, that translates into 50fps increaseing to 100fps, a net of 50fps.
Does this essentially mean that increasing your frame rate is easier the higher it is to begin with, and that linear incriments of the same piece of hardware (say overclocking 10% then +10% more then +10% more) lead to bigger and bigger gains? Or have I got something wrong somewhere?
This article is one I've referred to a lot over the years - http://www.mvps.org/directx/articles/fps_versus_frame_time.htm
I was wondering how does this relate to hardware increases in speed. For example, if we assume a simple (ideal) processor doing fixed instructions, and we double it's speed from 1Ghz to 2Ghz, we'd expect the execution time of something taking 10ms to drop to 5ms right? Twice the speed, half the time.
So am I right in thinking that the performance increase that gives us, in FPS is also non-linear? That it depends on the original frame rate.
So for example if you start with a piece of code that takes 10ms, it's reduced to 5ms. That translates into 100fps increased to 200fps, an net of 100fps.
However if your starting point is 20ms, and you're halved to 10ms, that translates into 50fps increaseing to 100fps, a net of 50fps.
Does this essentially mean that increasing your frame rate is easier the higher it is to begin with, and that linear incriments of the same piece of hardware (say overclocking 10% then +10% more then +10% more) lead to bigger and bigger gains? Or have I got something wrong somewhere?