frames per second into milisecs

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
im drawing a blank. if you have 60fps then how many miliseconds does it take to render a frame? whats the formula?
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
All he's saying is just because you have one drop from 900 FPS to 450 FPS and that that's technically half the FPS, the next time you do the same said operation it won't cut the FPS in half again. He has good examples.
 

SonicIce

Diamond Member
Apr 12, 2004
4,771
0
76
but what does he mean 60-55 is a greater drop than 900-450? 900-450 means twice the computation time but 60-55 is only 9% slower.
 

Soccerman06

Diamond Member
Jul 29, 2004
5,830
5
81
Originally posted by: SonicIce
but what does he mean 60-55 is a greater drop than 900-450? 900-450 means twice the computation time but 60-55 is only 9% slower.

Well only one screen, that I know of, right now can produce anything over 120fps ((1/120fps)*1000=8.33ms). Most new screens do 12ms or 16ms. So even though you drop from 900-450fps you still cant see the difference because the screen is still displaying only ~120fps so there is no fps drop. On the other side, going from 60-55fps you can actually measure at a 9% drop in framerate.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: SonicIce
but what does he mean 60-55 is a greater drop than 900-450? 900-450 means twice the computation time but 60-55 is only 9% slower.

I thought he explained it pretty well:

1000ms/sec / 900FPS = 1.111.. ms per frame
1000ms/sec / 450FPS = 2.222.. ms per frame
Increase in execution time: 1.111.. ms

1000ms/sec / 60FPS = 16.666.. ms per frame
1000ms/sec / 56.25FPS = 17.777.. ms per frame
Increase in execution time: 1.111.. ms!

Percentage-wise, it was a 'bigger' performance hit going from 900FPS to 450FPS. But in this case, it was the same amount of added time/frame.

If you do some per-frame operation that takes a fixed amount of (CPU) time (for instance, reading/writing data to system RAM), it will seem to have a 'bigger' impact as the FPS goes up. This is part of why (for instance) framerates with Quake3 at 640x480 resolution fluctuate wildly as you change memory timings or up your CPU clock slightly, but it has almost no impact on modern games running at higher settings (since the frames are being rendered much more slowly, so a couple of extra nanoseconds being used per frame by the slower RAM has little impact).

If you assume all the time is being used by rendering, it shouldn't matter -- but if you are trying to analyze the impact of CPU performance hits on a program that is also spending time drawing 3D graphics, it can be important.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: SonicIce
but what does he mean 60-55 is a greater drop than 900-450? 900-450 means twice the computation time but 60-55 is only 9% slower.

Percentage wise 900-450 is a greater drop however in absolute terms it is not as big. At 900 fps, each frame only takes 1.11 ms to complete, halve the fps and now they take 2.22 ms. If you're at a lower framerate where each frame takes 16ms, then a drop to half now takes 32ms per frame, so the percentage in fps is the same, the percentage in time is the same, but the time increase is larger for drops at lower framerates.