Why is everyone obsessed with framerate?

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
This is something that has mystified me ever since sites like THG started reviewing mainstream 3D cards (back in the TNT days)... the talk was all about how many FPS a card could get. But really, why?

Let me justify my line of questioning here...

Since day 1, video cards have traditionally though of as limited by the polygon and pixel count they can push, and now how fast the CPU can push the data to the card. While these are all valid concerns when pushing raw frame rate, there is a limit here...

... and this limit is neither the video card or the CPU. It's the monitor. Be it an LCD or a CRT, the monitor is only going to display upto a certain number of frames, based on the response time or refresh rate of the monitor. Physically, a CRT cannot display any more frames per second than the vertical refresh rate allows for. Same goes in a similar fashion for LCD's response time (though this is a little different since each pixel can be managed independantly - though currently it isn't).

So what does all this mean?

This means that if your refresh rate is, say, 85Hz, the maximum FPS you will see is 85fps. Sure, now you're going to say "well turn off V-Sync". We're not talking about V-Sync here... in fact V-Sync does exactly what I recommend to do. But for all intents, right now, consider V-Sync as off. Sure, your game might be displaying 200FPS, but your monitor, regardless of what's going on inside the computer, is only going to display upto your refresh rate. Period. Any additional frames being generated are going to get sent to the frame buffer, and then overwritten with the next frame before the framebuffer is sent to the monitor on the next refresh interval. So at ~200FPS, you're actually losing 1.5 frames for every frame you output to the monitor. Wasted.

Worse yet, with a CRT's "scanning" technology, if you're pushing more frames into the framebuffer than the CRT can display per refresh, there is a very distinct possibility that you're going to display part of frame A during the top half (or portion) of a scan and the frame gets rewritten before the refresh is over, causing part of frame B to be displayed. Oh hey! That's what's called "Tearing".

So what do we do now? V-Sync locks the display driver to literally stall the next frame until the screen refresh is done before the next frame is sent to the framebuffer. All is well. There are a few other things that help with IQ and such, but that's not what I'm discussing.

My point is that 200FPS (or whatever) is all fine and good, just like any sort of ego-stroking. But it does you no good. It does more harm than good. You're pushing framerates too high for no reason, and in fact, it's going to cause other things to suffer. If you're not wasting the CPU horsepower on the additional frames you never see, you can use it for things like better sound, better physics, better IQ and god forbid better AI.

The only saving grace in the future will be pure digital displays where each pixel on the display is hooked directly to a "pixel" in the framebuffer, independent of any refresh or scan rate. This will be a fundamental change in display technology, allowing for nearly infinite framerates to be displayed without penalty as each pixel can change independently at any time to any color. Of course this technology will not be happening any time soon...

Cliff's
Since everyone likes analogies, it's like an 1.8L inline 4 in the Senta versus a 6.0L V-12 Viper... Sure the V-12 can push out a lot more at the high end, but in the end, the speed limit on the highway is only 65MPH.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
But for all intents, right now, consider V-Sync as off. Sure, your game might be displaying 200FPS, but your monitor, regardless of what's going on inside the computer, is only going to display upto your refresh rate. Period. Any additional frames being generated are going to get sent to the frame buffer, and then overwritten with the next frame before the framebuffer is sent to the monitor on the next refresh interval.

That isn't how it works- when the next frame is flipped to the back buffer the prior frame is dropped and your monitor picks up drawing the next frame in the middle of the prior one. As an example- if you are pushing a mid level RR of say 100Hz and you are pushing 300FPS then your monitor will draw the first third of the screen for the first frame, the second third of the screen for the second frame and the final third of the screen for the final frame.

To the general point- when you are pushing 200FPS that means you can increase the settings without having to worry about slowdown. Also- average is just that, average. An average of 500FPS means little if your minimum is 10FPS right in the middle of the most intense action.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: BenSkywalker
But for all intents, right now, consider V-Sync as off. Sure, your game might be displaying 200FPS, but your monitor, regardless of what's going on inside the computer, is only going to display upto your refresh rate. Period. Any additional frames being generated are going to get sent to the frame buffer, and then overwritten with the next frame before the framebuffer is sent to the monitor on the next refresh interval.

That isn't how it works- when the next frame is flipped to the back buffer the prior frame is dropped and your monitor picks up drawing the next frame in the middle of the prior one. As an example- if you are pushing a mid level RR of say 100Hz and you are pushing 300FPS then your monitor will draw the first third of the screen for the first frame, the second third of the screen for the second frame and the final third of the screen for the final frame.

To the general point- when you are pushing 200FPS that means you can increase the settings without having to worry about slowdown. Also- average is just that, average. An average of 500FPS means little if your minimum is 10FPS right in the middle of the most intense action.

I dumbed it down a bit Ben. The above situation occurs on some driver implementations (the driver will lock the framebuffer on a refresh and discard new frames, though most cards/drivers don't follow this practice for framerate reasons these days). If you read down a bit further, I say the same thing:

Worse yet, with a CRT's "scanning" technology, if you're pushing more frames into the framebuffer than the CRT can display per refresh, there is a very distinct possibility that you're going to display part of frame A during the top half (or portion) of a scan and the frame gets rewritten before the refresh is over, causing part of frame B to be displayed. Oh hey! That's what's called "Tearing".

Still, to the general point, you are wasting frames and CPU/GPU which can be better served elsewhere.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
(the driver will lock the framebuffer on a refresh and discard new frames, though most cards/drivers don't follow this practice for framerate reasons these day

That is VSync.
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: BenSkywalker
(the driver will lock the framebuffer on a refresh and discard new frames, though most cards/drivers don't follow this practice for framerate reasons these day

That is VSync.

Actually, that's half of V-Sync. Most drivers will stall the rendering path when a V-Sync is triggered, and not allow rendering at all. At least that's what was happening back in the GeForce 1/2 driver days when I was working with them. They may have changed it to stall and discard at the SwapBuffer() call these days. It's been a bit since I've dealt with that.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: SunnyD
Originally posted by: BenSkywalker
(the driver will lock the framebuffer on a refresh and discard new frames, though most cards/drivers don't follow this practice for framerate reasons these day

That is VSync.

Actually, that's half of V-Sync. Most drivers will stall the rendering path when a V-Sync is triggered, and not allow rendering at all. At least that's what was happening back in the GeForce 1/2 driver days when I was working with them. They may have changed it to stall and discard at the SwapBuffer() call these days. It's been a bit since I've dealt with that.

It's not all about average as said earlier.
If I get 50% at 30fps and the other 50% of the time I have 150 fps, that's 90fps average, but obviously it's not smooth. It's going to be nasty. FPS in and of itself may not be as essential as it once was, but this can be seen in websites, many of which now publish both average and minimum frame rates.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: kurt454
I could live with a minimum of 85 fps in FEAR with max settings. ;)

Or rather, a maximum of 85fps.

But I do agree, this issue is overblown (monitor limiting frames, blah blah). 60fps is perfect if it's pinned @60 the whole time.

It's the fluctuations in FPS and the dips into the 20's and 30's that IMO makes moder games appear at times to be choppy.