Some thoughts about Frames-per-Second in games

ubergeek

Junior Member
Jan 15, 2002
2
0
0
A popular benchmark for the processing capability of the 3D engines of popular video cards is to see how many frames a game can render in a second using a particular hardware platform.

Many think that if a game does not run at 120 Frames/Minute at 1600 x 1200, then the video card simply is not powerful enough.

Now lets think about this a minute.

If your monitor can only display at 85 Hz Vertical (realistically frames per second), then why render at 120 FPS?
In order for a monitor, which can display no more than 85 FPS, to display a game rendering at a rate of 120 FPS, it must DROP 35 Frames per second! The 3D hardware will be, in some cases rendering 2 or more frames into the video buffer before the monitor can actually display it. That would be at least one frame of animation lost and that much processing power totally wasted.

Also, what happens when you drop video frames in a movie or any other video stream? It gets choppy that's what! Turning on the vertical sync will slow the card down to what the monitor can handle with excellent animation results, but you are now running that state-of-the-art, fastest-on-the-planet video card no faster than one that costs less than half as much. Doesn't that suck!

While yes, using the game FPS performance numbers is a great way to quantitatively measure just how blazingly fast a video card really is and is great driver for competition between the video chipset makers, but realistically to get the best, smoothest animation from my games with no dropped frames I suggest picking the video card that will drive your display equipment at its highest level (or just a bit more), and turning on the V-Sync. 120 FPS in Quake III? Big deal! If my monitor is not going to display it any faster than 85 FPS, the extra 35 FPS is wasted and never seen.
 

Necrolezbeast

Senior member
Apr 11, 2002
838
0
0
what about better image quality with AA and aniso? Or maybe that if I want to buy a vid card now I won't have to upgrade in 2 months when a new game comes out and runs really slow. People buy the best card which they can afford for a reason, we don't want to buy a new vid card every 2 months when a new game comes out just to barely be able to play it. We want to be able to keep it in there for as long as possible while running current games and future games at a high fps so we can turn on all the visuals.
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Asuka
Originally posted by: rbV5
Wrong.


Modern video cards have very little problem with rendering without being sync'd to the monitor refresh rate (vsync disabled). The effect of this is a vertical "cut" through the display which can be somewhat annoying but YMMV.

From what I recall, one problem with vsync being "ON" is that when the card is buffering the next frame, if the refresh takes to long that buffered frame is thrown away and the next frame is drawn to match the refresh. This can actually lead to less FPS than the monitor refresh even though the video card is capable of rendering more. I recall reading either a SGI or a 3dfx article which explained that in the worst case timing between video card and monitor, you could actually end up with half the possible rendered frames.

So, your claims of "throwing away" FPS only apply when vsync is on. When vsync is off, the effect is that you get "half frames" sometimes, where 2 frames are blended together. That's a poor way of describing it, but basically you end up without being limitied by synchronizing with the montior refresh.

When playing multiplayer games, I don't mind vsync "off" at all. If I'm playing a uber cool single player game I'll turn it on because I believe that it improves the visual quality.


Many think that if a game does not run at 120 Frames/Minute at 1600 x 1200, then the video card simply is not powerful enough.


Gosh, I hope many people think that way :)
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
if a game does not run at 120 Frames/Minute at 1600 x 1200, then the video card simply is not powerful enough.
In some games, yes. If the timedemos are generating 120 FPS average then you will get slowdowns on the larger levels (eg Quake3).

If your monitor can only display at 85 Hz Vertical (realistically frames per second), then why render at 120 FPS?
Because you can still see the effects of partially drawn frames in the form of better mouse sensitivity, response and smoothness.
 

Emultra

Golden Member
Jul 6, 2002
1,166
0
0
Remember, extra power is never obsolete, since you can convert performance into quality.
 

Emultra

Golden Member
Jul 6, 2002
1,166
0
0
Oh, and remember also that it's good to have some spare FPS in massive battles, where the FPS gets lowered.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:eek: Perceptible FPS and Vsync are obscure and mostly down to personal preference.

:) Vsync will avoid the cutting/tearing of horizontal lines across the screen but many people, either from poor monitors, high res or OS quirks use 60Hz refresh rates. You are likely to get between 50% and 80% of your Hz in to FPS with Vsync, at 60Hz this means 30-48 FPS which is NOT good news.
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:D I agree that it's easy to get carried away by silly FPS and for most Vsync on will be preferable providing they can run 85Hz+ as many people will notice tearing but find little benefit going over 60 FPS. Benchmarkers need to disable Vsync, results would be boring and similar up until the card can no longer run the FPS to the refresh rate. Anyway it is generally a good idea to run games between 60 and 100 FPS, average FPS means that actual lowest FPS are significantly lower. There is always a compromise, even before details settings do you want 800x600 100FPS, 800x600xAA 70FPS or 1024x768 70FPS? Very much down to personal pref.

Amo.net - nice link
 

Sachmho

Golden Member
Dec 6, 2001
1,197
0
0
we don't want to buy a new vid card every 2 months when a new game comes out just to barely be able to play it. We want to be able to keep it in there for as long as possible while running current games and future games at a high fps so we can turn on all the visuals.

i've had my gf3 for a year now, and i haven't played any game yet that has any visible lag whatsoever, and i'm an avid gamer that plays cutting edge software... your point seems to be moot, my friend....
 

AnAndAustin

Platinum Member
Apr 15, 2002
2,112
0
0
:) Yeah but Sachmho don't forget how much you paid for your GF3 card a year ago. IIRC the point was there's no point in cheaping out and buying a $60 GF2 card when the extra $30 for a GF3 or Rad8500 will last you a while lot longer. Buying a top card like the Rad9700 or even GF4TI4600 isn't generally a great idea because they devalue fast and by the time their perf and features truly prove useful there are better and cheaper cards out anyway. For most of us there's no need to buy top of the range, no cheap out and buy entry level, you want to be somehwere in between depending upon your current system, budget and needs (both current and future). That's my take on it anyway. ;)
 

Mingon

Diamond Member
Apr 2, 2000
3,012
0
0
Personally any card that is displaying less than 50fps minimum is to slow, 25fps is about playable in rts or something like dungeon siege but more is always better. To manage this minimum you really need your video card to produce twice the amount for the times when the carda slow down. Basically what your saying has some truth as long as your vsync refresh rate is equal to your average frame rates.
 

Sachmho

Golden Member
Dec 6, 2001
1,197
0
0
Yeah but Sachmho don't forget how much you paid for your GF3 card a year ago. IIRC the point was there's no point in cheaping out and buying a $60 GF2 card when the extra $30 for a GF3 or Rad8500 will last you a while lot longer.

yes, i do remember... it was $100 at bestbuy :)

i guess that solves that
 

merlocka

Platinum Member
Nov 24, 1999
2,832
0
0
Originally posted by: Sachmho
Yeah but Sachmho don't forget how much you paid for your GF3 card a year ago. IIRC the point was there's no point in cheaping out and buying a $60 GF2 card when the extra $30 for a GF3 or Rad8500 will last you a while lot longer.

yes, i do remember... it was $100 at bestbuy :)

i guess that solves that

LOL
 

rogue1979

Diamond Member
Mar 14, 2001
3,062
0
0
I believe that smooth gameplay at resolutions that you like is more than good enough. If you prefer 1600 x 1200 with lots of stuff turned on then you need a Geforce 4. If 1280 x 1024 or 1024 x 768 is good enough then a overclocked Geforce 2 Pro/Ti/Ultra or Geforce 3 is plenty of video power. In some of my games 40-60fps is enough. In first person shooters I like 60-80fps. This again is personal preference, other may feel like they need more speed.
 

swatoa

Junior Member
Apr 9, 2002
21
0
0
i've had my gf3 for a year now, and i haven't played any game yet that has any visible lag whatsoever, and i'm an avid gamer that plays cutting edge software... your point seems to be moot, my friend...

Hell, I could very well say the same thing about this geforce 2 pro. All games, up until Battlefield 1942 and America's Army, play at very respectable framerates at max settings. Video tear is another story. Could my monitor be a candidate?