• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Can someone explain the benefit of 100 fps in a game?

AluminumStudios

Senior member
I'm sorry if this has been asked before.

I'm not a gamer, but I'd like to understand why such high FPS is so important in games? If your monitor is set at 75 Hz or 85 Hz or if you have an LCD which have lower response times than CRTs, how does 100+ FPS in a game make it better? One would imagine that the highest FPS you'd actually see would be equal to the refresh rate of your monitor, wouldn't it?
 
What everyone wants is the game to be totally fluid. In other words no hitches or stuttering during any part of the game. I've read in many threads where anything over 60 to 80 fps is not decernable to the human eye. Of course this is also disputed, but the point is smooth gaming with no tearing. Which is why V sync is usually turned on. Is HL2 better at 200 fps than a constant 85? IMO no.
 
if you have vsync on you cannot get more frames than your refresh rate.
Measureing fps is mostly used to tell how much eyecandy your videocard can handle compared to ther cards. High fps have the benefit of probably also haveing a high minimum fps. Most benchmarks only look at the mean fps (try looking at hardocp.com, to see min and max fps) if the mean is 100 fps then this could be because some part of the benchmark runs at 200 fps and other at 25.
 
I think the main issue to understand is that the 100+FPS is a max frame rate. Depending on what is happening in the game, it can drop much lower.(explosions, outdoor scenes, running, etc..)

So by starting with a very high frame rate, you give yourself room that when these things happen you won't have a framerate go below 60-80FPS.

That said, I agree with GrumpyMan that There is no reason not to simply lock in something like 85FPS. Your system will then run cooler with lower power consumption and most likely at a constant level instead of constantly speeding up an slowing down.
 
Yeah I think the important thing in gaming is the minimum framerate during heavy scenes. If it dips to 25, then it's a slide show and therefore sucks, but if you can keep it at a constant 60 to 85 at all times then you're golden. The rest is for benchmarking to see how powerful your system is.
 
What GimpyOne said. Also, for good examples of this check out the video card reviews at HardOCP.com since they focus a lot on tracking average and minimum framerates, not just average or maximum.

In some games, for some cards the average might be 60 fps, but drop to 10 fps during an explosion or other "busy" scene -- at that point your smooth gameplay just got choppy and yu'd wish for the card that did 90 fps average and only dropped to 25 fps.

But I agree, a max framerate limiter makes good sense, like Doom3 normally capping it at 60 fps. Why overheat your card and CPU rendering useless extra frames?
 
Thanks for the responses, those answers make sense - ie.) if a video card can do 150 FPS then under a stressful scene it won't slow down as much ...
 
Hmmm. I think movies play at about 32 FPS? So a video card thet plays at 100+ FPS should be able to have all of the effects etc turned on and still be playable.

 
What people dont understand is having vsync off does NOTHING for anyone. The Moniter only displayes an image so many times a second, and no matter what the video card does, vsync on or off, you wont see any more fps then the moniter is showing.

Ok, here is an example.

200fps video --- 85Hz moniter -- Your game tells you 200fps, you can only see 85.
vsync on, will be the same thing, except the real fps the video card renders is limited to the refresh, so you see the same fps, but with no tearing. Having vsync off is absolutely pointless unless your benchmarking.
 
Originally posted by: Teuton
Hmmm. I think movies play at about 32 FPS? So a video card thet plays at 100+ FPS should be able to have all of the effects etc turned on and still be playable.

TV/Movies have motion blur, so their argument is different.
 
One disadvantage with having vsync turned on is that on some demanding scenarios, your framerate might drop to exactly half or a third of whatever your refresh rate is, rather than falling gradually. I've seen this happen in Quake3, and I'm not exactly sure why.
 
Back
Top