• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Refresh rate vs. Frame rate

gman61

Junior Member
😕 My monitor is set to the 85 Hz refresh rate, which I understand to mean that the monitor will draw 85 new screens per second. If this is true, what is the advantage of having a system that can put out more than 85 frames per second? It seems to me that if a system can put out 100 frames per second that 15 of those frames would get lost somewhere along the line between the video card and the monitor. Am I wrong? Please explain.
 
They dont really get lost, you can just be sure you wont be seeing any droped frames. Since FPS tend to fluctuate alot while playing a game, having a computer that can pump out 100+ fps is a very good thing, because even though that comp can push 100 FPS, its bound to get a slow down resulting in losses of up to 20 FPS at times. Now if you have a system that can only pump out 80 or 85 FPS, that 20 fps brings your frame rate down to 60 to 65 FPS, witch really takes away from the smoothness of gameplay. The human brain cant process many more than 75fps, so as long as you keep your monitor and fps level at or above this consistantly, your gameplay will be as smooth as pie.
 
As long as v-sync is enabled, you will not lose frames. Most videocards have this enabled by default. Basically, the card knows not to draw stuff faster than the monitor can update. If v-sync is disabled, then you may notice some strange things happening if you card is doing 100FPS when you monitor can only show 85.
 
Back
Top