Refresh rate vs. Frame rate

gman61

Junior Member
Nov 29, 2001
2
0
0
:confused: My monitor is set to the 85 Hz refresh rate, which I understand to mean that the monitor will draw 85 new screens per second. If this is true, what is the advantage of having a system that can put out more than 85 frames per second? It seems to me that if a system can put out 100 frames per second that 15 of those frames would get lost somewhere along the line between the video card and the monitor. Am I wrong? Please explain.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
They dont really get lost, you can just be sure you wont be seeing any droped frames. Since FPS tend to fluctuate alot while playing a game, having a computer that can pump out 100+ fps is a very good thing, because even though that comp can push 100 FPS, its bound to get a slow down resulting in losses of up to 20 FPS at times. Now if you have a system that can only pump out 80 or 85 FPS, that 20 fps brings your frame rate down to 60 to 65 FPS, witch really takes away from the smoothness of gameplay. The human brain cant process many more than 75fps, so as long as you keep your monitor and fps level at or above this consistantly, your gameplay will be as smooth as pie.
 

RedShirt

Golden Member
Aug 9, 2000
1,793
0
0
As long as v-sync is enabled, you will not lose frames. Most videocards have this enabled by default. Basically, the card knows not to draw stuff faster than the monitor can update. If v-sync is disabled, then you may notice some strange things happening if you card is doing 100FPS when you monitor can only show 85.