Frames Per Second

MrDeuce

Junior Member
May 29, 2000
11
0
0
How is FPS different from refresh rate, and why in some game applications does my FPS drop a large amount. Also, what are some ways to increase FPS
 

Muerto

Golden Member
Dec 26, 1999
1,937
0
0
To tell you the truth I'm not 100% sure about the first question. My guess would be that in windows the images on the screen usually remain the same so they don't need to be re-drawn as a frame, they just need to be re-freshed.

Your frame rate probably drops because there is too much happening for your system to deal with quickly (if you provide some system specs we could probably identify the exact problem).

If you want to increase your frame rate you can do two things. Either turn down the detail levels in your games, or upgrade your computer (also make sure you have the latest drivers for all your components). Hope this helps. :)
 

ssjgokou1

Banned
Jul 2, 2000
190
0
0
refresh rate are different from fps because if the screen is at 85Hz, and you get half of that (42.5fps) then the monitor will display that frame twice before the graphics card draws another one, but in some cases the fps are greater than the refresh rate, at which point the monitor doesn't display all the fps, because a couple of them go by to fast for the monitor to display them. Even if you have 240hz refresh rate, your eyes won't even detect anything greater than 60fps, so to compare 200fps and 80fps is almost the same thing for your eyes, these numbers are mainly for benchmarking.

Do what the other guy said for getting more fps.
 

jmcoreymv

Diamond Member
Oct 9, 1999
4,264
0
0
Refresh rate i believe is measured in hz and is how fast your monitor draws a picture on a screen, the higher the refresh rate the lower the flicker. I keep my refresh rate at 85 hz. FPS is basically i think how many frames per second the computer can send out to the monitor. Im not good with words describing it but like....ahh forget it!!! I feel like im chokin on my tongue, Ill let someone more professional answer it.
 

AMB

Platinum Member
Feb 4, 2000
2,587
0
0
At what point can't the human brain detect an increase in FPS
 

Muerto

Golden Member
Dec 26, 1999
1,937
0
0
ssjgokou1,

Thanks for clearing that up. I wasn't quite sure on that either. So you're saying that the monitor and the video card refresh images independantly of one and other? If the monitor is set to a certain refresh rate it just keeps refreshing what ever the video card sends it?
 

ssjgokou1

Banned
Jul 2, 2000
190
0
0
it depends if V-sych is on or not, if so, then the monitor will try to "sync" the frames so they correspond, for instance, suppose you run 60hz, then your graphics card won't do anything greater than 60fps because it has to be in sync the frames to the montior speed of displaying them. If you turn it off, then the video card will out put as many fps as it can, while the monitor will try to keep up, if it can't you get poor quality, and lots of visual errors. Again, disabling vsync is only for benchmarking purposes only, just to have an estimae of what your video card/ processor can do :)
 

RSI

Diamond Member
May 22, 2000
7,281
1
0
When I disable vsync it goes faster and looks the same..

-RSI
 

ssjgokou1

Banned
Jul 2, 2000
190
0
0
it only looks the same because your fps don't exceed the refresh rate, if they do, and vsync is off, then you see tears in the screen, kinda' like someone is ripping apart the gun in quake III, but a little better, stuff like that.