Originally posted by: Dance123
Hi,
1/ First of all, what do most people consider a "smooth framerate"? Is it true that this is 70fps, what do you think?
2/ Second, is it true that you can get certain problems if your framerate is the same as your refresh rate or someting like that. I believe I once read something like that. Anybody knows more about this?!
3/ I believe 75Hz refresh rate is good enough, but if you can select higher like 100Hz or above, should you do that? Any benifits to go highe then 75Hz, what are the possible disadvantages if going (too) high? What are the guidelines here? Can a higher refresh rate lower performance or something? Anything else I should know?..
I hope you can help me with these questions! Thanks!!
1. Smooth framerate is subjective and it can range from 30- 80fps or higher. Personally, 50fps is smooth enough. However, I can accept 30fps if necessary, but not below.
2. If you have V-Sync On, then having your fps fall below your refresh rate will decrease your average fps by a lot. You will also notice framerate jumps from 60fps one second to 30fps in the other (if your refresh rate happened to be 60Hz). If you have V-Snyc Off and your fps goes above your refresh rate, it can increase the occurrence of tearing, which is when part of the screen shows the present frame and part shows the previous frame.
3. If you have a CRT, then 85Hz is minimum for flicker-free. Unless you notice flicker at that refresh rate, there is no need to put it higher. However, putting it higher may help smooth out the framerate jumps that occur with V-Sync enabled. Higher refresh rates also decrease the life of CRT's. And if you push it too high, with respect to resolution, you could get degraded image quality. If you have an LCD screen then there is no benefit from running out of spec. Even if you have a LCD that runs at 75Hz, the LCD won't be fast enough to show more than 60fps without ghosting.
Now to kill these spammers...
Originally posted by: KeepItRed
60+ is what the eye percieves as smooth. Anything over 60 cannot be seen or noted by the eye. Anything under 60, you will notice it's becoming a slide-show.
Ha ha. What a funny guy.
Originally posted by: Griswold
Because its biological fact. Your eye/brain cannot see those 115fps, but due to the way objects are rendered and presented on screens, you will feel it as being smoother - but not because your eye/brain can register each of the 115 frames per second. Otherwise you could distinguish between slow motion secene at 50fps and 100fps, which you cant. Its the fast moving scenes where its at.
Just because you don't notice the flicker, doesn't mean that you can't see the difference. Remember, CRT's have phosphors, which hold light and glow for a certain period of time. That's why we don't notice the flicker. Biological Fact? You're right about noticing it in fast motions. This is precisely why we
can see 115fps. If you tune your CRT to 100Hz, I bet you'll notice that the mouse is much more sensitive compared to a lower refresh rate, such as 85Hz. Why? Because you see the extra frames. Because you aren't moving the mouse any faster, but the display is just showing it to you faster. This is proof that you see those 115fps.
Originally posted by: destrekor
its a joke that people think there is a competitive edge to have 100fps over 50fps. You're not rendering the same number of frames at half the speed, or else it would look like a slowmotion video. games are set to render copies of frames. its illogical that there would be a gaming edge.
now run a demo-loop (like quake 4 demo recordings, or half life 2 demo recordings), and there is a set number of frames. so on some systems, some scenes may look like slow motion while others look like fast-forward, all depends on the system playing the demo-file as well as the system that recorded it. games are not played in this manner. again, because either 50fps would look like slowmotion to a 100fps system, or the 100fps system would look like the game is on fast-forward compared to the 50fps rig.
There
is a competitive edge. See Idk what you're talking about. Do you know what the difference is between 100fps and 50fps? Smoothness to the eye and response time. The response time is where the advantage is. Quicker reflexes, that sort of thing - think about it.
Originally posted by: Crazyfool
The key to framerates is not the average framerate but the minimum framerate. If you never dropped below 60fps you wouldn't be able to tell the difference between 60fps, 150fps or 5000fps... the human eye/brain can not process more than 60 frames per second. Period.
Your so smart. Show me the detailed scientific study that says that the human eye/brain cannot process more than 60fps.
Originally posted by: Crazyfool
Show me a study where a pilot is shown a white light for 1/220th of a second, then in the same spot a red light for 1/220th of a second, followed by a yellow light for 1/220th of a sec. He will see only one light and you would have me believe he could not only know that it was, in fact, 3 different colors but also that he could tell you what order they were in. :laugh:
That isn't the point. The point is that he was able to see and identify the airplane. If you sent the lights one right after the other, it would be a lot of different information to see so fast. On the other hand, seeing on frame of a game right after the other, you wouldn't notice the kind of difference you want, duh, because if you did, then it would be choppy. But you would notice it as smoother than something with a lower fps and that's good enough to show that we can see them.
And
remember: you have to have a monitor capable of a refresh rate as high as the frame rate you are trying to see, or else you won't notice a difference.