• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Why is Avg 60 FPS is better than min > 30 FPS?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
well then what do you think of television.....the cameras only shoot at 24FPS but last I checked TV wasnt a slideshow or choppy
Motion blur and no interaction helps film look smother but you'll still see jerkiness during fast camera panning.

60 fps is the highest that humans can detect
We use 60fps as a minimum guide because 60fps is the highest the human eye can detect.
Nonsense. The simplest rebuttal to this is a flickering screen at 60 Hz. You guys can see flicker at 60 Hz refresh right?
 
OK, lets clear a few things up. 60 FPS is the max a human can distiguish between individual frames, but most can tell the difference between 60 and 100 FPS. At 30 FPS, I can easily see when you flash a separate picture for one frame. However, at 100 FPS, the average human could not tell. Still, you can tell a difference over 60FPS, even if you cannot "see" each individual frame. It just makes it even smoother.

TV works at lower FPS becasue the images are combined and "blurred together". There are a few games that have experimented with this type of rendering (just render 2-3 frames at a time to a surface then flip that surface to the screen). This works best in games that will have VERY high FPS (well over 100). Most monitors cannot display 200 Hz, but if you combine 2 frames, the monitor can display 100 Hz... It takes a bit of processing overhead to actual write the surfaces, but it really doesn't matter when you already have over 100 FPS.

Someone did a few tests on gamedev.net about this. They made two separate videos, one at 60 FPS, the other at 30, but with the images blurred (effective 60). Most (over 90%) could not tell a difference. He also showed the same video at 30 FPS (non-blurred), and almost everyone could tell that 60 was smoother. If I had the motivation I would make a quick app...
 
You are incorrect SrGuapo. There have been tests that have proven the human eye can see way beyond 60fps at a frame by frame level.

One such test was done in the airforce, where a pilot saw a 3 frame video running at Various FPS. Each frame showed a completely different type of aircraft. The pilot was able to tell the testers what kind of aircraft was on each frame and in what order at up to 243FPS acurately.

ANYONE saying 60FPS or 30FPS or whatnot is the limit of anything visual is completely full of it.
 
My lowest playble FPS is 18 on a first person shooter. Any lower than that and it gets annoyingly difficult to control a character.

If I hit 20 frames minimum whilst playing, Im fine.

Of course, 60 average is much better but I can survive well.

Back in my days of a 56k modem, I could game better than some with cable connections online 😛 Im used to low frames and poor pings. I remember changing from 56k to cable, I couldnt play properly because I had my ping-adjust to a tee. Its like learning a new game.
 
I cannot stand playing UT2k4 with anything less than ~45-50 fps.
I have to start leading into every shot ever so slightly, & it pisses me off, since it means i tend to die alot more.
 
It's not sarcasm, it's the truth. I forget the mechanics of it, but there's a reason 60 fps looks better on a computer game than 30, but it's not as big of deal with tv.
No offense intended, just can't remember the explanation and too tired to Google it.
Interlacing and camera blur could be the culprits.

60 hz is not what TV's run at. the electron gun scans at 30 hz, because each frame is two fields. (60 hz when playing gamecube). thats why TV is 30 fps, and film is 24. 60 fps is the highest that humans can detect, and it presents smoother gameplay and more realistic motion. still those damn games can't imintate motion for crap.
Sixty hertz is what a TV runs at because it displays 60fps. It takes 30fps and chops it in half via interlacing and displays half of the frame only at one time - therefore - 60hertz.

Did you just say 24fps was the highest fps that humans can detect? Because I think my hand would like to meet your face. Don't worry it's not just you. And I mean no offense, it's just that so many people have this notion and it's totally incorrect. It can get frustrating.

I'm still curious where people get this information. Is it retarded logic, or is some weirdo just pulling sht out of his ass and people eat it up? I don't kow.

We use 60fps as a minimum guide because 60fps is the highest the human eye can detect. (it depends on the person, but within 3 fps either way for everybody) So all the numbers extra are just extra padding that you know you won't be able to see, but you know you will never have slowdown. Generally as long as your fps never drop below 60fps, you'll never know the diff between 65fps and 100fps.
This is wrong and you computer teacher is a moron. I can definitely notice the difference between 60fps and 85fps. I can notice the difference between 100fps and 120fps and higher. Visually, we may not notice much past 85fps, after that it seems like diminishing returns, visually. But when user response it taken into consideration, you will notice much faster mouse response in 120fps compared to 100fps. I've tested it.

OK, lets clear a few things up. 60 FPS is the max a human can distiguish between individual frames, but most can tell the difference between 60 and 100 FPS. At 30 FPS, I can easily see when you flash a separate picture for one frame. However, at 100 FPS, the average human could not tell. Still, you can tell a difference over 60FPS, even if you cannot "see" each individual frame. It just makes it even smoother.
If 60fps is the max, then people wouldn't get so many headaches by looking at a 60Hz TV. If anything, around 85fps is average with diminishing returns after that.
 
Originally posted by: dguy6789
You are incorrect SrGuapo. There have been tests that have proven the human eye can see way beyond 60fps at a frame by frame level.

One such test was done in the airforce, where a pilot saw a 3 frame video running at Various FPS. Each frame showed a completely different type of aircraft. The pilot was able to tell the testers what kind of aircraft was on each frame and in what order at up to 243FPS acurately.

ANYONE saying 60FPS or 30FPS or whatnot is the limit of anything visual is completely full of it.

again, it's all about preference. many here are assuming that every human can perceive the same thing. false, it's preference, what you can grasp compared to something else.
 
Personally, I use a 6800GT and run it at settings that most would find silly in multiplayer gaming...in CS:S I run 1024x768 no AA 8x AF, settings maxed. If you look at benchmarks, I'm capable of FPS well above 60 average, but in large firefights I've got times when I drop to 60 fps and I can feel it everytime and cannot stand it. In single player or non-shooter games (like WoW), lower than 60 drops are not such a big deal, but in twitch level games having even the slightest interruption in the smoothness of the game is very distracting.

 
Originally posted by: DeathBUA
Sometimes I wonder if these people that say 30 FPS is choppy or slideshow....well then what do you think of television.....the cameras only shoot at 24FPS but last I checked TV wasnt a slideshow or choppy 😛

This has been discussed literally hundreds of times before. TV appears fluid because of motion blur. Pause a DVD movie on a single frame of fast action and you will not see a clear, crisp image. The image you see is blurred, creating a more seamless transition between frames.
 
24 fps is the bottom line...that's what they use in film. Anything lower than that and you start to get some jumping. Old (like WWI) films were either 16 or 24 FPS. The 16 FPS films are the ones where the people and cars move very quickly. Anyone who has seen a film this old knows what I'm talking about.
 
Originally posted by: SuperTyphoon
60 hz is not what TV's run at. the electron gun scans at 30 hz, because each frame is two fields. (60 hz when playing gamecube). thats why TV is 30 fps, and film is 24. 60 fps is the highest that humans can detect, and it presents smoother gameplay and more realistic motion. still those damn games can't imintate motion for crap.

There's no measurable limit. The signals sent from the eye to the brain are not digital signals. People can be trained to detect much higher frame rates.

*EDIT* A perfect example was my Speed Reading and Comprehension class in high school. The instructor flashed words on the screen from a projector for so many milliseconds to train our eyes and brain to detect and recognize images in less time. I think he started at about 1/4 of a second, and by the end of the class we were recognizing words flashed for 1/500th of a second, and multiple words one after the other, 1/250th of a second apart. So you could say that my eyes and brain have been trained to detect 250 frames per second.
 
Personally, my best session of CS:S ever was with average FPS of 19-20. I probably couldve done even better with more FPS, which is why I am now running at much lower resolutions. As said before, Its all personal preference.
 
Originally posted by: ssvegeta1010
Personally, my best session of CS:S ever was with average FPS of 19-20. I probably couldve done even better with more FPS, which is why I am now running at much lower resolutions. As said before, Its all personal preference.

When you're used to having games running at a solid 85 FPS to match your monitor's refresh rate, it's quite noticeable if it ever dips into the 40's, 50's, or even 60's. Very disturbing to take a step back. If you've always had value parts... Durons, Celerons, GeForce MX's... and you're used to gaming at 20 fps, it probably doesn't seem too bad to you.
 
24 fps is the bottom line...that's what they use in film. Anything lower than that and you start to get some jumping. Old (like WWI) films were either 16 or 24 FPS. The 16 FPS films are the ones where the people and cars move very quickly. Anyone who has seen a film this old knows what I'm talking about.
I'm pretty sure it was 18fps in those old movies.
 
Originally posted by: BFG10K
well then what do you think of television.....the cameras only shoot at 24FPS but last I checked TV wasnt a slideshow or choppy
Motion blur and no interaction helps film look smother but you'll still see jerkiness during fast camera panning.

60 fps is the highest that humans can detect
We use 60fps as a minimum guide because 60fps is the highest the human eye can detect.
Nonsense. The simplest rebuttal to this is a flickering screen at 60 Hz. You guys can see flicker at 60 Hz refresh right?

No 60fps is the highest the human eye can detect. Our little tools are just a bit inaccurate on grading fps on games... W/ computer games, it's more like 70fps then higher than that is unnoticable. And the 60 hz refresh is off too, it's not true 60.
 
Such a stupid amount of people still exist that every 6 months this pops up.

Try this fps monitor and make sure that vsync is turned off and refresh rate is as high as possible.

Also remember that lcd's are different to monitors and as such you are less likely to notice the change due to the lcd remaining lit.
 
Originally posted by: Mingon
Such a stupid amount of people still exist that every 6 months this pops up.

Try this fps monitor and make sure that vsync is turned off and refresh rate is as high as possible.

Also remember that lcd's are different to monitors and as such you are less likely to notice the change due to the lcd remaining lit.

I set one to 60FPS and the other to 90FPS and I could BARELY discern a difference, when it's 60 vs 30 I can notice the difference but it's not horrible, at 70-100 I couldnt tell anymore
 
Originally posted by: gorcorps
Originally posted by: BFG10K
well then what do you think of television.....the cameras only shoot at 24FPS but last I checked TV wasnt a slideshow or choppy
Motion blur and no interaction helps film look smother but you'll still see jerkiness during fast camera panning.

60 fps is the highest that humans can detect
We use 60fps as a minimum guide because 60fps is the highest the human eye can detect.
Nonsense. The simplest rebuttal to this is a flickering screen at 60 Hz. You guys can see flicker at 60 Hz refresh right?

No 60fps is the highest the human eye can detect. Our little tools are just a bit inaccurate on grading fps on games... W/ computer games, it's more like 70fps then higher than that is unnoticable. And the 60 hz refresh is off too, it's not true 60.

The "full frame" rate of an NTSC TV is 29.97 Hz. The field rate is 59.94 Hz. What YOU see is 59.94 FPS after deinterlacing. Since both the source and the destination are exactly 59.94 Hz there is no discernable difference because everything's the way it was intended. If the TV ran at ~120 Hz and displayed ~60 FPS, you could easily tell. The TV would display double frames in order to display at 120 Hz (to keep vertical synchronization). When watching a TV, FPS never varies either. Add to all that, you don't have a mouse that has to be capped to the maximum frames your video card can output. So the video is capping nothing. The reason we use 60 FPS as a minimum standard? It's a rounded off NTSC standard. Nothing to do with the human eye.

Don't take my word for it. http://64.233.167.104/search?q=cache:15.../02-21-01FPS.html+human+eye+60hz&hl=en
 
No 60fps is the highest the human eye can detect. Our little tools are just a bit inaccurate on grading fps on games... W/ computer games, it's more like 70fps then higher than that is unnoticable. And the 60 hz refresh is off too, it's not true 60.
You are wrong, and if you think you are right, then prove it. Or better yet, try this. This only works on a CRT though. Put on a low resolution that will give you a high refresh rate. Now put it to 85Hz. Play with the mouse. Now put it to 120Hz. Play with the mouse. Feel the difference. There is greater response time, meaning although visually you won't notice a difference, playing with response shows that your eyes do pick up a difference, or else the response wouldn't have changed. There you are proved wrong. Make sure the CRT is at the specified refresh rate by check on the monitor OSD, sometimes software can be weird.

I set one to 60FPS and the other to 90FPS and I could BARELY discern a difference, when it's 60 vs 30 I can notice the difference but it's not horrible, at 70-100 I couldnt tell anymore
You should make sure the program actually displays 90 because mine never goes over 60 for some reason.
 
I'd take 30fps min over 60 average unless I knew it was a stable average. 30 min would mean even in the most intense of action its still somewhat playable, whereas 60 average could mean you're in the teens which is pretty much unbearable, and in the hundreds if your staring at the sky or low detailed hills or walls.

If you think about TV, it is true that it runs at 30fps, however in games such as fast paced shooters you possibly have characters traveling at high speeds, perhaps in a vehicle. If that vehicle flies across your screen at a mere 30 frames per second you'll see it almost "jumping" from frame to frame depending upon its speed, instead of flying smoothly, making it harder to target and shoot down.

<30 is definately too slow for me these days (now that I can afford to stay above it, and I'm thankful for it), I like to make sure I have a 50-70 minimum frame rate in faster paced games, the slower games aren't as important but I still like to stay above 30.
 
Originally posted by: Beiruty
Anyone can explain why Games at 30 FPS is different fromr a Video at 30 FPS. We are only comparing progressive not interlaced.

Is it because Games need to react to user input?

Agree. I think user input has a lot to do with it. Let's do some math here...

With 30FPS you get .03SPF. Let's say in a FPS game I move my mouse quickly to change my view 180 degree in 0.2S. with 30FPS you get 6.7 frames to render the 180 sweeping view. It can be choppy especially with a mouse pointer that you can feel. Mouse pointer will feel a bit "jumpy". With 60FPS you get 13 frams to render it, which can be a lot smoother. 120FPS give you 26 frames which is super smooth. If you are just watching then it might not look as bad.
 
60 FPS is the max a human can distiguish between individual frames
Wow, so no flicker for you at 61 Hz? I personally need a minimum of 85 Hz to get a flicker-free image.

No 60fps is the highest the human eye can detect.
False.

Our little tools are just a bit inaccurate on grading fps on games...
No they aren't. Some games do lose accuracy at a very high framerate but that's beside the point.

And the 60 hz refresh is off too, it's not true 60.
How so? If a monitor's OSD says 60 Hz what reason do you have to doubt it?
 
Originally posted by: SynthDude2001
Originally posted by: dug777
i think i get about 40fps in d3 timedemo demo1 with my 9800 pro (high, 1024), and that seems great to me, but then fps ho's like BFG 10K won't touch anything below 60fps...so it really is personal preference i guess 😉

You say that like there's something wrong with being an FPS ho 🙁

nah, if you can afford it then that's great i guess, but i really actually am perfectly happy with what the d3 timedemo demo1 tells me i'm getting, as in i played the game at a variety of settings and found out what i liked best, then benched it and found i was getting about 40fps... i suspect my farcry fps would be fairly similar, maybe 10fps higher.

But then for UT2K4 my fps would be sky high, and sims 2 my fps is in the dirt, so i guess what kinda game you are playing makes a big difference too...UT really does become a bit easier with a constant very high fps, sims2 it really doesn't matter if your fps is not great, cos it's such a slow game type...just my 2c 😉
 
Back
Top