How many FPS can you notice?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

How many FPS can you notice a difference up to?

  • 30 FPS

  • 45 FPS

  • 60 FPS

  • 85 FPS

  • 100 FPS

  • 144 FPS

  • It depends on the game and situation. Explain further.


Results are only viewable after voting.

Face2Face

Diamond Member
Jun 6, 2001
4,100
215
106
my monitor does up to 120hz, and I can definitely tell even the difference between 96hz and 120hz. It's most noticeable in FPS, but it's always noticeable. The difference between say, going from 60hz to 96hz is a more noticeable jump, but there is still a noticable difference between 96hz and 120hz, and i would imagine even between 120hz and 144hz.

I have a tough time seeing any difference at all on my monitor. I definitely agree that 60Hz to 96Hz is very noticeable, but I haven't see it going from 96Hz to 120Hz. I may have to try it again.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
Because 2d images are games. Yeah.

Huh?

Lets ignore the fact that yes some games are just 2D and have been for most of gaming history and focus on the fact that it doesn't even matter.

What gets drawn to the screen at the end of the day is a 2D image represented in a plane, the ability to move across that plane at a given speed and then sample that movement at a given rate (60hz, 120hz, whatever) is identical in concept to doing it in 3D.

I think more than anything this demonstrates people desire to dismiss the claim because of preconcieved biases or simply not willing to acknowledge that they're painfully wrong about claims that the eye can't see above X many FPS.

We already know the airforce put figther pilots through some of the most grueling visual treatment and they can discern details in 1 frame up to 350-400fps, they eyes can see and appreciate huge levels of detail that modern refresh rates and screen resolutions are not even close to achieving in modern devices, we use tricks and approximations to make up for the lack of raw power to deliver this quality such as motion blur in film.
 

happy medium

Lifer
Jun 8, 2003
14,387
480
126
". Most of the new digital projectors are capable of projecting at 48 fps, with only the digital servers needing some firmware upgrades. We tested both 48 fps and 60 fps. The difference between those speeds is almost impossible to detect, but the increase in quality over 24 fps is significant. "

. https://www.facebook.com/notes/peter-jackson/48-frames-per-second/10150222861171558

Peter Jackson the maker of the Hobbit said the above.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
". Most of the new digital projectors are capable of projecting at 48 fps, with only the digital servers needing some firmware upgrades. We tested both 48 fps and 60 fps. The difference between those speeds is almost impossible to detect, but the increase in quality over 24 fps is significant. "

. https://www.facebook.com/notes/peter-jackson/48-frames-per-second/10150222861171558

Peter Jackson the maker of the Hobbit said the above.

Keep in mind, he's talking about his film. Movies are always shot in ways to avoid exposing what poor FPS can be like. Sometimes they break the rules and it is painfully obvious. On top of that, even at 48Hz, they used motion blur.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
". Most of the new digital projectors are capable of projecting at 48 fps, with only the digital servers needing some firmware upgrades. We tested both 48 fps and 60 fps. The difference between those speeds is almost impossible to detect, but the increase in quality over 24 fps is significant. "

. https://www.facebook.com/notes/peter-jackson/48-frames-per-second/10150222861171558

Peter Jackson the maker of the Hobbit said the above.

Film is not the the same, they use analogue cameras with a physical shutter speed in order to capture the movement of a scene, in essence each frame contains not just an instant of time but a range of time from when the shutter opened to when it closed, that creates a blurred image and it's how motion blur is captured into film, it's what lead to film having a 24fps limit, that's the minimum number of motion blurred frames that you need to trick the eye into seeing motion and not just a string of still images.

Games are not the same, they do render exact snapshots of time every frame and so you can discern a higher number of frames.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Film is not the the same, they use analogue cameras with a physical shutter speed in order to capture the movement of a scene, in essence each frame contains not just an instant of time but a range of time from when the shutter opened to when it closed, that creates a blurred image and it's how motion blur is captured into film, it's what lead to film having a 24fps limit, that's the minimum number of motion blurred frames that you need to trick the eye into seeing motion and not just a string of still images.

Games are not the same, they do render exact snapshots of time every frame and so you can discern a higher number of frames.

The Hobbit "films" were shot using RED Epic cameras - it was 100% digital. The shutter terminology still persists to this day, but it's all digital now (time-based aperture instead of physical aperture). Anyway, the main point you're making is a correct one - a recording of real life will always be subject to some amount of blurring whereas renderings are a precise thing that exists in only one location for every frame rendered (unless rendered or post-processed with blur).

The specific problem with the Hobbit HFR is that Jackson shot it at 48fps with a 270-degree (1/64 per second) exposure rather than the standard 1/96 per second. He did that intentionally to make it blurrier (to appease film buffs??) and less like a videogame, but it didn't really work out. Unfortunately, his choice to record The Hobbit that way also negatively affected the 24fps version of The Hobbit as well, resulting in a strange mix of blur and judder in fast moving scenes and pans. I found the movies almost unwatchable compared to the LOTR trilogy.

http://www.red.com/learn/red-101/shutter-angle-tutorial
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,342
265
126
Brains don't see in frame rates. If your monitor is displaying more frames, your EYES are picking up more visual information, and sending that over to the brain for processing and response. Combined with lower response time from user to display, you get this faster feedback loop between the game and user with higher frame rates.

Watching movies and gaming are not comparable in way shape or form. Alongside the reasons others have already given, there is no feedback loop watching movies like there is with gaming. The information is only going one way during movies.

I think 240Hz will be about the maximum that almost all will be able to feel. Beyond that it will be tough. Going from 120 to 240 will provide the feeling of "instant", imo. That will be even less of an upgrade than going from 60 to 120, but it's still something.
 
Last edited:

yacoub

Golden Member
May 24, 2005
1,991
14
81
I chose that it depends on the situation, because we don't see in FPS. We perceive change. If the game I'm playing has very little change in viewing angles, I can play at much lower FPS than when playing a 1st person game that I turn often in.

I find ~85 FPS to be the point where I have a hard time telling a difference due to the speed I move around.
+1.

Anything below 60fps tends to feel sluggish in responsiveness in an FPS game. The higher the fps, the smoother and more responsive the movement will feel in a first-person game.
 

birthdaymonkey

Golden Member
Oct 4, 2010
1,176
3
81
At home, I play Dota on a 100Hz (overclocked) Korean 1440p panel with vsync on. If my framerate drops to 50fps - due (rarely) to crazy action on the screen or (more commonly) because I've alt-tabbed out and back in while spectating a game and am looking at a website with hw acceleration - I notice it immediately. The contrast makes 50fps feel like a slideshow.

At my friend's place, I play the game on a normal 60Hz monitor. The first match always seems choppy, but then I get used to it. Of course, if I go home and play at 100Hz right away, it feels super fluid buttery smooth.

Anyway, the difference between a true 100fps and 50/60fps is obvious. I think that most of the misconception that you can't feel the difference is based on people using 60Hz screens with vsync off - so the fps the game's rendering at (even if it's as high as 200) gets displayed at 60Hz regardless, with some tearing thrown in for good measure.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
Even if I could notice above 60 FPS I'd rather have my game looking better because 40 FPS seems just fine to me.
 

bfun_x1

Senior member
May 29, 2015
475
155
116
I just gave Titanfall a go at 100Hz/100fps then switched to 60Hz/60fps. I'm not really sure I can tell the difference. At first I thought 100fps looked better then I went to 60fps and didn't think it looked any worse. Honestly I don't notice any difference between 60Hz and 100Hz on my desktop either.
 

Eric1987

Senior member
Mar 22, 2012
748
22
76
I just gave Titanfall a go at 100Hz/100fps then switched to 60Hz/60fps. I'm not really sure I can tell the difference. At first I thought 100fps looked better then I went to 60fps and didn't think it looked any worse. Honestly I don't notice any difference between 60Hz and 100Hz on my desktop either.

You're gonna be burned at the stake be careful of what you say.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
He posted his opinion. He didn't state that the human eye couldn't see past 60 FPS. There is a big difference.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I just gave Titanfall a go at 100Hz/100fps then switched to 60Hz/60fps. I'm not really sure I can tell the difference. At first I thought 100fps looked better then I went to 60fps and didn't think it looked any worse. Honestly I don't notice any difference between 60Hz and 100Hz on my desktop either.

I've tested myself with framerate and tried to guess the framerate without a frame counter and so far off, it was extremely hilarious.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
While there is some visual differences, it is the latency that is most obvious to me, but I'm pretty accurate up to about 80 FPS. Not that I'll guess exactly, but as long as its a mouse driven game with a fair amount of camera movement, I can tell you if it is near 30 FPS, near 40 FPS, near 60 FPS, and if it is over 75 FPS.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Lots of games have FPS caps built in that can only be disabled by editing config files. Those games won't look any different on a 120hz monitor because they still run at 60.

Lo and behold, Titanfall is limited to 60 FPS in game unless you follow the instructions here http://www.reddit.com/r/titanfall/comments/1xyvmo/tip_how_to_play_in_120hz_mode/. In Titanfall you must enable vsync to go over 60.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
What's often ignored when this debate comes up every so often is the interpolation speeds or resolution of input from peripherals (mice, controllers, keyboards).

What I notice (and others should take note of) is the game engine itself makes a huge difference as to whether or not a specific frame rate matters.

For instance, The original Quake engine and subsequent Quake engines you could really tell the difference between frame rates. Even if your eyes stopped seeing a FPS difference after around 85-90 FPS (what I suspect most people can't see past, at least on a good CRT) the feel of the game keeps improving as the frame rate increases.

There's a big difference in "feel" when you set a Quake engine game to run at 125 FPS. Mouse input and movement is smoother and more responsive as the interrupt of the peripherals is provided a lower latency. I believe this carries over to many other twitch based games.

Other issues that cloud this debate is LCD overdrive issues, contrast ratios, input latency of the actual monitor etc, G2G speeds. This all can change the issue of "how many FPS can you notice".

Personally if you play twitch games or anything competitive you should be aiming for at least 100FPS or 120FPS if the peripheral interrupt latency is determined by the frame rate.

My 2 cents..
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
^ all that is quite true. There are definitely latency issues that change with higher FPS, as well as your hardware weaknesses may hide some of the FPS gains. It definitely goes beyond just what you see in many cases, and what you "feel".
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
What's often ignored when this debate comes up every so often is the interpolation speeds or resolution of input from peripherals (mice, controllers, keyboards).

What I notice (and others should take note of) is the game engine itself makes a huge difference as to whether or not a specific frame rate matters.

For instance, The original Quake engine and subsequent Quake engines you could really tell the difference between frame rates. Even if your eyes stopped seeing a FPS difference after around 85-90 FPS (what I suspect most people can't see past, at least on a good CRT) the feel of the game keeps improving as the frame rate increases.

There's a big difference in "feel" when you set a Quake engine game to run at 125 FPS. Mouse input and movement is smoother and more responsive as the interrupt of the peripherals is provided a lower latency. I believe this carries over to many other twitch based games.

Other issues that cloud this debate is LCD overdrive issues, contrast ratios, input latency of the actual monitor etc, G2G speeds. This all can change the issue of "how many FPS can you notice".

Personally if you play twitch games or anything competitive you should be aiming for at least 100FPS or 120FPS if the peripheral interrupt latency is not determined by the frame rate.

My 2 cents..

And its why Battlefield 4 has always been crap once you get good enough to matter. Their server refresh rate used to be 10hz, which was unplayably bad. After much hand wringing and denying the issue existed, DICE increased it to 30, which is still too slow and causes frustrating and obvious problems (killed when on your screen you've turned a corner). Rumor is they'll finally bump it to 60 but only the "premium" test server has that.

So yeah, game engine makes a colossal difference.

You cant get skyrim or fallout 3 or NV to look smooth at any FPS without special mods that remove the persistent engine stutter.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Yeah DICE has always had issues making a tradeoff between massive maps and a huge number of players while limiting the the tick rate of the server. I quit playing the Battlefield games for this reason out of frustration. I'm a bit old school but I recall it first being an being an issue with Battlefield 2.
 

HeXen

Diamond Member
Dec 13, 2009
7,837
38
91
Sounds like it sucks to be some of you guys. Glad those things don't usually bother me.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
The specific problem with the Hobbit HFR is that Jackson shot it at 48fps with a 270-degree (1/64 per second) exposure rather than the standard 1/96 per second. He did that intentionally to make it blurrier (to appease film buffs??) and less like a videogame, but it didn't really work out. Unfortunately, his choice to record The Hobbit that way also negatively affected the 24fps version of The Hobbit as well, resulting in a strange mix of blur and judder in fast moving scenes and pans. I found the movies almost unwatchable compared to the LOTR trilogy.

http://www.red.com/learn/red-101/shutter-angle-tutorial
Pretty sure standard for 24fps films shutter is 1/48 and thus hobbit had sharper images than usual which was one of the reasons why people didn't like it. (shorter motion blur)
If they would have shot using 360 degree shutter he would have got images identical to standard 24fps 180 degrees used in most movies.