Innokentij
Senior member
- Jan 14, 2014
- 237
- 7
- 81
No ones ever said 24 was max, 24 is roughly the minimum point where things start to seem fluid.
I see! Just read about it on a forum without checking it myself
No ones ever said 24 was max, 24 is roughly the minimum point where things start to seem fluid.
my monitor does up to 120hz, and I can definitely tell even the difference between 96hz and 120hz. It's most noticeable in FPS, but it's always noticeable. The difference between say, going from 60hz to 96hz is a more noticeable jump, but there is still a noticable difference between 96hz and 120hz, and i would imagine even between 120hz and 144hz.
Because 2d images are games. Yeah.
". Most of the new digital projectors are capable of projecting at 48 fps, with only the digital servers needing some firmware upgrades. We tested both 48 fps and 60 fps. The difference between those speeds is almost impossible to detect, but the increase in quality over 24 fps is significant. "
. https://www.facebook.com/notes/peter-jackson/48-frames-per-second/10150222861171558
Peter Jackson the maker of the Hobbit said the above.
". Most of the new digital projectors are capable of projecting at 48 fps, with only the digital servers needing some firmware upgrades. We tested both 48 fps and 60 fps. The difference between those speeds is almost impossible to detect, but the increase in quality over 24 fps is significant. "
. https://www.facebook.com/notes/peter-jackson/48-frames-per-second/10150222861171558
Peter Jackson the maker of the Hobbit said the above.
Film is not the the same, they use analogue cameras with a physical shutter speed in order to capture the movement of a scene, in essence each frame contains not just an instant of time but a range of time from when the shutter opened to when it closed, that creates a blurred image and it's how motion blur is captured into film, it's what lead to film having a 24fps limit, that's the minimum number of motion blurred frames that you need to trick the eye into seeing motion and not just a string of still images.
Games are not the same, they do render exact snapshots of time every frame and so you can discern a higher number of frames.
+1.I chose that it depends on the situation, because we don't see in FPS. We perceive change. If the game I'm playing has very little change in viewing angles, I can play at much lower FPS than when playing a 1st person game that I turn often in.
I find ~85 FPS to be the point where I have a hard time telling a difference due to the speed I move around.
I just gave Titanfall a go at 100Hz/100fps then switched to 60Hz/60fps. I'm not really sure I can tell the difference. At first I thought 100fps looked better then I went to 60fps and didn't think it looked any worse. Honestly I don't notice any difference between 60Hz and 100Hz on my desktop either.
I just gave Titanfall a go at 100Hz/100fps then switched to 60Hz/60fps. I'm not really sure I can tell the difference. At first I thought 100fps looked better then I went to 60fps and didn't think it looked any worse. Honestly I don't notice any difference between 60Hz and 100Hz on my desktop either.
What's often ignored when this debate comes up every so often is the interpolation speeds or resolution of input from peripherals (mice, controllers, keyboards).
What I notice (and others should take note of) is the game engine itself makes a huge difference as to whether or not a specific frame rate matters.
For instance, The original Quake engine and subsequent Quake engines you could really tell the difference between frame rates. Even if your eyes stopped seeing a FPS difference after around 85-90 FPS (what I suspect most people can't see past, at least on a good CRT) the feel of the game keeps improving as the frame rate increases.
There's a big difference in "feel" when you set a Quake engine game to run at 125 FPS. Mouse input and movement is smoother and more responsive as the interrupt of the peripherals is provided a lower latency. I believe this carries over to many other twitch based games.
Other issues that cloud this debate is LCD overdrive issues, contrast ratios, input latency of the actual monitor etc, G2G speeds. This all can change the issue of "how many FPS can you notice".
Personally if you play twitch games or anything competitive you should be aiming for at least 100FPS or 120FPS if the peripheral interrupt latency is not determined by the frame rate.
My 2 cents..
Pretty sure standard for 24fps films shutter is 1/48 and thus hobbit had sharper images than usual which was one of the reasons why people didn't like it. (shorter motion blur)The specific problem with the Hobbit HFR is that Jackson shot it at 48fps with a 270-degree (1/64 per second) exposure rather than the standard 1/96 per second. He did that intentionally to make it blurrier (to appease film buffs??) and less like a videogame, but it didn't really work out. Unfortunately, his choice to record The Hobbit that way also negatively affected the 24fps version of The Hobbit as well, resulting in a strange mix of blur and judder in fast moving scenes and pans. I found the movies almost unwatchable compared to the LOTR trilogy.
http://www.red.com/learn/red-101/shutter-angle-tutorial