What is the "frame rate" of a human eye?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

serpretetsky

Senior member
Jan 7, 2012
642
26
101
human eye doesn't work like a camera, it has no shutter and no refresh rate, so trying to put a single fps number on it doesn't work.

It has analog little cones and rods that are constantly feeding information to the nerves. They will respond as soon as they get enough energy hitting them.

A lot of what is perieved as smooth or not is defined by how your brain decodes the information.

Motion can be percieved at very low fps. 24fps of motion blurred film is percieved as generally pretty smooth for most people. 24fps of non motion-blurred film is not smooth.

60fps and above for non motion-blurred is generally considered smooth, although most can still differentiate up to 120fps.

a single bright flash can be percieved far beyond any of these fps numbers.
 
Aug 25, 2012
27
0
0
I did some research on this once myself, and it did support the earlier research, that shows 72fps to be the rate with "minimal psychovisual noise".

I further tuned that to 72.734hz. Which is what my monitor is running at now. Very still and quiet, and non-noisy.

If you have an Nvidia card, on windows, you can easily set it in their custom resolutions screen.

Peace Be With You.
 
Last edited:

CycloWizard

Lifer
Sep 10, 2001
12,348
1
81
There isn't really a correct answer to this question. The simplest answer is usually the report of the flicker fusion rate which is the maximum frequency at which an individual can still perceive the blinking of a light stimulus. Even this threshold depends on many factors, but the most commonly-reported result is 16 Hz.

The more rigorous answer, as always, is that it depends. The number of photons striking your retina is far too large to worry about each individual photon, so it has several filters in place to point out the most necessary features of the image. The brain uses a type of adaptive temporal integration of the information it receives to tell you what your eye has seen. Thus, you can see events with timescales over many orders of magnitude. The integration time step (for lack of a better term) is variable depending on the stimulus. Thus, when the framerate drops and your brain has been integrating at the higher rate, it will notice a lack of information and struggle (very briefly) to put together a coherent image. For example, if you run a game at a framerate of 120 Hz but something happens to slow it down to 70 Hz, you will clearly see a difference.
 

reallyscrued

Platinum Member
Jul 28, 2004
2,617
5
81
This is an awesome discussion.

I've learned so much.

Anyone know at what bit rate we hear audio? I can't really tell the difference past 320 kpbs mp3s. I'm pretty sure that's the bitrate that we hear audio in.
 

who?

Platinum Member
Sep 1, 2012
2,327
42
91
With a film movie projector the screen is dark whenever the film is moving. Light only passes through the film when each frame is held still yet because of persistence of vision we don't see the black screen.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,687
1,222
136
Anyone know at what bit rate we hear audio? I can't really tell the difference past 320 kpbs mp3s. I'm pretty sure that's the bitrate that we hear audio in.
We hear in a mathematically-logarithmic compressed lossless format and we remember tunes in reverse.
--
Human eye-brain communication is much like compressed lossless x264 it depends how much data your brain can process per second. If you are drunk you will see less motion and things will be stuttery or blurred while active minds will see clearly.
 
Last edited:

piasabird

Lifer
Feb 6, 2002
17,168
60
91
The point is video is made by rendering frames. Blame it on the media industry. It goes all the way back to those boxes that you crank by hand to see a film. So it makes a difference whether you are playing games or watching a video. If you turn up the frame rate then you either have missed frames or you have a monitor/tv that is supplying frames that are redrawn approximations of what should be between the frames. This in itself can make video kind of choppy also. So it may be downright irrational to own a TV that is higher than 60 Hrz. It could be that you are right and TV'S should be designed with a slightly faster frame rate of 72Hrz, and maybe that is the speed we sould make full motion video.

When an industry makes these standards they do so with 2 things in mind. The first is cost and the second is visual quality. Usually somewhere in the middle is where things in reality exists. However, maybe 30 or 60 frames per second is something some engineer came up with because it was a nice round number they could deal with. As a programmer I saw this a lot. The user is just told to deal with the end-product in a take it or leave it proposition.

So when people get new HD televisions, what do you think they do? They customize the settings to turn down all the video correction to make the video smoother. This is the reality.
 

Check

Senior member
Nov 6, 2000
367
0
0
The rate at which your mind processes images is dependent on a lot of things. For example eye movement will cause the images processed to slow down.

http://en.wikipedia.org/wiki/Chronostasis

Just thought I would complicate matters even more.

Let's look at the 24fps from a more practical point of view.

1) There is a finite speed at which you can expose film to light and get a picture that doesn't suck, it's a chemical process after all.
2) The faster you run the film the more money it costs
3) Originally there was a poor bastard with a hand crank that was operating these cameras. Faster films rates makes it harder on the cameraman who is trying to keep everything steady.

And now for the NTSC 30fps and PAL 25fps from a practical point of view.
It corresponds to a multiple of the mains frequency for the countries it was used in.

The Americas have settled on 60Hz for our power (NTSC) and the rest of the world (for the most part) is 50Hz (PAL).
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
So it may be downright irrational to own a TV that is higher than 60 Hrz. It could be that you are right and TV'S should be designed with a slightly faster frame rate of 72Hrz, and maybe that is the speed we sould make full motion video.

The reason for the 60Hz refresh was simple, the power line frequency was 60 Hz. When tv was being developed it was difficult to create a time base. Making a circuit just to generate a specific frequency out of vacuum tubes would have been impractical , so the 60Hz AC frequency was used, interlacing the frame every half cycle for odd and even lines. The PAL system uses 25fps for the same reason, their AC is 50Hz AC, so half is 25fps.

aa-raster-1.gif


Something to think about with non-raster displays like LCD is that they don't really have a frame rate that relates to Hz easily. An LCD will only update when the image changes, so if you were to make a file that is one solid color at 24fps the LCD will display that file at 1Hz because the image content isn't changing.

LCD should be thought of more as the Hz rate being what they can update at if they were required to do it, not at what rate they always update the display .