Mark Rejhon
Senior member
- Dec 13, 2012
- 273
- 1
- 71
+1 (very especially true for sample-and-hold display technologies like LCD)You're confusing multiple issues.
First of all the human eye can see way more than 30fps, 60fps, 120fps and even higher, the benefits of displaying more unique frames to a user is indisputable.
Although there are gradual points of diminishing returns, see Science & References for some useful information about the human vision system.
On a slightly different but very related topic, it is true humans can't tell flicker directly beyond a specific Hz. It varies human by human. Some can't see flicker directly at 60Hz, others can see it even at 120Hz. Virtually nobody can notice flicker staring directly at 1000Hz flicker. However, there are side effects even from 1000Hz flicker such as wagon wheel effect / phantom array effect / stroboscopic effects. There were tests where humans were able to tell a >500Hz flicker, and in some cases, 10,000Hz flicker through such an indirect effect (see these references, scroll to bottom).
People can see short strobes. A xenon camera flash is less than 1 millisecond. Frame samples on a 60Hz display is much longer than that (1/60sec = 16.7ms). Moving a camera around while taking a photograph at 1/60sec shutter, will cause a blurry photograph. Likewise, moving your eyeballs while tracking moving objects on a screen, leads to motion blur on a sample-and-hold display.
The frame is continuously shining for the whole refresh. Your eyes are always moving while tracking moving objects. The static image on the screen (non-strobed, long sample of 16.7ms) gets smeared across your vision since your eyeball is in a different position at the beginning of the refresh versus the end of the refresh. Thus, a sample-and-hold display (LCD) will have lots of motion blur compared to an impulse-driven display (CRT). The short samples (flickers/strobes) eliminate the chance that the image gets blurred across your retinas.
In fact, repeated frames on flicker/impulse driven displays (e.g. CRT 30fps@60Hz, 60fps@120Hz, 100fps@200Hz, etc) is tantamount to a double-exposure at different positions as you eyes keep moving and the extra refresh is in a different position, causing a doubled-up-edge in moving objects, with separation between doubled-up edges smaller the higher framerate you go. There's been comments about the LightBoost doubled-up-edge effect at 60fps@120Hz. Usually, people are most familiar with the CRT 30fps@60Hz doubled-up-edge effect, but it remains at far higher refresh rates.
Humans can tell apart photographs taken with a 1/60sec shutter and a 1/1000sec shutter. The photograph taken at 1/1000sec is much sharper. There are point of diminishing returns well above 60fps, but the curve doesn't end until well beyond 1000fps(equivalence, via interpolation and/or strobes). That's why Samsung and Sony have "Clear Motion Rate 960" and "Motionflow XR 960" in some high-end HDTV's (costing over $2000). Although somewhat gimmicky, tests have shown that the motion on these displays are pretty clear, although the input lag of interpolation make them unsuitable for use with video games. LightBoost solves this problem by avoiding interpolation, and strictly sticking to strobes -- providing about 90% less motion blur than a regular 60 Hz LCD. (regular non-LightBoost 120 Hz LCD only has 50% less motion blur than 60 Hz LCD).
Obviously cameras and eyeballs are not apples to apples, but it serves to help people understand why 60fps is not the final frontier, and the extra motion blur caused by your moving eyeballs -- it is same type of blurring effect as taking a long-exposure photograph with a moving camera. (The sample-and-hold effect forces this second cause of motion blur to happen -- in addition to motion blur caused by pixel persistence, which are two separate causes of display motion blur as explained in academic/university papers, and TV manufacturer research).
Last edited: