Some interesting science, which is pretty well covered (e.g.
Microsoft Research, Nokia, Sharp Labs, universities), is that motion blur is actually dictated by the length of time a refresh is displayed for, instead of the refresh rate.
That's why stroboscopic displays (flicker) such as CRT, plasma and LightBoost, have excellent motion quality. The flickers shortens the length of time a refresh is displayed for.
It's also precisely why long-time CRT users who have switched to 120 Hz LCD's (and gotten dissapointed) have noticed that CRT 75fps@75Hz produces much clearer motion than LCD 120fps@120Hz.
That's because the CRT phosphor decays in less than 1/500th of a second, which is shorter than 1/120th of a second (Because LCD refreshes are continuously displayed for a whole refresh).
LightBoost strobes are 1/700th to 1/400th of a second (range of 1.4ms to 2.4ms), depending on the LightBoost OSD setting. The 1/700th of a second flickers of a frame produces less than 1/10th the motion blur of a standard 60 Hz LCD refresh. An LCD refresh is continuously shining, which is called "sample and hold".
On newer displays, motion blur is mostly caused by eye tracking, NOT pixel persistence
According to many research papers, including those in
Science & References here, motion blur is caused by eye tracking across a frame. Your eyes are always moving to track moving objects. The longer a frame is displayed for, the more eye tracking across the frame occurs (your eyes are in a different position at the beginning of a frame, versus end of a frame). That's why even sample-and-hold OLED displays have lots of motion blur (e.g. PS Vita has lots of motion blur) even though pixel response is nearly instantaneous. Even on LCD's (TN's especially) pixel persistence is already only a tiny fraction of of a refresh, so most motion blur is nowadays caused by eye tracking, instead of pixel persistence. The artificial stepping, caused by a finite framerate, produces opportunity for eye tracking to create a motion blur limitation that's not caused by pixel persistence.
Source:
Microsoft Research (and
many others)
The only way to reduce motion blur (caused by eye tracking) is to shorten the length of time that a frame is displayed for. This is accomplished by doing extra Hz (e.g. 240Hz, 480Hz, 960Hz...) or via extra black periods between refreshes (e.g. flicker displays like CRT, Plasma, LightBoost, black frame insertion, stroboscopic backlights, etc.) The bigger the black period and the smaller the refresh time period, the better it is for motion blur.
A medium persistence CRT display with 2ms phosphor decay would have the equivalent amount of motion blur to a theoretical flickerfree 500fps@500Hz LCD (2ms per continuously-shining refresh), which does not exist without motion interpolation (bad for games due to input lag). Who can buy a 500 Hz LCD and be able to run 500fps on it natively? You can't. As a result, it's cheaper from a GPU perspective to just simply shorten the frame samples without raising the framerate. Adding black periods between frames. That means flicker (stroboscopic effect of a black period between frames) is required to prevent the frames from being motion blurred as your eyes track across it. Now, you want a high enough refresh rate so the flicker is not noticeable (e.g. 120 Hz instead of 60 Hz).
Clearly, a frame being stroboscopically flashed for only 1/400th or 1/700th second (LightBoost 120 Hz) produces vastly clearer motion than frames displayed for a full 1/120th second (non-LightBoost 120 Hz LCD). This is very similar to CRT phosphor decay of 1-2ms, and results in several "It looks like a CRT" testimonials from long-time CRT users.
You need frame rate matching refresh rate, to prevent repeated refreshes from contributing to motion blur. When judder is too high frequency to detect, the judder blends into a motion blur (this is why 60fps@120Hz still looks less clear than 120fps@120Hz). LightBoost is hardware-limited to only function during high refresh rates (100-120Hz) so this enforces a high GPU requirement for best motion clarity. It's also why LightBoost doesn't help you very much if you're only running half the framerate of refresh rate; so for the "wow" LightBoost effect, you really need to run fps at least matching Hz, to make sure that your eye tracking trajectory follows the motion in the refreshes accurately.
Source:
Microsoft Research (and
many others)
Things that weaken a display's ability to eliminate motion blur
- Sample and hold (ala traditional LCD - creates eye-tracking-based motion blur)
- Pixel persistence (
finally solvable by turning off backlight while waiting for pixels to transition)
- Eye tracking (solved by reduce the amount of time a frame is displayed for)
- Judders/stutters/inconsistent frame rendertimes, eye tracking is less in sync with motion (get a better computer/GPU)
- Repeated refreshes via frame rates lower than refresh rate, eye tracking is less in sync with motion (get a better computer/GPU)
You really need fps=Hz for best CRT effect
All weak links needs to be eliminated, if you're a person with a goal of eliminating all sources of perceived motion blur. Truly, you NEED fps=Hz for maximum elimination of motion blur, otherwise LightBoost mostly becomes a ho-hum thing. Once ALL weak links are ALL eliminated, that's when strobe displays (e.g. CRT or LightBoost) start to really shine. Judderfree 120fps@120Hz is possible with Source Engine videogames on good nVidia GPU's (e.g. GTX 680), so such games are excellent candidates for the LightBoost mode; unlike newer games like Crysis which may actually look better during non-flicker 144Hz due to the inability to run fps=Hz for the "CRT look". People highly familiar with 60fps@60Hz CRT (e.g. Sega Model 3 arcade video games, to things like Nintendo scrollers) will know how clear a CRT is at scrolling.
For the purposes of testing "perfect motion" (perfectly consistent frame render times, with frame renders synchronized to refresh), it is easier to get smoother motion when enabling VSYNC, but that adds input lag, so if you want VSYNC OFF you need to get the smoothest possible VSYNC OFF motion that you can get (good GPU, good system, consistent frame render systems, good game settings, etc), if VSYNC OFF is your preferred setting.
The old BENQ AMA-Z flickering backlight of 2006
Yesterday's motion-blur-reducing backlights (e.g. BENQ AMA-Z in year 2006) reduced motion blur by only 30% (~1.5x less motion blur). Those flickered badly at 60 Hz, and never sold very well. Today's vastly superior LightBoost stroboscopic backlights (originally designed for 3D Vision) have the capability of eliminating 92% of motion blur (a whopping 12x elimination of motion blur; a complete order of magnitude!!). Manufacturers now need to finally reconsider marketing this technology, and make it an easy to turn on/off via a button; reintroducing as a special button on the monitor, etc.
The miracle of cramming pixel persistence into a vertical blanking interval
Meanwhile, manufacturers needed to invent LCD panels that can refresh quickly enough for active 3D (alternating left/right eye with minimal pixel persistence leaking between frames, for shutter glasses operation). Finally, pixel persistence was recently successfully compressed into the time period of a vertical blanking interval for nearly all GtG transitions! Once this was successfully accomplished, the pixel persistence barrier is shattered -- pixel persistence ceases to be a limiting factor in motion blur when it's now possible to
hide pixel persistence (>99%+) by turning off the backlight between refreshes. The strobe backlight can flash for a shorter time period than the length of the pixel transitions, as long as the pixel persistence is virtually all successfully kept in darkness between backlight flashes on complete frames. For example, an ASUS VG278H is a 2ms TN panel with a measured MPRT (Motion Picture Response Time) of a mere 1.4 millisecond, when LightBoost is enabled and configured to the 10% setting (shortest strobe length). It is impressive to see actual examples of LCD's that have less motion blur than its pixel persistence limitations imply; thanks to the ability of stroboscopic backlights to bypass pixel persistence.
Obviously, to really notice, you need to be sensitive enough to motion blur (And understand that you need to eliminate all motion blur weak links); the kind of person that clearly sees a massive difference between a 60 Hz LCD and a 120 Hz LCD (~50% improvement in motion clarity) but be able to immediately notice 120 Hz LCD does not have as clear motion as a CRT. Then, in this case, you'll more easily notice the further improvement of going from 120 Hz LCD -> 120 Hz LightBoost (~75-80% further improvement, for a grand total of 85%-92% improvement in motion clarity), during fast-action games like FPS.
The niche market is bigger than expected...
Motion-blur-eliminating (precise synchronized) strobe backlights is not for everyone, but they are proving to be a feature apparently in demand by thousands (my LightBoost HOWTO has had 15,000 pageviews in the last 7 days alone, with over a thousand downloads of my .reg and .inf files). LightBoost is more popular on some forums (e.g. HardForum and OCN, with tens of thousands of views) than some forums such as this one, but it appears to be already pushing some sales of LightBoost-enabled monitors (people buying because they heard about the lack of motion blur provided by LightBoost). Monitor manufacturers and nVidia needs to take notice that this is a niche market that's big enough to be worthwhile to make it easier for nVidia drivers to enable (without requiring 3D glasses). Enthusiast gamers (especially those who came from CRT), don't realize that they're sitting on something really good unless they know it exists.
Now back to regular fun posting; hopefully I've not bored too many people about the technicalities of why LCD's produce far more motion blur than CRT's (and not because of pixel persistence).