New tweaks for ASUS/BENQ/Samsung 120 Hz! (Zero LCD motion blur; looks like CRT)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BoFox

Senior member
May 10, 2008
689
0
0
But he's what ? doubling his resolution and gaining about 4ms of response time.

He is however now having to deal with bezels, worse overall picture quality, and the inherant trouble related to running anything at triple monitor resolution.
Well, since these are IPS panels, there shouldn't be worse overall picture quality unlike the TN panels that have color shifting issues with slight viewing angles.

I'd definitely trade in 4ms for 3x the resolution - especially for racing games where I can turn my head around and see where I'm going! :rolleyes: :whiste: :p
 

BoFox

Senior member
May 10, 2008
689
0
0
Some interesting science, which is pretty well covered (e.g. Microsoft Research, Nokia, Sharp Labs, universities), is that motion blur is actually dictated by the length of time a refresh is displayed for, instead of the refresh rate.

That's why stroboscopic displays (flicker) such as CRT, plasma and LightBoost, have excellent motion quality. The flickers shortens the length of time a refresh is displayed for.

It's also precisely why long-time CRT users who have switched to 120 Hz LCD's (and gotten dissapointed) have noticed that CRT 75fps@75Hz produces much clearer motion than LCD 120fps@120Hz. That's because the CRT phosphor decays in less than 1/500th of a second, which is shorter than 1/120th of a second (Because LCD refreshes are continuously displayed for a whole refresh).

LightBoost strobes are 1/700th to 1/400th of a second (range of 1.4ms to 2.4ms), depending on the LightBoost OSD setting. The 1/700th of a second flickers of a frame produces less than 1/10th the motion blur of a standard 60 Hz LCD refresh. An LCD refresh is continuously shining, which is called "sample and hold".

On newer displays, motion blur is mostly caused by eye tracking, NOT pixel persistence
According to many research papers, including those in Science & References here, motion blur is caused by eye tracking across a frame. Your eyes are always moving to track moving objects. The longer a frame is displayed for, the more eye tracking across the frame occurs (your eyes are in a different position at the beginning of a frame, versus end of a frame). That's why even sample-and-hold OLED displays have lots of motion blur (e.g. PS Vita has lots of motion blur) even though pixel response is nearly instantaneous. Even on LCD's (TN's especially) pixel persistence is already only a tiny fraction of of a refresh, so most motion blur is nowadays caused by eye tracking, instead of pixel persistence. The artificial stepping, caused by a finite framerate, produces opportunity for eye tracking to create a motion blur limitation that's not caused by pixel persistence.

gg463407.TempRate16(en-us,MSDN.10).gif

Source: Microsoft Research (and many others)

The only way to reduce motion blur (caused by eye tracking) is to shorten the length of time that a frame is displayed for. This is accomplished by doing extra Hz (e.g. 240Hz, 480Hz, 960Hz...) or via extra black periods between refreshes (e.g. flicker displays like CRT, Plasma, LightBoost, black frame insertion, stroboscopic backlights, etc.) The bigger the black period and the smaller the refresh time period, the better it is for motion blur.

A medium persistence CRT display with 2ms phosphor decay would have the equivalent amount of motion blur to a theoretical flickerfree 500fps@500Hz LCD (2ms per continuously-shining refresh), which does not exist without motion interpolation (bad for games due to input lag). Who can buy a 500 Hz LCD and be able to run 500fps on it natively? You can't. As a result, it's cheaper from a GPU perspective to just simply shorten the frame samples without raising the framerate. Adding black periods between frames. That means flicker (stroboscopic effect of a black period between frames) is required to prevent the frames from being motion blurred as your eyes track across it. Now, you want a high enough refresh rate so the flicker is not noticeable (e.g. 120 Hz instead of 60 Hz).

Clearly, a frame being stroboscopically flashed for only 1/400th or 1/700th second (LightBoost 120 Hz) produces vastly clearer motion than frames displayed for a full 1/120th second (non-LightBoost 120 Hz LCD). This is very similar to CRT phosphor decay of 1-2ms, and results in several "It looks like a CRT" testimonials from long-time CRT users.

You need frame rate matching refresh rate, to prevent repeated refreshes from contributing to motion blur. When judder is too high frequency to detect, the judder blends into a motion blur (this is why 60fps@120Hz still looks less clear than 120fps@120Hz). LightBoost is hardware-limited to only function during high refresh rates (100-120Hz) so this enforces a high GPU requirement for best motion clarity. It's also why LightBoost doesn't help you very much if you're only running half the framerate of refresh rate; so for the "wow" LightBoost effect, you really need to run fps at least matching Hz, to make sure that your eye tracking trajectory follows the motion in the refreshes accurately.

gg463407.TempRate15(en-us,MSDN.10).gif

Source: Microsoft Research (and many others)

Things that weaken a display's ability to eliminate motion blur
- Sample and hold (ala traditional LCD - creates eye-tracking-based motion blur)
- Pixel persistence (finally solvable by turning off backlight while waiting for pixels to transition)
- Eye tracking (solved by reduce the amount of time a frame is displayed for)
- Judders/stutters/inconsistent frame rendertimes, eye tracking is less in sync with motion (get a better computer/GPU)
- Repeated refreshes via frame rates lower than refresh rate, eye tracking is less in sync with motion (get a better computer/GPU)

You really need fps=Hz for best CRT effect
All weak links needs to be eliminated, if you're a person with a goal of eliminating all sources of perceived motion blur. Truly, you NEED fps=Hz for maximum elimination of motion blur, otherwise LightBoost mostly becomes a ho-hum thing. Once ALL weak links are ALL eliminated, that's when strobe displays (e.g. CRT or LightBoost) start to really shine. Judderfree 120fps@120Hz is possible with Source Engine videogames on good nVidia GPU's (e.g. GTX 680), so such games are excellent candidates for the LightBoost mode; unlike newer games like Crysis which may actually look better during non-flicker 144Hz due to the inability to run fps=Hz for the "CRT look". People highly familiar with 60fps@60Hz CRT (e.g. Sega Model 3 arcade video games, to things like Nintendo scrollers) will know how clear a CRT is at scrolling.

For the purposes of testing "perfect motion" (perfectly consistent frame render times, with frame renders synchronized to refresh), it is easier to get smoother motion when enabling VSYNC, but that adds input lag, so if you want VSYNC OFF you need to get the smoothest possible VSYNC OFF motion that you can get (good GPU, good system, consistent frame render systems, good game settings, etc), if VSYNC OFF is your preferred setting.

The old BENQ AMA-Z flickering backlight of 2006
Yesterday's motion-blur-reducing backlights (e.g. BENQ AMA-Z in year 2006) reduced motion blur by only 30% (~1.5x less motion blur). Those flickered badly at 60 Hz, and never sold very well. Today's vastly superior LightBoost stroboscopic backlights (originally designed for 3D Vision) have the capability of eliminating 92% of motion blur (a whopping 12x elimination of motion blur; a complete order of magnitude!!). Manufacturers now need to finally reconsider marketing this technology, and make it an easy to turn on/off via a button; reintroducing as a special button on the monitor, etc.

The miracle of cramming pixel persistence into a vertical blanking interval
Meanwhile, manufacturers needed to invent LCD panels that can refresh quickly enough for active 3D (alternating left/right eye with minimal pixel persistence leaking between frames, for shutter glasses operation). Finally, pixel persistence was recently successfully compressed into the time period of a vertical blanking interval for nearly all GtG transitions! Once this was successfully accomplished, the pixel persistence barrier is shattered -- pixel persistence ceases to be a limiting factor in motion blur when it's now possible to hide pixel persistence (>99%+) by turning off the backlight between refreshes. The strobe backlight can flash for a shorter time period than the length of the pixel transitions, as long as the pixel persistence is virtually all successfully kept in darkness between backlight flashes on complete frames. For example, an ASUS VG278H is a 2ms TN panel with a measured MPRT (Motion Picture Response Time) of a mere 1.4 millisecond, when LightBoost is enabled and configured to the 10% setting (shortest strobe length). It is impressive to see actual examples of LCD's that have less motion blur than its pixel persistence limitations imply; thanks to the ability of stroboscopic backlights to bypass pixel persistence.

Obviously, to really notice, you need to be sensitive enough to motion blur (And understand that you need to eliminate all motion blur weak links); the kind of person that clearly sees a massive difference between a 60 Hz LCD and a 120 Hz LCD (~50% improvement in motion clarity) but be able to immediately notice 120 Hz LCD does not have as clear motion as a CRT. Then, in this case, you'll more easily notice the further improvement of going from 120 Hz LCD -> 120 Hz LightBoost (~75-80% further improvement, for a grand total of 85%-92% improvement in motion clarity), during fast-action games like FPS.

The niche market is bigger than expected...
Motion-blur-eliminating (precise synchronized) strobe backlights is not for everyone, but they are proving to be a feature apparently in demand by thousands (my LightBoost HOWTO has had 15,000 pageviews in the last 7 days alone, with over a thousand downloads of my .reg and .inf files). LightBoost is more popular on some forums (e.g. HardForum and OCN, with tens of thousands of views) than some forums such as this one, but it appears to be already pushing some sales of LightBoost-enabled monitors (people buying because they heard about the lack of motion blur provided by LightBoost). Monitor manufacturers and nVidia needs to take notice that this is a niche market that's big enough to be worthwhile to make it easier for nVidia drivers to enable (without requiring 3D glasses). Enthusiast gamers (especially those who came from CRT), don't realize that they're sitting on something really good unless they know it exists.

Now back to regular fun posting; hopefully I've not bored too many people about the technicalities of why LCD's produce far more motion blur than CRT's (and not because of pixel persistence).

Wow, just wow! You taught me something neat! Thank you, sir!

Funny how developers keep on adding exaggerated motion blur to console ports, and we're trying to fight motion blur for our precious PC games! Sometimes, when I play games on my CRTs, I let the motion blur option remain enabled, especially if it's just a slower-paced game and if disabling motion blur automatically disables some other special effects (like Resident Evil 5 for example). Then on my LCD, I try to disable motion blur because heck, there's already built-in motion blur in the monitor!

Hey, I just would like to share some more insight as to the CRT phosphor decay time, since you claimed it to be 1/500th of a second. I bolded parts of your text above in red, including where you said "A medium persistence CRT display with 2ms phosphor decay".
Perhaps that would be true for the time it takes for a phosphor fired up to medium brightness (at say, 50% luminosity) to fade to black.
Pretty much all CRTs out there, including old TV's, display far, far, FAR longer fade times if the phosphors are fired up to maximum brightness (at least 80% of the MAX luminosity that is caused by turning up the contrast to >80%).

I would think that it's on some sort of a logarithmic scale.
I have even seen phosphor decays take as long as 50+ms (more, actually), rather than 2ms, with a tweaked CRT monitor that had its brightness pushed beyond manufacturers specs. So bad, that the mouse cursor would leave a long ghosting trail against a dark background when moving it at about 10 inches per second (just fast enough for each refresh to show a solid movement).

It seems that there are two factors in play: phosphor brightness, and how long the phosphor was heated up.

Say, a phosphor fired up to 50% brightness would take 2ms to fade to black.
60% brightness - 3ms
70% - 5ms
80% - 8ms
90% - 12ms
100% - 20+ms (depending on the CRT make/model)

Plus, if the phosphor was fired up to 100% for more than 0.1-0.2 seconds in duration, then it'd take at least twice as long for the phosphor to decay to black, than if it were fired up to 100% for only one frame at 120Hz (8.33ms).

I have not taken actual physical measurements of this. It's only a hypothesis, from what I have seen and studied with my own eyes - far from a precise tool for measurement. Yet Stereo3D ghosting is what pretty much showed me how even if it's displayed for only 1 frame out of every 2 frames, it's still not quite enough for the phosphor to sufficiently decay.

That's why although CRT is great with high "brightness-contingent semi-strobing" refresh rates, it sucks for Stereo3D gaming, where ghosting is unbearable. Usually, I have to try to turn up the brightness as much as possible on my FW900's, to compensate for the dim shutterglasses, but it only makes ghosting worse. I'd be seeing double with bright images everywhere, and even have a hard time ignoring them - sometimes it's severe, actually.

I really look forward to trying out the IPS panels with 120Hz Lightboost (in 3D also), even if the response time is nowhere as good as my 65" DLP HDTV that has 0.016ms (16 microsecond) response time.
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Well, since these are IPS panels, there shouldn't be worse overall picture quality unlike the TN panels that have color shifting issues with slight viewing angles.
Look again. TN panels. Yes, three TN panels in surround; because he said "LightBoost" and these are VG248QE's.

Yes, he owns a Catleap 2B IPS 1440p 27" overclocked to 130 Hz. But he prefers to play on these TN panels because of the zero motion blur effect made possible by LightBoost. You can't cram three CRT's that close together. With LightBoost, you can.

The uniformity problems is minor (to him) compared to the massive motion blur mess that IPS still is (for people who are sensitive to motion blur) for fast action FPS. For an example how lack motion blur can outweigh IPS quality in certain situations (for people who are REALLY sensitive to motion blur):
SkyViper said:
(from HardForum)
So I finally got the VG248QE hooked up last night and was able to play around with it for a couple hours. The other monitor that I have is a HP ZR30W which is a 30" 2560x1600 IPS monitor so I will be comparing the VG248QE to that a lot in this review.

Right off the bat, I noticed the color quality seems to be a lot worse than the ZR30W. Everything looks to be washed out, dull and not to mention the monitor suffers from poor viewing angles. On the ZR30W, there is next to no color shifting when I move my head around unlike the VG248QE, but that's a common problem with all TN monitors. I tried calibrating the monitor a little bit using some of the values posted online, but it still doesn't compare to the HP.

Moving on, the first thing I tried was 144 Hz gaming. I loaded up Borderlands 2 just to see how it is and I can definitely say it felt smoother. There is no screen tearing at all on the ASUS, unlike how it is on the HP if i don't turn on Vsync. Although the game felt smoother at 144 Hz and there was less blurring, I found that having to play on a lower res (1920x1080 vs 2560x1600) and poorer color reproduction made the overall gaming experience WORSE. Granted this isn't a competitive, online FPS game so I might have benefited more from having a faster refresh rate, but I would have probably stuck with playing this game on the 30" IPS monitor rather than a 24" TN.

At this point I felt like I may have wasted $300 bucks on a monitor that is full of compromises. The next thing I tried of course was using the Lightboost hack. This was the main reason why I bought the monitor in the first place since there are plenty of other 120 Hz monitors that I could have gotten that I'm sure had better color reproduction.

So I downloaded the hacked INF file and followed Mark's instructions. After turning on Lightboost, I noticed the monitor became a little bit brighter so I loaded up PixPerAn just to verify everything is working. The first thing I noticed was that I can actually read "I need more socks" at full speed! This was cool since I've never been able to read it going so fast before on any LCD monitor.

I then proceeded to load up Borderlands 2 again not having much expectations. The first thing that happened was I noticed the FPS drop down to around 1-2 fps, but then I remembered to hold down "Ctrl-T" for a few seconds to turn off the 3D effect which fixed the FPS problem. So I loaded up a game and the first thing that came to my mind was...

SWEET MOTHER OF GOD!

Am I seeing this correctly? The last time I gamed on a CRT monitor was back in 2006 before I got my first LCD and this ASUS monitor is EXACTLY like how I remembered gaming on a CRT monitor. I was absolutely shocked and amazed at how clear everything was when moving around. After seeing Lightboost in action, I would have gladly paid twice the amount for something that can reproduce the feeling I got when playing on a CRT. Now I really can't see myself going back to my 30" 2560x1600 IPS monitor when gaming. Everything looks so much clearer on the ASUS with Lightboost turned on.


If you do any kind of gaming, you should definitely get this monitor. For everything else however, an IPS monitor would probably be better.

Thankfully I am lucky enough to have both :)
Look at how he hates TN quality, until he saw the zero motion blur. There are many testimonials on about several dozen different forum websites about the lack of motion blur that enabling LightBoost makes possible.

For fast action games and certain gameplay styles, for people who are actually sensitive to motion blur, sometimes the LightBoost effect outweighs the poor TN colors. Imagine having 12x the perceived resolution during fast motion -- that's what LightBoost gives you for very fast pans, etc -- pans as perfectly clear/sharp as stationary images. It's like having high-def fast-motion (LightBoost) instead of VHS-quality fast motion (motion blurred 60 Hz), which can impede competitive FPS gaming where you need to identify enemies quickly during fast motion; it's far easier to track your eyes on fast motion if the motion is perfectly sharp in fast pans (the smooth-as-butter CRT effect).
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Funny how developers keep on adding exaggerated motion blur to console ports, and we're trying to fight motion blur for our precious PC games! Sometimes, when I play games on my CRTs, I let the motion blur option remain enabled, especially if it's just a slower-paced game and if disabling motion blur automatically disables some other special effects (like Resident Evil 5 for example). Then on my LCD, I try to disable motion blur because heck, there's already built-in motion blur in the monitor!
Yes, sometimes motion blur can be beneficial:
Hides stutters more pleasingly.
...Stutters are often easily noticeable on CRT than on LCD
Hides stroboscopic effects more pleasingly (e.g. wagon wheel artifacts).
...More important at lower refreshes like 60 Hz
Make low framerates more pleasing looking.
...If your framerate is far lower than refresh, motion blur is far more pleasing.
Makes for sometimes pretty imagery if you are doing things slowly.
...You aren't doing fast motions that require fast reactions.

Perhaps that would be true for the time it takes for a phosphor fired up to medium brightness (at say, 50% luminosity) to fade to black.
You are correct about the logarithmic scale. That said, most phosphor decay measurements is for average decay, and for fade to 90% black. This is where the phosphor ceases to noticeably contribute to motion blur (in the form of ghosting effect, usually green trailing). That's where the common 1-2ms values comes from.

I have not taken actual physical measurements of this. It's only a hypothesis, from what I have seen and studied with my own eyes
You are correct. See high speed video of CRT. You can clearly see that the phosphor decay lasts all the way until the next frame. What matters though is, when the phosphor essentially stops contributing to motion blur.

There has been a few reports that a properly configured LightBoost LCD has less motion blur than a Sony GDM-W900 CRT (see media coverage, especially the TechNGaming article), so you are right there are instances where LightBoost outperforms CRT (motion clarity only, NOT color quality), especially if you adjust the LightBoost OSD setting down closer to 10% which shortens the strobe lengths. The image is dim, except when you play in a dark room.

I really look forward to trying out the IPS panels with 120Hz Lightboost (in 3D also), even if the response time is nowhere as good as my 65" DLP HDTV that has 0.016ms (16 microsecond) response time.
But true, I can't wait for the first IPS LightBoost displays (if they're working on making it possible!) It will be very tricky to squeeze IPS pixel persistence into a blanking interval, but it could be possible to pull off via accelerated refresh electronics (e.g. scanning a 120Hz frame in 1/500th of a second) to make the idle time period between refreshes big enough to fit IPS pixel persistence. In addition, input lag will be affected by this buffering action and because you need to wait in total darkness longer (for the pixel persistence to finish) before you strobe.

Alas, it may take several years before this happens on computer monitors, without inter-refresh crosstalk artifacts and with acceptable input lag. I've found out that 1ms TN panels (with LightBoost enabled) have a better looking CRT effect in motion test patterns (e.g. PixPerAn) than 2ms TN panels (with LightBoost enabled)

Pixel response time does not dictate motion blur alone. Frame sample length does.
As for DLP, there's still more motion blur on DLP than CRT because the DLP pixel pulses multiple times per refresh, using per-pixel PWM (ultra-high-speed temporal dithering at many kilohertz) to create the full spectrum of colors. That's equivalent to multiple refreshes (but at the pixel level) in interfering with zero motion blur. To reduce motion blur, DLP proectors need to keep the pixel completely black for as long as possible between the last refresh and the next refresh. Some DLP's do this form of black frame insertion to reduce motion blur, but the more black period, the dimmer the image and the lower the color resolution (because of lower resolution temporal dithering).

Even a 0ms pixel response display can have far more motion blur than an 8ms pixel response display, because of longer frame sample lengths despite shorter pixel transition times. That's why a PS Vita OLED has similar motion blur to a fast TN LCD; most of the motion blur is caused by eye tracking on sample-and-hold. The effective respone time (to human eyes) of a DLP is roughly the time period from the leading edge of the visible refresh to the trailing edge of the visible refresh. The visible refresh of a DLP is several milliseconds long, because of the temporal dithering that DLP needs to do to generate the full color spectrum from binary pixels; and to maximize utilization of the light from the projector light bulb. That's the frame sample length, from the perspective of what will create motion blur when your eyes track across it (as explained in my previous posts with the science/references). Even with black frame insertion features, most DLP's takes several milliseconds per refresh to finish its entire pixel PWM sequence for a refresh (especially for bright colors), you're going to have more motion blur than most CRT or LightBoost displays. That said, the good DLP projectors has amazingly accurate color. They also happen to do stereoscopic 3D very well too.
 
Last edited:

Pheesh

Member
May 31, 2012
138
0
0
Mark, this is awesome background. Is there any technical reason why AMD's driver teams couldn't implement support for a similar lightboost feature for their cards (including existing cards)? It's a pretty big feature for enthusiast gamers and I know people that are choosing NVIDIA cards over AMD solely because of this feature. I'm partially regretting my recent purchase of an ATI card because I won't be able to utilize this.

It's not clear if AMD is even aware of this yet or working on a similar feature.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Mark, this is awesome background. Is there any technical reason why AMD's driver teams couldn't implement support for a similar lightboost feature for their cards (including existing cards)? It's a pretty big feature for enthusiast gamers and I know people that are choosing NVIDIA cards over AMD solely because of this feature. I'm partially regretting my recent purchase of an ATI card because I won't be able to utilize this.

It's not clear if AMD is even aware of this yet or working on a similar feature.
There's no technical reason stopping AMD from doing this. In fact, if you tweak a Radeon computer to match the resolution timings (Vertical Total of 1147 or 1149 including blanking interval), you can hot-plug a DVI cable between an nVidia computer to a Radeon computer, and LightBoost keeps working. You still seem to need an nVidia product to activate the LightBoost feature in a monitor.

However, LightBoost is an nVidia licensed name to monitor manufacturers. This feature was originally designed to make 3D stereoscopic clearer and brighter, by strobing the backlight in sync with the 3D shutter glasses. But also has the great side effect of eliminating motion blur even for 2D without glasses (something more important to some people like me, than wearing 3D glasses).

If you add a strobe backlight to a monitor that works with Radeon, it can't be called "LightBoost" (an nVidia licensed name). Samsung actually has a strobe backlight that works with Radeon, but it has extremely poor input lag compared to LightBoost. (See Samsung Zero Motion Blur HOWTO -- works on Radeon)
 
Last edited:

BoFox

Senior member
May 10, 2008
689
0
0
Yes, sometimes motion blur can be beneficial:
Hides stutters more pleasingly.
...Stutters are often easily noticeable on CRT than on LCD
Hides stroboscopic effects more pleasingly (e.g. wagon wheel artifacts).
...More important at lower refreshes like 60 Hz
Make low framerates more pleasing looking.
...If your framerate is far lower than refresh, motion blur is far more pleasing.
Makes for sometimes pretty imagery if you are doing things slowly.
...You aren't doing fast motions that require fast reactions.

You are correct about the logarithmic scale. That said, most phosphor decay measurements is for average decay, and for fade to 90% black. This is where the phosphor ceases to noticeably contribute to motion blur (in the form of ghosting effect, usually green trailing). That's where the common 1-2ms values comes from.

You are correct. See high speed video of CRT. You can clearly see that the phosphor decay lasts all the way until the next frame. What matters though is, when the phosphor essentially stops contributing to motion blur.

There has been a few reports that a properly configured LightBoost LCD has less motion blur than a Sony GDM-W900 CRT (see media coverage, especially the TechNGaming article), so you are right there are instances where LightBoost outperforms CRT (motion clarity only, NOT color quality), especially if you adjust the LightBoost OSD setting down closer to 10% which shortens the strobe lengths. The image is dim, except when you play in a dark room.


But true, I can't wait for the first IPS LightBoost displays (if they're working on making it possible!) It will be very tricky to squeeze IPS pixel persistence into a blanking interval, but it could be possible to pull off via accelerated refresh electronics (e.g. scanning a 120Hz frame in 1/500th of a second) to make the idle time period between refreshes big enough to fit IPS pixel persistence. In addition, input lag will be affected by this buffering action and because you need to wait in total darkness longer (for the pixel persistence to finish) before you strobe.

Alas, it may take several years before this happens on computer monitors, without inter-refresh crosstalk artifacts and with acceptable input lag. I've found out that 1ms TN panels (with LightBoost enabled) have a better looking CRT effect in motion test patterns (e.g. PixPerAn) than 2ms TN panels (with LightBoost enabled)

Pixel response time does not dictate motion blur alone. Frame sample length does.
As for DLP, there's still more motion blur on DLP than CRT because the DLP pixel pulses multiple times per refresh, using per-pixel PWM (ultra-high-speed temporal dithering at many kilohertz) to create the full spectrum of colors. That's equivalent to multiple refreshes (but at the pixel level) in interfering with zero motion blur. To reduce motion blur, DLP proectors need to keep the pixel completely black for as long as possible between the last refresh and the next refresh. Some DLP's do this form of black frame insertion to reduce motion blur, but the more black period, the dimmer the image and the lower the color resolution (because of lower resolution temporal dithering).

Even a 0ms pixel response display can have far more motion blur than an 8ms pixel response display, because of longer frame sample lengths despite shorter pixel transition times. That's why a PS Vita OLED has similar motion blur to a fast TN LCD; most of the motion blur is caused by eye tracking on sample-and-hold. The effective respone time (to human eyes) of a DLP is roughly the time period from the leading edge of the visible refresh to the trailing edge of the visible refresh. The visible refresh of a DLP is several milliseconds long, because of the temporal dithering that DLP needs to do to generate the full color spectrum from binary pixels; and to maximize utilization of the light from the projector light bulb. That's the frame sample length, from the perspective of what will create motion blur when your eyes track across it (as explained in my previous posts with the science/references). Even with black frame insertion features, most DLP's takes several milliseconds per refresh to finish its entire pixel PWM sequence for a refresh (especially for bright colors), you're going to have more motion blur than most CRT or LightBoost displays. That said, the good DLP projectors has amazingly accurate color. They also happen to do stereoscopic 3D very well too.

Awesome knowledge, man!

Yeah, 80% black also gives further insight into it!

The stuff you said about motion blur makes sense, but only in those cases! It just quickly became popular with console versions running at lower frame rates, hence transition to PC games on the majority of computers that are likely to be unable to run default settings consistently at 30-60+ fps.

Man, you're really making me want a Lightboost-capable 120Hz LCD! 27" is just insanely expensive, at $600! Come on, man, when will it be $300? I know it can be $300, but come on, why the years?
 
Last edited:

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
So you don't watch the videos and see the evidence for yourself?

Mark is no joke. He's doing the work and showing his work.

Yes i watched all the videos. I see no evidence to ME and lots of others don't see a difference in real world with it. That is the point. If you "see" it then great for you. But this is not something worth switching out any monitor you already have for another for majority of people.

I prefer to game on my Dell U2410 over my Asus 120Hz monitor. I regret getting it because of the hype surrounding 120Hz was better for FPS, just was not the case for me.
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
Yes i watched all the videos. I see no evidence to ME and lots of others don't see a difference in real world with it. That is the point. If you "see" it then great for you. But this is not something worth switching out any monitor you already have for another for majority of people.

I prefer to game on my Dell U2410 over my Asus 120Hz monitor. I regret getting it because of the hype surrounding 120Hz was better for FPS, just was not the case for me.

Wait... so you do all the steps and then use PixPerAn and you don't see the demonstrable difference?
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
It is worth mentioning a quote of a good post I've made on HardForum:
Rattle said:
thanks guys, I don't notice much difference between 120hz with LB Vs 144hz No LB TBH, both feel great though.
Vega said:
The difference is huge! I can instantly tell when LB is not on.
It's quite normal to have these conflicting reviews of LightBoost.
Common causes are the following:

Human Factors
-- Your ability to track fast-moving objects; and your sensitivity to motion blur.
-- Whether or not you are used to CRT gaming. (LightBoost brings the CRT effect to LCD)
-- Some people growing up today, has never played on a CRT before. Such individuals may be less likely to notice quickly.
-- Some people only have a habit of eye-tracking only slower-moving objects.
-- Specific play styles. Strafing sideways & turning motions benefits more than walking forward.
-- Your sensitivity to input lag, flicker, etc. (You benefit more if you don't feel any effects from input lag or flicker)

Computer Factors
-- Ability to run fps=Hz. You really need 120fps@120Hz to get maximum LightBoost benefit.
-- Judder/stutter control. Some games/configurations judder so much, that it negetates LightBoost.
-- Framerate limits. Some games cap to 60fps, this needs to be uncapped (e.g. fps_max)
-- Faster motion benefits more. Not as noticeable during slow motion.
-- Specific games. e.g. Team Fortress 2 benefits far more than World of Warcraft.
-- Some games judder more with VSYNC ON, while others judder more with VSYNC OFF. Test opposite setting.
-- High quality mouse (preferably 1000 Hz gaming mouse). Ordinary mice adds too much judder.

Example of areas that benefit from eliminating motion blur:
-- Fast 180-degree flick turns in FPS shooting.
-- Shooting while turning, without stopping turning (easier on CRT or LightBoost)
-- Close-up strafing, especially circle strafing
-- Running while looking at the ground (e.g. hunting for tiny objects quickly).
-- Identifying multiple far-away enemies or small targets, while turning fast
-- Playing fast characters such as "Scout" in Team Fortress 2
-- High-speed low passes, such as low helicoptor flybys in Battlefield 3, you aim better.

For people who have gameplay styles in fast-action video games, such people can gain a massive competitive advantage during fast-motion activities, because you react faster. Without motion blur, enemies are easier to identify while you're still in fast motion. Even out of the corner of your eyes, even before you stop moving. Without motion blur, fast panning motion look as perfectly sharp as being stationary -- LightBoost measured 92% sharper motion than a 60 Hz LCD -- which yields a high-definition-in-motion experience when you play with an impulse driven display like CRT or LightBoost. As a result, there are several gamers (with certain game play styles) who gain a lot more frags when gaming with LightBoost.

Human reaction times are measured in hundreds of milliseconds; reducing human lag is useful. Even if you react a scant 20 milliseconds faster, that can still actually out-compensate an enemy that has less input lag than you. While it is noteworthy to mention some people say input lag is too high for them, for other people the input lag is not even felt or noticeable (It's important to note that there are many factors of input lag other than the display, too). For some people, the lack of motion blur (reduced human brain lag) far outweighs the minor (unnoticeable) input lag disadvantage of LightBoost; and have game scores that go up dramatically with LightBoost.
However, it is understandable, not everyone benefits, for various factors -- listed above.
To do another test, download PixPerAn and test with LightBoost enabled versus disabled. It's virtually guaranteed to see a difference in PixPerAn, far more easily than in most video games.

Sometimes certain video games don't show the LightBoost effect very well. For example, Crysis will usually NOT show the LightBoost effect, while Team Fortress 3 will. This is because you need a minimum framerate of more than 80 frames per second, to begin seeing the LightBoost effect (complete lack of visible motion blur). Also, temporarily turn on VSYNC during testing, since the differences amplify when frame deliverytimes is in synch with the backlight strobes. Then if you prefer VSYNC OFF again, experiment with fps_max of several different values such as 125, 200, and 999 -- sometimes you get much better fluidity. LightBoost does sometimes amplify visibility of stutters and tearing effects (as it does on CRT because clearer motion amplify other motion artifacts that interfere with the zero motion blur effect), so you prefer framerates matching refresh rate to eliminate motion blur. Some benefits start to show at frame rates slightly more than 50% Hz (e.g. 80fps), increasing until fps matches Hz.

LightBoost won't benefit:
-- If you can't tell apart a CRT and a newer/modern 60 Hz LCD in motion clarity, you probably won't benefit from LightBoost.
-- If you can't tell apart 60 fps and 120 fps, then you probably won't benefit from LightBoost either.
-- If you can't run more than 60fps, then you probably won't benefit either.
-- If you have a variable framerate (e.g. wild fluctuations 60fps<->120fps), LightBoost won't help as much.

Again, LightBoost is mostly worthless at 60fps @ 120Hz.
You need consistent solid 120fps @ 120Hz for maximum LightBoost benefit.
 
Last edited:

tweakboy

Diamond Member
Jan 3, 2010
9,517
2
81
www.hammiestudios.com
My games are like my old FW900 but better. Mon is 1ms ,, also I overclocked it to 75hz and its soo nice doesnt bother my eyes, and FPS games are instant mouse,, no laggy mouse,,,,, With LED you get 5 to 16ms probably about 8ms on your monitor,, that is mouse lag not game lag. Kinda wiggley jiggley. For me mouse is instant like CRT, no ghosting at all, smooth mouse and games. gl
Also I know Im @ 75hz because my monitor tells me soo,.. in its settings it shows. gl
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Breakthrough on HardForum/OCN!
Some people are getting "nVidia LightBoost" on Radeon's!

This is done by temporarily borrowing an nVidia laptop (or tower) to enable LightBoost on monitors (following the LightBoost HOWTO), then using ToastyX's Custom Resolution Utility to make Radeon match LightBoost-compatible timings (Vertical Total of 1149), then hot-plugging the DVI cable back to their Radeon desktop. (This method also works with a $50 cheap GeForce card installed in the same system as your $600 Radeon) About five or six successful confirmations. New ASUS VG248QE's even remember the LightBoost setting even through sleep and power-off cycles. Just don't lose power to your monitor (yank power cord), and it will stay enabled.

I wonder what nVidia has to say about this -- but it is very noteworthy some people have done the literal equivalent of installing a Porsche engine in a Corvette chassis (or vice-versa).
 
Last edited:

Childs

Lifer
Jul 9, 2000
11,450
7
81
Nvidia shouldnt care about this. You still need an Nvidia product to enable it. So AMD's customers buying an Nvidia card is still better than nothing.

Anyways, my VG248QE finally arrives on Tues. Cant wait to give this a shot.
 
Last edited:

hans030390

Diamond Member
Feb 3, 2005
7,326
2
76
I'm having a hard time figuring out if this will work with the Asus VG236HE or not. It looks like it won't as far as I can tell.
 

Annisman*

Golden Member
Aug 20, 2010
1,918
89
91
Nvidia should care about this. You still need an Nvidia problem to enable it. So AMD's customers will buy an Nvidia card as well.

Anyways, my VG248QE finally arrives on Tues. Cant wait to give this a shot.


You're gonna love it !
 

Childs

Lifer
Jul 9, 2000
11,450
7
81
You're gonna love it !

Man, my spelling and grammar suck. lol Anyways, I hope so. I bought the glasses as well. Not that I plan on gaming with them, but I want to see if the 3D is better than what I get on my TV. My old eyes have grown really sensitive to motion blur, and I'm not ready to stop gaming.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Man, my spelling and grammar suck. lol Anyways, I hope so. I bought the glasses as well. Not that I plan on gaming with them, but I want to see if the 3D is better than what I get on my TV. My old eyes have grown really sensitive to motion blur, and I'm not ready to stop gaming.
There are two interpretations I make of this:

- You are becoming more sensitive to motion blur; which means you can now much more clearly tell apart CRT-vs-LCD, or 60-vs-120Hz LCD? You probably grew up with CRT's, and as a result, modern flat panels have shown obvious short comings in motion blur when you upgraded to them just a few years ago. LightBoost is very likely to be stunning if you manage frame-rates matching refresh-rates (120fps@120Hz)

- Your vision is now becoming more motion blurred due to age, and you have a harder time telling differences in motion blur? That even CRT's look worse for motion blur than they used to. In which case, LightBoost might not help because your vision system has now become the limiting factor.

I think you mean the former. I hope :)
 
Last edited:

Childs

Lifer
Jul 9, 2000
11,450
7
81
There are two interpretations I make of this:

- You are becoming more sensitive to motion blur; which means you can now much more clearly tell apart CRT-vs-LCD, or 60-vs-120Hz LCD? (You may also see flicker better than you used to, because you grew up with CRT's and modern flat panels have shown glaring short comings). LightBoost is very likely to be stunning if you manage frame-rates matching refresh-rates (120fps@120Hz)

- Your vision is now becoming more motion blurred due to age, and you have a harder time telling differences in motion blur? That even CRT's look worse for motion blur than they used to. In which case, LightBoost might not help.

I think you mean the former. I hope :)

I have always noticed motion blur on LCDs, but now its physically bothering me. Like its tiring out my eyes. I can also see scan lines on my plasma (ST50). I dont have a CRT anymore, but I never remember having a problem with them.

Its probably a combination of the two. Anyways, I generally dont need an excuse to buy a new toy, but even better since there is a chance this might help my eyes a bit.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
I have always noticed motion blur on LCDs, but now its physically bothering me. Like its tiring out my eyes. I can also see scan lines on my plasma (ST50). I dont have a CRT anymore, but I never remember having a problem with them.
Yes, there are a few reports of LightBoost actually reducing videogame eyestrain due to reduced motion blur. The human eye gets tired trying to focus on motion blur during fast motion.

That said, LightBoost can also increase eyestrain due to the strobe effect (and/or people who have never exercised their eyes in a sports-style habit of fast eye tracking & which means eye movements get tired faster); but if your eyes were always comfortable with CRT even at 60 Hz flicker, then you should have no problem with 120 Hz flicker (especially after re-acclimating to it, if you've been away from CRT's for a long time)

For many people (myself included), LightBoost made no difference to eyestrain.

Do report back when you try this out, and how it feels on your eyes. Thanks!
 

Childs

Lifer
Jul 9, 2000
11,450
7
81
Yes, there are a few reports of LightBoost actually reducing videogame eyestrain due to reduced motion blur. The human eye gets tired trying to focus on motion blur during fast motion.

That said, LightBoost can also increase eyestrain due to the strobe effect (and/or people who have never exercised their eyes in a sports-style habit of fast eye tracking & which means eye movements get tired faster); but if your eyes were always comfortable with CRT even at 60 Hz flicker, then you should have no problem with 120 Hz flicker (especially after re-acclimating to it, if you've been away from CRT's for a long time)

For many people (myself included), LightBoost made no difference to eyestrain.

Do report back when you try this out, and how it feels on your eyes. Thanks!

I have an LG W2363D 120Hz monitor now, and just in general its better for gaming. The LG's color is very flat, almost faded, so I was looking for a new monitor anyways. If the strobing gives me problems, I can just turn it off and use 144Hz mode. To be honest, I probably would have bought it just for 144Hz and sRGB, but if Lightboost helps at all then its all gravy. Anyways, I'll report back when everything is set up.
 

Larnz

Senior member
Dec 15, 2010
248
1
76
Does enabling the 3d steroscopic and lightboost 24/7 etc settings required for this cause any issues on a second monitor that isn't 120hz in windows extended desktop mode?

I currently have a dell u2711 (1440p) on a gtx 680 but am considering picking up a asus 27" 120hz to try all this out with the side benifiet that my fps will go up as 1 680 under some games isn't quite enough at 1440p all maxed out.

Also if you had to pick a 27" monitor for this lightboost trickery which model do you think is the best currently?

thanks
Larnz
 
Last edited:

SnakeZ

Junior Member
Feb 1, 2013
16
0
0
Hey man, Im thinking of getting this monitor for competitive gaming and VERY competitive gaming on Xbox. Do you think this is the best monitor and in your lightboost thread you say that you need 120fps to =120hz but what if the fps drops under 120 say 70 while you are on 120hz, will you still se a difference? I would imagine yes.