30-60 Vs 60 Vs 60+ FPS, What is the true?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Idocare

Junior Member
Feb 4, 2013
20
0
0
Where Lightboost made sense for me was it helped with the dimming with shutters and helped to reduce crosstalk based on one can reduce the 3d contrast and still receive a bright and vibrant experience, while curbing crosstalk.

Well, not sure about 3D cause 3D era began (never seen this thing) 1 month after I bought (someone said Murphy ?) a new TV-set (obviously the best money can buy at that time) thus I have not yet direct experience in it (I can live with or without it, maybe better w/out) but lightboost it's kinda sort of a Dynamic Contrast so it makes sense, DC was present even in my 40PFL that's only a (let we call it so) poor 60Hz (200Hz processing other signals but only 60Hz attached to a rig) rated at 1ms (one of the main reasons why I bought it), I've taken a 1000fps video of a movie with DC on minimum and is not showing ghostings nor blurriness of any kind, not bad for a 4 years old stuff. Since then there are surely been improvements and lightboost seems to be a fine tuning of it.
-----------------------------------------------------------------------------------


Beside that I must say that for gaming, unless certain conditions are met, is probably better having v-synch enabled, at least for my taste, said that I have a 690 that can push @1080p nearly everything far beyond ultra settings, @60Hz or the limit of my monitor it seems the card could choose more accurately which frame to send in the framebuffer accordingly to my inputs while leaving it uncapped the results are by far more random driven, the more frame the card can output the more random is the frame displayed provided the monitor can only output 60 of them in a second, for sure this could have its advantage though, given the card can feed the screen with the most up-to-date frame that can be a trifle nearer to what's happening in the engine than with v-synch enabled, in that case once a frame is in the framebuffer it will go on the screen at the next "refresh complete" signal while disabling v-synch the screen will be fed with the most recent "ready" frame, but this also means that if there is no synch and the display needs to.. display something while there isn't a complete frame ready it must do what it can and we can spot tearing effects and the like...

SO to tell it all in my opinion for a solo-play in a open world RPG it's worth to keep it enabled but for a LAN party with some good stuff/bucks for prize ...to hell v-synch along with image quality...

About the MAIN topic "my" thruth is "there is more than just one thruth", is not all about numbers but how well the screen matches other components of the rig and in what kind of environment it will be used and for what. Generally speaking is "the faster the better" given that, while for a movie the Hollywood standard 24fps or 30fps can be enough(BR-movie are usually 24p or 30i), for gaming just let them R.I.P. under the satisfactory 60fps target, for extremely dynamic gaming (FPS, racing etc.) we are back to the general rule "the faster the better" but keeping in account that a fast AND fine-tuned system is something to aim to, a fast but "plug and pray" one can instead be even worst than a slower but finely tuned one.

Sorry for the long post, here is half past one AM and I've had a really bad day.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
That said lightboost is not the reinvention of warm water but a patented name for a kind of the well known by far "scanning-backlight" technology, this time GPU driven....
There are some important differences between scanning backlights and full-strobe backlights:

- Scanning backlights have the backlight diffusion problem between on-segments of backlight and off-segments of backlight. This interferes with motion blur reduction.
- Most scanning backlights only give a 2x-3x improvement in motion clarity (Even the 'Samsung Clear Motion Ratio 960' models)
- Past attempts (e.g. BENQ 2006 attempt with AMA-Z) gave only about a 30% improvement (less than 1.5x clearer motion).
- It's actually not GPU generated. It's just an nVidia branded technology. Technically, if you had a 120Hz source, you can first turn on nVidia's strobe backlight via nVidia drivers, then you can unplug your DVI cable and plug another 120Hz source into the monitor. It even continues working when you hot-plug between an nVidia computer to an AMD computer (as long as the signal timings are identical, someone on HardForum reported this).

Today's LightBoost strobe backlights provide about a full order of magnitude improvement in motion clarity. (7x to 11x clearer depending on the LightBoost OSD setting). It is night and day motion clarity compared to past attempts at scanning backlights. Hopefully this technological improvement continues, and arrives on IPS.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Turning off motion blur made my eyes bleed, even at 60Hz.
You need to retest on an impulse-driven display (e.g. CRT or LightBoost) because you still have motion blur weak links (traditional LCD's).

I agree with you, when I turn off LightBoost, the "turn off motion blur" doesn't look as good. Especially on a LCD monitor with non-refresh-synchronized PWM.

Sometimes some people like motion blur, other times it interferes (e.g. competition gaming). I like motion blur when done artistically and for movies, and when used judiciously in games. However, for fast-twitch action games, many people (especially those who are used to CRT) gain a huge competitive advantage when you eliminate all weak links to motion blur, to gain the CRT "perfect-motion" effect. The reaction time is improved because you see everything clearly even when things are moving, and it's easier to track eyes. Games like TF2, BF3, Quake Live, Counter Strike, etc, where you need fast twitch action. Then again, other times, you might prefer to have the motion blur. Which is true, too.

Regardless, people also need test the webpage on an impulse-driven monitor (e.g. CRT or LightBoost) to understand the zero motion blur effect, because there is still motion blur on that webpage even when you turn off motion blur, because of LCD sample and hold.
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
while disabling v-synch the screen will be fed with the most recent "ready" frame, but this also means that if there is no synch and the display needs to.. display something while there isn't a complete frame ready it must do what it can and we can spot tearing effects and the like...
Regarding VSYNC OFF, to clarify: frames are complete on the graphics card, just not complete on monitor..

A typical LCD monitor is always refreshing pixels in a scanning fashion, from top-to-bottom (see high speed video), top to bottom, rinse and repeat. It's the same sort of sequential scan sequence as a CRT, except a LCD monitor is (traditionally) continuously shining rather than flickering.

When you turn off VSYNC, the PC is transmitting the corresponding "row of pixels" (over the video cable) of the most recently complete frame (stored on the GPU) and the display (at its current scan sequence) is displaying it. When a new frame is finished on the GPU, it immediately and instantly switches (i.e. "cuts") to the new frame as the top-to-bottom scan continues to progress. This creates the tear-line effect, in a splicing-style effect where the top part is the old frame and the bottom part is the new frame. (Note: There can be multiple splices in the same frame, especially when running at 3x-4x the framerate versus the refresh rate). The advantage is that reduces latency to the absolute minimum possible, given the limitations of pushing the frames from the GPU to the monitor, one row of pixels at a time (As seen in the high speed video).

This lowers latency, giving you a competitive advantage, but can degrade motion quality in certain games, unless your framerate is so high you (e.g. 240fps+) that some people no longer sees the tearing.

So the frames are always complete in the GPU, it's simply the limitations of the top-to-bottom scan sequence that must go over the video cable (e.g. DVI) and the monitor is displaying the pixel-row, one pixel row at a time, of the most recently available complete frame from the GPU. Turning VSYNC on makes sure that one refresh always is the same frame, eliminating any tearing, but forcing any subsequent frame to wait until the next refresh.
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
There are some important differences between scanning backlights and full-strobe backlights:

- Scanning backlights have the backlight diffusion problem between on-segments of backlight and off-segments of backlight. This interferes with motion blur reduction.

- It's actually not GPU generated. It's just an nVidia branded technology. Technically, if you had a 120Hz source, you can first turn on nVidia's strobe backlight via nVidia drivers, then you can unplug your DVI cable and plug another 120Hz source into the monitor. It even continues working when you hot-plug between an nVidia computer to an AMD computer (as long as the signal timings are identical, someone on HardForum reported this).

Today's LightBoost strobe backlights provide about a full order of magnitude improvement in motion clarity. (7x to 11x clearer depending on the LightBoost OSD setting). It is night and day motion clarity compared to past attempts at scanning backlights. Hopefully this technological improvement continues, and arrives on IPS.

Well, being english not my birth language maybe I haven't explained myself well enough.

It's quite obvious that's not GPU "generated" never told that thing but I said GPU "driven" but it could have been better said as GPU "activated", it's evenly quite obvious that a screen that's not enginereed to deliver the backlighting in that fashion simply cannot and, provided it is, it always can given the feature is properly "driven", "activated", let's call it whatever we want...

As for the "scanning backlight" point, lightboost is a variant of the multiple backlighting techniques that are applicable, and PWM dimming is not that far since is a "strobing" system as well, thus it is nothing more than one of the many and not something "completely" new, that said, "strobing backlight" could be a more precise definition if you like.

This about the correctness of my statement.



About "motion clarity"
Given that, it's quite obvious that it's main feature is that is in "synch" with the frame being displayed thus "strobing" along the frame switching thus giving a more CRT like kind of view and add to benefits of the more steady LCD image quality, this is surely the way to go provided it can be driven along dynamic lighting and dynamic contrast, otherwise it would be only helpful reducing or eliminating (on fastest panels) the blurriness at the expense of contrast ratio and global image quality, in fact, if you watch closely at the video you linked it seems that in the white numbers on black screen sample the "pulse" of backlight enlightens the whole screen, on my dictionary this means there is no a selective use of the backlighting system in order to provide better contrast or image quality. This leads the lightboost thing in the bigmout field of its promoters as it's seemingly, used alone, an effective feature for reducing blurriness but cheaper than using fast panels combined with dynamic backlight that can give both, nonetheless it could be a great addition to an already kickass screen but IF it should be used alone and on slow panels to compensate their slowness killing "effective" contrast ratio then I'd rather stick with a native fast panel and dynamic contrast.

On other words its purpose is to display the frame on the screen only when it is complete, blacking out its creation during the scanning process and leading to a more CRT-like feeling, thus is by itself a great concept but seems to be poorely applied only to make not-so-great monitors look more like really high-end TV sets, the day it will be paired with really fast panels and really effective algorythms and processors to keep better contrast, that day it would be one of the best features ever introduced in screen driving.

The final step would probably be single per-pixel strobe lighting that would be closer to plasma technology for the best contrast ratio and parallel simultaneous activation/deactivation of all the lines in a frame instead of scanning techiques for the least ghostings/blurryness effects, however this suppose the use of decent logics and fast panels and BTW is only a theory came to my mind compiling this post and not supposed to work well in practice.



Regarding VSYNC OFF, to clarify: frames are complete on the graphics card, just not complete on monitor..

A typical LCD monitor is always refreshing pixels in a scanning fashion, from top-to-bottom (see high speed video), top to bottom, rinse and repeat. It's the same sort of sequential scan sequence as a CRT, except a LCD monitor is (traditionally) continuously shining rather than flickering.

When you turn off VSYNC, the PC is transmitting the corresponding "row of pixels" (over the video cable) of the most recently complete frame (stored on the GPU) and the display (at its current scan sequence) is displaying it. When a new frame is finished on the GPU, it immediately and instantly switches (i.e. "cuts") to the new frame as the top-to-bottom scan continues to progress. This creates the tear-line effect, in a splicing-style effect where the top part is the old frame and the bottom part is the new frame. (Note: There can be multiple splices in the same frame, especially when running at 3x-4x the framerate versus the refresh rate). The advantage is that reduces latency to the absolute minimum possible, given the limitations of pushing the frames from the GPU to the monitor, one row of pixels at a time (As seen in the high speed video).

This lowers latency, giving you a competitive advantage, but can degrade motion quality in certain games, unless your framerate is so high you (e.g. 240fps+) that some people no longer sees the tearing.

So the frames are always complete in the GPU, it's simply the limitations of the top-to-bottom scan sequence that must go over the video cable (e.g. DVI) and the monitor is displaying the pixel-row, one pixel row at a time, of the most recently available complete frame from the GPU. Turning VSYNC on makes sure that one refresh always is the same frame, eliminating any tearing, but forcing any subsequent frame to wait until the next refresh.


Again, I never said that the frame needs to be "complete" on the monitor but that a monitor displays "what it receives" subsequently at its "aknowledgement", and that it must do what it can with what signal it receives. Better explained, and exactly as the both of us said in slightly different ways, with v-synch "on" a framebuffer will be loaded from the VGA with complete frames and at the "screen refresh complete" message the framebuffer will feed the next complete frame in synch with refresh rate without having its memory changed from the VGA at faster rates, in absence of v-synch the card will also try to fill the framebuffer with complete frames but this could have been already partially readen to send the signal at the screen at a given rate and this makes for the screen can display rows that are not part of the same frame AKA tearing, however in the few words of my contested previous example I wasn't keeping in account only FASTER than refresh rendering speed of the VGA but even SLOWER ones and in this scenario is easy to understand how a frame cannot be ready on the VGA if the refresh rate of the screen is FASTER than rendering capabilities of the VGA itself, it is also easy to find that there are multiple choices for unpleasant effects, from tearing to stuttering but there are too many things to keep in account to be too precise in this regard. Lets just look at this example, a card that can average an output of 90fps it's not said that can always keep 90+ fps neither 60+ and what do you think the screen displays if there isn't a new frame ready to display ? It cannot simply display something that doesn't exist thus it reads the framebuffer. In absence of v-synch the screen starts repeating the older frame present in the framebuffer that can be overriden row by row "on the fly" by the new one from the card leading to tearing (even in lower than refresh rate situation), in presence of v-synch at a screen refresh rate faster than rendering capabilities of the VGA the old frame can be resend to the screen till the new one is ready or the screen can be kept blacked-out, but both ways in this ratio 60hz=60times a second, frame ready->frame displayed every 1/60th second, frame not ready wait till next frame ready one more cycle 1/60th seconds + 1/60th seconds, total time 2/60th of a seconds that leads to a 30fps effective rate, this way v-synch enabled with not enough performant cards eliminates tearing but surely boosts at least the input-lag..

This to clarify that's not mandatory that the card MUST have a frame ready when the screen needs to display one and this is one of the reasons why do exists framebuffers.

I'm pretty sure I forgot to mention something butfor now I think it's enough...
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
As for the "scanning backlight" point, lightboost is a variant of the multiple backlighting techniques that are applicable, and PWM dimming is not that far since is a "strobing" system as well, thus it is nothing more than one of the many and not something "completely" new, that said, "strobing backlight" could be a more precise definition if you like.
Yes, PWM is essentially a non-motion-optimized strobe backlight, not optimized to maximize black time between refreshes. There is tiny amount (about 10%-15%) of incidential motion blur reduction -- not noticeable to most eyes, unlike the 50% from 60Hz->120Hz and the approx 85%-92% for 120Hz(LB).

The problem is a motion-optimized strobe backlight is not a "scanning" type of backlight, because a scanning backlight flashes different parts of the screen in sequence. Different sites have used "backlight flashing", "black frame insertion", and other terminology.

It's only recent it's now possible to squeeze pixel persistence nearly completely into the time period between refreshes. The Blur Busters Blog has gone with "strobe backlight" as the mainstream terminology -- because it's more self-explanatory to laymen (rather than the engineer types), and more resembles the sound of a high-speed photographic flash that eliminates motion blur in photographed objects. It actually has a large number of surprising similarities: The shorter the flash, the less motion blur -- is both applicable to strobe backlights (for perceived motion blur), and to flash photography (for recorded motion blur).

Given that, it's quite obvious that it's main feature is that is in "synch" with the frame being displayed thus "strobing" along the frame switching thus giving a more CRT like kind of view and add to benefits of the more steady LCD image quality, this is surely the way to go provided it can be driven along dynamic lighting and dynamic contrast
Several existing HDTV technologies (Samsung Clear Motion Ratio 960, Sony Motionflow XR 960) have the ability to combine local dimming and scanning backlight operation simultaneously. Local dimming is the act of dimming/turning off backlight LED's behind dark parts of images, to enhance contrast ratio. This is different from dynamic contrast.

at the expense of contrast ratio and global image quality
On a law-of-physics basis, strobe backlights have no effect on a panel's contrast ratio (assuming LCD contrast was stable on a time-basis for the entire refresh). However, a necessity arises for a different reason: crosstalk. One of the degradation in image quality appears to be caused by the various adjustments made differently in LightBoost mode to raise the LCD blacks a bit from the absolute blackest LCD blacks, to prevent crosstalk between refreshes. This improves 3D shutter glasses operation, but reduces contrast ratio (independently of dynamic contrast ratio, via backlight dimming).

seemingly, used alone, an effective feature for reducing blurriness but cheaper than using fast panels combined with dynamic backlight that can give both
Yes, it is technically possible to increase dynamic contrast with strobe backlights/edgelights (albiet not not ANSI contrast). Personally, I prefer local-dimming-driven contrast techniques, which produces better real-world same-scene contrast. It behaves as dynamic contrast "with extra perks". But, that's not currently being done in any consumer monitors. It's not easy to do local dimming with an edge light, though it is technically possible to do so (dividing in only 2 rows of zones vertically; coarse-grained local dimming)

nonetheless it could be a great addition to an already kickass screen but IF it should be used alone and on slow panels to compensate their slowness killing "effective" contrast ratio then I'd rather stick with a native fast panel and dynamic contrast.
Fair, if you loved dynamic contrast (especially if you have fast-responding LED dynamic contrast that reacts seamlessly to scene changes), then LightBoost might not be for you. However, I personally turn off dynamic contrast as many have flawed behavior. It's a matter of personal preference...

but seems to be poorely applied only to make not-so-great monitors look more like really high-end TV sets, the day it will be paired with really fast panels and really effective algorythms and processors to keep better contrast, that day it would be one of the best features ever introduced in screen driving.
LightBoost was originally invented for 3D, rather than the primary goal of motion blur elimination, but this great secondary effect (more important to some of us, apparently).

The final step would probably be single per-pixel strobe lighting
That's OLED :)
It will be one of the ultimate display technologies, once it gains enough brightness for good impulse-driving at short strobes. (Mind you, PS Vita's OLED is sample-and-hold so it has more perceived motion blur than impulse-driven OLED's. It is noteworthy that OLED's need to be about 10x brighter than the PS VITA's, for CRT-quality impulse driving). Until then, there's plenty of technological improvement left in LCD's long before cheap large-size OLED's become widespread.

(...about VSYNC talk...)This to clarify that's not mandatory that the card MUST have a frame ready when the screen needs to display one and this is one of the reasons why do exists framebuffers.
As for the VSYNC talk, I already know that -- there's always a front buffer waiting. When I said the frame "cuts in", I was talking about the front buffer (not the back buffers). I did not talk about the back buffers (which I already know about). What is true is that there's always at least one finished frame ready in a buffer (I forgot to call it a "front buffer", my apologies). So what I said was correct -- We're talking about the same thing essentially, just trying to explain it from different perspectives, of different stages of the "pipeline" between the GPU to the display.

....so shall we go back to talking about 30fps vs 60fps vs 60+fps. :)
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
Several existing HDTV technologies (Samsung Clear Motion Ratio 960, Sony Motionflow XR 960) have the ability to combine local dimming and scanning backlight operation simultaneously. Local dimming is the act of dimming/turning off backlight LED's behind dark parts of images, to enhance contrast ratio. This is different from dynamic contrast.

I was talking about pulsing a strobe of light through the complete frame not at an even level like lightboost does but in a sectorized fashion like dynamic-contrast and dynamic-lighting techniques allows.
BTW I didn't get precisely what you meant with the last two phrases but turning off/dimming the backlight source per zone is an essential part of dynamic contrast and is what a full screen strobing lighting system like lightboost doesn't do, I've also pointed out in many ways that could be great to see BOTH of the things implemented in a, to be clear, per-sector, locally micro-dimmed pulsed strobe fashion...

On a law-of-physics basis, strobe backlights have no effect on a panel's contrast ratio (assuming LCD contrast was stable on a time-basis for the entire refresh). However, a necessity arises for a different reason: crosstalk. One of the degradation in image quality appears to be caused by the various adjustments made differently in LightBoost mode to raise the LCD blacks a bit from the absolute blackest LCD blacks, to prevent crosstalk between refreshes. This improves 3D shutter glasses operation, but reduces contrast ratio (independently of dynamic contrast ratio, via backlight dimming).

Unfortunately from a panel producing point of view it's not as you says, there are few chanches that a ray of light could be stopped by the alignement leaks between crystals shutters and if you strobe or not some light will always leak through them, physically the more dimmed a light behind a intentionally dark panel zone the lesser the leakage the higher the contrast...and here we're back to the above point, "local strobing" could be the way to enhance both, "full strobing" from a contrast point of view is, in the best scenario, as good as any other "full panel" backlighting technique, CCFL included, otherwise could only be worst as if it's more effective impressing images onto the retina for purposedly "let through" fields of the image it's evenly more effective in its leaked form coming from fields supposed to be dark.

About edge backlight:

I wasn't talking about that that's mostly an engineering chimera but instead of regular backlighting, for example my set has 224 stand alone backlighting sectors but newer ones have up to 1000 segments with added micro-dimming and screen refresh speeds up to 1200Hz, for reference I'm talking about Philips 9000 series TV, those things doesn't need strobing stuff for sure but nonetheless if implemented in the right way it could give a boost even to high tech stuff like those sets. For me the essence of lightboost remains to keep cheap production costs leaving additional room for gains while delivering advertisable performance on a "pretty normal stuff" monitor, is for this reason that I was warning you and others about it, it seemed to me you were branding the sword too much in defense of this "enlarging-things-marketing-advertisements" instead of looking at the technical side of the question, but tastes are tastes... however, summarizing:

Is good to see only complete frames quickly backlighted thanks to "lightboost" ? For me, YES.
Is good to have an evenly backlighted screen even in zones supposed to be completely dark thanks to a more effective leakage induced by "lightboost" ? For me, NO. :hmm:

However I repeat that could be amazing to see it ADDED to other techniques that are more effective in handling brightness and contrast, otherwise is a "give and take"...I "give" big money for a feature that improves motion clarity "taking" light leakage back through the shutters of the crystals that kills my blackest blacks...no, thanks, you're right, I'm sure it's not for me... :biggrin:

That's OLED :) It will be one of the ultimate display technologies, once it gains enough brightness for good impulse-driving at short strobes. (Mind you, PS Vita's OLED is sample-and-hold so it has more perceived motion blur than impulse-driven OLED's. It is noteworthy that OLED's need to be about 10x brighter than the PS VITA's, for CRT-quality impulse driving). Until then, there's plenty of technological improvement left in LCD's long before cheap large-size OLED's become widespread.

I beg your pardon but I think here you're missing something. :confused:
I'm still at the point as OLEDs were faster, brighter and could deliver a wider field of view than anything else on the earth, they were self-producing light thus they didn't need any kind of backlighting, they were able to deliver the best brightness along with the best contrast and the widest colour spectrum at an amazingly fast speed with the only MINOR :ninja: drawback to be really expensive and with a not-so-long life expectance. Something has changed ?

As for the VSYNC talk, I already know that -- there's always a front buffer waiting. When I said the frame "cuts in", I was talking about the front buffer (not the back buffers). I did not talk about the back buffers (which I already know about). What is true is that there's always at least one finished frame ready in a buffer (I forgot to call it a "front buffer", my apologies). So what I said was correct -- We're talking about the same thing essentially, just trying to explain it from different perspectives, of different stages of the "pipeline" between the GPU to the display. ....so shall we go back to talking about 30fps vs 60fps vs 60+fps. :)

Yep, this is more or less what I was saying but maybe it's better we just don't add too many stages as the more we add the higher the latency will be, usually an output framebuffer consists of just one memory stack that can be readen and written to in synch or readen and overwritten on the fly w/out synch, another pair of arms is the post-processing done by some monitor's logic which have room to do it, panel allowing, or post-processing of the GPU itself that occur, however, before a frame can reach the output framebuffer, nor this thing helps with latency...

However I do believe we were not out of the main topic as to understand in deep what's going on on a screen it's mandatory to understand what's going on behind the scenes or do we have to summarize with a "a crappy monitor is trash @30Hz, @60Hz and even @120hz and a badass system can be crippled by crappy settings @30fps like @60fps and even @120fps" ? ;)
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Interesting talk about dynamic contrast ratio and how to combine motion-optimized backlights (synchronized strobes) with contrast-optimized backlight technologies (local dimming, etc). Only a couple comments:

screen refresh speeds up to 1200Hz
It would make motion clearer, but should be interesting to know how much of that is from motion interpolation (not good for games due to input lag), how much of that is from impulse-driving (strobing/flicker, like CRT -- e.g. 1/1200sec flashes once every 1/60sec), and how much of that is simply tantamount to repeated refreshes (e.g. like simple temporal dithering in plasma subfields in cheaper plasmas -- essentially flickering pixels at 600Hz). It would take really good motion interpolation and/or full-backlight strobes (not scans, due to backlight diffusion issue), to get 20x the motion clarity of a 60 Hz LCD. Most of the "960 Hz" simulation only result in about 2-3x the motion clarity of 60 Hz LCD due to many factors (Weak links) and not as clear motion as a good plasma. That said, LCD motion enhancement technologies are gradually becoming more efficient (with LightBoost a highly efficient motion-blur-eliminating method, albiet with corresponding disadvantages too). The only way to do it in a computer/game compatible manner is via extra refreshes (e.g. 120 Hz or 144 Hz, or even beyond in future) and via impulse driving (e.g. motion-optimized strobe backlight).

Also, even Panasonic 2500 Hz FFD plasma (e.g. Panasonic VT50) only has about ~4x the motion clarity of a 60 Hz LCD display, due to the unfortunate red/green phosphor decay of 5 milliseconds, which is currently a major limiting factor in further motion blur reduction on plasma displays (plasma motion is not fully as clear as CRT motion).

I beg your pardon but I think here you're missing something. :confused:
I'm still at the point as OLEDs were faster, brighter and could deliver a wider field of view than anything else on the earth, they were self-producing light thus they didn't need any kind of backlighting
Sorry, I need to make myself clearer. What you said was the following:

The final step would probably be single per-pixel strobe lighting
Per-pixel strobe backlighting is impossible with LCD due to backlight diffusion, but possible with OLED (strobing the pixels directly). Instead of per-pixel strobe backlighting, it's directly strobe-driven pixels (direct pixel strobing of OLED). That is what Sony's expensive "Crystal LED" prototype already does today (impulse-driven). However, the PS Vita does not do it (sample-and-hold).

The reason is that single per-pixel strobing requires a LED backlight with individual LED's for every single pixel. But if you have a LED for every single pixel (and in all 3 primary colors R,G,B), then why do you need the LCD? Get rid of the LCD layer if you've successfully made a full array of LED's (with R,G,B) at the single-pixel level -- now it's a display itself, no longer just a simple backlight.

That's what I am trying to say. Single pixel backlight is not worthwhile for LCD's -- skip that, and just create a 100% (Cystal) LED or OLED display. If you gain the technology to do per-pixel local dimming (millions of zones, and use red/green/blue pixels), you've got the technology to just go ahead with just using the LED's as the display instead; and get rid of the LCD layer...

Even with OLED monitors or Crystal LED monitors, the electronics have to impulse-drive (strobe the pixels) to eliminate motion blur on these displays. You still need to impulse-drive the pixels for game/computer-friendly low-input-lag motion blur elimination (the only way to do motion blur elimination while also avoiding interpolation or strobing(ala CRT) is to increase refresh rate to GPU-insane levels, such as native 500fps on a non-strobed 500Hz display, or native 1000fps on a non-strobed 1000Hz display -- that is not going to happen. Strobing is easier and eliminates motion blur at lower refresh rates, which is easier for GPU's! Same reason why it has been observed CRT 60fps@60Hz is clearer motion than traditional LCD 120fps@120Hz)
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
It would make motion clearer, but should be interesting to know how much of that is from motion interpolation (not good for games due to input lag), how much of that is from impulse-driving, and how much of that is simply tantamount to repeated refreshes (e.g. like simple temporal dithering in plasma subfields). It would take really good motion interpolation and/or full-backlight strobes (not scans, due to backlight diffusion issue), to get 20x the motion clarity of a 60 Hz LCD. Most of the "960 Hz" simulation only result in about 2-3x the motion clarity of 60 Hz due to many factors (Weak links) and not as clear motion as a good plasma. That said, LCD motion enhancement technologies are gradually becoming more efficient (with LightBoost a highly efficient motion-blur-eliminating method, albiet with corresponding disadvantages too). The only way to do it in a computer/game compatible manner is via extra refreshes (e.g. 120 Hz or 144 Hz, or even beyond in future) and via impulse driving (e.g. motion-optimized strobe backlight).

Obviously everything above 60Hz is headroom for interpolation, post-processing of any kind and 3D handling but this doesn't make the panel slower than 1/20th of a 1/60th of a second and 0,83ms backed by its strobing backlighting techique are more than enough to deliver a blurry-less image in motion, even if not in synch with the complete frame.
TBC every manufacturer has it's own strobing technique, it's only in PC monitor that seems something completely new as nVidia has patented one of the many and TBEMC even Sharp "scanning backlight" it's a strobing one, only not in synch with the frame, but we've already touched this point.


Per-pixel strobe backlighting is impossible with LCD due to backlight diffusion, but possible with OLED (strobing the pixels directly). Instead of per-pixel strobe backlighting, it's directly strobe-driven pixels (direct pixel strobing of OLED). That is what Sony's expensive "Crystal LED" prototype already does today (impulse-driven). However, the PS Vita does not do it (sample-and-hold). The reason is that single per-pixel strobing requires a LED backlight with individual LED's for every single pixel. But if you have a LED for every single pixel (and preferably 3 colors), then why do you need the LCD? Get rid of the LCD layer if you've successfully made a full color backlight at the single-pixel level -- now it's a display itself, no longer just a simple backlight. That's what I am trying to say. Single pixel backlight is not worthwhile for LCD's -- skip that, and just create a 100% (Cystal) LED or OLED display. If you gain the technology to do per-pixel local dimming (millions of zones, and use red/green/blue pixels), you've got the technology to just go ahead with just using the LED's as the display instead; and get rid of the LCD layer...

Nope, it's not impossible if you think they can align masks to cancel shutter's leakage, it's not so difficult and please don't forget there are even RGB-backlitted panels around.

I wansn't talking about OLEDs in a direct way but since they can be white as well, or, better said they ARE white without "drugging" them with ruthenium, platinum, iridium and so on (that's one of the expensive parts of their production costs after all), aligning a shutter mask with a very "fast" lcd panel using a "less expensive" white overdriven OLED panel as a per-pixel PWM dimmed backlight source can give ~0 leakage, ~∞contrast, a good brightness, low to no motion-blur and a wide gamut while keeping the typical field of view of the front panel used and costs at a reasonable level while giving the chance to handle better gigabit colour gamuts at high speed. Conventional OLEDs actually seems to be still a little too much pricey for mass production, nonetheless they will surely be the future speaking of panel technology.

However yes, here we are a little out of the main topic so I come back on the railroad:

Even with OLED monitors or Crystal LED monitors, the electronics have to impulse-drive (strobe the pixels) to eliminate motion blur on these displays. You still need to impulse-drive the pixels for game/computer-friendly low-input-lag motion blur elimination (the only way to do motion blur elimination while also avoiding interpolation or strobing(ala CRT) is to increase refresh rate to GPU-insane levels, such as native 500fps on a non-strobed 500Hz display, or native 1000fps on a non-strobed 1000Hz display -- that is not going to happen. Strobing is easier and eliminates motion blur at lower refresh rates, which is easier for GPU's! Same reason why it has been observed CRT 60fps@60Hz is clearer motion than traditional LCD 120fps@120Hz)

Mmm..,it's not quite like that and you seems way too pessimistic in this field, there are many ways to achieve good results and the worst interpolation maybe can ruin image quality if unproperly applied but certainly cannot introduce more input-lag than v-synch, as already said, in recent sets is easy find strobing techniques that, even if non implemented to be in sync with the refresh, are surely better than having a monitor pushing 500 or 1000 fps into eyes, remember that IRL the frequency of domestic lighting is 50/60Hz but's not for this reason that if you move your head quickly you can see a motion-blur effect nor when in solar light where everything is illuminated with variable light waveleght from 100nm to 1mm. Probably really high frequencies wont help in reducing motion-blur but instead could lead with more ease to oversampling effects, ergo blur, ghosting, etc.


To reboot completely in topic however there is the need to mention technologies which allows to implement v-synch in a virtual way, as for example Lucid Virtu MVP that allows nearly lag-less performance while eliminating the obnoxious tearing, personally I think it's a must-have cause it eliminates the debate between frame-rate and image quality.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
As for per-pixel backlight control, I'll leave it to the manufacturers to figure it out. If they can pull it off, so be it -- but I think turning LED's into a display is easier than doing what was described.

Now onto motion interpolation:

Mmm..,it's not quite like that and you seems way too pessimistic in this field, there are many ways to achieve good results and the worst interpolation maybe can ruin image quality if unproperly applied but certainly cannot introduce more input-lag than v-synch
Good interpolation requires looking ahead an extra frame at the absolute minimum, which requires adding another framebuffer layer, which adds an extra frame of input lag -- even with VSYNC. Good interpolation algorithms unfortunately require the PREVIOUS frame, the CURRENT frame, and the NEXT frame, to do far more accurate interpolation.

Linear-vector motion interpolation between just the previous frame and current frame (with no knowledge of other frames, such as the subsequent frame) do not yield the best possible motion interpolation, and causes unacceptable artifacts during video game motion.

Consider specailcases of non-straight vectors including things like curved motion, and non-straight motion, like a head turning, people riding a bicycle, scene changes, objects suddenly becoming obscured, multiple motion vector directions (e.g. fast zooms, explosions, first person perspectives super-fast motion, etc). Motion going too fast (e.g. motion steps of more than 50 pixels between frames). All of this is extremely challenging for motion interpolators. You don't want the motion interpolator to insert ugly artifacts whenever you do things like a fast 360 degree flick, etc.

Yes, it's possible to do *some kind* of basic interpolator with just the knowledge of the current frame and a previous frame, but it won't be good, since it does not provide enough information for non-linear motion (e.g. accelerating motion, sudden direct change, curved motion, etc). Even better interpolation is possible if you have more than 3 frames of knowledge centering around the current frame (prev, current, next), but I'm not sure which HDTV's use more than just the next frame (yes, some of them need that "future frame", and that is a major cause of interpolation input lag)

Poor interpolators often suddenly stop interpolating when motion get complex -- that's why sometimes motion is smooth, but motion starts stuttering when motion becomes complex (e.g. first perspective view of running through a forest, bumping and pushing leaves out of the way). That's pretty common in simple interpolators. The framerate suddely drops back to original framerate (e.g. 24fps)

Also, object occulusion is not a 100% solvable problem with motion interpolators. For example, scenery behind a picket fence with tiny gaps. Strafing in front of a picket fence, you'll reveal some slits of a scenery behind, then different slits of scene. It may not ever reveal the complete scene behind the picket fence because the gaps between slats only covers about 25% or less of the scenery. Between the two frames, only 50% of the scenery behind the fence is revealed (but due to granularity f a low framerate, having "skipped over" never-seen scenery, between the two frames). How do you motion-interpolate scenery that is never revealed, between the frames? (e.g. intermediate strafe positions in front of picket fence, showing never-revealed-before scenery) It's impossible. Object occulusion is not a 100% solvable science for motion interpolation. Better algorithms and logic improves this, but intrisinically, it's an unsolvable problem for total coverage of all situations. Even simple situations such as dragging a window on top of a fancy wallpaper or background window (possibly animated) -- Interpolating this (in a "black-box" interpolator) is very tricky.

Regardless.... Unfortunately, no dice. It's mathematically impossible to create a *good* interpolator (with no visible artifacts) without knowledge of at least one future frame. Therefore, additional buffering occurs, and this adds at least 1 additional frame of input lag (at the absolute minimum), even with VSYNC ON.

One *could* go to a higher intermediate framerate (e.g. 144fps) to reduce the lag of a frame, then the input lag of interpolation could be tolerable for computer use, but still poor for competitive gaming use (compared to an impulse-driven display).

Another problem: Motion interpolation works best on consistent framerates. Motion interpolation works best when the framerate is consistent and predictable. Computer games don't always run at consistent framerates. Interpolators (especially those that do not lookahead at least several frames) won't always be able to predict when a stutter happens and be able to successfully "smooth the stutter" out of existence.

Yet another problem: Motion interpolators often have lots of difficulties with low-contrast and blurry objects. How does an interpolator tell apart random "noise" (e.g. macroblocks randomly moving aroudn in video), versus intentional world movement in a dark, low-contrast environment where contrast is less than the contrast between the artifacts in poorly-compressed material? Interpolator fine-tuning leads to a lot of motion falses/mistaken motion (showing up as motion artifacts caused by an interpolator -- I've seen them and they look ugly).

*Even* better interpolators require several frames of lookbehind *and* lookahead buffers, to better predict direction changes, stutters, motion behavior (angular/accelerating motion, etc) and improved accuracy (being sure that something is actually moving and not just accidental 'noise').

If you do motion interpolation anyway -- then computer graphics must be interpolated far more flawlessly than video, because it's far easier to see defects in motion interpolation in computer graphics. Computer graphics are often 'razor sharp', often contains many straight lines, and full of unexpected interactive motion (sudden mouse pointer direction changes, drag direction changes, unexpected stutters, accelerated motions in games, rapid occulusion effects, etc). Flaws in an interpolator wreaks havoc with that, with far-more-easily noticed artifacts. If you are doing good lookahead/lookbehind interpolators (adds more input lag), the interpolation looks so much better, but various artifacts remain.

Motion interpolation definitely has its place, yes...
Maybe even with specially designed games that co-operate with a motion interpolator to maximize quality and minimize lag. But discrete "black box" independent motion interpolators, built into display (operating with no advance knowledge of the source material), isn't the solution for lag-free computers/gaming use.

To make even just a majority of population happy, using a computer/videogaming with motion interpolation, will likely not be the answer.

As an associated member of Society for Information Display (sid.org), and subscribing to papers written by universities and institutions (I pay $150/year to read these papers), there's definitely only two lag-free methods of shortening lengths of frames in order to reduce perceived motion blur on displays: More native Hz/frames and/or shorter impulses.
....As I've already explained before, that's extra frames in extra refreshes (in sample-and-hold displays, ala LCD), and/or shorter impulses with more black period between frames (in impulse-driven displays, ala CRT/plasma/backlight control).

Providing more Hz/frames in a high quality manner via interpolation is not a lag free method.
 
Last edited:

Idocare

Junior Member
Feb 4, 2013
20
0
0
.../snip/...

Regardless.... Unfortunately, no dice. It's mathematically impossible to create a *good* interpolator (with no visible artifacts) without knowledge of at least one future frame. Therefore, additional buffering occurs, and this adds at least 1 additional frame of input lag (at the absolute minimum), even with VSYNC ON. One *could* go to a higher intermediate framerate (e.g. 144fps) to reduce the lag of a frame, then the input lag of interpolation could be tolerable for computer use, but still poor for competitive gaming use (compared to an impulse-driven display). Another problem: Motion interpolation works best on consistent framerates. Motion interpolation works best when the framerate is consistent and predictable. Computer games don't always run at consistent framerates. Interpolators (especially those that do not lookahead at least several frames) won't always be able to predict when a stutter happens and be able to successfully "smooth the stutter" out of existence. Yet another problem: Motion interpolators often have lots of difficulties with low-contrast and blurry objects. How does an interpolator tell apart random "noise" (e.g. macroblocks randomly moving aroudn in video), versus intentional world movement in a dark, low-contrast environment where contrast is less than the contrast between the artifacts in poorly-compressed material? Interpolator fine-tuning leads to a lot of motion falses/mistaken motion (showing up as motion artifacts caused by an interpolator -- I've seen them and they look ugly). *Even* better interpolators require several frames of lookbehind *and* lookahead buffers, to better predict direction changes, stutters, motion behavior (angular/accelerating motion, etc) and improved accuracy (being sure that something is actually moving and not just accidental 'noise'). If you do motion interpolation anyway -- then computer graphics must be interpolated far more flawlessly than video, because it's far easier to see defects in motion interpolation in computer graphics. Computer graphics are often 'razor sharp', often contains many straight lines, and full of unexpected interactive motion (sudden mouse pointer direction changes, drag direction changes, unexpected stutters, accelerated motions in games, rapid occulusion effects, etc). Flaws in an interpolator wreaks havoc with that, with far-more-easily noticed artifacts. If you are doing good lookahead/lookbehind interpolators (adds more input lag), the interpolation looks so much better, but various artifacts remain. Motion interpolation definitely has its place, yes... Maybe even with specially designed games that co-operate with a motion interpolator to maximize quality and minimize lag. But discrete "black box" independent motion interpolators, built into display (operating with no advance knowledge of the source material), isn't the solution for lag-free computers/gaming use. To make even just a majority of population happy, using a computer/videogaming with motion interpolation, will likely not be the answer. As an associated member of Society for Information Display (sid.org), and subscribing to papers written by universities and institutions (I pay $150/year to read these papers), there's definitely only two lag-free methods of shortening lengths of frames in order to reduce perceived motion blur on displays: More native Hz/frames and/or shorter impulses. ....As I've already explained before, that's extra frames in extra refreshes (in sample-and-hold displays, ala LCD), and/or shorter impulses with more black period between frames (in impulse-driven displays, ala CRT/plasma/backlight control). Providing more Hz/frames in a high quality manner via interpolation is not a lag free method.

Well, I must say your points are interesting but you must think that speaking of "motion interpolation" there are several ways in which it could help, provided a sufficiently fast panel in use, for example, if you have a panel capable of displaying 20 refresh for a 1/60th second frame (a 1200Hz panel "could" do it for example on pure math basis) you can simply split the single frame in 20 semiframes, but since this cannot be done in real time and needs some for processing, let say just in 12 semiframes that can be displayed from top to bottom everyone with the rest of the screen blackened, for example, semiframe 1 displays the 1/12th very top, the second one leaves the first top segment black and displys the 2/12th on the second segment, the third one the 3/12th on the third segment leaving the remainings black, this could be the best implementation of a motion interpolation feature that furthermore wouldn't need to drive in a specific way the backlighting allowing a regular local-dimming to be kept as source and giving a CRT-ish feeling to the image while keeping the image quality and contrast levels tipical of LCDs.

An interpolation done in this way wouldn't impact response time cause all the process is done in the in-between semiframes processing and this means that a 60Hz link could display all the 60fps in the very same way than on a panel only able to display 60fps "without" interpolation of any kind.

I was speaking of this kind of interpolation, obviously any image-enhancing one, like ones which you were referring to are likely to induce lag, "image-processing" is a time-consuming feature while image-splitting can be done "nearly" in real time, I'm not talking about improving image definition, noise nor anything else than using interpolation in its fastest and efficient way, or to to display fewer lines a time that leads to a more overall black backgound tha gives, as we both aknowledge, better results in matter of motion clarity.

As you know a way to overcome the sample-and-hold backdraws could be building panels able to split the image on many segments each one able to display a portion of the frame and would require a splitted driving circuitry as well otherwise a panel needs to drive in series all the pixels, well, in absence of this peculiarity the only way remains to split a frame into several subframes with addition of "scanning" black segments, nonetheless a parallel driving on the pixels could add to this as well.

This is indipendent from the signal coming from the card unlike when the monitor framebuffer is used for image-correction in post-processing, thus introduce no additional lag, it only needs to split the incoming frame on a larger framebuffer leaving room for the additional black segments of the subframes, I'm sure you get what I mean.

I think this could be the way to handle correctly an interpolation to improve motion-clarity without adding any lag and leaving image quality on the capable arms of already in use "local-dimming techniques" that would only need little adjustements to fit this "scanning technique".

Obviously this would not correct tearing as the signal will be the same, only the corresponding "unaligned" frame will go in a subframe intead in a complete frame but this would also preserve all the benefits of 60+fps on a 60Hz link without using v-synch and all the benefits of this one instead if in use.

If you get well what I meant, don't you think it could be a good way to go ?

P.S. (f-u-k the Universal Virtual MVP, it seems it doesn't work on SLI/X-fire nor with a single dual gpu card...s-h-t)
 

nextJin

Golden Member
Apr 16, 2009
1,848
0
0
Did you try the links? Are you saying you didn't see a difference between 30 FPS and 60 FPS with the cube and the soccer ball?

You'd have to be blind not to see it.

Now, you could make a case against 60 FPS and 120 FPS not showing a difference, but I'll bet if you had to control those objects at some level of precision, you'd see a difference there, as is the case in many games where input is tighter and more fluid @ 120 FPS compared to 60 FPS.

I wouldn't worry about him responding, he literally got fact checked right out the door.
 

Pia

Golden Member
Feb 28, 2008
1,563
0
0
This thread makes me want to invest in a 120Hz Lightboost display, but I have been using a Dell 2405FPW for about 6 years now. Paying a decent amount of money for something that is a downgrade in res, color and viewing angles after all that time feels silly. Plus I'd actually want to upgrade res for working, and I'm not willing to constantly stay on cutting edge GPUs.

I guess the best compromise I could hope for in the near future would be a 1440p / 60Hz IPS monitor with a low response time and a strobe backlight?
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
This thread makes me want to invest in a 120Hz Lightboost display, but I have been using a Dell 2405FPW for about 6 years now. Paying a decent amount of money for something that is a downgrade in res, color and viewing angles after all that time feels silly. Plus I'd actually want to upgrade res for working, and I'm not willing to constantly stay on cutting edge GPUs.

I guess the best compromise I could hope for in the near future would be a 1440p / 60Hz IPS monitor with a low response time and a strobe backlight?
IPS monitors will not likely have a strobe backlight for a while, because the pixel persistence needs to fit into the time period of a blanking interval (like in the high speed video). However, it's not impossible, and it could still happen eventually.

Longer pixel persistence start to become incompatible with a full-strobe backlight, and requires a synchronized scanning backlight that flashes rows of LED's vertically in sequence with the LCD refresh, as a rough emulation of CRT scanning. However, this has some rather severe limitations on the maximum possible amount of motion blur elimination, due to backlight diffusion from on-segments versus off-segments of backlight. For this reason, full-panel strobe backlights provide much "purer" motion blur elimination, that more truly breaks the pixel persistence barrier, with no upper cieling on motion blur that can be eliminated. Whereas a scanning backlight can give you about 2x-3x real-world reduction of motion blur, strobe backlights are capable of giving you an order-of-magnitude (10x real-world reduction of motion blur).

You could have both -- two monitors, optimized for different purposes. Some people do this; they have made the compromise (e.g. Vega sometimes seems to prefer playing on his ASUS VG248QE LightBoost over the Catleap 2B overclocked 2560x1440p 130 Hz IPS monitor) -- getting the high-definition motion experience at 1080p can be sharper motion than the blurry motion experience at 1440p. If you do lots of graphics design, programming, writing, you will almost certainly prefer upgrading to a 1440p IPS display. However, if you played lots of CRT and have been hating motion blur of LCD for many years and you do lots of gaming, former die-hard CRT game players find LightBoost literally be a savior from the heavens. Some people's FW900's have actually been chucked into closets thanks to LightBoost. Tough decisions, depending on what type of person you are, and what you use your monitor for.
 
Last edited:

Shephard

Senior member
Nov 3, 2012
765
0
0
I use to think 60 fps matters but it really doesn't. You won't see any smoothness difference from 30fps to 60fps. As long as you don't drop below 30 it is all good.

Now if you have a 120hz monitor that's a totally different story.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
I use to think 60 fps matters but it really doesn't. You won't see any smoothness difference from 30fps to 60fps. As long as you don't drop below 30 it is all good.
Not unamiously true for everyone.
It can be be visible on old, terrible LCD monitors.

See this webpage animation #1:
15fps versus 30fps versus 60fps


See this webpage animation #2:
Test Motion With Multiple Custom Framerates


Also....
  1. There can be differences in judder. Appliable to all displays. 30fps@60Hz (less erratic judder) looks different from 47fps@60Hz (more erratic judder). Doing 31fps@60Hz or 59fps@60Hz can have a strange judder with a "beat frequency" effect.
  2. Double image effects on CRT. For strobe-driven displays (CRT/Plasma), there's a double-image effect from 30fps@60Hz. This is commented by many long-time players that do lots of fast-motion strafing and panning motions on console videogames that are framerate locked at 30fps. It can be tested via this webpage (set the bottom ball to 30fps, top ball to 60fps, disable software-based motion blur for both). You can see for yourself that the double image effect occurs on CRT displays.
  3. Additional perceived motion blur can occur. Appliable to all displays. Especially for sample-and-hold displays (LCD), there's more motion blur. It can be tested via this webpage (set the bottom ball to 30fps, top ball to 60fps, disable software-based motion blur for both). You can see for yourself that there much more motion blur that comes from the display (in addition to stutters). This is because of the longer sample-and-hold effect at lower framerates.

So go see for yourself, and come back weeping you can see 30fps versus 60fps after all, no matter what your display. ;)

Now if you have a 120hz monitor that's a totally different story.
At least, we agree on that :)

In fact, I can even tell apart 60fps@120Hz versus 120fps@120Hz;
It actually also reveals itself in Google Chrome via this webpage (but not in IE or FireFox, due to framerate limiting)
 
Last edited:

Shephard

Senior member
Nov 3, 2012
765
0
0
Not unamiously true for everyone.
It can be be visible on old, terrible LCD monitors.

See this webpage animation #1:
15fps versus 30fps versus 60fps


See this webpage animation #2:
Test Motion With Multiple Custom Framerates


Also....
  1. There can be differences in judder. Appliable to all displays. 30fps@60Hz (less erratic judder) looks different from 47fps@60Hz (more erratic judder). Doing 31fps@60Hz or 59fps@60Hz can have a strange judder with a "beat frequency" effect.
  2. Double image effects on CRT. For strobe-driven displays (CRT/Plasma), there's a double-image effect from 30fps@60Hz. This is commented by many long-time players that do lots of fast-motion strafing and panning motions on console videogames that are framerate locked at 30fps. It can be tested via this webpage (set the bottom ball to 30fps, top ball to 60fps, disable software-based motion blur for both). You can see for yourself that the double image effect occurs on CRT displays.
  3. Additional perceived motion blur can occur. Appliable to all displays. Especially for sample-and-hold displays (LCD), there's more motion blur. It can be tested via this webpage (set the bottom ball to 30fps, top ball to 60fps, disable software-based motion blur for both). You can see for yourself that there much more motion blur that comes from the display (in addition to stutters). This is because of the longer sample-and-hold effect at lower framerates.

So go see for yourself, and come back weeping you can see 30fps versus 60fps after all, no matter what your display. ;)

At least, we agree on that :)

In fact, I can even tell apart 60fps@120Hz versus 120fps@120Hz;
It actually also reveals itself in Google Chrome via this webpage (but not in IE or FireFox, due to framerate limiting)
Anything under 30 fps is bad and you can tell right away.

I see it slightly more smooth with 60 fps.

In games though I really don't see a difference between 48 fps for example or 65 fps.

Now you put 2 monitors together - one @ 60hz and one at 120hz playing Counter-Strike... It's day and night. 120hz smoothness is so nice and makes the gaming much more enjoyable.

I use to shoot for 60 fps and above when gaming. Now though if you want ultra settings you won't get 60 fps unless you have the highest end card.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
Now you put 2 monitors together - one @ 60hz and one at 120hz playing Counter-Strike... It's day and night. 120hz smoothness is so nice and makes the gaming much more enjoyable.

I use to shoot for 60 fps and above when gaming. Now though if you want ultra settings you won't get 60 fps unless you have the highest end card.
With a single GTX 680 on an i7-3770K, I am able to max out my settings even with AA enabled on all my Source Engine games (Half Life 2 series, Black Mesa, Team Fortress 2, Counter Strike, etc), and still play a solid 120fps@120Hz.

But you're right -- I can't even pull off 120fps in Crysis.

BTW, do you happen to have a Samsung 120 Hz monitor? If so, test out the Samsung Zero Motion Blur HOWTO. It's a LightBoost-like strobe backlight that also works on AMD Radeon cards, although it has more input lag than the ASUS/BENQ LightBoost monitors. Going from 60Hz->120Hz only has a 2x motion blur reduction, but going from 120Hz->120Hz(strobed) has a further ~4-5x motion blur elimination, and has roughly 7x to 11x less motion blur than 60Hz, according to motion tests.
 
Last edited:

Shephard

Senior member
Nov 3, 2012
765
0
0
With a single GTX 680 on an i7-3770K, I am able to max out my settings even with AA enabled on all my Source Engine games (Half Life 2 series, Black Mesa, Team Fortress 2, Counter Strike, etc), and still play a solid 120fps@120Hz.

But you're right -- I can't even pull off 120fps in Crysis.

BTW, do you happen to have a Samsung 120 Hz monitor? If so, test out the Samsung Zero Motion Blur HOWTO. It's a LightBoost-like strobe backlight that also works on AMD Radeon cards, although it has more input lag than the ASUS/BENQ LightBoost monitors. Going from 60Hz->120Hz only has a 2x motion blur reduction, but going from 120Hz->120Hz(strobed) has a further ~4-5x motion blur elimination, and has roughly 7x to 11x less motion blur than 60Hz, according to motion tests.
no I use to have one of the best CRT 120hz monitors. Only ever had LCDs that could do 75hz. Better than 60, but nothing like 120.

source games aren't demanding I can get 200+.

CS is just a good example for 120hz smoothness test.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
1
71
no I use to have one of the best CRT 120hz monitors. Only ever had LCDs that could do 75hz. Better than 60, but nothing like 120.
Aha, CRT's are a different animal.
The best motion I've ever seen, back in the day, and only recently finally equalled by LightBoost LCD monitors.

Motion blur difference of CRT 60Hz versus CRT 120Hz (more subtle blur difference; more of a flicker difference) is very different than motion blur difference of LCD 60 Hz versus LCD 120 Hz (easy to tell apart), and also a different comparison than CRT 60 Hz versus LCD 60 Hz (easy to tell apart). This is due to the sample-and-hold effect of LCD contributing to perceived motion blur.

Just to be clear, when I said 120Hz having 50% less motion blur than 60Hz, I was referring to standard LCD displays.
 
Last edited:

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
I overclocked my monitor to 75Hz (from 60) and I can see the difference between 60 fps and 120 (75) on my monitor using the soccer ball test :(

Now I have to get a 120Hz screen...