• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

CONFIRMED: G-SYNC includes LightBoost sequel. (nVidia sanctioned, no hack)

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
When Andy of nVidia was asked whether LightBoost could be combined with G-GSYNC, AndyBNV of nVidia confirmed on NeoGaf:

AndyBNV said:
“We have a superior, low-persistence mode that should outperform that unofficial [LightBoost] implementation, and importantly, it will be available on every G-SYNC monitor. Details will be available at a later date.”.
This scientifically confirms strobing is used, because of the law of vision physics — there is scientifically no other way to do LightBoost-matching low-persistence (1ms) modes without ultrahigh refresh rates (e.g. 1000fps@1000Hz) or frame interpolation (e.g. 200fps->1000fps). Since both are unlikely with nVidia G-SYNC, this officially confirms backlight strobing to keep visible frame displaytimes short (aka persistence). In addition, John Carmack confirmed on twitter that a better backlight strobe driver is included:

John Carmack (@ID_AA_Carmack) on Twitter said:
“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”
Both statements by Andy and John, are confirmations that official backlight strobing (LightBoost) is part of G-SYNC, a 2D motion blur elimination, finally officially sanctioned by nVidia. The question becomes: Can both be combined into adaptive-rate backlight strobing?

UPDATE: Your existing ASUS VG248QE monitor is already upgradeable to G-SYNC!

______________

For those not aware: Strobe backlights eliminate motion blur on LCD's The backlight is turned off while waiting for pixel transitions (unseen by human eyes), and the backlight is strobed only on fully-refreshed LCD frames (seen by human eyes). The strobes can be shorter than pixel transitions, breaking the pixel transition speed barrier! This allows LCD to have motion as clear as a CRT.
Since GtG (pixel transitions) is now shorter than persistence (pixel staticness), most motion blur today is now caused by persistence, as demoed by www.testufo.com/eyetracking
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Backlight strobing may be uncomfortable at low refresh rates, similar to CRTs. I guess we'll have to wait and see if they combine them. If you can maintain over 75 FPS, it probably would be nice, but not so great at lower FPS.
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
Backlight strobing may be uncomfortable at low refresh rates, similar to CRTs. I guess we'll have to wait and see if they combine them. If you can maintain over 75 FPS, it probably would be nice, but not so great at lower FPS.
Actually, according to John Carmack's tweet -- It's a choice -- It's an "either-or" option
Some people will obviously prefer one, and others will prefer the other.

-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for consistent max framerates 120fps @ 120hz (zero motion blur)

However, I believe I've invented a variable-rate strobing algorithm that gradually becomes PWM-free below 60Hz. Creative strobe curve shaping. Eliminates flicker (for most). I'm hoping nVidia adopts this.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Actually, according to John Carmack's tweet -- It's a choice -- It's an "either-or" option
Some people will obviously prefer one, and others will prefer the other.

-- G-SYNC mode: Better for variable framerates (less stutters but more blur)
-- Strobe mode: Better for consistent max framerates 120fps @ 120hz (zero motion blur)

However, I believe I've invented a variable-rate strobing algorithm that gradually becomes PWM-free below 60Hz. Creative strobe curve shaping. Eliminates flicker (for most). I'm hoping nVidia adopts this.
I was responding to your question
The question becomes: Can both be combined into adaptive-rate backlight strobing?
I was pointing out the possible drawback to that.

Edit: And after reading the link about it, I see you are aware of the possible problem of that idea.
 
Last edited:

akahoovy

Golden Member
May 1, 2011
1,336
0
0
Exciting times. I've been this || close to buying an Overlord Tempest 270OC, but now I wonder if I should wait. I would love to jump to a 1440p IPS panel, but G-Sync sounds fantastic.
 

omeds

Senior member
Dec 14, 2011
646
13
81
Hopefully this means the display won't have crappy IQ from 3D mode like when using the 2D lightboost hack.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
Hopefully this means the display won't have crappy IQ from 3D mode like when using the 2D lightboost hack.
Yes, there are many obvious opportunities to improve the quality of strobe backlight modes. I'm not sure which qualities got improved, but there are many ways they could have:

* No need to compensate for 3D glasses tint
* Better color calibration
* No gamma bleaching
* Wider strobe brightness adjustment range (brighter settings and dimmer settings)
* 12-bit or 14-bit LUT's for banding-free strobe-optimized overdrive algorithms.
* Easy to enable in OSD or nVidia settings.
* Adjustable strobe flash length.
* Shorter minimum strobe flash length (motion blur is directly proportional to persistence).

The colors might be slightly different strobed mode, but the delta will probably be smaller than the color quality difference between 120Hz-versus-144Hz. It will probably still be dimmer, especially at smaller strobe flash lengths.
 
Last edited:

Elfear

Diamond Member
May 30, 2004
7,034
554
126
Looks like we AMD folks will still have to use Strobelight. I wonder if G-Sync will be hackable...
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
On the pcper.com livestream today it was mentioned there are 4 modes of the gsync module.
- the usual game mode they have been talking about
- 3D vision
- a low persistence mode with a fixed refresh rate which is very high
- lightboost like mode but much better. They have invested quite heavily in this.

Its not quite the same as lightboost but they have taken the idea and ran with it and its something module directly supports.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
Its not quite the same as lightboost but they have taken the idea and ran with it and its something module directly supports.
Good stuff. They're finally really promoting a strobe backlight mode now!
I've been trying to reach nVidia. Since Blur Busters is the reason for LightBoost 2D popularity:

[SHAMELESS-PLUG]
If any nVidia employees are reading this, can they refer me to nVidia's PR? I'd like to put Blur Busters on the blogger reporter invite list to future nVidia launch events. Being located in Toronto, Chief Blur Buster is within driving distance of Montreal. Blur Busters would love to attend future launches similar to G-SYNC.
[/SHAMELESS-PLUG]
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Really disappointed about how evasive they were with licencing and other graphics card vendors. NVidia's attitude to the market is really beginning to irritate me. They make great tech and then doom it to obscurity with this "its NVidia only" differentiation that just slows down its adoption.

I don't mind needing to get a new monitor, that is a given that a new protocol needs a new monitor. Fine. Heck I can even live with a new GPU. What I can't live with is only an NVidia GPU. That isn't necessary and its anti competitive. I would like all the monitor manufacturers doing it, all the GPU manfacturers doing it by their own choice and free will. I would like IPS monitors trying to do it as well. Its that free market which gives us the variety and finds the best solutions.

I really hope this doesn't end up as an obscure hack only on particular NVidia cards with these special monitors that we only see for half a year and then everyone abandons it because it fails to get adopted because of NVidia's annoying stance on their competitors. Lightboost is like that today, an obscure hack and it ought to be more mainstream.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
nVidia said G-SYNC is being tested for future 1440p and 4K monitors too, it's not limited to TN. I'm glad they're aiming to make the "LightBoost" thing more mainstream; hopefully that's going in 1440p and 4K monitors, too.

I think variable refresh rate technologies are an impressive idea (as perhaps potentially future random access of pixels on the LCD panel, for faster refresh rate where the eye is pointing at, etc.)

I don't think they'll keep this to themselves forever; they'll probably have an exclusivity for a little while, and then begin licensing this technology. They said they didn't preclude licensing.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The same way they license PhysX, sure.

Meanwhile, AMD is actually helping to improve the 4K standard. http://semiaccurate.com/2013/10/11/amd-solves-4k-display-problems-vesa-display-id-v1-3/
PhysX is very different than this, and read this: http://mygaming.co.za/news/hardware/59483-nvidia-gsync-tech-will-boost-gaming-monitor-performance.html
A boon to the adoption of GSync is that the translation is all done at the driver level. GSync is merely an extension of the Displayport protocol and does not need approval from a standards body, nor is the method patented, leaving AMD and Intel to work on their own solutions or help the industry to shift to this new display method.
 
Nov 26, 2005
14,765
114
106
Strobe mode has no effect on input lag. G-sync should be the same as no v-sync is now in terms of input lag.
So Strobed Mode (Lightboost) @ 10% does not have an effect on input lag VS 144Hz with no Lightboost?

EDIT: I do have a VG248QE, and a VG278H
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So Strobed Mode (Lightboost) @ 10% does not have an effect on input lag VS 144Hz with no Lightboost?

EDIT: I do have a VG248QE, and a VG278H
The only thing strobing does is cause the backlighting to dim/blacken when the refresh happens. After it finishes the refresh, it goes to full brightness and you don't see the transition.

I guess, if you want to get completely technical, you could consider that dim/blackened light phase as a delay, but the actual pixel change is not altered.
 
Nov 26, 2005
14,765
114
106
The only thing strobing does is cause the backlighting to dim/blacken when the refresh happens. After it finishes the refresh, it goes to full brightness and you don't see the transition.

I guess, if you want to get completely technical, you could consider that dim/blackened light phase as a delay, but the actual pixel change is not altered.
I do notice a difference between 144Hz VS 120Hz Lightboost strobbed @ 10% and the latter is quicker.

EDIT: I guess it's safe to say Lightboost takes away some of the inconsistencies.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I do notice a difference between 144Hz VS 120Hz Lightboost strobbed @ 10% and the latter is quicker.

EDIT: I guess it's safe to say Lightboost takes away some of the inconsistencies.
Well, the way it works, the actual latency should not be effected. However, it may be perceived differently.
 

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
Strobe mode has no effect on input lag. G-sync should be the same as no v-sync is now in terms of input lag.
Strobe modes adds an average of half a frame of input lag, when looking closely at the high speed video: http://www.youtube.com/watch?v=hD5gjAs1A2s

With a strobe mode, the LCD is refreshed in total darkness, so the flash only happens at the end of the refresh. That means the top of the screen has a full frame of input lag, the center of screen has half a frame of input lag, and the bottom edge of screen has unchanged input lag. Confirmed with my prototype Blur Busters Input Lag Tester.

But, strobe mode actually improves human reaction time (for eye-tracking use cases, such as circle strafing, high speed helicoptor flybys, and other fast panning that requires fast reaction times) because of the lack of motion blur, greatly outweighing the ~4ms of added average input lag, allowing people to score better. You can think of it as a fair tradeoff: In exchange for increasing input lag by half a frame, it decreases human brain lag by more than that.
 
Last edited:

Mark Rejhon

Senior member
Dec 13, 2012
273
0
71
I do notice a difference between 144Hz VS 120Hz Lightboost strobbed @ 10% and the latter is quicker.

EDIT: I guess it's safe to say Lightboost takes away some of the inconsistencies.
High speed video of LightBoost on YouTube:
http://www.youtube.com/watch?v=hD5gjAs1A2s

It's quite self explanatory how LightBoost hides the pixel transitions.
Note: The video was taken of TestUFO's Flicker pattern running in full-screen mode (selecting "Flicker Test" at the top of www.testufo.com, then selecting Height: Full Screen)
 
Thread starter Similar threads Forum Replies Date
TheF34RChannel Graphics Cards 43

ASK THE COMMUNITY