G-sync... something?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

darkfalz

Member
Jul 29, 2007
181
0
76
G-sync is a hardware implementation that doesn't care whether v-sync is on or off or what refresh rate you are at (if v-sync is off).

I'd imagine with compatible hardware you could do the same via software, but I think latency / performance hit would be greater. I'll try to explain to the best of my understanding:

In G-sync the on board memory receives the frame from the GPU. It then displays this frame instantly in a refresh. It can do this while receiving the next frame to the onboard memory. This is the advantage in having that buffer memory on the GPU - think of it sort of like hardware triple buffering. G-sync isn't really "variable refresh" - it just waits until it gets the next frame before it refreshes on demand.

In a variable refresh rate mode, the GPU would need to tell the monitor a frame was finished, and would need to adjust the refresh rate interval for that frame, display it before it sends the next one - because the monitor has nowhere to "store" the frames, it has to wait until it has displayed it then tell the GPU to send the next one (much like Vsync, except the interval would not be set). Sure - they could implement a third buffer in hardware, but this would introduce some latency and extra memory use and so on.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well, PCPer got G-Sync working on a laptop with these drivers and no G-Sync hardware.

Ryan's Note: I think it is important here to point out that we didn't just look at demos and benchmarks for this evaluation but actually looked at real-world gameplay situations. Playing through Metro: Last Light showed very smooth pans and rotation, Assassin's Creed played smoothly as well and flying through Unigine Heaven manually was a great experience. Crysis 3, Battlefield 4, etc. This was NOT just a couple of demos that we ran through - the variable refresh portion of this mobile G-Sync enabled panel was working and working very well.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
That's a good thing as it should mean lower cost G-Sync monitors. Assuming a lot of the added cost of current G-Sync monitors is due to the module and not licensing and Nvidia allows module less G-Sync desktop monitors. It also means Freesync should be just as good as G-Sync.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
G-sync is a hardware implementation that doesn't care whether v-sync is on or off or what refresh rate you are at (if v-sync is off).

I'd imagine with compatible hardware you could do the same via software, but I think latency / performance hit would be greater. I'll try to explain to the best of my understanding:

In G-sync the on board memory receives the frame from the GPU. It then displays this frame instantly in a refresh. It can do this while receiving the next frame to the onboard memory. This is the advantage in having that buffer memory on the GPU - think of it sort of like hardware triple buffering. G-sync isn't really "variable refresh" - it just waits until it gets the next frame before it refreshes on demand.

In a variable refresh rate mode, the GPU would need to tell the monitor a frame was finished, and would need to adjust the refresh rate interval for that frame, display it before it sends the next one - because the monitor has nowhere to "store" the frames, it has to wait until it has displayed it then tell the GPU to send the next one (much like Vsync, except the interval would not be set). Sure - they could implement a third buffer in hardware, but this would introduce some latency and extra memory use and so on.

That's not how it works. gsync is variable refresh. It refreshes while recieving the frame. The memory is there to compare the incoming frame to the previous one, to be able to do response time compensation / overdrive.

All high refresh rate screens have memory.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
That's not how it works. gsync is variable refresh. It refreshes while recieving the frame. The memory is there to compare the incoming frame to the previous one, to be able to do response time compensation / overdrive.

All high refresh rate screens have memory.

I'd like to know exactly what it does. It obviously has nothing to do with dynamically varying the refresh rate, because G-Sync works without it. That's at least with Maxwell. Maybe Kepler needed it?
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I'd like to get exact details too on how the module works too as my understanding is all based on assumptions. In use I do notice that G-Sync helps SLI (with G-Sync I do not see any microstutter, with G-Sync off there is very noticeable microstutter). G-Sync is primarily to help smooth out varying framerates with a single card, but I didn't think it would help with varying frametimes(ie. microstutter) seen with SLI. G-Sync obviously does help SLI microstutter based on my experience so very curious is it because of adaptive refresh implementation or maybe the module based implementation helps.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I'd like to get exact details too on how the module works too as my understanding is all based on assumptions. In use I do notice that G-Sync helps SLI (with G-Sync I do not see any microstutter, with G-Sync off there is very noticeable microstutter). G-Sync is primarily to help smooth out varying framerates with a single card, but I didn't think it would help with varying frametimes(ie. microstutter) seen with SLI. G-Sync obviously does help SLI microstutter based on my experience so very curious is it because of adaptive refresh implementation or maybe the module based implementation helps.

It definitely can help with SLI microstutter. What causes the microstutter is bad timing of frames, and in part due to the disparity between when the frame was created and meant to be shown, and when it actually gets shown.

With variable refresh rates, the frames that are created, are shown at the exact time they are meant to be. So even if the frame times jump back and forth between 10ms and 16ms, the frames were created to fit in the time sequence that way, so it looks like it should.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Thanks that it is a real good explanation. For those who need visual help like myself from blurbusters : http://www.blurbusters.com/gsync/how-does-gsync-fix-stutters. It's just hard for me to believe even though that's what I'm seeing! I'm blame it on how our eyes lie to us :) Also I'm probably making an incorrect assumption with V-Sync OFF vs G-Sync, the only difference between the two would be tearing. I see microstutter with V-Sync off.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Thanks that it is a real good explanation. For those who need visual help like myself from blurbusters : http://www.blurbusters.com/gsync/how-does-gsync-fix-stutters. It's just hard for me to believe even though that's what I'm seeing! I'm blame it on how our eyes lie to us :) Also I'm probably making an incorrect assumption with V-Sync OFF vs G-Sync, the only difference between the two would be tearing. I see microstutter with V-Sync off.

While the frame times are the same, the refresh rate is fixed without G-sync, so even though a new frame has finished and been sent to the monitor, without G-sync, there will be a lot of times it waits for it to be updated, or only part of the frame is being updated when sent, and part of it waits another refresh.

Visually, G-sync and Freesync help correct the visual problems of microstutter. It isn't a complete fix, but it helps quite a bit.

Good link. I'll have to save that for future explanations.
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
I'd like to know exactly what it does. It obviously has nothing to do with dynamically varying the refresh rate, because G-Sync works without it. That's at least with Maxwell. Maybe Kepler needed it?

The module hardware is "just" what a monitor controller needs. The module was needed for nVidia to release G-Sync monitors faster because controller manufactures would never agree to implement a non-free tech like G-Sync. In nVidia's documentation, they even say that the module replaces (!!!) the "normal" screen controller.

That's everything, really. It does nothing else, because there's nothing else needed to be done.

With FreeSync working as expected and this leak, what proof do you guys need more?
 

kasakka

Senior member
Mar 16, 2013
334
1
81
The reason why the leak works with laptop eDP displays is that in those devices the GPU hardware is handling scaler duties and controlling the display directly. Otherwise you need hardware for G-Sync to work, just like adaptive sync needed a hardware change on displays.

I'm interested in what the module does though, considering ASUS says they had to add a bigger heatsink and overclock the component to get it working right in their ROG Swift display. What kind of processing causes that kind of heat output?
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
The reason why the leak works with laptop eDP displays is that in those devices the GPU hardware is handling scaler duties and controlling the display directly.

It's easier to do on laptop, because almost all of them use eDP and eDP supports Adaptive-Sync since 2009.

eDP uses a normal DP port for communication with the GPU (where LVDS required a special, dedicated one). If the GPU supports Adaptive-Sync over eDP, it also supports Adaptive-Sync over normal DP. So, Maxwell supports Adaptive-Sync but nVidia chooses not to enable it.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
It's easier to do on laptop, because almost all of them use eDP and eDP supports Adaptive-Sync since 2009.

eDP uses a normal DP port for communication with the GPU (where LVDS required a special, dedicated one). If the GPU supports Adaptive-Sync over eDP, it also supports Adaptive-Sync over normal DP. So, Maxwell supports Adaptive-Sync but nVidia chooses not to enable it.

Why does eDP support adaptive sync out of the box? Do you know why it is originally included? Is it just something like a newer revision?
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
Why does eDP support adaptive sync out of the box? Do you know why it is originally included? Is it just something like a newer revision?

They just gave it a new name.

Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.
 
Feb 19, 2009
10,457
10
76
You know its messed up with even FUD thinks G-Sync monitors are a rip off.

http://fudzilla.com/news/graphics/36903-benq-wants-580-99-for-g-sync-24-inch-1080p-monitor

NV will keep on milking their massive premium for a module which isn't needed, for as long as they can get away. They could also easily support Adaptive Vsync since its an open industry standard, its not linked to AMD, they can give it a different name, like NSync.. but no, they don't want to make it cheaper for gamers, they want to milk it.

As gamers, its your responsibility not to reward such behavior and expect better.
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
NV will keep on milking their massive premium for a module which isn't needed, ...

... in the future. Because at the moment its still needed. Important point here.

But, they could have just told the truth: That the the module was created because there was no other alternative at the time.

So bad press about nVidia lately, most could have been avoided by just telling the truth.