[TechPowerUp article] FreeSync explained in more detail

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

velis

Senior member
Jul 28, 2005
600
14
81
Petty bickering aside, reading about this topic just makes me think about something both AMD and NV already have full control over, but choose to convey through new marketing buzzwords:
Activating VSYNC causes the GFX card to WAIT before sending a finished frame to the monitor.
GSYNC makes this issue go away because it's now the monitor doing the waiting.
Either or both could simply do away with this by making the GFX card buffer the finished frame until VSYNC comes along and already generate the next one while waiting.
Which is where tripple buffering (long since implemented) comes in.

Just what exactly is the difference in LAG and FLUIDITY between G-SYNC and tripple buffering?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Petty bickering aside, reading about this topic just makes me think about something both AMD and NV already have full control over, but choose to convey through new marketing buzzwords:
Activating VSYNC causes the GFX card to WAIT before sending a finished frame to the monitor.
GSYNC makes this issue go away because it's now the monitor doing the waiting.
Either or both could simply do away with this by making the GFX card buffer the finished frame until VSYNC comes along and already generate the next one while waiting.
Which is where tripple buffering (long since implemented) comes in.

Just what exactly is the difference in LAG and FLUIDITY between G-SYNC and tripple buffering?
Clearly you need to read a bit about the downsides of v-sync even with triple buffering.

A 60hz monitor with FPS ranging between 30 and 60, can only display frames with the times of 16.7ms and 33.3ms. Nothing in between.

That means that if you have 45 FPS, your frame times will be:
16.7ms, 33.3ms, 16.7ms, 33.3ms, etc.

Triple buffering allows for variable FPS between 30 and 60, but it still does not deliver them smooth to the display.

With G-sync, those uneven frame times become:
25ms, 25ms, 25ms, 25ms, etc.

Lag is another issue, but the above is already plenty reason for G-sync and Freesync. It does sound like Freesync may have extra latency, but we are still uncertain, as the tech is still in development.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
Petty bickering aside, reading about this topic just makes me think about something both AMD and NV already have full control over, but choose to convey through new marketing buzzwords:
Activating VSYNC causes the GFX card to WAIT before sending a finished frame to the monitor.
GSYNC makes this issue go away because it's now the monitor doing the waiting.
Either or both could simply do away with this by making the GFX card buffer the finished frame until VSYNC comes along and already generate the next one while waiting.
Which is where tripple buffering (long since implemented) comes in.

Just what exactly is the difference in LAG and FLUIDITY between G-SYNC and tripple buffering?

triple buffering at 40fps(each frame takes 25ms to render) causes half your frames to display for 16ms and the other half to display for 33 ms(this causes stuttering/juddering). G-sync at 45fps causes every frame to be displayed for 25ms.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Lag is another issue, but the above is already plenty reason for G-sync and Freesync. It does sound like Freesync may have extra latency, but we are still uncertain, as the tech is still in development.

FreeSync only works with buffers because AMD need to adjust the refresh cycle before the frame gets send and they still need to be synchronize with the monitor (V-Sync).

It's not even close to G-Sync. It's more a better V-Sync with TB method than a real solution to the problem of tearing and stuttering or lag, half frames or stuttering.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106

I dont think it uses eDP internally. Also you cant connect the display via eDP from your desktop for that matter.

Also it would need to be eDP 1.3 and contain a display framebuffer. Read, g-sync style solution.

PSR mode allows GPU to enter power saving state in between frame updates by including framerbuffer memory in the display panel controller
 
Last edited:

velis

Senior member
Jul 28, 2005
600
14
81
I'm not sure this is really the case: It has been said many times that NV controller sits between GFX card and monitor logic. That would mean that monitor continues working at its fastest refresh (144Hz in provided examples) which is fast enough to not be visually perceived when a new frame is displayed.

But given a 60Hz monitor, G-SYNC would be nothing more than triple buffering. That's as far as my reading carries me, but perhaps G-SYNC actually does render the frame as it comes in, which would then actually make a difference vs triple buffering. But then why is this tech not shown with any 60Hz monitors?

BUT, with a 144Hz monitor and with assumption of no LAG improvements the actual G-SYNC or FreeSync benefit is perceptibly nothing more than triple buffering even if a frame can be displayed at any time.
 

RaulF

Senior member
Jan 18, 2008
844
1
81
So as i understand it, the graphics card is sort of always waiting on the monitor. I see it as the gpu just sits there waiting for the monitor to do its thing and then that causes lag.
Does the Gsync module forces the monitor to take all the work the gpu has ready and display it no matter what, even if the monitor is not ready?

So the gpu is always pumping out stuff and not having to pause or wait, it just keep pumping out frames due to the Gsync module pushing them to the monitor when the gpu has them!


Feel free to criticize or guide me in the right direction.
 

velis

Senior member
Jul 28, 2005
600
14
81
This is exactly what I think G-SYNC does not do. I think the only actual thing it does is buffer the frame until the monitor is ready to display it. Since they only showcase the tech using 144Hz monitors, that looks great, but on 60Hz monitors it wouldn't look half as great.
Edit: FreeSync is even worse than that: the framerate is supposed to be determined in advance for it.
 

Jaydip

Diamond Member
Mar 29, 2010
3,691
21
81
Clearly you need to read a bit about the downsides of v-sync even with triple buffering.

A 60hz monitor with FPS ranging between 30 and 60, can only display frames with the times of 16.7ms and 33.3ms. Nothing in between.

That means that if you have 45 FPS, your frame times will be:
16.7ms, 33.3ms, 16.7ms, 33.3ms, etc.

Triple buffering allows for variable FPS between 30 and 60, but it still does not deliver them smooth to the display.

With G-sync, those uneven frame times become:
25ms, 25ms, 25ms, 25ms, etc.

Lag is another issue, but the above is already plenty reason for G-sync and Freesync. It does sound like Freesync may have extra latency, but we are still uncertain, as the tech is still in development.

Add "average" before 45 please :D
 

jj109

Senior member
Dec 17, 2013
391
59
91
But given a 60Hz monitor, G-SYNC would be nothing more than triple buffering. That's as far as my reading carries me, but perhaps G-SYNC actually does render the frame as it comes in, which would then actually make a difference vs triple buffering. But then why is this tech not shown with any 60Hz monitors?

Whenever someone goes with the tired and wrong triple buffering argument against G-sync, I just link this image.

NgCwFP6.png


In G-sync, the monitor holds onto its current framebuffer unless the GPU pushes it a new one. The only downside is the GPU must make sure that the monitor's framebuffer isn't being scanned.

Add "average" before 45 please :D

Eh, triple buffered V-sync looks awful if it can't hit the refresh rate.
 

omeds

Senior member
Dec 14, 2011
646
13
81
I'm not sure this is really the case: It has been said many times that NV controller sits between GFX card and monitor logic. That would mean that monitor continues working at its fastest refresh (144Hz in provided examples) which is fast enough to not be visually perceived when a new frame is displayed.

But given a 60Hz monitor, G-SYNC would be nothing more than triple buffering. That's as far as my reading carries me, but perhaps G-SYNC actually does render the frame as it comes in, which would then actually make a difference vs triple buffering. But then why is this tech not shown with any 60Hz monitors?

BUT, with a 144Hz monitor and with assumption of no LAG improvements the actual G-SYNC or FreeSync benefit is perceptibly nothing more than triple buffering even if a frame can be displayed at any time.

It's nothing like TB. Gsync is synchronizing the monitors refresh rate to match frame rate 1:1, even when fps drop below the max refresh rate.
 

velis

Senior member
Jul 28, 2005
600
14
81
Give me a video showing VSYNC, tripple-buffered VSYNC and G-SYNC. With Precise timing information embedded into picture.

With info currently known, I'm betting the latter two will look nearly identical.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
But then why is this tech not shown with any 60Hz monitors?

There is no "Hz" anymore with G-Sync. You can limit your current frames to 60fps on a 144Hz display and there is no difference to a 60Hz G-Sync monitor.

G-Sync is responsible to refresh your display. And that's the reason why G-Sync works without buffer.

Give me a video showing VSYNC, tripple-buffered VSYNC and G-SYNC. With Precise timing information embedded into picture.

With info currently known, I'm betting the latter two will look nearly identical.

Watch the promo video from nVidia on their site. Or look on youtube for the montreal videos. nVidia always used V-Sync with TB.
 

omeds

Senior member
Dec 14, 2011
646
13
81
Give me a video showing VSYNC, tripple-buffered VSYNC and G-SYNC. With Precise timing information embedded into picture.

With info currently known, I'm betting the latter two will look nearly identical.


I don't mean to be condescending, but I don't understand which part you're having trouble with.

Gsync alters the actual refresh rate of the display. If you have 45fps, the refresh rate will be 45hz. TB and vsync do not do this, if you have 45fps, the monitor still runs at 60hz causing judder.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No, G-Sync is not altering the refresh rate. It triggers a refresh cycle based on the time between frames. The refresh rate is still the maximum rate of the monitor.
 

omeds

Senior member
Dec 14, 2011
646
13
81
And what do you think changing the refresh cycle is doing? That is exactly what's happening. When you have 45fps, the monitor is running at 45hz. The end result is refresh rate and frame rate are at 1:1 parity.
 

velis

Senior member
Jul 28, 2005
600
14
81
Like I said: if this is true, then G-SYNC is something more, but even then I claim it is irrelevant in comparison to tripple buffering when used with a 144Hz display.
If used with a 60Hz display though, it's a completely different game.
But since the tech is showcased with 144Hz monitors only, I highly doubt G-SYNC actually creates a 1:1 parity. It just looks like that because 144Hz is really fast. Not to mention that the comparison video is shot with a (comparatively) crappy camera. Not nearly a professional level equipment (for such a scenario).
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
But given a 60Hz monitor, G-SYNC would be nothing more than triple buffering. That's as far as my reading carries me, but perhaps G-SYNC actually does render the frame as it comes in, which would then actually make a difference vs triple buffering. But then why is this tech not shown with any 60Hz monitors?

Exactly. All the information I've seen indicates this.

BUT, with a 144Hz monitor and with assumption of no LAG improvements the actual G-SYNC or FreeSync benefit is perceptibly nothing more than triple buffering even if a frame can be displayed at any time.

The tech to removes tearing is exactly the same as the one behind triple buffering: You just keen a copy of the last completely rendered frame in the memory (GPU or Controller).

To remove the input lag on those solutions, you try to match your LCD's redraw frequency to your actual framerate.

You don't change the frequency of the LCD directly. You run it at max hz and by "pausing" it, you can "emulate" a reduced frequency. Ex: If at 120hz I add a 8.3ms delay (with VBLANK) I would get a new frame all 16.6ms, that's exactly what you have at 60hz.

This last part is what's "new" in both tech.
 

omeds

Senior member
Dec 14, 2011
646
13
81
Like I said: if this is true, then G-SYNC is something more, but even then I claim it is irrelevant in comparison to tripple buffering when used with a 144Hz display.
If used with a 60Hz display though, it's a completely different game.
But since the tech is showcased with 144Hz monitors only, I highly doubt G-SYNC actually creates a 1:1 parity. It just looks like that because 144Hz is really fast. Not to mention that the comparison video is shot with a (comparatively) crappy camera. Not nearly a professional level equipment (for such a scenario).


I have a 144hz display and its still very noticeable when frame rate does not match the refresh rate with TB. It presents the same judder as 60hz displays, just shorter intervals.

Watch the Gsync demo, the refresh rate becomes variable to match frame rate 1:1.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
And what do you think changing the refresh cycle is doing? That is exactly what's happening. When you have 45fps, the monitor is running at 45hz. The end result is refresh rate and frame rate are at 1:1 parity.

The G-Sync module.
And it isn't running at 45hz. The display gets refreshed every 22,2ms by the G-Sync module. The current "refresh rate" of the display is still the maximum rate or the one which got selected.

And yes, G-Sync and V-Sync at the refresh rate behave nearly the same. The difference is only in lag because G-Sync doesn't need a additional buffer like V-Sync.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
So laptops are what AMD was talking about? Ehhh....

Any word of "freesync" working on displays that are not connected like a laptop?
 

omeds

Senior member
Dec 14, 2011
646
13
81
It's still at the maximum internally from which it polls, but not what is presented by the display - which is what it's all about.

Yes if the display is refreshed every 22.2ms that means its 45hz. Refreshing every 16.6ms means its 60hz. Every 8.3ms means 120hz.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
There is a difference:
The refresh rate is not at 45, it's still at 144hz. But the G-Sync module triggers every 22,2ms a refresh because only frames are coming in this timeframe.

That's the reason why G-Sync works without buffer. There is a refresh when a new frame comes in.
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
That's the reason why G-Sync works without buffer. There is a refresh when a new frame comes in.

You still need a buffer, because if the next frame takes to long to render and your VBLANK expires, you have to reuse the last frame. That's why there is RAM on the G-Sync controller.