[TechPowerUp article] FreeSync explained in more detail

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

omeds

Senior member
Dec 14, 2011
646
13
81
There is a difference:
The refresh rate is not at 45, it's still at 144hz. But the G-Sync module triggers every 22,2ms a refresh because only frames are coming in this timeframe.

That's the reason why G-Sync works without buffer. There is a refresh when a new frame comes in.

Yes it's at 144hz internally and all that goes with it like less input lag. This is from which it polls. The mere fact you said it triggers a refresh every 22.2ms means the display is outputting 45hz by default. The triggered refreshes are what the final output is. If they are at 22,2ms, that IS 45hz.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No, because the next frame could come faster - after 16,6ms.
There is no refresh rate at all.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The techreport captured high speed video as part of their review. The evidence there suggests that the refresh of the monitor is controlled by the graphics card and the display refreshes at a frequency of the GPU updates. There is no refresh happening for the same frame when below the monitors peak frame rate.

Its also precisely inline with that Nvidia has been telling us, that the monitors new frame is determined by availability on the GPU and that the frame can be displayed for a minimum of 1000/144 ms and for a maximum time of 1000/30 ms.

The evidence we have points to it working precisely as NVidia has described (which they have done in quite a lot of detail).
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Your refresh rate depends on the time between frames. Because with G-Sync every frame will stay as long as no other frame comes in every new frame means a new refresh cycle.

If your frames come in with 50% 16,6ms and 50% 25ms you get a refresh rate of 50Hz. But acutally there is no static refresh rate because the frametime is variable and never static over 1000ms.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
No, I'm not confusing anything. Triple buffering doesn't have to be setup to render ahead, though it allows it. The point I made about the comment of Freesync needing triple buffering to do what G-sync does, is that G-sync does not need to use triple buffering, and the only reason you would need to in the case of variable refreshes, is they'd need to know how long to display an image ahead of time, requiring it to be a look ahead system.

Why would they need triple buffering with a variable refresh rate if they didn't need to render ahead?

As described by a few before, Freesync appears to allow them to dynamically change the refresh rate. If that is all they are doing is changing a refresh rate, they have to know how long an image is going to take to be rendered in order to set the refresh time in order for it to be ready to be displayed when the next vertical blanking mode comes around. To do this, you need to render ahead one frame in order to know how long these frames are taking.

G-sync takes a different approach. It does not set refresh rates, but instead, calls a refresh, then holds in vertical blanking mode until the GPU tells it to start a new refresh. With this method, you do not need triple buffering.

This is based on the info we are given, which is still lacking, but what has been told to us suggests this is going to be a difference. Freesync, the way it has been described, sounds like it'll cause about 17-33ms of additional latency (the current frame time in ms). Of course with DirectX and triple buffering (most games), when you reach your refresh rate in FPS, this happens as well, so it isn't necessarily that terrible, just not as good.

NOTE: In DirectX, with triple buffering, every frame does have to be displayed. OpenGL is different.

Three buffers is not the same as "triple buffering", "triple buffering" uses three buffers, but so can render ahead queue. The problem is that most simply use the phrase "triple buffering" for anything with 3 buffers. It makes it hard to know what they are talking about. One uses the most recent frame rendered and sends it to the screen and can drop the old frame. The other can't drop an old frame and must show every frame rendered.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
There is a refresh rate, it just isn't static. GSync uses VBLANK and adjusts the refresh rate to the GPU. It does have a max and minimum refresh rate though.
 

omeds

Senior member
Dec 14, 2011
646
13
81
The techreport captured high speed video as part of their review. The evidence there suggests that the refresh of the monitor is controlled by the graphics card and the display refreshes at a frequency of the GPU updates. There is no refresh happening for the same frame when below the monitors peak frame rate.

Its also precisely inline with that Nvidia has been telling us, that the monitors new frame is determined by availability on the GPU and that the frame can be displayed for a minimum of 1000/144 ms and for a maximum time of 1000/30 ms.

The evidence we have points to it working precisely as NVidia has described (which they have done in quite a lot of detail).

Correct :thumbsup:

There is a refresh rate, it just isn't static. GSync uses VBLANK and adjusts the refresh rate to the GPU. It does have a max and minimum refresh rate though.

Correct :thumbsup:

This makes no sense.

Exactly. :thumbsup:

No, because the next frame could come faster - after 16,6ms.

Of course it can, the refresh rate is variable - to match the frame rate - this is gsync at work. If the next frame is 16.6ms so is the next refresh, if the next frame is 25ms so is the next refresh.


Sonton, don't confuse polling rate and internal refresh rate with what is actually put out by the display. 1:1 ratio to the user is the goal, and that is what gsync achieves.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Give me a video showing VSYNC, tripple-buffered VSYNC and G-SYNC. With Precise timing information embedded into picture.

With info currently known, I'm betting the latter two will look nearly identical.
Sadly, this is impossible to do without a G-sync monitor. Since our monitor's refresh rates cannot vary, we cannot see what a G-sync monitor displays.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Three buffers is not the same as "triple buffering", "triple buffering" uses three buffers, but so can render ahead queue. The problem is that most simply use the phrase "triple buffering" for anything with 3 buffers. It makes it hard to know what they are talking about. One uses the most recent frame rendered and sends it to the screen and can drop the old frame. The other can't drop an old frame and must show every frame rendered.
If that is the case, DirectX does not ever use triple buffering, it uses render ahead queue, except it doesn't always render ahead.

DirectX does force every frame rendered to be displayed.

http://en.wikipedia.org/wiki/Multiple_buffering
Another method of triple buffering involves synchronizing with the monitor frame rate. Drawing is not done if both back buffers contain finished images that have not been displayed yet. This avoids wasting CPU drawing undisplayed images and also results in a more constant frame rate (smoother movement of moving objects), but with increased latency.[1] This is the case when using triple buffering in DirectX, where a chain of 3 buffers are rendered and always displayed.
The interesting thing is, they do call this triple buffering.

G-sync does not use triple buffering to do its thing. AMD said they thought they could mimic G-sync with the use of triple buffering and VBlank. Why would AMD's method require the use of triple buffering, and not G-sync?

Could it be that G-sync can send an image to refresh, and hold in vertical blanking mode, then dynamically call a new refresh, but Freesync has to tell the monitor the refresh rate every refresh, meaning they have to either predict the frame rendering time, or simply render ahead?

They could have been more specific as to why they needed triple buffering, but I cannot think of another reason why they would need it with a variable refresh system. It suggests there will be some latency with it at the very least.
 
Last edited:

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
If that is the case, DirectX does not ever use triple buffering, it uses render ahead queue, except it doesn't always render ahead.

DirectX does force every frame rendered to be displayed.

http://en.wikipedia.org/wiki/Multiple_buffering
The interesting thing is, they do call this triple buffering.

G-sync does not use triple buffering to do its thing. AMD said they thought they could mimic G-sync with the use of triple buffering and VBlank. Why would AMD's method require the use of triple buffering, and not G-sync?

Could it be that G-sync can send an image to refresh, and hold in vertical blanking mode, then dynamically call a new refresh, but Freesync has to tell the monitor the refresh rate every refresh, meaning they have to either predict the frame rendering time, or simply render ahead?

They could have been more specific as to why they needed triple buffering, but I cannot think of another reason why they would need it with a variable refresh system. It suggests there will be some latency with it at the very least.

Yes as I said most still use triple buffering when in fact it is a render ahead queue. In directX they call it a swap chain, and yes it isn't "triple buffering" it is a render ahead queue.

GSync still has a maximum refresh rate, does the GSync frame buffer act as the front buffer, so the graphics card can continue to work without waiting for the monitor to refresh?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yes as I said most still use triple buffering when in fact it is a render ahead queue. In directX they call it a swap chain, and yes it isn't "triple buffering" it is a render ahead queue.

GSync still has a maximum refresh rate, does the GSync frame buffer act as the front buffer, so the graphics card can continue to work without waiting for the monitor to refresh?
Triple buffering is not required, and in fact is preferred to not use if you reach your max refresh rate (with V-sync).

If you reach the max refresh rate, the GPU has no reason not to wait. There is no advantage in rendering ahead. All that does is create a render ahead situation that causes latency. The same thing that happens with V-sync, triple buffering and DirectX now (the reason many people use 59 FPS limiters).

If your monitor can only display 144hz, why would you want to create more frames?
 
Last edited:

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Triple buffering is not required, and in fact is preferred to not use if you reach your max refresh rate (with V-sync).

If you reach the max refresh rate, the GPU has no reason not to wait. There is no advantage in rendering ahead. All that does is create a render ahead situation that causes latency. The same thing that happens with V-sync, triple buffering and DirectX now (the reason many people use 59 FPS limiters).

If your monitor can only display 144hz, why would you want to create more frames?

Yep, that way you have very little input lag.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Yep, that way you have very little input lag.
If your answer was in response to this:
If your monitor can only display 144hz, why would you want to create more frames?
You do realize that DirectX requires every frame rendered to be displayed, so how would it reduce latency? OpenGL could, but not DirectX. It'll increase latency, as you end up rendering ahead, but still displaying those old frames.

And why would it be required to mimic G-sync, when G-sync doesn't do that?

In DirectX, triple buffering causes extra latency when your FPS reaches your refresh rate due to this.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
If your answer was in response to this:

You do realize that DirectX requires every frame rendered to be displayed, so how would it reduce latency? OpenGL could, but not DirectX. It'll increase latency, as you end up rendering ahead, but still displaying those old frames.

And why would it be required to mimic G-sync, when G-sync doesn't do that?

In DirectX, triple buffering causes extra latency when your FPS reaches your refresh rate due to this.

No shit, they use render ahead queue. I don't know what is so hard to understand. This is not "triple buffering" that only shows the most recently rendered frame.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No shit, they use render ahead queue. I don't know what is so hard to understand. This is not "triple buffering" that only shows the most recently rendered frame.
Triple buffering, in the way you are calling it, is not possible with DirectX. If they truly mean what you are talking about, then Freesync will only work with OpenGL and Mantle games.

But as I linked and quoted before, they do call your "rendered ahead" buffer system, "triple buffering". Just because you don't call it "triple buffering", does not mean AMD doesn't.

And what is so hard to understand here? Either they are using "triple buffering" as a render ahead queue as I stated, or they are severely limiting when Freesync can be used. Which do you think is more likely? G-sync doesn't use triple buffering, why would AMD say they'd need to use triple buffering to mimic G-sync if the use of triple buffering was only needed to reduce latency when you exceed your hz cap?

I do realize we are in the dark still. They do need to answer these questions for us to know. They may not be answering them yet, because they are still working on the tech.
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
I dont think it uses eDP internally. Also you cant connect the display via eDP from your desktop for that matter.

Also it would need to be eDP 1.3 and contain a display framebuffer. Read, g-sync style solution.

It does use eDP, I'm 100% certain of it. And eDP is compatible with DP, you just need to Frankenstein a cable together, and you can connect the panel directly to your desktop. ;)

My monitor doesn't support Freesync, but it's proof eDP is a little more widespread than we thought.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
It does use eDP, I'm 100% certain of it. And eDP is compatible with DP, you just need to Frankenstein a cable together, and you can connect the panel directly to your desktop. ;)

My monitor doesn't support Freesync, but it's proof eDP is a little more widespread than we thought.

What's the make and model of your desktop monitor. And version number if available.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
"Freesync" isn't even a real thing....it was based on a laptop demonstration which means nothing to desktops.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
What's the make and model of your desktop monitor. And version number if available.

It's a Crossover 27Q. You can see teardowns on overclock.net, if you don't believe me... but why would I lie? ' 3'
 
Last edited: