[TechPowerUp article] FreeSync explained in more detail

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The thing is alot of laptops already have this feature thats needed to get it working in them.

So in alot of laptops you ll be able to get FreeSync, in 2014.

Also the "Displayport 1.3" should be like 90days away or so, and hopefully at some point Desktop monitors that support it will start selling.

But your right, on the desktop 2014 probably wont have many monitors that can do it.

Thats not true.

I doubt alot of laptops actually do. Its not a laptop standard either as such. Since you need more than eDP for freesync to actually work.

DP 1.3 90 days away? Its Q2 as a very optimistic prediction.

And then dont expect any monitor to support it for another 6-12 months. HDMI 2.0 was just released. Putting more pressure on the hardly used DP. Its not even certain yet that DP 1.3 will use the same plugs as DP 1.2.

Also to use DP 1.3, you need a new GPU dont you? No current GPU as far as I know supports DP 1.3 ;)
 
Last edited:

ashetos

Senior member
Jul 23, 2013
254
14
76
I'm trying to understand the relevant technologies so don't be too harsh.

It seems that freeSync introduces a one frame delay in the pipeline, which G-sync probably doesn't (?) which would give G-sync the edge of having one frame duration less input lag. It doesn't seem that big of a deal though, and otherwise they seem to be pretty similar techniques to achieve similar results.

However, it seems to me that the same effect can be achieved with true triple buffering (not just 3 buffers). Triple buffering should provide the same smoothness as freeSync or G-sync without the need for extra hardware whatsoever.

There are however 2 drawbacks with triple buffering:
1. It increases GPU utilization to 100% (GPU power increase)
2. It does not take advantage of variable refresh rate in monitors (monitor power increase)

For problem #1 a frame rate limiter in the game render engine should be an adequate solution.

For problem #2 I don't see a solution.

So finally, my point really is, why don't we just use triple buffering in the first place? It really is an ancient technology.
 

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
I'm trying to understand the relevant technologies so don't be too harsh.

Most people here (including myself) are doing the same. We are trying to understand.

It seems that freeSync introduces a one frame delay in the pipeline, which G-sync probably doesn't (?) which would give G-sync the edge of having one frame duration less input lag. It doesn't seem that big of a deal though, and otherwise they seem to be pretty similar techniques to achieve similar results.

Can you please elaborate this? I don't see how it would delay by one frame. In both cases the GPU sends the frame as soon as he can (depending of the actual screen state). The only delay you see is if the last refresh "failed" and was forced to reuse an old image, so the new one has to wait.

However, it seems to me that the same effect can be achieved with true triple buffering (not just 3 buffers). Triple buffering should provide the same smoothness as freeSync or G-sync without the need for extra hardware whatsoever.

There are however 2 drawbacks with triple buffering:
1. It increases GPU utilization to 100% (GPU power increase)
2. It does not take advantage of variable refresh rate in monitors (monitor power increase)

Tripple buffering (flip the image to a second buffer and not to the video memory) is really fast if supported by the GPU and it supports variable refresh times.

So finally, my point really is, why don't we just use triple buffering in the first place? It really is an ancient technology.

Maybe because you need a 3rd buffer to use it, and this requires more VRAM. We should ask a 3D Engine dev.
 

ashetos

Senior member
Jul 23, 2013
254
14
76
Can you please elaborate this? I don't see how it would delay by one frame. In both cases the GPU sends the frame as soon as he can (depending of the actual screen state). The only delay you see is if the last refresh "failed" and was forced to reuse an old image, so the new one has to wait.

The way I understood it is that in the case of G-sync, the monitor itself will delay VBLANK until it has received the current frame, whereas in the case of freeSync you either have to predict the VBLANK duration and instruct the monitor accordingly, or always stay one frame behind and avoid predictions whatsoever. Maybe I got things wrong though, I'm not confident.

Tripple buffering (flip the image to a second buffer and not to the video memory) is really fast if supported by the GPU and it supports variable refresh times.

I was viewing variable refresh rate as a different technique than triple buffering, it hadn't occured to me that they can be combined, which sounds pretty cool. But then again, if we have variable refresh rates we probably don't need triple buffering at all.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
His first sentence is not correct.



It will with a compatible screen. The tech can make it happen, as it's more or less the same tech used by nVidia.

Now, we don't have such screens for desktop at the moment (I did not find any on google) and we don't know when they will be available. FreeSync is, for now just a tech, not a real product like G-Sync is.

You simply just added a qualifier dude. To make a compatible screen, one has to........ (you finish this sentence).
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
FreeSync is, for now just a tech, not a real product like G-Sync is.

Now that just isn't a true statement in the case of g-sync. That's what AMD's marketing would have people believe. They would have AMD fans everywhere think that a free g-sync alternative would be coming and soon. Alas that is not the case. Freesync probably won't happen in 2014, seeing as it requires new controller boards, DP 1.3 (not finalized yet) and new monitor firmware. Do you see the problem here? AMD marketing led AMD fans to believe that it was free. That it was coming soon. And that it was a viable alternative. I said it before. What AMD says and does are two different things. This has happened so many times in the past 5 years.

This is why you never believe a word that AMD marketing tells you. Treat everything they say with caution. Everything you see on forums , take it with a gain of salt. Some people bought it though. Like I said - i'm glad AMD's head GPU engineer came clean with the truth. And that truth is, freesync isn't free nor is it coming anytime soon. So mad respect to their head GPU engineer who revealed the truth about free-sync. Had their head GPU not revealed the truth, this nonsense from AMD marketing would be perpetuated across forums for months on end. "Just wait guys, free-sync is coming soon." Which isn't happening.

Lastly, G-sync monitors are available now in limited quantities. It isn't cheap. But it's available. So i'm not sure what's up with your last statement.
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
805
309
136
The way I understood it is that in the case of G-sync, the monitor itself will delay VBLANK until it has received the current frame, whereas in the case of freeSync you either have to predict the VBLANK duration and instruct the monitor accordingly, or always stay one frame behind and avoid predictions whatsoever. Maybe I got things wrong though, I'm not confident.

As far as I know, you can't "stop/shorten" a VBLANK, so in both systems you have to predict the next blanking interval.

I was viewing variable refresh rate as a different technique than triple buffering, it hadn't occured to me that they can be combined, which sounds pretty cool. But then again, if we have variable refresh rates we probably don't need triple buffering at all.

Yes you do, because you need a copy of the last completed frame if the rendering time is higher then the scaning.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Because the display doesn't know when the next frame will come and yet needs to refresh itself.

Funny thing is it doesn't really. It does it, but it could very well do not do this...

Are there any other screen types aside from CRT that need to refresh constantly?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Funny thing is it doesn't really. It does it, but it could very well do not do this...

Are there any other screen types aside from CRT that need to refresh constantly?

I have literally no idea what you're talking about, as it is not based in truth. A typical LCD monitor will refresh at 60hz regardless of what the GPU is spitting at it. So if a GPU is spitting 45 frames out while the refresh is 60hz, the monitor will continue to be at 60hz. Net result? Stuttering, since the GPU is at 45 frames while the panel is at 60hz. That's the entire problem. GPUs and panels are not sync'ed.

Maybe you should reconcile your statement there. The entire purpose of g-sync is to address the fact that monitors currently do not sync with the GPU. All monitors refresh independently of the GPU, therefore when the framerate doesn't align you get tearing and stuttering.
 

ashetos

Senior member
Jul 23, 2013
254
14
76
Yes you do, because you need a copy of the last completed frame if the rendering time is higher then the scaning.

Indeed, but double buffering should suffice in this case, right?

Anyway, I will repeat my point. Until we have a new standard for monitors everywhere some years from now (hopefully) where the monitor syncs with the GPU instead of the opposite, why don't we push for the use of triple buffering in software to eliminate micro stuttering? Why do we need to buy custom and incompatible hardware that only works with certain operating systems or full screen scenarios etc. Software already solves the problem.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Just a reminder (again), no GPU currently supports DP 1.3.

So new GPU+new monitor needed. And thats assuming you find something that actually fully supports it.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Software already solves the problem.

This is what AMD's marketing stated and this has already been proven wrong. AMD's freesync requires new monitor controller boards. So it does require new hardware - it isn't entirely software based.

From PCPer: http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

To be clear, just because a monitor would run with DisplayPort 1.3 doesn't guarantee this feature would work. It also requires the controller on the display to understand and be compatible with the variable refresh portions of the spec, which with eDP 1.0 at least, isn't required. AMD is hoping that with the awareness they are building with stories like this display designers will actually increase the speed of DP 1.3 adoption and include support for variable refresh rate with them.

Free-sync requires a new controller module embedded into desktop panels, of which no panels yet have. These controller boards are NOT part of the DP 1.3 standard, so AMD has to convince monitor manufacturers to add a special controller module. That sure does sound an awful lot like what g-sync is doing. Doesn't it? G-sync is a hardware solution which requires an additional controller board embedded in panels. Free-sync also requires a hardware solution - it also requires a controller board embedded in panels. NEITHER of these are part of the DP 1.3 specification. That doesn't sound free to me. That doesn't sound like a software solution to me.
 
Last edited:

ashetos

Senior member
Jul 23, 2013
254
14
76
This is what AMD's marketing stated and this has already been proven wrong. AMD's freesync requires new monitor controller boards. So it does require new hardware - it isn't entirely software based.

Add this to the long list of AMD marketing lies.

I'm not talking about freeSync. I'm saying triple buffering. We don't need new monitor technology to eliminate micro stuttering. Triple buffering solves the problem (i think?).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
That's incorrect, it's the other way around:

After the GPU renders a frame with G-Sync enabled, it will start polling the display to see if it’s in a VBLANK period or not to ensure that the GPU won’t scan in the middle of a scan out.
So, why does the GPU need this polling in the first place? If the frames just needs to be pushed to the controller to be displayed, why this polling?

But the interesting question is: What does the G-Sync controller do that can't be done on the GPU?

We need to wait for more information about FreeSync, but as of now, both seem similar to me. I'm not saying that FreeSync is better then G-Sync, or that G-Sync is a bad implementation, no not at all.
There are two posibilities:
1) I believe your quote is incorrect. That can't allow for varying refresh rates. The whole point of G-sync is so that the display reacts to the GPU, and not the other way around. If the GPU is only waiting on VBlanking mode, and nothing else, that is no different than V-sync.

2) The only other alternative is the monitor is put into a state where it attempts to refresh as fast as possible, but once in VBlank mode, it stops refreshing. Then waits for an update message from the GPU before starting a new refresh.

Someone simply could have made a typo or misunderstanding when making the quote you wrote. These things do happen, or it is option 2.

Number 2 does sound like it should work well, but this is not allowed by the current tech, which is why they had to add their chip into the monitor. 1 doesn't work with current tech either.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'm not talking about freeSync. I'm saying triple buffering. We don't need new monitor technology to eliminate micro stuttering. Triple buffering solves the problem (i think?).



Triple buffering most certainly does not tackle the issue of variable framerate causing stutter. It also doesn't address input lag. IF the monitor and GPU aren't synced, and the framerate is variable, you get stutter whether you have triple buffering or not.
 

ashetos

Senior member
Jul 23, 2013
254
14
76
Triple buffering most certainly does not tackle the issue of variable framerate causing stutter. It also doesn't address input lag. IF the monitor and GPU aren't synced, and the framerate is variable, you get stutter whether you have triple buffering or not.

You may be right, I have not actually seen and compared solutions. G-sync though is optimal for framerates between 30-60 for a 60 Hz monitor, and this framerate range is where triple buffering also excels versus double buffering.

I would love to see a comparison among G-sync, and v-sync with double and triple buffering. I would expect triple buffering to be less problematic than double buffering in the 30 - 60 framerate range.
 

biostud

Lifer
Feb 27, 2003
19,927
7,037
136
Wouldn't G-sync/free sync would make a lot of sense on IPS screens since they don't do 120Hz? (Not counting the hacked Korean models)
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You may be right, I have not actually seen and compared solutions. G-sync though is optimal for framerates between 30-60 for a 60 Hz monitor, and this framerate range is where triple buffering also excels versus double buffering.

I would love to see a comparison among G-sync, and v-sync with double and triple buffering. I would expect triple buffering to be less problematic than double buffering in the 30 - 60 framerate range.
I'll try to explain this so you understand the difference. G-sync excels all the way to 144hz, btw. Maybe higher (I've seen things to suggest 168).

Triple buffering allows the GPU to create frames at a variable rate when your FPS doesn't match your refresh rate. This allows FPS between 30 and 60. However, the monitor's refresh rate is still fixed. What that means is the display times of those frames will be something like this: 16.7ms, 16.7ms, 33.3ms, 16.7ms, 33.3ms, 16.7ms, 16.7ms.

G-sync and Freesync will allow variable display times, so 50 FPS may look like this: 20ms, 21ms, 19ms, 20ms, 20ms, 19ms, 21.5ms...
 

ashetos

Senior member
Jul 23, 2013
254
14
76
I'll try to explain this so you understand the difference. G-sync excels all the way to 144hz, btw. Maybe higher (I've seen things to suggest 168).

Triple buffering allows the GPU to create frames at a variable rate when your FPS doesn't match your refresh rate. This allows FPS between 30 and 60. However, the monitor's refresh rate is still fixed. What that means is the display times of those frames will be something like this: 16.7ms, 16.7ms, 33.3ms, 16.7ms, 33.3ms, 16.7ms, 16.7ms.

G-sync and Freesync will allow variable display times, so 50 FPS may look like this: 20ms, 21ms, 19ms, 20ms, 20ms, 19ms, 21.5ms...

Thanks for the analysis. I understand that a variable refresh rate is fundamentally a better solution. Would you expect the frames of double buffering to appear more "jumpy" than the frames of triple buffering?
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Thanks for the analysis. I understand that a variable refresh rate is fundamentally a better solution. Would you expect the frames of double buffering to appear more "jumpy" than the frames of triple buffering?
Expect? I think the question is, do they appear more jumpy? They do. When I play a game and I no longer am at 60 FPS on a 60hz monitor, I know immediately, as everything gets a bit stuttery. As soon as I hit the refresh rate, it isn't.
 

KompuKare

Golden Member
Jul 28, 2009
1,228
1,597
136
Funny thing is it doesn't really. It does it, but it could very well do not do this...

Are there any other screen types aside from CRT that need to refresh constantly?

I have literally no idea what you're talking about, as it is not based in truth.

Well, I got what Erenhardt meant: the old CRTs actually needed to be refreshed (phosphor dots fading after so many milliseconds etc.) which is why things like refresh rates, vblank, sync etc. and even scanlines still exist for historical reasons

Now while newer* like LCDs do not have an infinite refresh rate (far from it, hence the GtG specs on LCD monitors etc.), they also do not have any actual need to be refreshed like CRTs were the refresh happens from one corner at a time and travels down one line at a time. But for historical reasons the signals sent down monitor cables are still similar to how they were in the CRT days (in the case of VGA they are identical).

But while a CRT had a gun which could only fire one electron beam at once (I guess multiple guns would have been theoretically possible but would have meant a true monster of tube), LCDs do not and there is not really any reason why the pixels could not be refreshed in an other ways: a region at a time, from both ends at once, whatever: point being there is no physical electron gun.

Now whether that helps with the sync problem? Well, it might but it would mean a totally new way of addressing the screen; probably with buffer like with g-sync but that buffer being perhaps 'behind' each pixel. Not sure how feasible that is. But another though: aren't the sync flickers noticeable precisely monitors still get refreshed with what are effectively scanlines?

Oh, and even while reading the now locked thread, it was obvious to me that freesync is not likely to have been anything which was likely to have been possible to retrofit. But I also think that VBLANK support will eventually find its way to desktop monitors for the same energy saving reasons it is already in some laptops. Whether that helps FPS gamers? Don't know, but then I don't play FPS games *shrug*

*I know, technically, the LCD effect was first discovered in the 19th century before CRTs.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Thanks for the analysis. I understand that a variable refresh rate is fundamentally a better solution. Would you expect the frames of double buffering to appear more "jumpy" than the frames of triple buffering?

No offense, but do you actually play PC games? Not being facetious or anything. Maybe you just code. Or do productivity related tasks. Then the questions you're asking might make sense. Anyone who has played a game with a variable refresh rate, on a 60 hz panel, will notice stuttering. Not everyone games of course, maybe you just do productivity stuff. I can't see any actual PC gamer asking this type of question.
 
Last edited:

ashetos

Senior member
Jul 23, 2013
254
14
76
No offense, but do you actually play PC games? Not being facetious or anything. Maybe you just code. Or do productivity related tasks. Then the questions you're asking might make sense. Anyone who has played a game with a variable refresh rate, on a 60 hz panel, will notice stuttering. Period.

So that brings us to the question. Are you a PC gamer? No offense or anything, but no one who is actually a PC gamer would ask this question. Variable framerate results in stuttering. Triple buffering doesn't solve stuttering if your framerate swings between 30 and 60 constantly. This is blatantly obvious to anyone who actively games on the PC and isn't visually impaired. No disrespect intended or anything. But your questions are curious - not everyone games of course, maybe you just do productivity stuff. I can't see any actual PC gamer asking this type of question. Anyone who games knows.

I play DOTA 2 on a 60 Hz monitor and I don't have any problems because my GPU is strong enough to hit 60 fps steadily. I also know that DirectX games do not actually use triple buffering.

This is why it is not easy for me to compare results for double buffering versus triple buffering, I don't know how you people do it. Also, I haven't seen g-sync in action, but I reckon that it was compared to double buffering?

I have not noticed any reference to triple buffering in any g-sync review, so I have unanswered questions.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
No offense, but do you actually play PC games? Not being facetious or anything. Maybe you just code. Or do productivity related tasks. Then the questions you're asking might make sense. Anyone who has played a game with a variable refresh rate, on a 60 hz panel, will notice stuttering. Not everyone games of course, maybe you just do productivity stuff. I can't see any actual PC gamer asking this type of question.

:rolleyes: That post seems totally condescending.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I play DOTA 2 on a 60 Hz monitor and I don't have any problems because my GPU is strong enough to hit 60 fps steadily. I also know that DirectX games do not actually use triple buffering.

This is why it is not easy for me to compare results for double buffering versus triple buffering, I don't know how you people do it. Also, I haven't seen g-sync in action, but I reckon that it was compared to double buffering?

I have not noticed any reference to triple buffering in any g-sync review, so I have unanswered questions.
Actually, almost all DX games use triple buffering, with some exceptions. The easiest way you can know is that if you are getting variable FPS between 40 and 60, without sudden 30 FPS moments, triple buffering is being used. They just don't show you the option in most games (Farcry 3 did).