[TechPowerUp article] FreeSync explained in more detail

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I thought they said they had no such plans, at least publicly.
I saw they said they had no public go to market plan. Of course they are still developing the tech.

Of course their go to market plan may end up simply having the driver ready to use it when display companies use DP 1.3 and use CVT standards. That really is the way AMD has done things in the past. At least with HD3D.

But that may be all that is needed for it to launch.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
AMd specifically told pcper that their intent with the demo was to spur on the display manufacturers to ratify DP 1.3 and to implement this optional feature so they could utilise it, they talked about it in one of their podcasts.

I don't think AMDs intention is to go much further with this, there isn't after all a lot more they can do.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It may be all that is needed, but there is more they can do to get manufacturer's involved in getting monitors out there. They can go directly to them and help get their monitors ready, as Nvidia has done with G-sync.
 

Shivansps

Diamond Member
Sep 11, 2013
3,917
1,570
136
Thats the thing, if AMD do that its likely they have to use a FPGA to add the missing features right now, after all thats what the g-sync module do.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Thats the thing, if AMD do that its likely they have to use a FPGA to add the missing features right now, after all thats what the g-sync module do.

No, the G-sync module is quite different... AMD just needs display makers to support a standardized feature, Nvidia does a much more elaborate and custom job.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No, the G-sync module is quite different... AMD just needs display makers to support a standardized feature, Nvidia does a much more elaborate and custom job.
Just because it's a standard, doesn't make it less elaborate, although it does sound like G-sync is more specialized for this use.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Just because it's a standard, doesn't make it less elaborate, although it does sound like G-sync is more specialized for this use.

Well since all that AMD needs is a display controller which supports variable framerate and VBLANK, as opposed to a 768MB RAM framebuffer and an FPGA, I would say that FreeSync is significantly less elaborate, albeit presumably working at lesser quality than Gsync (but that's impossible to say since there's no actual demo for it).
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Well since all that AMD needs is a display controller which supports variable framerate and VBLANK, as opposed to a 768MB RAM framebuffer and an FPGA, I would say that FreeSync is significantly less elaborate, albeit presumably working at lesser quality than Gsync (but that's impossible to say since there's no actual demo for it).
I do agree, that is likely the case. I was just saying that being standardized doesn't mean less elaborate or vise versa.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Well since all that AMD needs is a display controller which supports variable framerate and VBLANK, as opposed to a 768MB RAM framebuffer and an FPGA, I would say that FreeSync is significantly less elaborate, albeit presumably working at lesser quality than Gsync (but that's impossible to say since there's no actual demo for it).

The display controller they would need contains a framebuffer plus PSR. And still not delivering the gsync experience. And its certainly not "free". AMD should have called it was it was, R-sync/A-sync or whatever instead of the misleading name.

Gsync with ASIC would drive the price down to peanuts and a better solution since we obviously wont get an industrial standard that actually fixes it. Both freesync and gsync are nothing but hotfixes to the actual problem. gsync is just a much better way of doing it.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
The display controller they would need contains a framebuffer plus PSR. And still not delivering the gsync experience. And its certainly not "free". AMD should have called it was it was, R-sync/A-sync or whatever instead of the misleading name.

Gsync with ASIC would drive the price down to peanuts and a better solution since we obviously wont get an industrial standard that actually fixes it. Both freesync and gsync are nothing but hotfixes to the actual problem. gsync is just a much better way of doing it.

How can you comment on the gsync and freesync experience? I going to take a stab in the dark here and say you have not seen either implementation in use. Compressed YouTube videos of PR press conferences do not qualify anyone to call either one better than the other.

I'm willing to chalk up the people who were at ces and saw both saying gsync seemed slightly better were just noticing that the effect isn't as good on an off the shelf laptop with a trash tn panel as it is on faster desktop panels that the gsync module is in.

I mean Toshiba laptops that I have seen have ridiculously bad displays in them. Washed out and extremely slow pixel response.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
@VulgarDisplay. I would assume the same that Toshiba were 60hz monitors and if so then the benefits are not apparent.

Here's how I think (casual graphics fan here so maybe way off in how I calculate the times) Freesync would work compared to VSYNC:
Simple scenario where frame rendering takes 22ms, 33ms, 22ms (22, 55, 77 on x time plot):
[Screen Refresh], realtime timestamp (or think as 1ms = 1ft starting at 0), delta between screen refresh
Vsync On
[33ms], 0ms, d(33ms)
[66.7], 22.2, d(33)
[83.3], 55, d(16.3)

Freesync with 60hz monitor //Almost no difference from VSYNC (probably need to have a much more variant scenario (like 22ms, 4ms, 45ms for frame rendering). So really just doing the same as VSYNC On (ie. no tearing).
[33ms], 0ms, d(33ms)
[66.7], 22.2, d(33)
[83.3], 55, d(16.3)

Freesync with 144hz monitor (here the screen deltas matches the GPU rendering delta. The screen refresh delta also matches realtime distance deltas so hopefully the mind will do the rest in believing in smooth frame rates).
[22ms], 0, d(22ms)
[56], 22, d(34)
[84], 55, d(28)
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
How can you comment on the gsync and freesync experience? I going to take a stab in the dark here and say you have not seen either implementation in use. Compressed YouTube videos of PR press conferences do not qualify anyone to call either one better than the other.

I'm willing to chalk up the people who were at ces and saw both saying gsync seemed slightly better were just noticing that the effect isn't as good on an off the shelf laptop with a trash tn panel as it is on faster desktop panels that the gsync module is in.

I mean Toshiba laptops that I have seen have ridiculously bad displays in them. Washed out and extremely slow pixel response.
The fact that Freesync requires them to use triple buffering, and G-sync doesn't, would suggest there is going to be a latency difference. That and the CVT Vblank protocol they are using, appears to be different than what G-sync uses.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
The fact that Freesync requires them to use triple buffering, and G-sync doesn't, would suggest there is going to be a latency difference. That and the CVT Vblank protocol they are using, appears to be different than what G-sync uses.

It would suggest, maybe but no one knows for sure. I'm not knocking gsync just so everyone is clear. I am however optimistic about freesync.

If it came from anywhere but amd press conference people would be calling it the best thing in history. Its an alternative to gsync using an unratified standard from vesa. They could just as easily put a change into the standard that basically makes it operate more like gsync. Since it is likely going to require new internals in the monitor anyway I don't see why not. If the demand is high enough for gsync and I think it will be they have even more reason to do it.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I agree, we don't have all the information needed to know yet. I think that is in part because AMD just tossed this together for a quick little demo to show what they are working on, and they are still ironing out the details. We'll have to wait to know if/what the differences are.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Not getting my hopes up for either. GSync won't work with 4k displays, and FreeSync has a death sentence because AMD is the one working on it.
 

Zanovar

Diamond Member
Jan 21, 2011
3,446
232
106
Not getting my hopes up for either. GSync won't work with 4k displays, and FreeSync has a death sentence because AMD is the one working on it.

"GSync won't work with 4k displays" erm what?,ive only skimmed both threads
what what?
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
"GSync won't work with 4k displays" erm what?,ive only skimmed both threads
what what?

GSync doesn't work with multiple displays. Current 4k displays tile two signals side by side using Displayport daisy chaining.

This also means no surround GSync setups. ATM at least.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Woah, color me impressed :thumbsup:

If FreeSync isn't sorted out by the time I upgrade my monitor, I'm dropping AMD like a bad habit.

I don't think resolution really matters for this type of technology. 8K 12K whatever, shouldn't make a diff.

That is a really nice monitor you have there though. Korean 1440 right?
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
And I'm not finding anywhere specs showing your Crossover 27Q has eDP. Help a guy out please.

This is the controller board.

aXEjinN.jpg


See how the LVDS pinouts have no port? They're not used. Above it is the eDP port, labeled DP OUT.

I don't think resolution really matters for this type of technology. 8K 12K whatever, shouldn't make a diff.

That is a really nice monitor you have there though. Korean 1440 right?

It's okay. Had to open it up to tape over some spots where backlight was bleeding out in the chassis. It's pretty ghetto, I'll upgrade next year (or this year, on black friday). It doesn't overclock, as I understand it it's limited by the eDP connection. The ones that overclock use dual-LVDS and a different board.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
GSync doesn't work with multiple displays. Current 4k displays tile two signals side by side using Displayport daisy chaining.

This also means no surround GSync setups. ATM at least.
Yeah, currently GSync doesn't support surround, but it is quite possible 4k displays will not always be two displays in one.

Anyways, I'm waiting for news that 3D Vision will be on one of these 1440p 120hz they are starting to sell with G-sync.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Yeah, currently GSync doesn't support surround, but it is quite possible 4k displays will not always be two displays in one.


Nope, that link Keys posted clearly showed two GSync modules connected to a 4k display. So they probably have surround GSync fixed :thumbsup: