[TechPowerUp article] FreeSync explained in more detail

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
So, if Freesync need new controllers means if AMD wanted to force OEM support they will also have to add a FPGA on the monitors.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
@Shivansps

I think it ll just happend naturally.
The people that sell monitors will start includeing support for DP 1.3.

The new DP 1,3 will have the needed features for FreeSync in it.

So yes at some point new models of monitors will all come with this, and amd's GPUs will have the needed hardware inside them, to make it work.

You ll most likely need a new monitor to get it working, but you wont need to buy a "extra" peice of hardware like with Gsync.
 

Teizo

Golden Member
Oct 28, 2010
1,271
31
91
You ll most likely need a new monitor to get it working, but you wont need to buy a "extra" peice of hardware like with Gsync.

The good thing about this, if it is any good, that it will drive the price of G-Sync monitors down much faster. It won't do away with G-Sync because G-Sync is proven and effective, just right now monitor manufacturers are charging a luxury tax for it.

So, please, AMD...don't drop the ball on this. Make it happen, and do it right.
 

Cookie Monster

Diamond Member
May 7, 2005
5,161
32
86
So, if Freesync need new controllers means if AMD wanted to force OEM support they will also have to add a FPGA on the monitors.

No. The current GSYNC controller (a scaler ASIC with variable refresh capability and from my understanding is what AMD is also referring to here) is basically a very expensive proof of concept where once they design their own ASIC for it, it'd be cheap as peanuts.

Current scalars just don't do any of this stuff, and nVIDIA with GSYNC has somewhat finally got the monitor vendors to implement such technology as the benefits are clearly there.. a technology or idea that has been around forever.

So any monitors in the nearby future will have to have an upgraded scalar on the desktop monitors to support this meaning "extra" hardware is necessary but probably much cheaper once they get it down to a mass manufactured ASIC instead of the expensive FPGA being used now.

The problem here is that for AMD they need to wait for DP 1.3 (which means 2015 in the earliest we see AMD's solution), which no GPUs support yet and the standard hasn't even been finalised where as nVIDIA isn't limited to this yet. GSYNC monitors most likely be cheaper by that time with many people or gamers in general adopting them.. The $200 dollar extra would most likely be down to $5.

My other thought was... they shouldn't have gone with the name "free"sync tbh. It could really backfire especially when this won't necessarily work with any AMD GPUs out now along with any monitors now.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
So yes at some point new models of monitors will all come with this, and amd's GPUs will have the needed hardware inside them, to make it work.

Just FYI, the control module that AMD requires (in monitors) for Free-sync is not part of the DP 1.3 specification. It is entirely optional. Monitors obviously don't REQUIRE variable refresh, so the only way this will happen is if AMD does the legwork, is proactive, and gets monitor manufacturers on board with a design that works with their GPUs. A design for a control module.

Because free-sync requires a control module, it isn't free. It also requires AMD to open discussions with monitor manufacturers, as this is an OPTIONAL control panel which certainly is not required by any monitor.

So the real question is, will AMD do the legwork? Will they open discussions with monitor manufacturers? I doubt it based on how AMD is run, to be quite honest. They're full of fluff with no substance. Equally funny is the fact that they called it "free sync" as a marketing shot at nvidia, but free-sync also adds to monitor cost. Just like g-sync. Nvidia did the legwork to get panel makers on board with g-sync, so hopefully AMD does the work to do the same. Like I said though. I doubt they will. All promises, all marketing, with minimal action.

One variable i'm unsure of is whether DP 1.3 is fully required. AMD's solution requires DP 1.3 monitors along with a special control module that supports variable refresh. But GPUs don't support DP 1.3 yet. So you have to wonder how that's going to work.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I don't know if AMD will have to do too much convincing if G-sync is the hit we all hope. Monitor companies will want to jump on it if it will sell monitors.

My biggest concern now is if future Nvidia cards will support the tech. The Freesync part, seems to be aimed at laptops.

I think G-sync is the perfect name, not as Geforce-sync, but GPU-sync. This new tech is syncing the monitor to the GPU, so GPU-sync is perfect, or G-sync for short. Of course since Nvidia has G-sync for their implementation, AMD will have to come up with another name.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
AMD claims it can offer the benefits of Nvidia’s G-Sync with a free driver update, Nvidia rebuts – fight!

http://www.extremetech.com/gaming/1...with-a-free-driver-update-nvidia-rebuts-fight
but if gamers want to see G-Sync-like technology, AMD believes it can offer an equivalent. AMD also told Tech Report that it believes triple buffering can offer a solution to many of the same problems G-Sync addresses. AMD’s theory as to why Nvidia built an expensive hardware solution for this problem is that Nvidia wasn’t capable of supporting G-Sync in any other fashion.

That part has me worried about Freesync. If it requires triple buffering to do the same thing as G-sync, that suggests that it will introduce 17ms of latency, because they have to render one frame ahead before displaying them.

The part about what displays can support now, PCPer's article went more in depth there. A lot of laptop displays, and all in 1 systems have the parts needed, but stand alone monitors do not.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
All GPUs render one frame ahead by default. Flip queue, no? One frame of lag is what we have now.
 

BallaTheFeared

Diamond Member
Nov 15, 2010
8,115
0
71
All GPUs render one frame ahead by default. Flip queue, no? One frame of lag is what we have now.


I think his point was TB induces more input lag than V-Sync alone. We have more than one frame of lag now, if you use V-Sync you're adding more input lag, V-Sync + TB = Even more input lag.


The more info that comes out about FreeSync the more obvious it becomes why AMD sat on this for eight years.
 

jj109

Senior member
Dec 17, 2013
391
59
91
That part has me worried about Freesync. If it requires triple buffering to do the same thing as G-sync, that suggests that it will introduce 17ms of latency, because they have to render one frame ahead before displaying them.

The part about what displays can support now, PCPer's article went more in depth there. A lot of laptop displays, and all in 1 systems have the parts needed, but stand alone monitors do not.

I don't think AMD is serious about FreeSync if they claim triple buffering is a sufficient solution to the synchronization problem. There's just no way to fit 35-55 fps into a 60 Hz cycle without introducing awful judder.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
All GPUs render one frame ahead by default. Flip queue, no? One frame of lag is what we have now.
I meant to say that it will add an additional frame worth of latency, because rather than rendering a frame, then displaying that frame, it will have to render 2 frames, and display the older of the 2 frames, and as new frames are created, it'll continue to display the older of the 2 frames finished.

This may be as others have suspected, the CVT VBlanking technique requires you to know how long the refresh will be before displaying the frame. In order to know how long to wait for the next frame, it simply renders the next frame before displaying the current one.

Of course, this isn't fully explained, only that they think they need triple buffering, and others have pointed out that VBlank under the CVT rules requires you to set the refresh time ahead the frame. I'm basing the idea above on those bits of info.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Their marketing really is insufferable.

"Doing the work for everyone"

That got some chuckles here.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
"Doing the work for everyone" sounds really awful and contrived when they've been maneuvering to make GCN an industry standard as much as they can. Immediately after they win both consoles they try to push a proprietary API on PC, TrueAudio in both the PS4 and PC...

When I think about all of that, that blog post starts to read as if it's describing an alternate reality. I'm seriously stunned. :thumbsup:
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
You guys are confusing render ahead queue and triple buffering.

Triple buffering doesn't add input lag as it doesn't need to show every frame to the screen. If it has two frames ready it will send the newest frame to the screen. Where as render ahead queue will send the old frame then the new frame.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You guys are confusing render ahead queue and triple buffering.

Triple buffering doesn't add input lag as it doesn't need to show every frame to the screen. If it has two frames ready it will send the newest frame to the screen. Where as render ahead queue will send the old frame then the new frame.

No, I'm not confusing anything. Triple buffering doesn't have to be setup to render ahead, though it allows it. The point I made about the comment of Freesync needing triple buffering to do what G-sync does, is that G-sync does not need to use triple buffering, and the only reason you would need to in the case of variable refreshes, is they'd need to know how long to display an image ahead of time, requiring it to be a look ahead system.

Why would they need triple buffering with a variable refresh rate if they didn't need to render ahead?

As described by a few before, Freesync appears to allow them to dynamically change the refresh rate. If that is all they are doing is changing a refresh rate, they have to know how long an image is going to take to be rendered in order to set the refresh time in order for it to be ready to be displayed when the next vertical blanking mode comes around. To do this, you need to render ahead one frame in order to know how long these frames are taking.

G-sync takes a different approach. It does not set refresh rates, but instead, calls a refresh, then holds in vertical blanking mode until the GPU tells it to start a new refresh. With this method, you do not need triple buffering.

This is based on the info we are given, which is still lacking, but what has been told to us suggests this is going to be a difference. Freesync, the way it has been described, sounds like it'll cause about 17-33ms of additional latency (the current frame time in ms). Of course with DirectX and triple buffering (most games), when you reach your refresh rate in FPS, this happens as well, so it isn't necessarily that terrible, just not as good.

NOTE: In DirectX, with triple buffering, every frame does have to be displayed. OpenGL is different.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
"Doing the work for everyone" sounds really awful and contrived when they've been maneuvering to make GCN an industry standard as much as they can. Immediately after they win both consoles they try to push a proprietary API on PC, TrueAudio in both the PS4 and PC...

When I think about all of that, that blog post starts to read as if it's describing an alternate reality. I'm seriously stunned. :thumbsup:

It's not any worse sophistry than how NV used to justify why they don't allow NV cards to be the PhysX card if the rendering card is not NV. NV kept saying it was because of quality assurance and how they didn't want to be blamed if anything went wrong. Yeah right. They were more honest this time around with G-Sync, with Petersen saying that NV put in the hard work to get G-Sync functional so they didn't want to license it to competitors.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
All these solutions require monitors and GPU to behave differently than they do now. They require tech to allow new techniques to be used. These new technologies must also be compatible with the old technologies. Perhaps someday in the future they can get rid of the backward compatibilities, but we aren't there yet.

Well, the off the shelf laptops supported it with current AMD tech. They just had to activate it in the drivers, which already support it. While it may take DP1.3 it doesn't appear to take any additional monitor hardware. At least nothing that isn't already in the Toshiba laptop they used.

I think the real reason we haven't seen it is because nVidia was the first company to look at it and realize the advantage of variable Vsync via VBLANK on desktop PC's. Pretty short sighted and an extreme lack of vision on AMD's part.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
By the way, you just made an untrue and completely false statement. Those laptops used eDP. Desktop monitors DO NOT use eDP. There's a story on the front page of PCper, they interested Kojira to get the straight scoop. And that straight scoop is that free-sync requires DP 1.3 and it requires a variable refresh rate aware control board. No current desktop panels have a variable refresh rate control board.

eDP is a power saving interface for laptops. Desktop monitors do not use this for obvious reasons. There is no desktop monitor that uses eDP. Therefore a control board is required for both free-sync and g-sync. Or we could just pretend that nvidia's engineers are idiots. They could have used only DP the entire time without a control board :rolleyes:. They added the control board because since desktop panels do not use eDP like laptops do, they require a special control board which is variable refresh aware.

This is all outlined at PCPer. They got the straight scoop from AMD. Maybe you should read up on that. Pretty sure you've seen it since it was posted earlier in this thread? If i'm not mistaken.

Why do you think AMD demo'ed it on laptops. Because it can't be done on desktop panels. Desktop panels do not use eDP.
 
Last edited:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
My monitor uses eDP internally, actually. It's one of those Korean jobs. There are probably a lot of monitors that use eDP out already.