[TechPowerUp article] FreeSync explained in more detail

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
my monitor has eDP internally and no scaler

am I good2go for cheapsync?
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I think the real question now is. Are you willing to drop the money on a garbage TN panel in the short term to get Gsync and lock yourself to nvidia hardware based on that monitor?

Or do you wait until display port 1.3 monitors come out and render gsync obsolete. I would imagine Nvidia GPU's are going to support vblank in dp1.3 once it is ratified. Considering how long I tend to keep a monitor vs. a GPU I'm willing to wait a while and see what happens.

If DP 1.3 is ratified in 60-90 (from the PC Per article) days I would imagine panel makers will adopt the spec quite fast because of the onslaught of 4K and its need for a better display standard.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think the real question now is. Are you willing to drop the money on a garbage TN panel in the short term to get Gsync and lock yourself to nvidia hardware based on that monitor?

Or do you wait until display port 1.3 monitors come out and render gsync obsolete. I would imagine Nvidia GPU's are going to support vblank in dp1.3 once it is ratified. Considering how long I tend to keep a monitor vs. a GPU I'm willing to wait a while and see what happens.

If DP 1.3 is ratified in 60-90 (from the PC Per article) days I would imagine panel makers will adopt the spec quite fast because of the onslaught of 4K and its need for a better display standard.
From the info this article gave, it does not render G-sync obsolete. It sounds like it will introduce a 16.7ms of latency on 60hz monitors, and it may also slightly disrupt the time sequence (though not sure on that either).

Atm, we don't know quite enough, but AT did not consider as good, as of yet. It may also cost money as well, as it has to include a chip not included in monitors today. Laptops may already have what is needed, however.

I guess the point is, we have to wait to see the truth.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think the real question now is. Are you willing to drop the money on a garbage TN panel in the short term to get Gsync and lock yourself to nvidia hardware based on that monitor?

Or do you wait until display port 1.3 monitors come out and render gsync obsolete. I would imagine Nvidia GPU's are going to support vblank in dp1.3 once it is ratified. Considering how long I tend to keep a monitor vs. a GPU I'm willing to wait a while and see what happens.

If DP 1.3 is ratified in 60-90 (from the PC Per article) days I would imagine panel makers will adopt the spec quite fast because of the onslaught of 4K and its need for a better display standard.

You still need more than DP 1.3. New GPU too, and a monitor that also supports the other requirements. 4K screens doesnt need more than DP 1.2 until you go 3D and 120hz.

And DP 1.3 spec is to be finalized in Q2, maybe later. Then add another 6-12 months before you even see the first monitor. And the first GPU that supports it is most likely 9-12 months away.

Plus its still not delivering the g-sync experience, far from.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I think the real question now is. Are you willing to drop the money on a garbage TN panel in the short term to get Gsync and lock yourself to nvidia hardware based on that monitor?

Or do you wait until display port 1.3 monitors come out and render gsync obsolete. I would imagine Nvidia GPU's are going to support vblank in dp1.3 once it is ratified. Considering how long I tend to keep a monitor vs. a GPU I'm willing to wait a while and see what happens.

If DP 1.3 is ratified in 60-90 (from the PC Per article) days I would imagine panel makers will adopt the spec quite fast because of the onslaught of 4K and its need for a better display standard.


I wonder when Hawaii based boards will support displayport 1.3. 60-90 days. I dont' think you understand how standard committees work. Standards don't get ratified that quickly. Will the 290 and 290X even work with these DP 1.3 monitors?

AMD told you free-sync wouldn't require hardware. Seems that free-sync requires a monitor logic board and that is not part of the DP 1.3 standard. So you tell me. Is this logic board free? Will panel makers include it? Keep in mind that it isn't part of the DP 1.3 standard thus far. Do you trust AMD to do the legwork and get this logic board into monitors?

Seeing as they already lied to you about being a 100% software solution. Guess what. That isn't the case. They're using a controller board just like nvidia. Like I said. What AMD tells you and what they do are different things. AMD's marketing is not to be trusted. This is a prime example of that.

PCPer has a story on this BTW:

http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

You should give that a good read, it sheds some light and truth on free-sync not being "Free".
 
Last edited by a moderator:

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
So AMD's is literally the same thing as Nvidia's. :|

Then what was with the "software overhead" BS that guy wrote in his article?
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Well, I got what Erenhardt meant
snip
Yes. Thank you for providing a better clarification I could ever do.
All this G-sync\free-sync looks like a pig with a bow tie and not a proper solution. And while it is hard (at least for me) to bash a free features, nv is a big offender here, asking $100 for a beta sub-solution.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yes. Thank you for providing a better clarification I could ever do.
All this G-sync\free-sync looks like a pig with a bow tie and not a proper solution. And while it is hard (at least for me) to bash a free features, nv is a big offender here, asking $100 for a beta sub-solution.

Freesync will not be free. And it will not deliver the same experience.

I cant really see any (good) solution that doesnt require a large buffer. Either on the GPU or the monitor.

Even 290X will not support freesync, even if we imagined you got the monitor.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So AMD's is literally the same thing as Nvidia's. :|

Then what was with the "software overhead" BS that guy wrote in his article?
The original article made it sound like it may be free on some laptops, but made no mention of monitors. I imagine things are different on how the monitors interface with the GPU. I don't honestly know.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
I wonder when Hawaii based boards will support displayport 1.3. 60-90 days. I dont' think you understand how standard committees work. Standards don't get ratified that quickly. Will the 290 and 290X even work with these DP 1.3 monitors?

Standards don't take years to ratify :| and many are compatible with simple firmware updates (see: HDMI, Blu-Ray, etc.) It might be a very simple matter to add compatibility. It may be very difficult. It's hard to say, considering nobody knows what the standard is yet.

AMD told you free-sync wouldn't require hardware. Seems that free-sync requires a monitor logic board and that is not part of the DP 1.3 standard. So you tell me. Is this logic board free? Will panel makers include it? Keep in mind that it isn't part of the DP 1.3 standard thus far. Do you trust AMD to do the legwork and get this logic board into monitors?

Seeing as they already lied to you about being a 100% software solution. Guess what. That isn't the case. They're using a controller board just like nvidia. Like I said. What AMD tells you and what they do are different things. AMD's marketing is not to be trusted. This is a prime example of that.

Oh dude, you're grasping at straws here. All it needs is a logic board which supports VBLANK control, which is part of eDP, all of which are standards. AMD is not requiring expensive and additional hardware like FPGA's or frame buffers. They are simply using existing standards. Which is nice, because then everyone can have access to it. I'm not a fan of proprietary tech, or $300 boards (more than I spent on my monitor...)

This is a simple proof of concept, it's just a demonstration that it should be possible, at some point in the future. Nobody is saying it's coming out in 2014, or even coming out at any point in time. AMD specifically said they have no plans to turn it into a product. That was pretty clear. They're just trying to push the standards to support it, not sell anything to anyone.

I don't know why you insist on "trusting" AMD... what does it matter? It will either get into monitors eventually, or it won't. AMD didn't promise to make it happen. Are you just saying not to let this factor into buying anything now?

PCPer has a story on this BTW:

http://api.viglink.com/api/click?fo...VIDIA-G-Sync&jsonp=vglnk_jsonp_13892033092116

You should give that a good read, it sheds some light and truth on free-sync not being "Free".

All that is needed for this to work, as AMD explained it, was an eDP connection between the discrete GPU and the display, a controller for the screen that understands the variable refresh rate methods of eDP 1.0 specifications and an updated AMD driver to properly send it the signals. The panel can communicate that it supports this variable refresh technology to the graphics card through the EDID as resolutions and timings are communicated today and then the graphics driver would know to send the varying vblank signals to adjust panel refresh times on the fly.

Read your article first, because it specifically says that the monitor controller (not an extra add in FPGA with a framebuffer and lots of other complicated stuff) can do this process.

Freesync will not be free. And it will not deliver the same experience.

You're saying it WILL NOT deliver the same experience, yet you haven't even tried the tech (Gsync OR Freesync I bet)? That's a bold statement.

I cant really see any (good) solution that doesnt require a large buffer. Either on the GPU or the monitor.

Even 290X will not support freesync, even if we imagined you got the monitor.

There's a simple solution that doesn't require a buffer, and AMD is doing it - they predict how long the frame will take. And 290X will not support freesync? Why not? AMD says it's a firmware update for the extra display connectivity tech, and the VBLANK control is already built into the driver. Please, share why you're CERTAIN that it cannot work.



Stop the personal attacks please. Let me remind you of this post I made yesterday.


-Rvenger
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
By the time I need to upgrade all three monitors (years from now), I hope G-Sync/FreeSync come standard with all monitors.

I have three new-ish monitors (Eyefinity) and am not paying the G-Sync tax three times to upgrade.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
There's a simple solution that doesn't require a buffer, and AMD is doing it - they predict how long the frame will take. And 290X will not support freesync? Why not? AMD says it's a firmware update for the extra display connectivity tech, and the VBLANK control is already built into the driver. Please, share why you're CERTAIN that it cannot work.

No offense, but you're the AMD employee, (you told us this in a post a couple of weeks ago), you tell us why it will be certain to work when:

1) PCPer stated - per Kurjari - that new monitor logic boards are required.
2) Displayport 1.3 is not supported by Hawaii

Put simply, your statements do not reconcile with Kojuri's (#1). And he is the head GPU engineer at AMD from what I understand.

I just feel confused here. You told us that you're affiliated or employed by AMD. With that being the case, why don't you contribute and actually tell us something of worth?

I don't know if you work with Warsam - he posts in the CPU forum frequently, and is an AMD employee. Yet he discloses that fact in his sig. Anyway, his agreement with moderators is that he cannot serve as a PR/advertisement mouthpiece, and must get approval prior to posting such threads. I don't know if that same applies to you, but i'm not sure if everyone is aware that you're affiliated/employed by AMD. Warsam's posting conditions were dictated by Idontcare and jvroig. I'm not sure if you underwent the same vetting process that warsam did, but it sounds like you didn't. You're also not disclosing your employment in a sig, either. Which I feel would be great, since you're the AMD guy - you could be a valuable resource for those who have questions. You could be a valuable resource. What isn't valuable, with all due respect, is a PR/advertising mouthpiece. Nobody is interested in that. Jvroig stated such to Warsam when he was being vetted as an AMD rep.

As far as I see it, you're the AMD guy, why are you putting the burden of proof on US? Do you understand how ridiculous that is? I don't mean any disrespect or anything, but I really feel that you should be coming to us with information in this respect and if you don't have anything to share, why bother with rhetorical questions? I mean you're from the source. You're working with AMD.


Take this to Moderator discussions instead of calling a member out.


-Rvenger
 
Last edited by a moderator:

chimaxi83

Diamond Member
May 18, 2003
5,457
63
101
No offense, but you're the AMD employee, you tell us why it will be certain to work when:

1) PCPer stated - per Kurjari - that new monitor logic boards are required.
2) Displayport 1.3 is not supported by Hawaii

Put simply, your statements do not reconcile with Kojuri's (#1). And he is the head GPU engineer at AMD from what I understand.

I just feel confused here. You told us that you're affiliated or employed by AMD. With that being the case, why don't you contribute and actually tell us something of worth?

I don't know if you work with Warsam - he posts in the CPU forum frequently, and is an AMD employee. Yet he discloses that fact in his sig. Anyway, his agreement with moderators is that he cannot serve as a PR/advertisement mouthpiece, and must get approval prior to posting such threads. I don't know if that same applies to you, but i'm not sure if everyone is aware that you're affiliated/employed by AMD. Warsam's posting conditions were dictated by Idontcare and jvroig. I'm not sure if you underwent the same vetting process that warsam did, but it sounds like you didn't. You're also not disclosing your employment in a sig, either. Which I feel would be great, since you're the AMD guy - you could be a valuable resource for those who have questions. You could be a valuable resource. What isn't valuable, with all due respect, is a PR/advertising mouthpiece. Nobody is interested in that. Jvroig stated such to Warsam when he was being vetted as an AMD rep.

As far as I see it, you're the AMD guy, why are you putting the burden of proof on US? Do you understand how ridiculous that is? I don't mean any disrespect or anything, but I really feel that you should be coming to us with information in this respect and if you don't have anything to share, why bother with rhetorical questions? I mean you're from the source. You're working with AMD.

Holy hell dude, take it to mod discussions and/or PM already. Or just keep with the walls of text, that's cool too.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Nvidia responds to free sync demo:

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
If you aren't familiar with eDP, don't feel bad. It's a connection type used in tablets and notebooks and isn't used at all in desktop configurations (some all-in-one designs do use eDP). But here is where it might get interesting: the upcoming DisplayPort 1.3 standard actually includes the same variable refresh rate specification. That means that upcoming DP 1.3 panels COULD support variable refresh technology in an identical way to what we saw demoed with the Toshiba laptops today. DP 1.3 is on schedule to be ratified as a standard in the next 60-90 days and from there we'll have some unknown wait time before we begin to see monitors using DP 1.3 technology in them.

This is why I said 30-90 days for DP1.3 to be ratified. I didn't make it up. 3 months is not a long time in terms of monitor use. I use monitors for YEARS, not months.

If you want to chance tying yourself to likely to die proprietary standard when the alternative that will assuredly be supported by nvidia and AMD (unless nvidia truly wants to rob people blind) then go ahead. I myself am willing to wait until this plays out because if the 30-60 days blurb from that article is true we could be seeing DP1.3 products by the end of the year. I think 4K is going to really be what pushes for this variable refresh stuff because the displays are so much harder to drive.

It may not be necessary update to DP1.3 for bandwith to drive 4K, but it is to allow smoother animation with the GPU power we currently have.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
Why not just have the LCD monitor hold the image till the GPU sends a bit telling the monitor it is time to refresh. Now this would only work up to a certain max refresh rate then you run into problems( though could use VSync for max refresh rate.) but below that it would work just fine. Seems like that wouldn't be a problem to implement.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
How do I find out if my monitor supports this? I know it uses an eDP connection internally.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
That is what the G-Sync module is doing.

For that you need new logic in the monitors.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Why not just have the LCD monitor hold the image till the GPU sends a bit telling the monitor it is time to refresh. Now this would only work up to a certain max refresh rate then you run into problems( though could use VSync for max refresh rate.) but below that it would work just fine. Seems like that wouldn't be a problem to implement.
All these solutions require monitors and GPU to behave differently than they do now. They require tech to allow new techniques to be used. These new technologies must also be compatible with the old technologies. Perhaps someday in the future they can get rid of the backward compatibilities, but we aren't there yet.
 

MisterMac

Senior member
Sep 16, 2011
777
0
0
I just wait for the day one piece of a hardware unit doesn't need secondary hardware unit's manufacturers to fix problems for us consumers.

How on earth - direct control of everything isn't ONLY GPU\CPU controlled yet is beyond me.

A monitor should just display directly always - and all data exposed to it should be dictated by the piece of hardware that handles it.

Seems like with 4k and 120hz oncoming - this will be a niche gamer benefit for both sides with nvidia perhaps doing a little profiteering before monitor manufacturers get off their asses and deliver a decent solution.
 

Paul98

Diamond Member
Jan 31, 2010
3,732
199
106
All these solutions require monitors and GPU to behave differently than they do now. They require tech to allow new techniques to be used. These new technologies must also be compatible with the old technologies. Perhaps someday in the future they can get rid of the backward compatibilities, but we aren't there yet.

If all that is done now is the monitor simply draws what is in the GPU frame buffer every refresh rate. The only change for the GPU would be to send an extra bit of information when the frame buffer is changed. Now the monitor controller would be a larger change. But with what they can already do it with variable refresh rates it shouldn't be a problem.

Both could be backward compatible with no problem.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
If all that is done now is the monitor simply draws what is in the GPU frame buffer every refresh rate. The only change for the GPU would be to send an extra bit of information when the frame buffer is changed. Now the monitor controller would be a larger change. But with what they can already do it with variable refresh rates it shouldn't be a problem.

Both could be backward compatible with no problem.
This is what they are doing, but with a few complications.

They also have to account for refresh rates less than 30hz, because colors change if they aren't refreshed at all. They also have to be compatible with normal constant refreshes, which most applications expect.

The problem is that the existing hardware is not setup for this now. That is why G-sync was made, and why AMD is going to require DP 1.3 and chip to support this.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I'm hoping FreeSync is just as good as G-sync mainly as it should mean wider selection of monitors to choose from. The main benefit I see with from GPU controlled refresh is close to lag free FPS with no tearing. If that doesn't come with FreeSync then I'm going with G-sync. The real nice thing about both implementations is it doesn't require any developer support(I think) so even if it's niche feature dont worry about paying a premium for a feature that may become worthless or not supported like 3D.