AMD's FreeSync and VESA A-Sync discussion

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

96Firebird

Diamond Member
Nov 8, 2010
5,709
316
126
Honestly I just saw it as a comment that meant "it had to be bought from someone", and it wasn't worded by a lawyer, so they guy mentioned Nvidia. Going off on a tangent on whether Nvidia is the sole supplier is pointless.

Even so, they are claiming the vendor must buy the Gsync modules. How do they know that is the case? Maybe Nvidia is building and supplying them the hardware to get Gsync adoption, so they can sell more cards.

In reality, nobody here, nor at AMD, knows.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
This is more a discussion for the thread dedicated to the monitor, but...

Its in the review. For BF4, Vsync off was run at 84FPS (1000/84=11.9ms/frame), Gsync was run at 82FPS (1000/82=12.2ms/frame). For Crysis 3, Vsync off was run at 47FPS (1000/47=21.3ms/frame), Gsync was run at 45FPS (1000/45=22.2ms).

what do you think that indicates? I read, I believe anandtech, document this behavior. is it a problem with the recording tools, drop in performance due to a slight driver overhead or a true frame of latency?
 

Mand

Senior member
Jan 13, 2014
664
0
0
Even so, they are claiming the vendor must buy the Gsync modules. How do they know that is the case? Maybe Nvidia is building and supplying them the hardware to get Gsync adoption, so they can sell more cards.

In reality, nobody here, nor at AMD, knows.

But it doesn't stop AMD from trying to position itself better in the eyes of consumers by pretending to know.

It's not something they should get a pass on doing. They made a direct, specific claim that their competitor's approach has a real, tangible downside that theirs does not - I expect evidence for it.
 

96Firebird

Diamond Member
Nov 8, 2010
5,709
316
126
what do you think that indicates? I read, I believe anandtech, document this behavior. is it a problem with the recording tools, drop in performance due to a slight driver overhead or a true frame of latency?

You mean why is Gsync run at less FPS than Vsync off? My guess is driver overhead, since there was no difference between the two on CS:GO @ 120FPS, assuming they both ran at 120FPS (the review doesn't specify, but I'll assume so).

Edit - Hmm, they didn't run Vsync off with a FPS cap @ 120. I was reading the FPS cap @ 300 results.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Even so, they are claiming the vendor must buy the Gsync modules. How do they know that is the case? Maybe Nvidia is building and supplying them the hardware to get Gsync adoption, so they can sell more cards.

In reality, nobody here, nor at AMD, knows.

So you reckon nvidia is just giving away $200 modules and allowing monitor manufacturers to charge whatever they want for it in the hope that they sell more geforces?
 

Mand

Senior member
Jan 13, 2014
664
0
0
what do you think that indicates? I read, I believe anandtech, document this behavior. is it a problem with the recording tools, drop in performance due to a slight driver overhead or a true frame of latency?

These are separate issues.

There is a small framerate drop when G-Sync is active, that has been attributed to the extra work the GPU has to do to poll the monitor. It has been measured at about 2% drop in framerate.

This is completely different from G-Sync's impact on latency and input lag, which in most test cases has been measured to be on par with vsync off. There are a couple anomalies, but they deal with games running extremely high framerates with framerate limiters, which could be doing some funky things to the input lag.
 

Mand

Senior member
Jan 13, 2014
664
0
0
So you reckon nvidia is just giving away $200 modules and allowing monitor manufacturers to charge whatever they want for it in the hope that they sell more geforces?

Pretty much, yeah. Note that the retail price of $200 for the DIY kit is not necessarily Nvidia's cost for the module. I mean, there's no way to be certain, but it's at least plausible.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,709
316
126
So you reckon nvidia is just giving away $200 modules and allowing monitor manufacturers to charge whatever they want for it in the hope that they sell more geforces?

How do you know they haven't moved on to ASICs from FPGA?
 

96Firebird

Diamond Member
Nov 8, 2010
5,709
316
126
Because they're having to hand-tune the modules to each panel type a display OEM wants to use. ASICs are not suitable for that.

For R&D on each panel, they can use an FPGA. For production, move to ASIC.
 

NomanA

Member
May 15, 2014
128
31
101
Here are the questions I would ask, if I were able (I made an account, but not sure if it'll get through their approval process in time for the Q&A):

Most of your questions were already answered months ago, in an interview like the one below

http://www.sweclockers.com/artikel/...ngsfrekvenser-med-project-freesync/2#pagehead

Here's a brief summary for you. From the answers in that article.

Project FreeSync will utilize DisplayPort Adaptive-Sync protocols to enable dynamic refresh rates for video playback, gaming and power-saving scenarios. All AMD Radeon graphics cards in the AMD Radeon HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon R9 295X2, 290X, R9 290, R7 260X and R7 260 additionally feature updated display controllers that will support dynamic refresh rates during gaming.AMD APUs codenamed Kaveri, Kabini, Temash, Beema and Mullins also feature the necessary hardware capabilities to enable dynamic refresh rates for video playback, gaming and power-saving purposes. All products must be connected to a display that supports DisplayPort Adaptive-Sync.


There are three key advantages Project FreeSync holds over G-Sync: no licensing fees for adoption, no expensive or proprietary hardware modules, and no communication overhead.


VESA DisplayPort Adaptive-Sync is a new component of the DisplayPort 1.2a specification that allows a graphics card to control the refresh rate of a display over a DisplayPort link. As it seems there is some confusion, I want to emphasize that DisplayPort Adaptive-Sync is not FreeSync. By itself, DisplayPort Adaptive-Sync is a building block that provides a standard framework a source device, e.g. a graphics card, can depend on to execute dynamic refresh rates..................... Make no mistake, providing dynamic refresh rates to users still takes a lot of ‘secret sauce’ from the hardware and software ends of our products, including the correct display controllers in the hardware and the right algorithms in AMD Catalyst.

You were confused last time as well, because you thought the AMD rep was talking about displays, when he mentioned the complexity of freesync. You thought "display controllers" meant that the controllers are in display, which even with no prior knowledge of these terms you should have been able to figure out from the interview context alone.

Here's the very basic summary of those quoted comments

1) Monitors need to have Displayport Adaptive V-Sync, which is optional part of 1.2a spec. The changes needed on these monitors won't be expensive and they won't be proprietary. Changes are laid out in VESA spec after all.

2) Freesync is implemented in AMD hardware (display controllers of Hawaii, Bonaire already capable) and drivers. Older Tahiti, Pitcairn GPUs will adjust sync-rates only during videos and not games. Newer APUs also support full freesync capabilities.

3) The only requirement on monitor is for it to have VESA a-sync capability. So any question of partnerships should come from a business perspective and not technology, since the monitor makers don't need any proprietary tech from AMD to support this feature.

And that's it. None of this is super-complex or confusing.
 

96Firebird

Diamond Member
Nov 8, 2010
5,709
316
126
Didnt see that you quoted me as a source, myself i used this forum as a source to give my analysis, this was posted here by a member :

http://www.google.com/patents/US20080055318

Thanks for the actual patent, too much for me to read through.

However, the references at the end are more interesting to me. A lot of patents by Intel and Nvidia about dynamic refresh rates.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Pretty much, yeah. Note that the retail price of $200 for the DIY kit is not necessarily Nvidia's cost for the module. I mean, there's no way to be certain, but it's at least plausible.

Good point, and it could be even more uncertainty introduced by the monitor manufacturers and even retailers like Newegg, who all love to raise prices on their own whenever there is a new feature that raises demand. That would all be independent of Nvidia's agreement regarding how the module is provided to the monitor manufacturer, but you could see people getting all bent out of shape and trying to blame Nvidia for the higher ultimate price, even if it's just demand driven (kind of like how everyone blamed AMD for price gouging during the litecoin mining boom times, when really it may have been anything including Newegg and Amazon price gouging).
 

Mand

Senior member
Jan 13, 2014
664
0
0
You were confused last time as well, because you thought the AMD rep was talking about displays, when he mentioned the complexity of freesync. You thought "display controllers" meant that the controllers are in display, which even with no prior knowledge of these terms you should have been able to figure out from the interview context alone.

Actually no, I was not confused at all, and I thought none of that.

Perhaps if you want to start criticizing me, you should at least be clear on what I do and do not think.

The FAQ that AMD put up does not answer my questions. One-by-one:

1) Monitors need to have Displayport Adaptive V-Sync, which is optional part of 1.2a spec. The changes needed on these monitors won't be expensive and they won't be proprietary. Changes are laid out in VESA spec after all.
Please provide evidence on why these changes won't be expensive. "Because AMD said so" is insufficient. Hence why I asked a question why they think it won't be expensive, which hasn't been answered.

2) Freesync is implemented in AMD hardware (display controllers of Hawaii, Bonaire already capable) and drivers. Older Tahiti, Pitcairn GPUs will adjust sync-rates only during videos and not games. Newer APUs also support full freesync capabilities.
Then why are modifications to the display hardware necessary? AMD said at CES that its goal for FreeSync was to encourage hardware manufacturers to do the necessarily development, and even today an AMD representative said that the burden of that development would rest on the display manufacturers, not AMD. So, my question is simple: what modifications are required, what is added to a display that makes it FreeSync capable as compared to one that isn't, and what action does the AMD GPU have to do in order to make the whole thing come together? None of these questions have been answered.

3) The only requirement on monitor is for it to have VESA a-sync capability. So any question of partnerships should come from a business perspective and not technology, since the monitor makers don't need any proprietary tech from AMD to support this feature.
So? Why does needing proprietary tech or not mean that AMD can't comment on what would be required in a display for its feature, FreeSync, to function? AMD is the only one pushing this, yet simultaneously washing their hands of any responsibility for actually having it happen. They say they have hardware partners, but won't say who they are. They say they're working with partners, but then say the partners are responsible for the development and AMD isn't. So what work are they doing, exactly? These are unanswered questions.

The only answer I've been able to extract from people so far basically sums up as "because open standards are magic, and everyone loves everyone with open standards." An open standard does not guarantee adoption. An open standard does not guarantee development. An open standard does not mean things show up on your doorstep for free.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
10,847
3,297
136
Thanks for the actual patent, too much for me to read through.

However, the references at the end are more interesting to me. A lot of patents by Intel and Nvidia about dynamic refresh rates.

Thoses patents at the end are the ones citing ATI s patent as prior work, hence their presence on this page with their demand dates.

A newly applied patent can show that it does the same thing using a different approach and thus is not breaching the cited patent, hence my analysis that a gsync module inside the panel was a mean to have a different approach and could be used as proprietary solution.
 
Last edited:

Abwx

Lifer
Apr 2, 2011
10,847
3,297
136
So? Why does needing proprietary tech or not mean that AMD can't comment on what would be required in a display for its feature, FreeSync, to function?

The condition is for the panel to have the same features as eDP, i think it s clear from the start, and that current panels are inherently very easy to upgrade using their existing circuitry albeit with a different firmware.
 

Mand

Senior member
Jan 13, 2014
664
0
0
The condition is for the panel to have the same features as eDP, i think it s clear from the start, and that current panels are inherently very easy to upgrade since they use the existing circuitry albeit with a different firmware.

They have also not shown that eDP is capable of frame-by-frame variable refresh. They have shown that you can use it to change from one static refresh rate to another static refresh rate, but they haven't shown it working, ever. Why not, if it's so easy?
 

96Firebird

Diamond Member
Nov 8, 2010
5,709
316
126
Thoses patents at the end are the ones citing ATI s patent as prior work, hence their presence on this page with their demand dates.

A newly applied patent can show that it does the same thing using a different approach and thus is not breaching the cited patent, hence my analysis that a gsync module inside the panel was a mean to have a different approach and could be used as proprietary solution.

Yeah I know, I was just saying that it looks like this technology has been in the works for a very long time.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I'm fully aware of the nomenclature involved, and I asked my questions very specifically.

And no, I can't see how it can possibly be mandatory. To do that, they would have had to get every display manufacturer, every tcon manufacturer on board with it. They've had precisely zero indication that they've done that, and a few rather strong pieces of evidence that the hardware vendors are taking a more "wait and see" approach to it rather than a "we're all on board immediately" approach. It can't be mandatory in that case.

They only have to get VESA on board. Saying they have to get every display manufacturer on board is FUD.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Even so, they are claiming the vendor must buy the Gsync modules. How do they know that is the case? Maybe Nvidia is building and supplying them the hardware to get Gsync adoption, so they can sell more cards.

In reality, nobody here, nor at AMD, knows.

Yes, that makes exactly 0 sense.

NV is giving those expensive boards to the display manufacturer for free to spread g-sync adoption. Meanwhile pesky little b-s-tards sell g-sync equipped displays with a huge premium. All while NV itself makes sure the adoption is sky-rocketing by asking $200+ for DIY kits.

Truth is, nv makes sure they make a whole bunch of $ on each g-sync module sold. Suck every last drop of the sweet fanboy's milk they can before free-sync hits.

If anything, people should be grateful AMD is spreading the word free g-sync alternative is coming. They are saving people's money and if someone still buys g-sync, they help make informed decision.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I just want to clarify what vendor lock-in is:
\http://en.wikipedia.org/wiki/Vendor_lock-in



I think many of the lock-in arguments depend on whether you consider the lock-in as coming from the vendor, vs. as coming from the market.

For example, the argument quoted above relies on the current situation of not being able to obtain the feature from anyone else except AMD. However, I'm not sure the current situation satisfies the definition of "making" the consumer dependent (i.e., forcing them).

But will that remain forever? Is there something here that is preventing other vendors from using the tech - can AMD somehow block Intel or NVidia from using it?

My feeling is that, lock-in turns on whether others are blocked from using something. But I can see Ocre's point, that Intel and Nvidia don't currently support this open standard, so you have a "de facto" lock-in.

It would require speculation to say that an open standard would be used by Intel or Nvidia, to where the consumer would have a choice.

But maybe I'm missing something - is there something blocking them from using it, or is it just the current conditions where they aren't using it and nothing is stopping them?

Also I'd like to add that being first to market is in no way a vendor lock in. Simply because no one else makes it yet doesn't fit the definition. People trying to say that (not you) are again spreading FUD.
 

Abwx

Lifer
Apr 2, 2011
10,847
3,297
136
They have also not shown that eDP is capable of frame-by-frame variable refresh. They have shown that you can use it to change from one static refresh rate to another static refresh rate, but they haven't shown it working, ever. Why not, if it's so easy?

Because they do not manufacture monitors and have to wait for such manufacturers to update their firmwares but adaptative sync is not a major release, not everyone needs it, so they ll do the updates with their normal products releasing cycles wich are often 6-12 months between two products refreshes, AMD didnt give this time line randomly, they know how the industry works.

Technicaly speaking it depend of the display Asic but to summarize grossly this chip is currently a master device while with the norm called adaptative sync it will be either a master or a slave device, depending on the source device capabilities.

For the feature that interest us that s just a question of communication protocol between this chip and the GPU with this latter taking control of parameters that are otherwise put at default values by the display chip with each manufacturer implementing what he think are the best default values, that s not the case with eDP where the GPU has control of all thoses parameters and its implementation is in this case just a driver issue.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I gotta say I still find it baffling that back in January that they stated , no new hardware and no new monitors. It was a simple as a firmware update. Wonder what's up with that. Looks like you need new hardware (only the 260X and 290/X support it) and a new monitor. Interesting.

You need to reread what they said. They said that there should be some monitors in the market that could be made to run it with a firmware update. Never did they say, "no new hardware and no new monitors". More FUD.
 

Mand

Senior member
Jan 13, 2014
664
0
0
You need to reread what they said. They said that there should be some monitors in the market that could be made to run it with a firmware update. Never did they say, "no new hardware and no new monitors". More FUD.

"no new hardware" is exactly what multiple tech reporters walked away from their demo thinking that they said.