• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

AMD's FreeSync and VESA A-Sync discussion

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Also I'd like to add that being first to market is in no way a vendor lock in. Simply because no one else makes it yet doesn't fit the definition. People trying to say that (not you) are again spreading FUD.

Being first and last to market is certainly a lock in.
 
Actually no, I was not confused at all, and I thought none of that.

Perhaps if you want to start criticizing me, you should at least be clear on what I do and do not think.

The FAQ that AMD put up does not answer my questions. One-by-one:

Please provide evidence on why these changes won't be expensive. "Because AMD said so" is insufficient. Hence why I asked a question why they think it won't be expensive, which hasn't been answered.

Then why are modifications to the display hardware necessary? AMD said at CES that its goal for FreeSync was to encourage hardware manufacturers to do the necessarily development, and even today an AMD representative said that the burden of that development would rest on the display manufacturers, not AMD. So, my question is simple: what modifications are required, what is added to a display that makes it FreeSync capable as compared to one that isn't, and what action does the AMD GPU have to do in order to make the whole thing come together? None of these questions have been answered.

So? Why does needing proprietary tech or not mean that AMD can't comment on what would be required in a display for its feature, FreeSync, to function? AMD is the only one pushing this, yet simultaneously washing their hands of any responsibility for actually having it happen. They say they have hardware partners, but won't say who they are. They say they're working with partners, but then say the partners are responsible for the development and AMD isn't. So what work are they doing, exactly? These are unanswered questions.

The only answer I've been able to extract from people so far basically sums up as "because open standards are magic, and everyone loves everyone with open standards." An open standard does not guarantee adoption. An open standard does not guarantee development. An open standard does not mean things show up on your doorstep for free.

Most of your questions have been answered. If you want to know what needs to change in the display hardware you should have read/understood the white paper I posted from TI several months ago.

It's obvious you've never worked on product development projects in any capacity. AMD can't comment on products that aren't theirs until they are finalized. The product marketing teams for the various display manufacturers are the ones that release the information for their company's products. Hint: that is the last customer facing step in the whole process.
 
Technology wise the hardware has not been shown to exist, no. They have not demonstrated frame-by-frame variable refresh using eDP or eDP-like configurations. Ever. Nobody has done that. So, if you think the hardware exists, please point me toward evidence of that.

Educated guess. Nvidia is incredibly unlikely to pass up G-Sync to support A-Sync, and if A-Sync can't work with your Geforce cards, then so much for not being vendor locked.

And Despoiler, if AMD can't comment on tech until it's finalized, why have they spent the last eight months commenting on the tech? They've been talking up their end hugely, while smearing their competition's product with false claims. That's not something I can get behind, why is it acceptable to you?
 
Technology wise the hardware has not been shown to exist, no. They have not demonstrated frame-by-frame variable refresh using eDP or eDP-like configurations. Ever. Nobody has done that. So, if you think the hardware exists, please point me toward evidence of that.

Educated guess. Nvidia is incredibly unlikely to pass up G-Sync to support A-Sync, and if A-Sync can't work with your Geforce cards, then so much for not being vendor locked.

And Despoiler, if AMD can't comment on tech until it's finalized, why have they spent the last eight months commenting on the tech? They've been talking up their end hugely, while smearing their competition's product with false claims. That's not something I can get behind, why is it acceptable to you?

Very good points and very good questions. Well done.
 
Well at least we know why from that Q&A why there is so little technical information, because AMD really isn't involved in it much at all. They all but confirmed its all with the monitor manufacturers and they aren't privy to the details of what, when or how. That is interesting but also means we might never get good details on it, instead only the reviewer testing will be done without the details of how it differs.
 
Technology wise the hardware has not been shown to exist, no. They have not demonstrated frame-by-frame variable refresh using eDP or eDP-like configurations. Ever. Nobody has done that. So, if you think the hardware exists, please point me toward evidence of that.

Educated guess. Nvidia is incredibly unlikely to pass up G-Sync to support A-Sync, and if A-Sync can't work with your Geforce cards, then so much for not being vendor locked.

And Despoiler, if AMD can't comment on tech until it's finalized, why have they spent the last eight months commenting on the tech? They've been talking up their end hugely, while smearing their competition's product with false claims. That's not something I can get behind, why is it acceptable to you?

You make it sound like varying refreshes for the sake of it when you put it like that. We saw much reduced stutter and no tearing, the benefits of variable refresh, now you like to think they v-synced some arbitrary # hertz and I like to think vblank was manipulated somehow. Lets wait and see shall we.

I'll paraphrase a bit here: if adaptive sync is as good or better than our gsync, we'll revisit the situation , but for now we're committed to gsync. Tom Petersen's words there.
AMD have been commenting on freesync, implementation of vesa adaptive sync into monitor tech is really not something they're at liberty to talk about even if they want to. When the monitor partners are ready to talk about their products they will.
 
Well at least we know why from that Q&A why there is so little technical information, because AMD really isn't involved in it much at all. They all but confirmed its all with the monitor manufacturers and they aren't privy to the details of what, when or how. That is interesting but also means we might never get good details on it, instead only the reviewer testing will be done without the details of how it differs.

To have more accurate infos one has to access the VESA specifications datasheets wich are accessible only to its members, in the waiting we are left guessing around the engineer s public statements, wich are forcibly vague for whom has not the infos they have access to.
 
Well then does anyone else think it might be a good idea to postpone any further Free-Sync/A-Sync discussions til AMD finally launches?
 
If one has valuable infos or good insight it would be worthy to listen to the arguments, actualy the debate should rather concentrate on the feature themselves since no one to this day published detailed specs available to the general public, and that include all contenders of this syncmania, so no need to postpone anything but for sure it would be great to not end in circular motions with perpetualy the same questions and forcibly answers.
 
You make it sound like varying refreshes for the sake of it when you put it like that. We saw much reduced stutter and no tearing, the benefits of variable refresh, now you like to think they v-synced some arbitrary # hertz and I like to think vblank was manipulated somehow. Lets wait and see shall we.

I'll paraphrase a bit here: if adaptive sync is as good or better than our gsync, we'll revisit the situation , but for now we're committed to gsync. Tom Petersen's words there.
AMD have been commenting on freesync, implementation of vesa adaptive sync into monitor tech is really not something they're at liberty to talk about even if they want to. When the monitor partners are ready to talk about their products they will.

AMD showed the benefits of v-sync without missed frames, they didn't show the benefits variable refresh has over v-sync, and they haven't let anyone peek behind the curtain to see how the demos were setting refresh rate.
 
Then why are modifications to the display hardware necessary?

Because monitors have to support adaptive-sync part of display-port 1.2a spec. Why is this still a question?

The remaining couple of questions of yours are irrelevant. What's the BOM cost of changes needed in displays to support adaptive-sync? Who knows, but at the very basic level, all it means is that the monitor has external timing reference for blanking interval in addition to an internal (and configurable) source, and that the minimum and maximum of these update times are negotiated. It truly doesn't sound too complex. The internal reference is always there. You are just opening a path to have external reference, and some software updates to the information handshake.

And then you ask about how precisely AMD is controlling the blank interval and how often. More details will surely come around the time of feature availability. Why would any company put out a functional spec level of detail for a feature in development? They have given a general sense of how this feature will work. If you don't believe it, then you can wait till you know more, instead of branding everything as lies.

And regarding display-controllers you did misinterpret that very same article I linked. I remember it because it was one of the most astonishingly naive comments I have ever read in a tech forum, and even more unfortunate was that you used this lack of understanding to go about claiming that AMD are lying.
 
You make it sound like varying refreshes for the sake of it when you put it like that. We saw much reduced stutter and no tearing, the benefits of variable refresh, now you like to think they v-synced some arbitrary # hertz and I like to think vblank was manipulated somehow. Lets wait and see shall we.

We saw that if you change from one fixed refresh rate to a different fixed refresh rate that matches the FPS of your outpu, you see reduced stutter and no tearing, yes.

That's a far cry form being able to do that on every single individual frame when you don't know how long it will be until the GPU is finished rendering it. That is variable refresh, that is what G-Sync does, and that is what FreeSync must be able to do.

It's not that I "like to think they v-synced some arbitrary # hertz" - that's what their demo showed them doing. Yes, they were doing it by manipulating vblank - but they still changed from one static refresh rate to another static refresh rate.
 
When topics such as this which go round and round with no concrete matter to speak of, I'd agree with you. There are far more worthy topics to discuss than this constant round robin. It all comes back to zilch.
 
Well then does anyone else think it might be a good idea to postpone any further Free-Sync/A-Sync discussions til AMD finally launches?

They don't really need to launch, they just need to provide something more than just promises that they have something comparable that can compete and a relatively dull demo that is pretty much hands-off as "proof"

It was last October when we really got to see and hear about what G-Sync was, and we knew more about that product back then than we know about Freesync now, even after all these months.

So yeah, actual launch of the product wouldn't be necessary, a similar press event like what nVidia held where AMD can show off some of the technology behind Freesync and then have working units that people can actually get their hands on and play around with themselves, even get notable developers to see it and mess with it and speak about it (getting someone like Carmack to speak so highly of G-Sync was pretty big for nVidia) would be enough to get excited about AMD's efforts and have something worthwhile to talk about.
 
Good thing tomorrow isn't that far away then. I'd guess we'll here more about FreeSync/A-Sync along with a bunch of other very interesting info.
 
Well at least we know why from that Q&A why there is so little technical information, because AMD really isn't involved in it much at all. They all but confirmed its all with the monitor manufacturers and they aren't privy to the details of what, when or how. That is interesting but also means we might never get good details on it, instead only the reviewer testing will be done without the details of how it differs.

The problem might be that you need to be a VESA member to have access to the standard. They might not be free to just release all of the info.

When topics such as this which go round and round with no concrete matter to speak of, I'd agree with you. There are far more worthy topics to discuss than this constant round robin. It all comes back to zilch.

People are not forced to comment.
 
Last edited:
Indeed as that would require that it was intended that the competition would not be allowed to offer the technology which is not the case.

Exactly. If nobody chooses to pick it up or if nobody is capable of picking it up it's not even remotely the same thing as a vendor lockout. A vendor lockout is what is done when nVidia detects an AMD GPU as the primary render device and actively shuts down their hardware so it can't do PhysX calculations.
 
Exactly. If nobody chooses to pick it up or if nobody is capable of picking it up it's not even remotely the same thing as a vendor lockout. A vendor lockout is what is done when nVidia detects an AMD GPU as the primary render device and actively shuts down their hardware so it can't do PhysX calculations.

Well that's goes even further than what would be considered a traditional vendor lockout as it would not even be running on the AMD GPU, unlike BAA were the in game AA does run on AMD GPUS but NV locked out AMD GPUs from using it even though it was enabled for AMD GPUs in the beta Demo and eventually was enabled for AMD GPUS in the GOTY version.
 
Back
Top