AMD's FreeSync and VESA A-Sync discussion

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
But maybe I'm missing something - is there something blocking them from using it, or is it just the current conditions where they aren't using it and nothing is stopping them?

because using the display engine to control a variable rate has already been patented years ago by ATI and parts of this patent are already used in eDP...

Source

How true that is, I have no idea. Does that patent prohibit Nvidia from taking advantage of Adaptive-sync technology? If so, I would call that a vendor lock-out.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Regardless of whether AMD is imposing a lock, Nvidia has already said very clearly that they think G-Sync is a superior product and don't plan to support A-Sync.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76

I think that is just AMD throwing more mud than anything else. AMD is basically saying they have a power saving technology (PSR and transparent refresh rate adjustment) in their GPUs and Nvidia doesn't appear to so they have to implement it differently. Intel in its GPUs also has PSR and dynamic refresh rate adjustment, although whether its capable of async I don't know and they haven't said it is.

So if Nvidia doesn't have hardware support for these two technologies then its more dependent on the monitor taking over some responsibility. AMD hasn't actually said how Freesync works but I think this is basically how they technically differ:

Gsync - When you are below 30 hz the monitor will have in its buffer memory a copy of the last image it displayed, it detects that the image has been on screen too long and starts redrawing the image stored in the buffer (basically PSR). The GPU thus queries the monitor when its ready to send a frame to ask if the monitor is ready for the next one, if the monitor is it says yes otherwise its a no. Thus at all times the monitor is in charge of what it renders and when. This is why the gsync module is necessary, it implements the monitor side of this equation including PSR and the query aspect that desktop monitors don't have.

Freesync - AMD's cards ask for the minimum and maximum refresh rates of the monitor on connection. If the frame rate drops below the minimum then the GPU at that point of a frame reaching its limit will resend the same image to the monitor, PSR is thus implemented in the GPU. The monitor just renders what its sent and its running on the refresh rate defined by the GPU. Assuming the monitor can cope with varying lengths of vblank signals this allows variable refresh rate updates and the minimum frequency is specified in the initial conversation and the GPU ensures it doesn't send any slower than that. The monitor doesn't ensure that the image is not decaying.

I think this is how they effectively differ based on how I have pieced together the tidbits of information we have on Freesync (gsync I am very clear that is how it works). Presumably maintaining that image and redrawing a previous buffer requires a hardware change on Nvidia which is why they did it the way they did. As I understand it PSR on laptops more often works more like the Nvidia solution than the AMD one, although I never saw anything about queries for ability to take a new frame. Some confirmation on how this works from AMD would be nice.

I don't know what if any advantages Nvidia gains from their solution in comparison.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I gotta say I still find it baffling that back in January that they stated , no new hardware and no new monitors. It was a simple as a firmware update. Wonder what's up with that. Looks like you need new hardware (only the 260X and 290/X support it) and a new monitor. Interesting.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
I don't think AMD said it directly, but all the news articles about AMD's demo at CES 2014 say they believe it can all be done with no new hardware. Can blame the news sites for hyping up something that now looks to be false.
 

Mand

Senior member
Jan 13, 2014
664
0
0
I'd love to see you pull up an actual quote for this. . .

http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

Finally, as a last minute stirring of the pot, I received an email from AMD's Koduri that indicated that there might be some monitors already on the market that could support variable refresh rate TODAY with just a firmware update.
http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html

For display makers, supporting the technology won't require buying licenses, or integrating specialized hardware into the displays.

According to AMD's Raja Koduri, the display controllers inside NVIDIA GPUs don't support dynamic refresh rates the way AMD's do, and hence NVIDIA had to deploy external hardware.

That was shortly after the CES demo in January. AMD was quite clear about "no new hardware."
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
5,262
7,890
136
I don't think AMD said it directly, but all the news articles about AMD's demo at CES 2014 say they believe it can all be done with no new hardware. Can blame the news sites for hyping up something that now looks to be false.

Pretty much.


So where are the quotes supporting what blackened said? As far as the hardware goes, read your own link again:

"won't require buying licenses, or integrating specialized hardware into the displays."

Pretty clear to me what they meant and that as far as we know and have seen, this is true.
 

Despoiler

Golden Member
Nov 10, 2007
1,966
770
136

Might being the operative word. Blackened's statement is an absolute. Not the same thing. AMD proved it was so on their second demo though. They used off the shelf Korean monitors. There was a forum user either here or OC that pulled apart their Korean monitor to show it was using an eDP connector to the panel. I also proved that as long as the TCON was programmable it would only need a firmware update because the physical input and output specs were not changed when the DP 1.2a revision was ratified.
 

Mand

Senior member
Jan 13, 2014
664
0
0
So, more "we can't answer that" than anything else, disappointing. One thing I did note however, AMD claims that display manufacturers must buy G-Sync modules from Nvidia.

I'd like to see AMD's evidence for that claim. How do they know what the monitor manufacturers are and are not paying for? I can't possibly imagine that either Nvidia or a monitor manufacturer would give AMD information on contracts that don't involve AMD. That's just not reasonable business practice.

From reports from monitor manufacturers, Nvidia has been giving them a lot of help in order to get G-Sync going, in the form of Nvidia engineers working on their individual monitors. Why should we assume there's a licensing fee involved? What is the evidence behind AMD's claim?

In short, where's the beef?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
AMD claims that g-sync adds latency in their FS FAQ which isn't the case. States it polls or waits on the monitor. Pretty funny that they've got the entire smear campaign going on for a product that isn't remotely close to being on store shelves (2015) and they have an outright false claim in their FAQ.

Thracks was asked in that Q+A if he thought Tom Pederson was lying about this and he was strangely quiet. No answer. But he (thracks) authored the FAQ. Stay classy AMD, I guess.
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
One thing I did note however, AMD claims that display manufacturers must buy G-Sync modules from Nvidia.

I'd like to see AMD's evidence for that claim. How do they know what the monitor manufacturers are and are not paying for?

That does seem to be an assumption, but maybe it's the most reasonable absent some more facts (maybe Nvidia pays monitor manufacturers to use the modules, or Nvidia may offer them for free sort of like the razor blade sales model where you give away the initial shaving kit for free but make profit long term in repeat sales of the razor blades).
 

Mand

Senior member
Jan 13, 2014
664
0
0
That does seem to be an assumption, but maybe it's the most reasonable absent some more facts (maybe Nvidia pays monitor manufacturers to use the modules, or Nvidia may offer them for free sort of like the razor blade sales model where you give away the initial shaving kit for free but make profit long term in repeat sales of the razor blades).

It's entirely plausible that Nvidia has another structure set up. Nvidia is not in the display scaler replacement business, they're in the GPU business. It's entirely plausible to me that they would spend their own money getting G-Sync in as many displays as they possibly can in order to drive Geforce sales.

It makes no sense from their perspective to restrict the proliferation of G-Sync, and that's exactly what a license fee would do.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Holy cow that chat on the other website was not very enlightening. Pretty much what I learned:

1) Much of the key facts are not mine to share....

2) Other things are protected under NDA. I'm legally obligated to protect the information.

I feel it was a waste of time.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
So, more "we can't answer that" than anything else, disappointing. One thing I did note however, AMD claims that display manufacturers must buy G-Sync modules from Nvidia.

I'd like to see AMD's evidence for that claim. How do they know what the monitor manufacturers are and are not paying for? I can't possibly imagine that either Nvidia or a monitor manufacturer would give AMD information on contracts that don't involve AMD. That's just not reasonable business practice.

From reports from monitor manufacturers, Nvidia has been giving them a lot of help in order to get G-Sync going, in the form of Nvidia engineers working on their individual monitors. Why should we assume there's a licensing fee involved? What is the evidence behind AMD's claim?

In short, where's the beef?


We don't charge any licensing fees or material costs to the monitor manufacturer. In these categories, there are no costs to pass on to the consumer. There is no hardware the vendor must buy from AMD, as vendors must do with the NVIDIA Gsync module. It remains to be seen how monitor vendors will price the extra validation/QA work that's required of any dynamic refresh display. Likewise, dynamic refresh-aware panels are more costly than your everyday 60Hz panel--I wrote about this in our FAQ.

Nowhere in that is there any claim or assumption that Nvidia is charging license fees. I don't know where you are getting the "assumption of a license fee" from. They are talking about their own costs and fees.

Also WHO CARES if the monitor manufacturers are buying modules from Nvidia directly or not? That's the most nitpicky thing you could possibly snag on to. In every single thread about Freesync you interpret everything in the worst way with selective reading and purposely manipulate the conversation to put AMD in as negative of a light as possible.
 
Last edited:

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
I Think a valid question is, how long is a frame of latency with varied refresh rates?

This is more a discussion for the thread dedicated to the monitor, but...

Its in the review. For BF4, Vsync off was run at 84FPS (1000/84=11.9ms/frame), Gsync was run at 82FPS (1000/82=12.2ms/frame). For Crysis 3, Vsync off was run at 47FPS (1000/47=21.3ms/frame), Gsync was run at 45FPS (1000/45=22.2ms).
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Also WHO CARES if the monitor manufacturers are buying modules from Nvidia directly or not?

Ask the guys at AMD who claimed the monitor manufactures do, they brought it up.

Do you think they should back up their claims with facts and sources?
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Ok, so they don't need to buy hardware from AMD for Freesync. Instead, they have to buy it from somewhere else.

See where this is going?
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
Ask the guys at AMD who claimed the monitor manufactures do, they brought it up.

Do you think they should back up their claims with facts and sources?

Honestly I just saw it as a comment that meant "it had to be bought from someone", and it wasn't worded by a lawyer, so they guy mentioned Nvidia. Going off on a tangent on whether Nvidia is the sole supplier is pointless.