VESA Adopts Adaptive-Sync

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
We don't know why they are only supporting it with GCN1.1. Could be hardware or it could be they aren't interested in investing in backwards compatibility. It's a new standard, after all. You can't expect older hardware to support it. It would be nice, but you can't really expect it. As far as G-sync goes, that's different. That's a paid exclusive feature. You would expect more support.

They don't really seem to be in a hurry to get GCN 1.0 optimizations in mantle either. I guess its a business decision where they decided to say 'scew em' to all the legacy hardware.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Actually, there is. It's G-Sync. It uses the VBLANK for "frequency modulation".

And it doesn't use eDP, which is what I was specifically referring to. The demo of FreeSync at CES did not show what people apparently think it showed, and the goal was to rectify that misunderstanding.

I know full well that G-Sync uses VBLANK manipulation, which makes it all the more bizarre that people seem to think that FreeSync/Adaptive Sync will be this miracle solution that won't require the sort of hardware that G-Sync uses and be free and full of rainbows and happiness and cupcakes.
 

Shivansps

Diamond Member
Sep 11, 2013
3,916
1,570
136
Again, Adaptive Sync its just support by VESA specs, for monitor hardware and cables, to change VBLANK, Freesync is an implementation of that, nothing more, i dont see why just because AMD did not launched a FPGA FreeSync its any good, actually that whould have been better, because there whould have been a price fight.

So, right now we are waiting for a ASIC that has Adaptive Sync support, no idea who gona make it, beliving thats gona kill G-Sync? and offer "free" one?, reality check people. Also G-Sync its not gona die, its gona adapt to use Adaptive Sync.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
They don't really seem to be in a hurry to get GCN 1.0 optimizations in mantle either. I guess its a business decision where they decided to say 'scew em' to all the legacy hardware.

Moving goalposts. This has nothing to do with Mantle, but the last I heard we have people on these boards with GCN 1.0 cards realizing the same type of performance benefits as GCN 1.1 owners.

nVidia came up with a DispayPort function they decided to exploit to make a buck. If people want to buy into it they can. Just realize a lot of people aren't that gullible.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Also G-Sync its not gona die, its gona adapt to use Adaptive Sync.

+1 on this one!

If (we don't know yet) A-Sync will provide the same benefits then the actual G-Sync implementation then it makes sense for nVidia to use A-Sync but branded as G-Sync.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I have started asking around to see if I can get my hands on the VESA spec. Along with that ask I have found out the demo in January from AMD was based on PSR with "waits" for vblank. The implication being it wasn't using adaptive refresh rate or adaptive-sync as VESA is now calling it. Lets hope VESA understands the need to not only say they added something to the spec but what it was they added!
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
... was based on PSR with "waits" for vblank. The implication being it wasn't using adaptive refresh rate or adaptive-sync as VESA is now calling it ...

Well as far ass I know, A-Sync is based on VBLANK. If that's the case, then yes the FreeSync demo was "using" VESA's Adaptive-Sync.

btw : nVidia's G-Sync also uses VBLANK.

In both cases, the "redraw" frequency of the logic board does not change. They "just" add a delay to lower the "refresh rate". Example:

If you have a 120hz screen, the controller will grab what's available on the buffer every ~8.3ms. If you add a delay of ~8.3ms to all frames, the controller will redraw the LCD every ~16.6ms. That's exactly the same refresh rate as with a 60hz screen. If you add ~1.7ms delay, wou will have a 100hz refresh rate.

Visibly, there is no difference between a real 100hz screen and a 120hz screen with a ~1.7ms delay.

This methode is a lot cheaper to implement and should also be faster.

PS: On eDP, you can interrupt the actual VBLANK and force a redraw.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Well as far ass I know, A-Sync is based on VBLANK. If that's the case, then yes the FreeSync demo was "using" VESA's Adaptive-Sync.

btw : nVidia's G-Sync also uses VBLANK.

In both cases, the "redraw" frequency of the logic board does not change. They "just" add a delay to lower the "refresh rate". Example:

If you have a 120hz screen, the controller will grab what's available on the buffer every ~8.3ms. If you add a delay of ~8.3ms to all frames, the controller will redraw the LCD every ~16.6ms. That's exactly the same refresh rate as with a 60hz screen. If you add ~1.7ms delay, wou will have a 100hz refresh rate.

Visibly, there is no difference between a real 100hz screen and a 120hz screen with a ~1.7ms delay.

This methode is a lot cheaper to implement and should also be faster.

PS: On eDP, you can interrupt the actual VBLANK and force a redraw.

That would be result in additional lag.
 

Mand

Senior member
Jan 13, 2014
664
0
0
I have started asking around to see if I can get my hands on the VESA spec. Along with that ask I have found out the demo in January from AMD was based on PSR with "waits" for vblank. The implication being it wasn't using adaptive refresh rate or adaptive-sync as VESA is now calling it. Lets hope VESA understands the need to not only say they added something to the spec but what it was they added!

It wasn't using variable refresh at all, because it was two laptops each showing a different --STATIC-- refresh rate. One at 30FPS 60Hz vsync, one at 50fps 50Hz vsync. This can be proven by looking carefully at the on-screen display of the video and looking at its settings.

The CES demo did NOT demonstrate variable refresh of any kind, Adaptive Sync or otherwise. There is a huge, huge leap from using PSR to extend vblank to change refresh rate to another static value and changing vblank frame-by-frame so that the display waits for when the GPU is ready. At best, what was demonstrated at CES would be able to "guess" how long the GPU will take to render the next frame, and then tell the display how long to extend vblank. Only the whole point is that we don't know how long the GPU will take to render the next frame, which means the guess could be off, which will cause the same sort of stutter you get when using normal vsync. You could mitigate the effects of guessing wrong by buffering, but then you have input lag. But there's also no guarantee that even the guessing would be able to do this fast enough to change refresh rate on a frame-by-frame basis. Intel said at one point that they could get this style of PSR-based vblank-extension framerate change to update once a second. One second is an eternity in this context, and until they demonstrate something faster on actual hardware I will remain deeply skeptical that using this technique will work.

G-Sync doesn't do any of this guesswork, and that's why it has a huge chunk of memory in the G-Sync module. It has to do a lot more in order to get to the optimal solution.
 

Mand

Senior member
Jan 13, 2014
664
0
0
There's no guesswork, the GPU tells the screen when to update. It's the same on G-Sync and A-Sync.

No, it isn't the same. PSR isn't reactive, it's predictive. You tell it how long to hold vblank, and then you now have a new, static refresh rate. Theoretically, and this is a big 'theoretically', you could use this to do frame-by-frame variable refresh and hope that your prediction lines up with when the GPU is ready to spit out a new frame, but that has not been demonstrated yet. Not by the CES demo, not anywhere.

Nvidia said specifically that they chose not to go with this technique because it didn't give them the performance they were looking for.

There are differences. Yes, they both use vblank extension. No, that doesn't mean they're the same.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
PSR isn't reactive, it's predictive.

No. PRS has the ability to be interupted by the GPU. It's in the docs:

When GPU detects new image data (for example from a keystroke or mouse movement), GPU wakes up TCON eDP input and starts sending the new display image data.

Edit: btw, AMD's Product manager confirmed on beyond3d that A-Sync does not " guess in advance how long it will take to render a frame". Source: http://beyond3d.com/showpost.php?p=1846697&postcount=140
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
Considering A-Sync hasn't been demonstrated, I'd like to see his justification for making the claim. Or anything, really, anything at all about how it's supposed to work.

The guessing about guessing is because existing information on PSR indicated that that's how it would do it. If AMD has something else up it's sleeve, that's great, because guessing is a terrible way to do it. Yes, you can interrupt vblank, but the problem is if you undershoot on the initial determination of vblank. Extending it on the fly is not the same as interrupting it.

I'd like to see what they have in mind, though. Because right now all we have is vague promises and a deceptive demo that didn't show variable refresh, but that people love to point to saying it shows variable refresh.

In other words,

AMD, where's the beef?
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Yes, you can interrupt vblank, but the problem is if you undershoot on the initial determination of vblank. Extending it on the fly is not the same as interrupting it.

I don't understand what you want to say here, can you please elaborate?
 

Mand

Senior member
Jan 13, 2014
664
0
0
Vblank is just an interval. The display receives the vblank signal, which contains a time that it sits doing nothing, and then after that time will start the refresh. If the GPU is done with the frame before the vblank interval has completed, you can interrupt this by sending another signal, and the panel will refresh with the newly available frame. If, instead, the vblank interval ends before the GPU is done sending a frame, then the panel will refresh with the existing frame and you get both stutter and lag. There may be a way of re-sending a vblank signal, but from my (admittely imperfect) knowledge that's not how the protocols work.

The limit on how long vblank can be extended is why G-Sync has a floor of 30 FPS. It can't hold vblank longer than 33.33 ms. AMD has stated that A-Sync will work down to the 9 Hz range, and I'd love to see how they plan on doing it.

Actually, I'd love to see any technical information at all. So far they've made a lot of claims, but have provided absolutely nothing for us to use to evaluate those claims. That alone makes me deeply skeptical, especially considering how outright deceptive they were at CES. "It won't require expensive new hardware!" then a day later "no no, the point of our demo is to encourage hardware manufacturers to perform the necessary development!"
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Vblank is just an interval. The display receives the vblank signal, which contains a time that it sits doing nothing, and then after that time will start the refresh. If the GPU is done with the frame before the vblank interval has completed, you can interrupt this by sending another signal, and the panel will refresh with the newly available frame. If, instead, the vblank interval ends before the GPU is done sending a frame, then the panel will refresh with the existing frame and you get both stutter and lag. There may be a way of re-sending a vblank signal, but from my (admittely imperfect) knowledge that's not how the protocols work.

After the VBLANK, the controller reuses the image from the buffer if no new image is available.

This can produce a small delay im some rare cases when the controller is drawing from the buffer and a new image arrives.

It's not possible to avoid this problem if you don't want tearing. Even G-Sync has this problem.

The limit on how long vblank can be extended is why G-Sync has a floor of 30 FPS. It can't hold vblank longer than 33.33 ms. AMD has stated that A-Sync will work down to the 9 Hz range, and I'd love to see how they plan on doing it.

I don't think that the problem is the VBLANK length itself, but the need to refresh the LCD cells to avoid loosing colors.

Actually, I'd love to see any technical information at all. So far they've made a lot of claims, but have provided absolutely nothing for us to use to evaluate those claims. That alone makes me deeply skeptical, especially considering how outright deceptive they were at CES. "It won't require expensive new hardware!" then a day later "no no, the point of our demo is to encourage hardware manufacturers to perform the necessary development!"

Yes, we need more information. And we need tests to know if they can even match G-Sync.

But I don't share your extrem skepticism. The little information we have does sound right, now let them prove it ;)
 

Mand

Senior member
Jan 13, 2014
664
0
0
My extreme skepticism is because just about all of the little information we have they've had to walk back after follow-up reporting.
 

TrulyUncouth

Senior member
Jul 16, 2013
213
0
76
My extreme skepticism is because just about all of the little information we have they've had to walk back after follow-up reporting.

Link to the original claims and walkback? I am truly confused because of all the claims in this thread and any actual link to some claims and backpedaling would help me separate the wheat from the chaff in this thread. It seems like half of the people are prematurely calling this the death of Gsync and a few are saying this won't hold a candle to Gsync. Thanks in advance for links to anything more definitive than wildly varying claims from dudes on a forum...
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No, because you can interrupt a running VBLANK.

You can only calculate the time difference with additional buffers.
To reduce lag you need to bring this to the monitor. And with G-Sync this happens.

/edit: With G-Sync, there doesn't exist a refresh intervall anymore. This is not possible with eDP!
 

tolis626

Senior member
Aug 25, 2013
399
0
76
In my opinion, G-Sync is more a stunt from NVidia's side than anything else. Sure, it's the cream of the crop for the market right now (Or whenever it comes to market), but it will be dead instantly when something that does the same thing comes out, whether it's called A-Sync, Free-Sync or Heat-Sync (Heatsink. Get it? No? Ok...). Variable refresh rate is the way to go, sure, but not in the form of a proprietary kit from NVidia, however good it works. NVidia isn't the whole industry and people need to realize that. I don't care whether an alternative comes from AMD, Intel or anyone else, as long as there is one.

And that's one of the reasons I despise NVidia, however good their products are. They always try to capitalize on stuff they normally shouldn't. An example is, when Tegra 2 was the worst high-end SoC on the smartphone market (And by a large margin), they continued to market it as the best thing available and got game developers to make using advanced graphics only possible on NVidia hardware. Some simple hacks showed they were lying filthily. And that's just one example.

Not that AMD are a bunch of saints, but just sayin'...

Warning issued for thread crapping.
-- stahlhart
 
Last edited by a moderator:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
We have no hard facts on either technology and yet this thread has declared winners and losers. Stick to the tech and leave outcries of death for another forum.
@Mand how are you so sure of anything to warrant such extreme bias? where are your sources?
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
You can only calculate the time difference with additional buffers.
To reduce lag you need to bring this to the monitor. And with G-Sync this happens.

/edit: With G-Sync, there doesn't exist a refresh intervall anymore. This is not possible with eDP!

I don't know what you want to say with this ... :hmm:
 

Mand

Senior member
Jan 13, 2014
664
0
0
I don't understand why the "SOURCE!!!!" clamor is so strong, all of this is easily available by a quick Google search, and has been posted repeatedly in every discussion of FreeSync ever, but I'll do it again.

On why eDP isn't a magic bullet:

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo
However, Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.

That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.

When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.
On hardware requirements:

First, AMD's original presentation:

http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

AMD doesn’t want to charge for this technology since it’s already a part of a spec that it has implemented (and shouldn’t require a hardware change to those panels that support the spec), hence the current working name “FreeSync”.
They were rather emphatic on the "no new expensive hardware" bit in the presentation.

Then, a day later:

http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

All that is needed for this to work, as AMD explained it, was an eDP connection between the discrete GPU and the display, a controller for the screen that understands the variable refresh rate methods of eDP 1.0 specifications and an updated AMD driver to properly send it the signals.

To be clear, just because a monitor would run with DisplayPort 1.3 doesn't guarantee this feature would work. It also requires the controller on the display to understand and be compatible with the variable refresh portions of the spec, which with eDP 1.0 at least, isn't required. AMD is hoping that with the awareness they are building with stories like this display designers will actually increase the speed of DP 1.3 adoption and include support for variable refresh rate with them.
Bold added, and replace 1.3 with 1.2a and you have the same situation. AMD first said it didn't require new hardware, and then later said it requires a new controller and that their intent was to encourage display manufacturers to develop the necessary hardware. And who is going to pay for that development? The consumer.

Beyond that, the demo presented at CES claims to show FreeSync in action, which it doesn't. If you look at the video itself, available at just about any of the links but also directly:

http://www.youtube.com/watch?v=pIp6mbabQeM

The CES presentation did not show variable refresh. It showed two laptops with two STATIC refresh rates. One is running 30 FPS at 60Hz vsync, the other is running 50 FPS at 50Hz vsync. Plain old, everyday, been-there-for-decades vsync. This can be proven from the video itself, by zooming in on the on-screen display on each of the laptop and looking at the settings and frame rates that they are showing. The difference is that the refresh rate on the 50Hz laptop has been modified to 50Hz by use of a vblank extension. That you can change the refresh rate from one fixed value to another fixed value using vblank extension is a far cry from saying that you can use vblank extension to change the frame interval period on the fly, frame-by-frame, perfectly in line with the GPU. What the demo did show was how much better things look when you match the refresh rate to the frame rate. But we know that already, and it's nothing new - vsync has been doing it for decades.

What AMD demonstrated at CES was not variable refresh. Period. They had no problems telling people it was a competitor to G-Sync, though, and letting people walk away believing it. That's at best misleading, and at worst intentional deception. They also let people believe it took no hardware updates, and that it would be free, hence the name. Only they later said that it would require hardware updates, and that they made the demo to "encourage" other companies to do the work that Nvidia decided to do itself:

http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

Koduri told me that AMD wasn't bringing this demo out to rain on NVIDIA's G-Sync parade but instead to get media interested in learning about this feature of eDP 1.0 and DP 1.3, urging the hardware companies responsible to more quickly produce the necessary controllers and integrate them with upcoming panels in 2014. While I don't doubt that it is the case for AMD, I'm sure the timing of the demo and NVIDIA's G-Sync releases this week were not an accident.
The pcper article included a good deal of followup reporting, post-demo, that most other sources didn't bother to look into. Just about everybody else bought the original spin-fest of a presentation hook line and sinker.

We do have hard facts on G-Sync. We have no information whatsoever on FreeSync or Adaptive Sync or whatever AMD feels like calling it this week. And they won't give us any. What it would take for me to be satisfied is the following (hell, I'd love even one of them):

1) A working, true variable refresh demo in hardware. Unlikely, since they're asking other people to develop the hardware first. The recent FAQ in the wake of the VESA update says that they're "working closely with" hardware partners. It'd be nice if they bothered to say who those partners were.

2) A description or explanation of the technical processes behind their take on variable refresh. Given that their "Big Deal" differentiator between their approach and Nvidia is the whole open standard, available-to-everyone-with-no-royalties-or-licensing status, I don't see why they have to be coy about releasing technical details. Yet they've released precisely nothing of substance. The only thing is "it uses vblank! isn't that great!" They don't explain how it uses vblank, how it allows the GPU to actually sync with the display, and most importantly how they plan on pulling it off without the rather extensive hardware supports that G-Sync uses.

3) A list (or even one) of the "partners" they claim to be working closely with. The G-Sync display partners were giddy with excitement and announcing loudly how much they were interested in selling us G-Sync displays. I can't find a single one of them announcing even plans for an Adaptive Sync display.


So no, it's not bias. There's a distinct, real, bias-free difference between the two approaches (I was going to say between the two technologies, but one isn't a technology yet) and between how the two companies have presented themselves and their efforts. Nvidia actually has tangible results, AMD has spin and hope.

I'm an AMD owner, not Nvidia, so don't even start with anti-AMD bias on my part. I would love it if FreeSync/Adaptive Sync worked, because all I want is variable refresh and I don't care who gives it to me. But to hear people get suckered in by deceptive presentations and then have that misunderstanding take on a life of its own in the echo chamber of tech forums until it just becomes part of What Everybody Knows really bothers me. Especially since the truth is not hard to find.
 
Last edited:
Status
Not open for further replies.