• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

[Techreport] Intel plans to support VESA Adaptive-Sync displays

cen1

Member
http://techreport.com/news/28865/intel-plans-to-support-vesa-adaptive-sync-displays

IDF — In a Q&A session this afternoon, I asked Intel Fellow and Chief Graphics Software Architect David Blythe about Intel's position on supporting the VESA Adaptive-Sync standard for variable refresh displays. (This is the standard perhaps better known as AMD's FreeSync.)
Blythe indicated that Intel is positively inclined toward standards-based solutions like Adaptive-Sync, and he said Intel does indeed plan to support this optional extension to the DisplayPort spec. However, Blythe wasn't yet willing to commit to a timetable for Intel enabling Adaptive-Sync support in its products.
The question of a timetable is complicated by whether Intel's GPU hardware will require an update in order to enable Adaptive-Sync capability. A source familiar with the matter has indicated to us that this feature is not present in current hardware, so in all likelihood, Adaptive-Sync support will have to wait until at least after the Skylake generation of products.
Supporting Adaptive-Sync would be a natural next step for Intel, whose integrated graphics processors stand to benefit tremendously from displays with a more forgiving and flexible refresh cycle. Intel's backing would also be a big boost for the Adaptive-Sync standard, since the firm ships by far the largest proportion of PC graphics solutions.
 
This was expected and this is precisely the reason why Gsync will lose to VESA Adaptive Sync based technologies like FreeSync and whatever Intel calls their implementation. The range of monitors supporting Adaptive sync will increase rapidly once Intel implements it on their SoCs. I think Kabylake might support Adaptive Sync. :biggrin:
 
This was expected and this is precisely the reason why Gsync will lose to VESA Adaptive Sync based technologies like FreeSync and whatever Intel calls their implementation. The range of monitors supporting Adaptive sync will increase rapidly once Intel implements it on their SoCs. I think Kabylake might support Adaptive Sync. :biggrin:

Fingers crossed for iN-Sync. Bye Bye Bye. :$
 
I hope Nvidia does adopt Adaptive Sync soon, but probably be a while as it would kill any future G-Sync monitor sales. Even if Nvidia supported G-Sync as some kind of premium solution I think most would buy an adaptive sync monitor once Nvidia decides to adopt A-Sync monitors.
 
Easy solution. Nvidia will support both. G-Sync monitors will become their "premium" value add while also supporting A-Sync. Yes, it'll cost more; but gamers who want that "premium" feature will pay. They're already paying more.
 
Easy solution. Nvidia will support both. G-Sync monitors will become their "premium" value add while also supporting A-Sync. Yes, it'll cost more; but gamers who want that "premium" feature will pay. They're already paying more.

I'd say they might differentiate their product stack by offering "premium" G-Sync displays that have the G-Sync hardware present and then G-Sync Lite versions that use adaptive sync. If they decide however to ditch their module for adaptive sync, they'll probably just lock it down via software and still charge a validation or licensing premium at a reduced price. NVIDIA likes making money and an open standard won't change anything really. With them owning 80% of the dGPU market it won't change anything with Intel supporitng A-sync. Display manufacturers will still build G-Sync monitors as long as there is demand for NVIDIA products.
 
Easy solution. Nvidia will support both. G-Sync monitors will become their "premium" value add while also supporting A-Sync. Yes, it'll cost more; but gamers who want that "premium" feature will pay. They're already paying more.

There might be room for nVidia to add more features through the Gsync module. As they are now though, only people with an nVidia preference will buy it. And then only a certain percentage of them.
 
Easy solution. Nvidia will support both. G-Sync monitors will become their "premium" value add while also supporting A-Sync. Yes, it'll cost more; but gamers who want that "premium" feature will pay. They're already paying more.

Nah ... they will just start to use A-Sync and you will still have to pay that premium. Why would they not?

That said, it could still have value for the users as nVidia requires better hardware for their G-Sync branding.
 
I'd say they might differentiate their product stack by offering "premium" G-Sync displays that have the G-Sync hardware present and then G-Sync Lite versions that use adaptive sync. If they decide however to ditch their module for adaptive sync, they'll probably just lock it down via software and still charge a validation or licensing premium at a reduced price. NVIDIA likes making money and an open standard won't change anything really. With them owning 80% of the dGPU market it won't change anything with Intel supporitng A-sync. Display manufacturers will still build G-Sync monitors as long as there is demand for NVIDIA products.

Bingo.

NV will move to support ASync soon enough and they will find a way to make a buck from it, no doubts about it.
 
Its a cheap PR statement. So dont put anything into it yet.

However, Blythe wasn't yet willing to commit to a timetable for Intel enabling Adaptive-Sync support in its products.

And we have seen this countless times before without result. Its the wait and see game.

Its not going to happen with Skylake/Kaby Lake.

It would be nice to finally settle the standard for all, but this isnt it yet before they commit with actual products.
 
Last edited:
Well its PR for now, until their drivers are ready.

I don't think they need a new uarch, since notebooks already have a form of a-sync to save on power. It would be a surprise if Intel hardware wasn't capable already.
 
Not just drivers, also product capabilities because they dont support DP1.2a or DP1.3 yet.

Rule number 1 is always to follow the money. And Intel isnt willing to spend any yet. Else there would be a million dead technologies we would have today if they could live on generous words alone.

If I had to guess, Intel is watching the Adaptive Sync monitor volume and SKUs. It needs to hit some kind of critical mass before they want to commit money. While looking promising in the future if the growth continues, its far from now. I would think we at best talk about a 3 year timeline from today.

A source familiar with the matter has indicated to us that this feature is not present in current hardware, so in all likelihood, Adaptive-Sync support will have to wait until at least after the Skylake generation of products.

And that means we are at Cannonlake or later.
 
Last edited:
Not just drivers, also product capabilities because they dont support DP1.2a or DP1.3 yet.

Rule number 1 is always to follow the money. And Intel isnt willing to spend any yet. Else there would be a million dead technologies we would have today if they could live on generous words alone.

If I had to guess, Intel is watching the Adaptive Sync monitor volume and SKUs. It needs to hit some kind of critical mass before they want to commit money. While looking promising in the future if the growth continues, its far from now. I would think we at best talk about a 3 year timeline from today.



And that means we are at Cannonlake or later.
Or, they don't want affect sales of current products by announcing future product timetables and don't forget Rule number 2, which is make the money come to you.
 
Almost all known display manufacturers have (or announced) an A-Sync Product. There's no need to wait 3 years to know where the trend is going.

The important information here is that Intel is going to support it, no reason to downplay it.
 
Intel could do this because of Apple, and their upcoming Thunderbolt Display.

So it might be much faster then anyone thinks 😉

The hardware still doesnt support it. The LSPCon for Thunderbolt for that matter is plain DP1.2 as well.

So not only do they need a new CPU that actually supports DP1.2a/DP1.3. They also need a new Thunderbolt LSPCon.
 
Last edited:
The hardware still doesnt support it. The LSPCon for Thunderbolt for that matter is plain DP1.2 as well.

So not only do they need a new CPU that actually supports DP1.2a/DP1.3. They also need a new Thunderbolt LSPCon.

Isn't the LSPCon just a converter chip (DP to HDMI)? It should have no impact on the DP version.
 
The hardware still doesnt support it. The LSPCon for Thunderbolt for that matter is plain DP1.2 as well.

So not only do they need a new CPU that actually supports DP1.2a/DP1.3. They also need a new Thunderbolt LSPCon.

Don't worry about that. Apple always have their own, custom versions of hardware - TCon in Retina iMac.
 
Isn't the LSPCon just a converter chip (DP to HDMI)? It should have no impact on the DP version.

Its much more than that. And both the CPU and LSPCon needs to support DP1.2a or DP1.3 to work with A-Sync. And none of them supports it.

Don't worry about that. Apple always have their own, custom versions of hardware - TCon in Retina iMac.

So Apple got some secret side developed Skylake with DP1.2a and their own LSPCon?

Its a great announcement for the long run. But lets stay serious and inside reality for once.
 
Its much more than that. And both the CPU and LSPCon needs to support DP1.2a or DP1.3 to work with A-Sync. And none of them supports it.

I don't think you are right on this one.

Here's what Intel says:

For next generation 2015 platform this will require 3rd party LSPCON discrete chip on motherboard for external video interfaces
- HDMI* 2.0 Only
Source: Accelerating All-in-One PC Growth in Consumer - RPCS003

And ...

A MegaChips MCDP2800 is also present. According to MegaChips, it's a "high-speed level-shifter and active-protocol converter (LSPCON) in a single chip", which most likely is responsible for the HDMI on the back panel IO.
Source: http://www.tweaktown.com/reviews/72...-g1-intel-z170-motherboard-review/index4.html


Edit: But you are right that the CPU needs to support DP 1.2a/1.3, of course.
 
Last edited:
The LSPCon contains a HDMI 2.0 converter. Thats right.

But remember the DP gets packed into the Thunderbolt interface.

intel_thunderbolt_alpine_ridge_40gbps-1024x730.png

Alpine%20Ridge.png
 
But remember the DP gets packed into the Thunderbolt interface.

Yes, the TB Controller is feeded by PCIe and DP from the CPU directly. But it does not change the fact that if connected directly to the DP port on your motherboard, you don't go over the LSPCon as this one is only used for HDMI.

Anyway, this has nothing to do with the topic.

We need DP 1.2a/1.3 support on the CPUs. We know it's coming, but we don't know when exactly.
 
Back
Top