[Techreport] Intel plans to support VESA Adaptive-Sync displays

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
NVIDIA already supports Adaptive Sync. It's called mobile G-Sync and it's on eDP 1.3 displays. No module required, all processing is done by the GPU.

Why can't people get their heads around the fact that NVIDIA can do this at ANY TIME! Stop buying G-Sync module monitors and force their hand!
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The hardware still doesnt support it. The LSPCon for Thunderbolt for that matter is plain DP1.2 as well.

So not only do they need a new CPU that actually supports DP1.2a/DP1.3. They also need a new Thunderbolt LSPCon.

Skylake will support 1.3. But it may only be present on devices with TB3.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
NVIDIA already supports Adaptive Sync. It's called mobile G-Sync and it's on eDP 1.3 displays. No module required, all processing is done by the GPU.

Why can't people get their heads around the fact that NVIDIA can do this at ANY TIME! Stop buying G-Sync module monitors and force their hand!

Not going to happen. That's the very definition of being a loyal/biased/blind consumer that he/she can never view their preferred brand in a negative light since it's akin to admitting that individual made the wrong decision and he/she views any attacks on this brand as an attack on themselves. Therefore, any attack on G-Sync is an attack on NV and thus on themselves -- must defend G-Sync!! :)

With Intel adopting FreeSync, it should bring a lot more FreeSync monitor offerings to the market since it opens a huge market for monitor makers to tap into because a lot of gamers use low-end GPUs for light/casual gaming. These budget gamers would have little to no interest in paying a $100-150 premium for G-Sync. Once they do acquire FreeSync monitor, should they decide to later get a discrete GPU, if NV doesn't offer GSync, they will need to get a. Ew monitor or end up with another Intel/AMD APU/GPU upgrade. Because of this NV is going to be under a lot of pressure in the future (as Intel IGP performance improves) to support FreeSync.
 
Last edited:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
...With Intel adopting FreeSync...

Intel isn't adopting FreeSync - just supporting Adaptive Sync. They'll probably call it something lame like iSyncronous or something... haha

I get what you're saying. I'm not opposed to G-Sync, just the proprietary ASIC. Mobile G-Sync is the right direction.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
If I had to guess, NV will support Async with Pascal. They will continue to promote G-Sync as a better product, but giving the option to it's buyers will make them more appealing.

I'll use myself as an example: I want a new monitor and decided to wait until I picked a GPU as I'd be tied to that ecosystem. I picked GeForce but am still hesitant to spend the extra for a G-Sync model monitor. The cost difference, while frankly peanuts to me, is more so - why pay more for something marginally different (I won't say better or worse, since that is subjective).

I'll wait and see if Nvidia decides to adopt Async, otherwise next upgrade unless Pascal completely blows Fury X2 out of the water, my monitor desire will influence me much greater than it did this round.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
It already exist in another form for laptops as PSR.

PSR means the GPU goes to sleep, not that the panel doesn't refresh. As per the name, it refreshes, it just SELF refreshes.

http://www.anandtech.com/show/7208/understanding-panel-self-refresh
When displaying static content however (E.g. staring at the home screen, reading a page of an eBook), the display pipeline and associated DRAM are consuming power sending display updates when it doesn't need to. Panel Self Refresh (PSR) is designed to address the latter case.

To be clear, PSR is an optimization to reduce SoC power, not to reduce display power. In the event that display content is static, the contents of the frame buffer (carved out of system RAM in the case of a smartphone) are copied to a small amount of memory tied to the display. In the case of LG's G2 we're likely looking at something around 8MB (1080p @ 32bpp). The refreshes then come from the panel's memory, allowing the display pipeline (and SoC) to drive down to an even lower power state.

Also worth noting that apparently Intel used to have a feature where when you went from mains power to battery, its GPUs would drop the refresh rate from 60Hz to 40Hz on the display. Not adaptive, but a one time step like changing it in Windows settings. SO clearly they do care about changing refresh rates.

They also said in 2011 that PSR would be on all laptops "in 2 years". It's 4 years later. Or 3 years and 11 months and 6 days if you want to be specific.
http://www.theregister.co.uk/2011/09/14/intel_demos_panel_self_refresh_display_tech/
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Adaptive Sync will benefit Intel more than anyone else, I think, given the very wide fluctuations between min/max frames on Intel IGP. I just hope they are smart enough to implement methodology similar to mobile G-Sync and repeat frames at or below the bottom of the panel's refresh.
 

brandonmatic

Member
Jul 13, 2013
199
21
81
ExtremeTech states a 2017 time frame for implementation.

The only fly in the ointment is the timing. According to Tech Report, no current Intel GPU hardware supports Adaptive-Sync, which means we’re looking at a post-Skylake timeframe for support. Intel might be able to squeeze the technology into Kaby Lake, with its expected 2016 debut date, but if it can’t we’ll be waiting for Cannonlake and a 2017 timeframe.

http://www.extremetech.com/gaming/212642-intel-will-support-freesync-standard-with-future-gpus
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Please supply documentation for that. Because all Intels material only contains 1.2.

The article I had read was wrong. You are correct. It supports two full 1.2 connections via a thunderbolt 3 port.