VESA Adopts Adaptive-Sync

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Mand

Senior member
Jan 13, 2014
664
0
0
We might as well discuss how warp drive might work. We have no information to form the basis of a discussion about how A-Sync might or might not work.

It's not "calling" anything to point that out, either.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Right now based on what we know we don't think it does work, or at least I don't. AMD needs to show they have done more than just port the eDP features to the DP 1.2a spec optionally to show it has any use to gamers at all, that they so far haven't done. Until they actually technically brief people on how it works its vapourware.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It wasn't using variable refresh at all, because it was two laptops each showing a different --STATIC-- refresh rate. One at 30FPS 60Hz vsync, one at 50fps 50Hz vsync. This can be proven by looking carefully at the on-screen display of the video and looking at its settings.

The CES demo did NOT demonstrate variable refresh of any kind, Adaptive Sync or otherwise. There is a huge, huge leap from using PSR to extend vblank to change refresh rate to another static value and changing vblank frame-by-frame so that the display waits for when the GPU is ready. At best, what was demonstrated at CES would be able to "guess" how long the GPU will take to render the next frame, and then tell the display how long to extend vblank. Only the whole point is that we don't know how long the GPU will take to render the next frame, which means the guess could be off, which will cause the same sort of stutter you get when using normal vsync. You could mitigate the effects of guessing wrong by buffering, but then you have input lag. But there's also no guarantee that even the guessing would be able to do this fast enough to change refresh rate on a frame-by-frame basis. Intel said at one point that they could get this style of PSR-based vblank-extension framerate change to update once a second. One second is an eternity in this context, and until they demonstrate something faster on actual hardware I will remain deeply skeptical that using this technique will work.

G-Sync doesn't do any of this guesswork, and that's why it has a huge chunk of memory in the G-Sync module. It has to do a lot more in order to get to the optimal solution.

Wait, are you trying to disprove what AMD said it was doing by watching a 30fps YOUTUBE video?!

So far nothing you have said has proven anything except for the fact that you apparently hate AMD and will do anything for nVidia.

You do understand that the "controller" that understands the variable refresh method is not some extra piece of hardware? Its the controller that is already in the screen. Yes it needs to have that functionality added, and thats why its now a VESA standard. Now ANY GPU maker will be able to make use of it, and a display maker can integrate it into every single display. Since the majority of displays from a given maker use the same controller.

What I do not understand is why you are trying to prove that G-Sync is somehow way better than A-Sync. Especially since all you know is what nVidia has on its slides, and what AMD/VESA have on theirs. You don't know anything about how the software works under the hood for either. Which means its impossible to compare the two and say which is better at this point in time.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Wait, are you trying to disprove what AMD said it was doing by watching a 30fps YOUTUBE video?!

Youtube's framerate doesn't change what the laptop's on-screen display is showing.

In your haste to mock me, you forgot to actually look at what the video was showing.

So far nothing you have said has proven anything except for the fact that you apparently hate AMD and will do anything for nVidia.

Yawn. If you are so convinced I hate AMD and will do anything for Nvidia, why do I have AMD products and not Nvidia products, then? Hmmm? What you're doing is called projecting: you assume that the other guy must be as biased as you are, and in the same ways.

You do understand that the "controller" that understands the variable refresh method is not some extra piece of hardware? Its the controller that is already in the screen. Yes it needs to have that functionality added, and thats why its now a VESA standard. Now ANY GPU maker will be able to make use of it, and a display maker can integrate it into every single display. Since the majority of displays from a given maker use the same controller.

It is not an extra piece of hardware, but it is a piece of hardware that does something that the current ones don't do now. Hence, you need new hardware to do A-Sync. You say "yes it needs to have that functionality added" - so you do understand this, but you still think I'm wrong? How does that work? Who is going to add that functionality? How much is it going to cost? Who is going to pay for it? Answers: Not AMD, a lot, and us consumers. So "FreeSync" is not free.

What I do not understand is why you are trying to prove that G-Sync is somehow way better than A-Sync. Especially since all you know is what nVidia has on its slides, and what AMD/VESA have on theirs. You don't know anything about how the software works under the hood for either. Which means its impossible to compare the two and say which is better at this point in time.

You don't understand why I'm trying to prove that a product that exists is somehow way better than a product that doesn't exist? Seriously?

We have way more information than Nvidia slides, and to claim that's all we have is just laughable. There are G-Sync displays that real people are using. We know a lot about how they perform, people have reviewed them, tested them, evaluated them.

It is very possible to compare the two. And to then say "Hmm, this one exists, and that one doesn't."
 

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
I think it's funny how many people think this optional feature set is going to be appearing in monitors quickly. Have you noticed there is still no HDMI 2.0 capable chips on the market for TV's. That is a MAJOR market that every TV OEM is clamoring for. Now think about how far down the popularity and spec sheet A-Sync is in comparison.

If you cant get the controller chip makers to spin the silicon for HDMI 2.0 4k 60HZ TV in a timely manner think how long you are going to be waiting for A-sync to show up in a VESA compatible monitor controller. Not to mention it may have missed the boat to be included in the very delayed HDMI 2.0 compatible TCONS I just mentioned that I would assume they also use for monitors to some extent.

Actually I'm a little surprised Nvidia didn't just lump HDMI 2.0 functionality into their FPGA and sell that to all the TV companies along side G-Sync monitors.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
What is the point of being so full of angst over the possibility of a legitimate alternative to G-sync? It sure feels like some people are putting an unhealthy amount of personal energy into deriding it.

OK so it isn't out yet, so what, nobody said it was. Nobody said it would be free either. Yes, duh, you would have to buy a new monitor, and possibly a new graphics card. Nobody who is interested in something like this, follows these developments, comments on these stories, and also owns a brain would think otherwise. It is no different with Gsync. However to people that have some clue about what it takes to make a monitor, it looks like a monitor with updated standard would be cheaper than the same monitor with the old standard plus a Gsync board instead. Saying crap like "who pays for this? consumers" sounds like you are trying to manipulate people with poor critical thinking skills, like some shill on Fox News. Consumers always pay for everything, it's the obligation of the company to squeeze every penny out of them they possibly can. There is no reason that one option can't be cheaper than another, similar option.

I mean I don't see a reason for more than a reaction like "hey guys we don't know everything yet, so don't get too excited, but either way it sure is great a technology like this is moving forward!"

All else being equal, anybody sane and without profit motives should want an open standard, for reasons I won't go into because they should be obvious to even the dumbest humans. If Gsync is superior, but not in any significant way that the market cares about, it will probably still lose in the long run.

I also applaud you for the "I own AMD so I can't possibly have bias" claim, but it doesn't quite work that way.

At this point Nvidia should look into opening up their Gsync standard to competing video cards but try to corner the market for the modules to sell to monitor vendors. Then they can at least profit from the hardware sales and have a steady income from Gsync that their valiant defenders keep moaning and groaning about that Nvidia "deserves for their hard work" :rolleyes: It is also very possible to make a profit and gain marketshare without the toxic "It's my ball and if you don't like it I'm going to take my ball and go home" attitude that Nvidia constantly displays.

If you want Gsync to succeed so badly then instead of futilely battling the effects of the market energy would be better spend going after Nvidia to open up Gsync to work with other vendor's video cards. And if for some insane reason you think a company deserves anything ever, then take solace in the fact that they get money from each Gsync module sold, plus the goodwill of working with other companies, and market presence and brand recognition that comes from having an Nvidia logo on each monitor. The more places they can put their weird line-eyeball-thing, the more people will go "hey I recognize that logo" when shopping around.
 
Last edited:

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
If participants in this discussion do not stop using personal attacks in an effort to discredit the statements of others, the thread will be locked, and infractions will be handed out to the offenders.

The discussion is about VESA's adoption of Adaptive-Sync to the DisplayPort 1.2a specification. Get back on topic now, or get out of this thread.

-- stahlhart
 

Hitman928

Diamond Member
Apr 15, 2012
6,642
12,245
136
Mand, I honestly don't see why you are so worked up about with this. I feel like you're looking for a reason to be upset about it.

I don't understand why the "SOURCE!!!!" clamor is so strong, all of this is easily available by a quick Google search, and has been posted repeatedly in every discussion of FreeSync ever, but I'll do it again.

On why eDP isn't a magic bullet:

Who ever called eDP a magic bullet? The only reason it was mentioned is because AMD said that eDP is where the technology started and that displays meeting all of the eDP 1.0 standards could work with freesync. That's it. They didn't say eDP is here so nothing needs to be done. Obviously eDP isn't a "magic bullet", no one claimed it to be.

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

On hardware requirements:

First, AMD's original presentation:

http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

They were rather emphatic on the "no new expensive hardware" bit in the presentation.

What AMD said:

AMD doesn’t want to charge for this technology since it’s already a part of a spec that it has implemented (and shouldn’t require a hardware change to those panels that support the spec), hence the current working name “FreeSync”.

How you seem to have interpreted it:

AMD said that freesync will work without any cost to anyone and will work on any monitor without any changes

AMD only said they won't charge for the technology (i.e. licensing fees, anyone in the industry understands this) because the needed tech is already in a spec that they use and that those monitors that support this spec (full eDP 1.0) won't need a hardware change.

Then, a day later:

http://www.pcper.com/reviews/Graphi...h-FreeSync-Could-Be-Alternative-NVIDIA-G-Sync

Bold added, and replace 1.3 with 1.2a and you have the same situation. AMD first said it didn't require new hardware, and then later said it requires a new controller and that their intent was to encourage display manufacturers to develop the necessary hardware. And who is going to pay for that development? The consumer.

What you quoted is exactly what they said in the previous quote. Those displays that already fully (i.e. including the optional variable refresh) support eDP 1.0 or the upcoming DP1.2a do not need additional hardware. Nothing changed between the two quotes you seem to think contradict each other. Obviously changes need to be made to monitors that don't support the spec, which includes pretty much every desktop monitor. That's why the point of this thread is that this technology is now a part of the regular DP spec (still optional).

Any new spec changes will require adjustments in hardware for almost every case. This will cost money to implement. Everyone with any familiarity in the industry knows this. The point is that there won't be licensing fees associated with the technology and that it won't require a large "discrete" (read: expensive) hardware addition to the monitor but can instead be incorporated into the existing controllers to keep cost down. Sure, monitors that support Async will probably be more expensive than those that don't, but it shouldn't be anywhere near what a Gsync enabled monitored will cost and may even become wide spread enough to be negligible.

Beyond that, the demo presented at CES claims to show FreeSync in action, which it doesn't. If you look at the video itself

While the video doesn't show much, here is what Anandtech had to say about the demo from actually being there:

the system on the right is able to vary its frame rate and synchronize presenting each frame to the display's refresh rate

Again, it's obvious that Nvidia has a large lead time on AMD on this and that the CES demo was probably put together last minute to try and counter act the Gsync hype and hence, isn't much. How well does freesync really work? No one knows and in this you have a point. I am also a skeptic about how well it will actually perform, but there's a way to hold reasonable skepticism without jumping to numerous conclusions based on that skepticism. Same goes for those announcing the death of Gsync without seeing a single working product with freesync enabled.

I'm an AMD owner, not Nvidia, so don't even start with anti-AMD bias on my part.

Completely irrelevant to the discussion.

As I said, I'm skeptical about freesync as well, but I think you need to take a bit of a step back and just wait and see. We all know how you feel about it, no need to keep beating the same drum. In the end you'll be right or wrong, time will tell. If freesync even works close to the level of Gsync, I think Nvidia will have a hard time. There are plenty of examples in this industry where the market chooses the inferior product because of pricing, compatibility, etc. (e.g. USB). VESA putting this into the standard helps and AMD has given a time line of 6-12 months for monitors supporting the new standard to come out, if that happens, I'll look forward to the reviews, if not, the longer it goes the more dead it will become.

I will say this, Gsync looks like a great piece of technology, but Nvidia needs to get the cost of Gsync down, even if freesync fails. The price they are charging is what most people will pay for their video cards and higher than a lot of people's monitors. Sure they'll sell some either way, but if they want Gsync to actually be profitable for them, or even a good value add for their cards, they're going to need to get the price to more accessible levels, that's just business.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
I'm not upset, and if all people can do is say I shouldn't be believed because I'm being a meanypants to AMD, I'm not sure why I should bother responding.
 

SoulWager

Member
Jan 23, 2013
155
0
71
One, this isn't a discussion about AMD. AMD gave this baby up for adoption. The burden will have to be carried by whichever display manufacturer first decides the development costs of a-sync compatible scalers are justified, despite the option to sell smaller volumes of high margin displays with g-sync.

This announcement is just paperwork that defines the interface and basic requirements, it doesn't tell us which, if any, display manufacturers are working on supporting it, when they started working on it, or how well it's going to work.

While the video doesn't show much, here is what Anandtech had to say about the demo from actually being there: *snip*

Did Anand come to that conclusion from playing with the demo, or based on what AMD said about a-sync? The slow motion video shows "smooth" and "not smooth" in a way that doesn't require a-sync or anything like it.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
One, this isn't a discussion about AMD. AMD gave this baby up for adoption. The burden will have to be carried by whichever display manufacturer first decides the development costs of a-sync compatible scalers are justified, despite the option to sell smaller volumes of high margin displays with g-sync.

This announcement is just paperwork that defines the interface and basic requirements, it doesn't tell us which, if any, display manufacturers are working on supporting it, when they started working on it, or how well it's going to work.



Did Anand come to that conclusion from playing with the demo, or based on what AMD said about a-sync? The slow motion video shows "smooth" and "not smooth" in a way that doesn't require a-sync or anything like it.

adaptive-sync compatible scalers already exist. They use them on laptops.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
adaptive-sync compatible scalers already exist. They use them on laptops.

The problem is that in laptops, almost everybody uses eDP, and with eDP VESA did exactly tell them how to build the controller.

On the desktop side, there is no such plan, you "just" have to pass certain tests and you are done. This is an important point that talks against a fast implementation of A-Sync.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
adaptive-sync compatible scalers already exist. They use them on laptops.

Technically this isn't entirely true. a-sync is a DP feature, the scalers you are talking about are for eDP, although I imagine they would be quite similar to what you use for DP, and I don't think it will be all that problematic for the manufacturers to produce DP compliant scalers with a-sync.
 

Mand

Senior member
Jan 13, 2014
664
0
0
adaptive-sync compatible scalers already exist. They use them on laptops.

False. No adaptive sync scalers have been demonstrated.

What has been demonstrated is that you can use eDP connections in laptops to manipulate vblank to change the refresh rate from one static value to another static value. It has --NOT-- been demonstrated that eDP is capable of doing this fast enough for true, frame-by-frame, GPU-synced variable refresh.

In addition, the reason the eDP and laptop discussion even exists is because eDP systems don't have the scaler that desktop displays have. So even if adaptive sync did work on laptops, which has not been demonstrated, it would not be because of an adaptive sync compatible scaler.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
False. No adaptive sync scalers have been demonstrated.

What has been demonstrated is that you can use eDP connections in laptops to manipulate vblank to change the refresh rate from one static value to another static value. It has --NOT-- been demonstrated that eDP is capable of doing this fast enough for true, frame-by-frame, GPU-synced variable refresh.

In addition, the reason the eDP and laptop discussion even exists is because eDP systems don't have the scaler that desktop displays have. So even if adaptive sync did work on laptops, which has not been demonstrated, it would not be because of an adaptive sync compatible scaler.

I don't know what the scaler has to do with A-Sync ... but hey :awe:

Edit: Let's just stick with "controller" or with how it's called in eDP, TCON. Ok ?
 
Last edited:

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I don't know what the scaler has to do with A-Sync ... but hey :awe:

Edit: Let's just stick with "controller" or with how it's called in eDP, TCON. Ok ?

I believe all the talk about the scaler, is due to an Nvidia guy mentioning that certain capabilities necessary for a-sync is not possible with the current DP scalers (which is why Nvidia went out and made the g-sync module, which replaces it), however I also think I read somewhere that this is actually only a problem for Nvidia since the necessary capabilities can be handled by the GPU in AMDs case.

I'll admit that I'm pretty foggy on the whole thing though.

Edit: here's the link to the Nvidia guy talking about the scaler: http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo
and here's the link to the AMD guy mentioning how they can do dynamic refresh rates on the GPU, unlike Nvidia who therefore has to rely on the scaler: http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=3
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
however I also think I read somewhere that this is actually only a problem for Nvidia since the necessary capabilities can be handled by the GPU in AMDs case.

and here's the link to the AMD guy mentioning how they can do dynamic refresh rates on the GPU, unlike Nvidia who therefore has to rely on the scaler: http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=3

Considering that you still need the GPU to be able to send out variable refresh signals even with the G-Sync module, I'm not sure I believe AMD's assessment of Nvidia's capabilities.

Both AMD and Nvidia acknowledge that you need a controller capable of variable refresh. None of them have been demonstrated yet, except for the G-Sync module.

Doing dynamic refresh rates on the GPU is one thing. Having those dynamic refresh rates actually make it to the display is another. If it were as easy as AMD has claimed, why haven't they shown it yet?
 

SoulWager

Member
Jan 23, 2013
155
0
71
I believe all the talk about the scaler, is due to an Nvidia guy mentioning that certain capabilities necessary for a-sync is not possible with the current DP scalers (which is why Nvidia went out and made the g-sync module, which replaces it), however I also think I read somewhere that this is actually only a problem for Nvidia since the necessary capabilities can be handled by the GPU in AMDs case.

I'll admit that I'm pretty foggy on the whole thing though.

Edit: here's the link to the Nvidia guy talking about the scaler: http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo
and here's the link to the AMD guy mentioning how they can do dynamic refresh rates on the GPU, unlike Nvidia who therefore has to rely on the scaler: http://www.techpowerup.com/196557/amd-responds-to-nvidia-g-sync-with-freesync.html?cp=3
Variable refresh isn't something that's implemented in either the GPU or the display, it has to be implemented everywhere. About the only thing that could be moved from the g-sync module into the GPU is the RAM that holds the frame for panel self refresh, re-sending frames if the next one isn't done by the time the LCD hits it's maximum refresh interval.

All the talk about developing new display controllers is due to the fact that AMD hasn't yet demonstrated a monitor's refresh rate adapting to changing framerate, despite what they've said about eDP.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Considering that you still need the GPU to be able to send out variable refresh signals even with the G-Sync module, I'm not sure I believe AMD's assessment of Nvidia's capabilities.

In a previous argument you tout Raja Koduri's knowledge as impeccable, but here in this statement you aren't so sure. Which is it? You can't have it both ways.

Both AMD and Nvidia acknowledge that you need a controller capable of variable refresh. None of them have been demonstrated yet, except for the G-Sync module.

You either need a controller or you don't need one, ie you direct drive the monitor from the GPU a la eDP. There are some Korean monitors out there right now that are setup with direct drive.

Doing dynamic refresh rates on the GPU is one thing. Having those dynamic refresh rates actually make it to the display is another. If it were as easy as AMD has claimed, why haven't they shown it yet?

Ahh they did with their demo. That and the fact that they went with the standards approach. Proprietary usually means you have partners and product to show when you announce because you've already been working on the project for sometime. Standards you have to wait for it to be approved before you start the work on productizing it. It's pretty clear that many users on this forum don't work on projects in large corporations. Their assumption, like yours, is you can just whip stuff up and show it off. There are some cases like with 802.11n where companies released "pre-draft" products. It's rare though because you could guess incorrectly where the standard ends up and be left with obsolete product. Either way when the project is started there is a lot of work and frankly corporate bureaucracy to get through. A project plan needs to get made, budgets need to get done, capital allocated, departmental resources aka people have to be assigned, marketing has to get done, any manufacturing requirements need to be completed, etc. The list is quite long no matter what industry you are in. That's why AMD is estimating 6-12 for productization. They have clearly started the work on their side, but now it's hurry up and wait on the TCON, scaler, and monitor manufacturers to take the spec and implement it on their side.
 

Mand

Senior member
Jan 13, 2014
664
0
0
In a previous argument you tout Raja Koduri's knowledge as impeccable, but here in this statement you aren't so sure. Which is it? You can't have it both ways.

I treat him as a source of great accuracy in regards to AMD's capabilities, but not so much as far as Nvidia's capabilities. Why is this unreasonable, exactly?

And, again, no, they did NOT show variable refresh in their demo. What they showed, and you can look for yourself since they posted the video, is two laptops with FIXED, STATIC refresh rates, that weren't varying. You keep repeating a claim that is directly contradicted by video evidence (the OSD of the laptops shows exactly what they're doing, one is running 30 FPS at 60 Hz vsync, the other is running 50 FPS at 50 Hz vsync).

You're right that we're waiting on the TCON, scaler, and monitor manufacturers to take it up. That's my whole, entire point. Adaptive sync does not yet exist. It has not been demonstrated that the eDP technique actually results in variable refresh. These are facts. I don't pretend to know what AMD is doing in its top secret internal research, but I do know they've never actually demonstrated variable refresh yet. The CES demo did not do it.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Ahh they did with their demo. That and the fact that they went with the standards approach. Proprietary usually means you have partners and product to show when you announce because you've already been working on the project for sometime. Standards you have to wait for it to be approved before you start the work on productizing it.

There wasn't a standard before Adaptive-Sync. AMD hasn't released a driver for Freesync on a system with a eDP display since the CES...

The DP1.2a uses AMD own technology. No other company is able to support it, even AMD provides Freesync on only a few cards.

AMD sent their documents to VESA last november. They haven't been able to produce a demo system since then.

It's clear that up to now Freesync is vaporware.

/edit: Philips and AOC expect to bring Adaptive-Sync display to the market in next Q1: german: http://www.computerbase.de/2014-05/aoc-philips-g-sync-ist-adaptive-sync-ueberlegen/
 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
adaptive-sync compatible scalers already exist. They use them on laptops.

first a-sync laptop maybe...
http://h10032.www1.hp.com/ctg/Manual/c04188028.pdf

Category Description
Product Name HP ENVY m6 Notebook PC
Processors ● AMD™ A10-5750M 2.50-GHz processor (turbo up to 3.50-GHz; 1600-MHz
FSB, 4.0-MB L2 cache, 1600-MHz DDR3, quad core, 35 W)
● AMD A10-7300 2.0-GHz processor (turbo up to 3.20-GHz; soldered on chip
(SOC), 4.0-MB L2 cache, quad core, 19 W)
Chipset AMD A76M fusion controller hub
Graphics Internal graphics:
● AMD Radeon™ HD 8650G graphics on computer models equipped with an
A10-5750M processor
● AMD graphics on computer models equipped with an A10-7300 processor
Support for HD Decode, DX11, HDMI , and PX7.0
Panel Support for the following display assemblies:
● 15.6-in, full high-definition (FHD), white light-emitting diode (WLED),
AntiGlare (1920×1080), slim (3.2-mm), SVA, color gamut 60%, TN, typical
brightness 300 nits, 16:9 aspect ratio
● 15.6-in, high-definition (HD), white light-emitting diode (WLED), AntiGlare
(1366×768), flat (3.8-mm), SVA, color camut 45%, typical brightness 200
nits, 16:9 aspect ratio
Support for low-voltage differential signalling LVDS, co-layout with eDP1.3+PSR)
Touchscreen and MultiTouch enabled
Airgap bonding
 
Status
Not open for further replies.