[Sweclockers] AMD opens up about Freesync

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
all of the companies that go with the latest DP 1.2a will update their TCONs and scalers to support. Their TCONs and scalers are either programmable so their vendor(s) need to develop the update to flash to existing hardware or they are fixed function and they need new ASICs.

I wouldn't hold my breath. Just look at how well HDMI 2.0 is coming along. All the TV OEM's made it sound like their first gen 4k sets would get a magical update to the TCON/Scaler and that's a major standard. Not an optional superset of a point revision of the DisplayPort standard.

We will get the same line from the monitor makers. "Working on it" "Coming soon" "maybe in a firmware update"
 

SoulWager

Member
Jan 23, 2013
155
0
71
You didn't even read the FAQs or the interviews. F-Sync = eDP implementation. A-Sync = standards based so all of the companies that go with the latest DP 1.2a will update their TCONs and scalers to support. Their TCONs and scalers are either programmable so their vendor(s) need to develop the update to flash to existing hardware or they are fixed function and they need new ASICs. NONE of them are using FPGAs because they are way too expensive. AMD isn't designing the hardware like Nvidia did hence the standards based approach.

1.2a is just the most recent displayport standard, adaptive sync is not required in order to use that version of displayport. There is additional verification in order to use the adaptive sync branding.

Saying all 1.2a displays will support adaptive sync because the standard supports adaptive sync is equivalent to saying all 1.2a displays will be 4k because 1.2a supports 4k. That's simply not how it works.


The important thing is that we still haven't been shown that existing scalers/ TCONs can do what AMD claims, which puts significant doubt on their estimated timeframe.
 
Last edited:

Squeetard

Senior member
Nov 13, 2004
815
7
76
Ima go a bit off topic here and say that game engines also need to be smoothed out. I run a 120hx lightboost monitor at 120fps, beautiful blur free gaming. But, when I go up or down stairs or steep slopes in a lot of games, the screen judders horribly. The camera movement needs tweaking for sure.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
This is not true. A-Sync support is an OPTION. Not "all" companies will update their hardware to support it. Considering there hasn't been an announcement of even one company saying they will, there is no reason to believe this.

Again, standards have two things: requirements, and options. A-Sync is not a requirement, it is an option. Being 1.2a compliant does not tell you whether or not it is A-Sync compliant. And no, it is not just a firmware flash to make them compliant.

"Standards based approach" doesn't mean you magically don't have to do R&D. Someone has to do it. Until a couple weeks ago, AMD specifically stated that they weren't doing it, that they were pushing FreeSync in order to encourage display OEMs to develop the necessary hardware. Now, they say that they're partnering with hardware vendors, but neither AMD nor the vendors will say who they are.

I can't understand how people can continue repeating these clearly, demonstrably, provably false statements. Taking what they said in the FAQ at face value is only as good as whether or not the FAQ is full of lies, misstatements, and half-truths. And given that their claims are easily disproven by a simple Google search, you should not take them at face value in any way regarding FreeSync. They have been deceptive and underhanded since day one with their CES demo, and have not demonstrated any improvement whatsoever.

1.2a is just the most recent displayport standard, adaptive sync is not required in order to use that version of displayport. There is additional verification in order to use the adaptive sync branding.

Saying all 1.2a displays will support adaptive sync because the standard supports adaptive sync is equivalent to saying all 1.2a displays will be 4k because 1.2a supports 4k. That's simply not how it works.


The important thing is that we still haven't been shown that existing scalers/ TCONs can do what AMD claims, which puts significant doubt on their estimated timeframe.

You both need better reading comprehension. DP 1.2a was already the latest DP spec before A-Sync was approved. So a company "going with the latest DP1.2a spec" ie they are implementing A-Sync, then the rest of what I said.

You both refuse to acknowledge that programmable TCONs and scalers exist and you also don't get that none of the physical input/output specs are changing. It's just a logic change that needs to be written into the firmware. Fixed function ASICs are the only type of TCON/scaler that needs a brand new hardware design.

BTW Mand. 3 Companies have already committed to A-Sync.
 

Mand

Senior member
Jan 13, 2014
664
0
0
You both need better reading comprehension. DP 1.2a was already the latest DP spec before A-Sync was approved. So a company "going with the latest DP1.2a spec" ie they are implementing A-Sync, then the rest of what I said.

Which is wrong. Perhaps rather than insulting us you should address what we actually are saying.

DP 1.2a was already the latest DP spec. A-Sync requires new hardware to function. A-Sync therefore cannot be required, and in fact it isn't required. Because if there are already DP 1.2a devices out there, that don't support A-Sync because up until a few weeks ago the spec didn't exist, then it can't be required.

Do you understand the difference between a requirement and a supported option?

I am not refusing to acknowledge that programmable TCONs and scalers exist. But programmable TCONs and scalers capable of variable refresh don't exist, according to statements by both AMD and Nvidia. It's not just a logic change that needs to be written into the firmware, according to statements by both AMD and Nvidia. Why you think otherwise, when two competing companies have told you otherwise, I would like to know. Are you familiar with TCON and scaler programming? Could you take an existing DP 1.2a monitor and change its TCON and scaler to make the change? Because the best in the business said it would require new hardware, and if you have any evidence to prove them wrong I'd love to see it.

You say that fixed function ASICs are the only type of TCON/scaler that needs a brand new hardware design. Maybe you're right, but [citation needed].
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
This seems to be ignoring the fact that I spoke to all the monitor manufacturers and not a single one said their monitors could be updated, many infact offered the exact opposite opinion and told me it would take new hardware. The facts do not agree with Despoiler in this case, its simply untrue that existing hardware in any form supports async. It might be theoretically true but I can not find a single product for which it is true out there in the real world.

The async adjustment to the spec is optional, VESA themselves said that. It wont be in all DP 1.2a devices, how many it will be in we have no idea because we have no product announcements, not a single one.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
Despoiler claims 3 companies have already committed to A-Sync: which ones are they? I haven't seen any announcements.
 

SoulWager

Member
Jan 23, 2013
155
0
71
You both need better reading comprehension. DP 1.2a was already the latest DP spec before A-Sync was approved. So a company "going with the latest DP1.2a spec" ie they are implementing A-Sync, then the rest of what I said.
A-sync implies 1.2a. 1.2a does not imply a-sync. If you want to know if a monitor has a-sync, you look for the a-sync branding, not the displayport version number.
You both refuse to acknowledge that programmable TCONs and scalers exist and you also don't get that none of the physical input/output specs are changing. It's just a logic change that needs to be written into the firmware. Fixed function ASICs are the only type of TCON/scaler that needs a brand new hardware design.
Even "programmable" hardware has some fixed function logic. A-sync wasn't part of the design requirements for any existing scalers or TCONs, so you shouldn't assume it will work the way you want it to until you see a demonstration of these scalers adapting refresh rate to an unpredictably changing framerate.

If that has been demonstrated, show me.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I know this has been seen by most here. I don't know if it's been posted. It answers many of the queries.



http://www.techpowerup.com/200741/g...ds-adaptive-sync-to-displayport-standard.html
The Video Electronics Standards Association (VESA) today announced the addition of 'Adaptive-Sync' to its popular DisplayPort 1.2a video interface standard. This technology delivers several important capabilities to computer users: Adaptive-Sync provides smoother, tear-free images for gaming and judder-free video playback. It also significantly reduces power consumption for static desktop content and low frame rate video.

Computer monitors normally refresh their displays at a fixed frame rate. In gaming applications, a computer's CPU or GPU output frame rate will vary according to the rendering complexity of the image. If a display's refresh rate and a computer's render rate are not synchronized, visual artifacts-tearing or stuttering-can be seen by the user. 1, DisplayPort Adaptive-Sync enables the display to dynamically match a GPU's rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience. In applications where the display content is static-such as surfing the web, reading email, or viewing a slide presentation-DisplayPort Adaptive-Sync allows the display refresh rate to be reduced seamlessly, lowering system power and extending battery life.

During the playback of lower frame rate video content, Adaptive-Sync allows the source to optimize transport of the video format leveraging OS and DisplayPort interfaces. In addition to providing smoother video playback, the lower frame rate enabled by Adaptive-Sync also reduces power demand, extending battery life.

"DisplayPort Adaptive-Sync enables a new approach in display refresh technology," said Syed Athar Hussain, Display Domain Architect, AMD and VESA Board Vice Chairman. "Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user's content, enabling power efficient transport over the display link and a fluid, low-latency visual experience."

2, Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA's embedded DisplayPort (eDP) specification since its initial rollout in 2009. As a result, Adaptive-Sync technology is already incorporated into many of the building block components for displays that rely on eDP for internal video signaling. Newly introduced to the DisplayPort 1.2a specification for external displays, this technology is now formally known as DisplayPort Adaptive-Sync.

"VESA is constantly evaluating new methods and technologies that add value to both the end user and our OEM member companies. Adaptive-Sync delivers clearly visible advantages to the user for gaming and live video, and contributes to the development of sleeker mobile system designs by reducing battery power requirements," said Bill Lempesis, VESA Executive Director. "VESA has developed a test specification to certify Adaptive-Sync compliance. Systems that pass Adaptive-Sync compliance testing will be allowed to feature the official Adaptive-Sync logo on their packaging, informing consumers which DisplayPort-certified displays and video sources offer Adaptive-Sync."

Implementation of DisplayPort Adaptive-Sync is offered to VESA members without any license fee.

Point 1, It updates the refresh rate on a frame by frame basis. So, no it's not a static reduced frame rate like some have said.

2, Adaptive-Sync has been around since 2009 as part of the eDP standard. The components for the tech already exist.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No, this "tech" hasn't been around since 2009.

PSR works on guessing. The GPU needs to know what comes next. So it is only useable for static framesrates - like videos.

PSR doesn't work for games or variable refesh rates because a)all refresh rates needs to be disclosed to the GPU and the PSR command is sent within the vblank intervall before the next frame.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Adaptive-Sync is not only "frame-to-frame". It also brings "PSR" to the desktop market. And this has been around since 2009.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I know this has been seen by most here. I don't know if it's been posted. It answers many of the queries.



http://www.techpowerup.com/200741/g...ds-adaptive-sync-to-displayport-standard.html


Point 1, It updates the refresh rate on a frame by frame basis. So, no it's not a static reduced frame rate like some have said.

2, Adaptive-Sync has been around since 2009 as part of the eDP standard. The components for the tech already exist.

Part of the problem there is that you have mixed what techpowerup and what VESA has said. (1) is actually not a quote from VESA, its techpowerup that has said that, otherwise it would be quotations like the actual statements VESA made. The actual quote about what the technology can do is:

Instead of updating a monitor at a constant rate, Adaptive-Sync enables technologies that match the display update rate to the user's content, enabling power efficient transport over the display link and a fluid, low-latency visual experience.

And quite rightly (2) says that this is a technology that has been present since 2009 with the eDP spec. But there is a big difference between the capabilities claimed in (1) by techpowerup and the actual quote that VESA gave(above) which they are also quoted on in (2).

The actual quotation is a bit of an issue because it is not anywhere near as strong a claim in capabilities as techpowerup has made, infact it just sounds like PSR. On what is their techpowerups claim based? Most likely a misunderstanding or a hopeful prediction, but it is definitely not something VESA said.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Computer monitors normally refresh their displays at a fixed frame rate. In gaming applications, a computer’s CPU or GPU output frame rate will vary according to the rendering complexity of the image. If a display’s refresh rate and a computer’s render rate are not synchronized, visual artifacts—tearing or stuttering—can be seen by the user. DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience.

In applications where the display content is static—such as surfing the web, reading email, or viewing a slide presentation—DisplayPort Adaptive-Sync allows the display refresh rate to be reduced seamlessly, lowering system power and extending battery life.

Quoted from vesa.org
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Well, THIS is on VESA's website. Which is where TPU, as well as others, got it all from. Where are you guys getting your info from?

Thanks, Leadbox, you beat me to it. ;)
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Well, THIS is on VESA's website. Which is where TPU, as well as others, got it all from. Where are you guys getting your info from?

Thanks, Leadbox, you beat me to it. ;)

That's how I roll :p
I think some here are too busy looking for inconsistencies in what's being said they end up glossing over the facts.
 

SoulWager

Member
Jan 23, 2013
155
0
71
I know this has been seen by most here. I don't know if it's been posted. It answers many of the queries.



http://www.techpowerup.com/200741/g...ds-adaptive-sync-to-displayport-standard.html


Point 1, It updates the refresh rate on a frame by frame basis. So, no it's not a static reduced frame rate like some have said.

2, Adaptive-Sync has been around since 2009 as part of the eDP standard. The components for the tech already exist.
The adaptive sync standard in 1.2a requires changing refresh rate on a frame by frame basis, but that's not what was demonstrated by AMD at CES, nor has it been demonstrated that any existing eDP controller can do this. Adaptive sync requires the display change framerate as frames arive, without being told in advance. eDP does not require this, it's perfectly okay for a power saving feature if the gpu sends a SDP telling the display a frame or so in advance when it wants switch to a lower/higher refresh rate, because latency isn't a significant concern in that use case.
Adaptive-Sync is not only "frame-to-frame". It also brings "PSR" to the desktop market. And this has been around since 2009.
If I interpret one of AMD's comments about triple buffering correctly, adaptive sync might not require PSR. G-sync uses PSR as a design choice, presumably because it removes some GPU and link overhead, or maybe because means simpler drivers. Basically, when the g-sync display hits it's minimum framerate, it uses the framebuffer inside the display to refresh the panel. As the maximum amount of time between refreshes depends on the specific panel, this simplifies things for the GPU, because it no longer has to worry about re-sending frames, or figure out how long it should wait before re-sending frames. If you want to make the display cheaper, you can use GPU memory to hold that third framebuffer, and put the maximum interval in the edid, so that the GPU will only re-send frames from that third buffer if it's going to hit that maximum refresh interval. There are tradeoffs here, I think the PSR method can be slightly superior, but there's much room for improvement on nvidia's current design. The biggest issue being the PSR delaying the next new frame when the frametime is between the maximum refresh interval and the maximum+minimum refresh interval(25 to 30fps on the VG248QE). If the display is more proactive about when it decides to update the panel from the internal buffer, that could mostly be avoided. For example, if the previous frametime was 30ms, self refresh the panel at 15ms, so the if the frametime drifts up to 35ms, you don't have to wait until 40ms for the PSR to finish before re-updating the panel with the fresh frame. A triple buffering approach could do slightly better on choosing when to pull the extra refresh forward, because the GPU would have more information on, and control over frame pacing than the display, but it would come at the cost of extra GPU memory and some (likely trivial) overhead. The reason I think the PSR approach is slightly better, is that the extra memory is likely cheaper if it comes with the display than if it comes on the GPU(It can be sized more efficiently based on the resolution of the display, and can be plain ddr3 instead of gddr5.)
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The adaptive sync standard in 1.2a requires changing refresh rate on a frame by frame basis, but that's not what was demonstrated by AMD at CES, nor has it been demonstrated that any existing eDP controller can do this. Adaptive sync requires the display change framerate as frames arive, without being told in advance. eDP does not require this, it's perfectly okay for a power saving feature if the gpu sends a SDP telling the display a frame or so in advance when it wants switch to a lower/higher refresh rate, because latency isn't a significant concern in that use case.

If I interpret one of AMD's comments about triple buffering correctly, adaptive sync might not require PSR. G-sync uses PSR as a design choice, presumably because it removes some GPU and link overhead, or maybe because means simpler drivers. Basically, when the g-sync display hits it's minimum framerate, it uses the framebuffer inside the display to refresh the panel. As the maximum amount of time between refreshes depends on the specific panel, this simplifies things for the GPU, because it no longer has to worry about re-sending frames, or figure out how long it should wait before re-sending frames. If you want to make the display cheaper, you can use GPU memory to hold that third framebuffer, and put the maximum interval in the edid, so that the GPU will only re-send frames from that third buffer if it's going to hit that maximum refresh interval. There are tradeoffs here, I think the PSR method can be slightly superior, but there's much room for improvement on nvidia's current design. The biggest issue being the PSR delaying the next new frame when the frametime is between the maximum refresh interval and the maximum+minimum refresh interval(25 to 30fps on the VG248QE). If the display is more proactive about when it decides to update the panel from the internal buffer, that could mostly be avoided. For example, if the previous frametime was 30ms, self refresh the panel at 15ms, so the if the frametime drifts up to 35ms, you don't have to wait until 40ms for the PSR to finish before re-updating the panel with the fresh frame. A triple buffering approach could do slightly better on choosing when to pull the extra refresh forward, because the GPU would have more information on, and control over frame pacing than the display, but it would come at the cost of extra GPU memory and some (likely trivial) overhead. The reason I think the PSR approach is slightly better, is that the extra memory is likely cheaper if it comes with the display than if it comes on the GPU(It can be sized more efficiently based on the resolution of the display, and can be plain ddr3 instead of gddr5.)

Did you not read the VESA release? We've quoted it multiple times.

"DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience."

This is what it states in VESA's press release about Adaptive-Sync. We haven't seen a review yet, but I hope they would know.

As far as AMD's demo, it wasn't an image that the rendering complexity changed in. That could explain why the FPS was static. Instead we have people going, Look! Look! The frame rate isn't changing! It doesn't work! AMD is lying!!! They're good for nothing deceitful liars! Look! Look! They're lying! They're lying! Everybody, they're lying! Blah-blah-blah...

There are companies who are shooting for 1stQ next year for release. I'm sure there will be demos before then.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Did you not read the VESA release? We've quoted it multiple times.

"DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience."

This is what it states in VESA's press release about Adaptive-Sync. We haven't seen a review yet, but I hope they would know.

As far as AMD's demo, it wasn't an image that the rendering complexity changed in. That could explain why the FPS was static. Instead we have people going, Look! Look! The frame rate isn't changing! It doesn't work! AMD is lying!!! They're good for nothing deceitful liars! Look! Look! They're lying! They're lying! Everybody, they're lying! Blah-blah-blah...
Yes, I've read the VESA press release. You misunderstand the point of the VESA standard, it's not a proof of concept or a hardware implementation. It's an interface specification and a set of requirements that need to be met in order to use the adaptive sync branding. It's paperwork. So far, no implementation has been shown to meet those requirements.

I'm not sure what you think AMD's demo proves, to me the "smooth" one looks identical to what you'd get if you ran a demo with v-sync on a display with it's refresh rate set to a fixed 50hz, which basically any monitor can do, regardless of display technology or interface. Also, there's no way the framerate was limited by rendering complexity, 30k triangles. It's either framecapped in software, or it's limited by v-sync. It's much easier to just change the refresh rate to 50hz than to actually code variable refresh firmware and drivers. If you were going to spend all the man hours to do THAT, you'd certainly come up with a tech demo that shows something that can't be done on a monitor from last century. I'm not saying AMD lied about what was being demonstrated, just that they failed to correct people when they assumed the CES demo was a proof of concept instead of a simulation of what the difference in smoothness would look like. AMD needed to show SOMETHING at CES, what would you have done in their position?
There are companies who are shooting for 1stQ next year for release. I'm sure there will be demos before then.
AMD isn't the party developing these displays. Keep that in mind when you decide how much weight to give their time estimates. If it's a first party source(the display manufacturer that's actually implementing a-sync), I'd definitely be interested in it.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Did you not read the VESA release? We've quoted it multiple times.

"DisplayPort Adaptive-Sync enables the display to dynamically match a GPU’s rendering rate, on a frame-by-frame basis, to produce a smoother, low latency, gaming experience."

This is what it states in VESA's press release about Adaptive-Sync. We haven't seen a review yet, but I hope they would know.

As far as AMD's demo, it wasn't an image that the rendering complexity changed in. That could explain why the FPS was static. Instead we have people going, Look! Look! The frame rate isn't changing! It doesn't work! AMD is lying!!! They're good for nothing deceitful liars! Look! Look! They're lying! They're lying! Everybody, they're lying! Blah-blah-blah...

There are companies who are shooting for 1stQ next year for release. I'm sure there will be demos before then.

We've read the VESA release, but you are misconstruing what a VESA release does.

A VESA release is an update to a Word document. It does nothing to indicate actual accomplishment of actually implementing the technology it describes. It merely says how you would have to do it if you were going to.

You and others have latched on to the "A-Sync enables" as the key word. But the spec does nothing, by itself. It takes actual hardware implementation that matches the protocols in the spec to happen. Is it a good thing that they made the spec? Yes. Did it make it more likely that FreeSync will become a reality? Yes - but only because it was sure to never happen without it. A VESA update is a necessary but not sufficient condition.

Your whole premise is based on this question: why would VESA implement something if they didn't think it would work? And it's a good, reasonable question. But they implement things that don't pan out all the time. That's the nature of optional updates. VESA allows for it, but does not mandate it, so there's no guarantee at all any of it will work, just that if it works it would have to look a certain way.

A VESA spec update IS NOT an indicator that they have an actual working implementation. It is not evidence that they have done frame-by-frame variable refresh. It is the paperwork that is the first step, that guides the future development. It is not a finished product. A spec update is a change to a Word document, it does not invent hardware and it is not proof of invention of hardware.

And as far as AMD's demo, they DID lie about it. Your excuse that it was a fixed framerate because there wasn't variable rendering load is ridiculous. They could have done any kind of rendering sequence they wanted, if the technology was working. From established benchmarks to something custom that goes from tame parts to melt-your-GPU parts. Except, that's not what they showed. What they showed was two laptops, one going at 30 Hz VSYNC and one going at 50 Hz VSYNC. You can see this by looking at the damn video, and reading the laptop's on-screen display, as numerous other people have done. AMD's demo did not show frame-by-frame variable refresh. Nothing has, yet, from AMD's side.

And again, a VESA spec update is not proof that it works, only that if it worked it would have to follow certain protocols, and meeting the spec is no guarantee of A-Sync support. A good example is a 4k display. DP 1.2 provides support for 4k SST. This does not mean that because DP 1.2 happened, a 4k display magically appeared on your desk capable of SST. Someone else had to develop it. The other part of the 4k example is that it shows what it means to be an optional part of the spec. DP 1.2 supports 4k SST displays, but that does not mean all DP 1.2 monitors are 4k SST. The 1080p TN 60 Hz piece of crap can still have DP 1.2, and that doesn't mean it can do 4k.

And no, the components for the tech DO NO, I REPEAT, DO NOT already exist. AMD itself said it requires new hardware, and that the whole point of their FreeSync push and getting the VESA update was to encourage hardware manufacturers to develop it. This is straight from AMD's VP of Visual Computing. How you can ignore it and pretend it's just a firmware update is beyond me.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91

I don't see anything in the post you quoted to suggest that he thinks a simple firmware update will do the trick.
You might want to rethink your whole dp 1.2 and 4k analogy it's all kinds of messed up and doesn't convey well what you're trying to say
Lumping in anything with everything else doesn't help either;)
 

Mand

Senior member
Jan 13, 2014
664
0
0
You might want to rethink your whole dp 1.2 and 4k analogy it's all kinds of messed up and doesn't convey well what you're trying to say

Saying "it's all kinds of messed up" without explaining it is not an argument. Especially since I'm not convinced you actually understand what I'm trying to say, trying to tell me what I'm trying to say seems a bit of an overreach.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
This seems to be ignoring the fact that I spoke to all the monitor manufacturers and not a single one said their monitors could be updated, many infact offered the exact opposite opinion and told me it would take new hardware. The facts do not agree with Despoiler in this case, its simply untrue that existing hardware in any form supports async. It might be theoretically true but I can not find a single product for which it is true out there in the real world.

The async adjustment to the spec is optional, VESA themselves said that. It wont be in all DP 1.2a devices, how many it will be in we have no idea because we have no product announcements, not a single one.

Wow, you have offered up more valuable information that every single poster combined.(my opinion)
You have actually put work into this and have provided information from multiple sources. In fact, you have collected (have) enough information to write a blog or article. Too bad it is all fragmented and buried in a sea of personal attacks and uselessness.
I mean, you have multiple relevant sources that is invaluable to a real discussion about the subject. But it appears not many want to really discuss it at all, as there was very little attention given to what i would call a very respectable effort for a meaningful debate with substance.

I would like to thank you for those efforts and tell those who are really looking at this openly to go back and read your prior post in the thread. In my opinion they are of the highest caliper. With very little effort, you have collected more information than any of the "reporters" and websites.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Saying "it's all kinds of messed up" without explaining it is not an argument. Especially since I'm not convinced you actually understand what I'm trying to say, trying to tell me what I'm trying to say seems a bit of an overreach.

I've never said anything you are attributing to me. All I'm doing is showing an actual credible source that shows the 2 points I highlighted. It does sync the monitor to the card on a frame by frame basis, and eDP has been around since 2009, which is all Adaptive-Sync is. Everything needed for it already exists. No new exotic hardware needed and it does sync the monitor to the GPU on a frame by frame basis.

People saying otherwise here are not referencing sources. I've even asked where this info is coming from.
 
Status
Not open for further replies.