VESA Adopts Adaptive-Sync

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Open standards doesn't mean things become free, as much as people would seem to believe. Especially not when the display manufacturers can market it as an upsell.

You could, if you where member of VESA, implement your own A-Sync and you would not have to pay them a cent. That's the "free" part.

Now, you (and me) we don't manufacture LCD screens, so the only choice we have is to buy them from somebody. And because the integration (not the usage !!!) may cost more, you will also have to pay more.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Then we had all better buy monitors now, because the DP 1.3 standard is about to be ratified. What do you think those displays will cost, maybe $200 more than current displays? All those companies are going to have to design new hardware after all.

Oh wait, no they won't, because they don't make their own display controllers.

You clearly don't understand the difference between support for a feature and a requirement. Standards include both, and Adaptive Sync is not a requirement.
 

dn7309

Senior member
Dec 5, 2012
469
0
76
I like the idea of adaptive refresh rate hardware in monitors, I have Vsync way too much because it cripple my video card performance, but at the same time, I need it to keep my screen from tearing.

As for being "free" there's no way this is going to be "free". It going to be a new standard where manufacture will charge more for just having a new version of display port just like when HDMI first came out TV manufacture price gouge for three or four years and being stingy on number of ports.

But I do like the fact that it will be free as in being an open standard that 99% of future monitors will have and you don't have to buy a specific GSync monitor for an additional cost that will only lock you down to the Nvidia ecosystem.

I honestly think if Nvidia wanted, they could have easily make "free sync" in the first place but they choose to R&D on making it propriety and lock it down because it makes business sense.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
You clearly don't understand the difference between support for a feature and a requirement. Standards include both, and Adaptive Sync is not a requirement.

Yes, I understand. I work in a standards based industry.

I guess we'll just have to wait a year and see. But I don't see how implementing part of a standard is a billion dollars cheaper than implementing the full standard.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
You could, if you where member of VESA, implement your own A-Sync and you would not have to pay them a cent. That's the "free" part.

Now, you (and me) we don't manufacture LCD screens, so the only choice we have is to buy them from somebody. And because the integration (not the usage !!!) may cost more, you will also have to pay more.

Buy them from whom, exactly? Who is "somebody" and why did they bother designing it in the first place, when they could have just used the standard like you (and me)?
 

MrPickins

Diamond Member
May 24, 2003
9,124
787
126
So who is going to design it?

The same companies that design the controllers that are currently in use.

So far, you guys seem to be trying to convince it that having it in a spec means that compatible hardware will just show up, that nobody had to pay much for. How, exactly?

You're exaggerating the cost.

Sure early adopters will pay a hefty surcharge, but that's normal for any new tech. By the time the second generation of AS monitors come around, there is no way in hell the upcharge will be $150.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Mand, G-Sync will be made irrelevant by A-Sync, open standards are the way to go in this industry.

Think about the economics. Would a monitor manufacturer make more money selling panels with G-Sync or A-Sync?

That's all you really need to ask yourself.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I really hope the final version of Oculus Rift supports this. Anything to make the image smoother will reduce motion sickness.

^ Absolutely, this a great point. Regarding both an Oculus implementation or any other implementation, I'm wary to see it in action and at what price. If its significantly cheaper than G-Sync it will likely be worse in application, if it is as high of quality as G-sync it will likely command a similar (probably slightly smaller) premium. If it ends up working as well and being cheap I'll be happy and flabbergasted simultaneously
 

Mand

Senior member
Jan 13, 2014
664
0
0
Mand, G-Sync will be made irrelevant by A-Sync, open standards are the way to go in this industry.

Think about the economics. Would a monitor manufacturer make more money selling panels with G-Sync or A-Sync?

That's all you really need to ask yourself.

They'd still sell both, even if they both were proprietary, in much the same way that motherboard manufacturers tend to support both AMD and Intel CPUs. Not on the same board, no, but as part of two product lines from the same company.
 

gorobei

Diamond Member
Jan 7, 2007
3,957
1,443
136
They'd still sell both, even if they both were proprietary, in much the same way that motherboard manufacturers tend to support both AMD and Intel CPUs. Not on the same board, no, but as part of two product lines from the same company.

unlikely.

right now gsynch's problem isnt technology but economics.

gsynch exists as a value add attempt by nvidia that certainly answers a problem in the gaming market. however because there was no standard on variable refresh timings/protocols and because of some sort of limitation on the gpu side, it is necessary to have a custom panel controller. they are currently using a fpga on a custom daughterboard shoehorned into a few 120 hz tn models. this would normally be done with an asic fixed function chip that is ordinarily an off-the-shelf kind of component.

but since lcd controller chip making is a niche industry there are probably only a handful of makers who dont update their designs until there is a major change in standards or must-have features. so nvidia has to use the more expensive and somewhat wasteful fpga. this is the bulk of the cost of the gsynch premium , though the insurance coverage for the housecall/mail-in service to install the daughterboard probably adds into it.

if there was an asic gsynch controller the cost would probably only be a few dollars over the normal BOM. but designing and contracting out a fab to make such a narrow use chip is very expensive and usually amortized over years and mass volume. for nvidia to commit the millions necessary to make a gsynch asic, they would need massive contracts with the bulk of lcd oems. given that pure gaming monitors make up a smaller portion of all lcd models, that is a difficult sell.

it is a chicken or the egg conundrum:
until there is a clear demand for variable refresh monitors, not enough oems will commit to a long term big volume contract for gsynch asic controllers. if nvidia cant get enough contracts for the controllers, they cant justify spending the millions necessary to make an asic. without sample chips no oem can put out a test model to check the waters. this is why nvidia went the expensive fpga route.

if nvidia had the market to itself this is a fine and admirable effort to push technology, but amd looked back at its ip portfolio and found some of their early efforts in variable refresh. the fact that they already had a version of variable timing built into their gpu design for the laptop panel eDisplayport interface was probably serendipity.

because the protocol for freesynch is so closely tied to displayport, it is an easier proposition to amend the vesa standard and have the niche controller asic makers enable the dp functionality in all their chips for the next iteration. since it is open it means intel can use the asynch function as well.

monitor oems are more likely to drag their feet and wait for an industry standard and protocol built into an off the shelf component like the controller chip, than they are to commit all their product lines to a proprietary gsynch chip that in the near term adds a $200 premium.

gsynch is a polished and finished product, but that isnt enough to make it a certain thing. No amount of gamer enthusiasm will change the economics of monitor oems.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
And what is the difference to Adaptive Vsync?
It requires a new displaycontroller in the GPU. It requires new display scalers. And right now there doesnt exist anything except Hawaii and Bonaire.
It is even unclear if nVidia and Intel support the additional requirements of AS in their current hardware.
 

Shivansps

Diamond Member
Sep 11, 2013
3,916
1,570
136
Actually G-Sync FPGA was made because there was no other way, its likely that if someone go and build new ASICs supporting A-Sync, Nvidia will also change G-Sync to use it instead of adding a FPGA, in the meanwhile, FPGA will be still be used because there is no other way.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Buy them from whom, exactly? Who is "somebody" and why did they bother designing it in the first place, when they could have just used the standard like you (and me)?

For example Paradetech, or Chrontel (first two hits on google).

They do bother implementing this on the controllers because they want to sell them. With more features on your controller, you have an advantage to those controllers who don't have them.

It's actually the same as with DP versions (1.0 -> 1.2), some controller support a newer version, those are more likely to find the way into more high-end screens that need the connectivity.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
It requires a new displaycontroller in the GPU.

Actually, I don't think it does.

Because DisplayPort send the data as packets, it should be possible to use this on every DP 1.2 compatible GPU. nVidia uses these same DP packets to communicate with the G-Sync module, so there no technical barrier in using Adaptive-Sync.

If nVidia's place, I would just use Adaptive-Sync and use the G-Sync brand for the software bundle. I mean, they where the ones that came up with the idea in the first place, they could use this.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
G-sync dead or not is yet to be seen, but...

I have to notice some dissimilarities. When MS announced DX12 to be released somewhere in the future with proper multithreading and low level access some called mantle dead. Now when Vesa adopted a-sync it means nothing for a G-sync superiority.:hmm:

Someone have a nice sig to sum this up:
"Ultimately market decides - rightly or wrongly"
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Actually, I don't think it does.
Because DisplayPort send the data as packets, it should be possible to use this on every DP 1.2 compatible GPU. nVidia uses these same DP packets to communicate with the G-Sync module, so there no technical barrier in using Adaptive-Sync.

It's obvious that Adaptive Vsync needs additional hardware features which are not present in most of the current GPUs. Why would AMD only support Hawaii and Bonaire GPUs when every GPU would be ready for it?
 

Galatian

Senior member
Dec 7, 2012
372
0
71
I thought this technology is actually built upon features that are already found in notebooks to save energy. If so I actually think it will be fairly trivial for display manufactures to adopt this technology?
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
It's obvious that Adaptive Vsync needs additional hardware features which are not present in most of the current GPUs. Why would AMD only support Hawaii and Bonaire GPUs when every GPU would be ready for it?

Why is it obvious?

nVidia can control the G-Sync module over DP without additional hardware on the GPU, but somehow for Adaptive-Sync you will need extra hardware? How does that make any sense for you?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Why is it obvious?

nVidia can control the G-Sync module over DP without additional hardware on the GPU, but somehow for Adaptive-Sync you will need extra hardware? How does that make any sense for you?

G-Sync is only supported by Kepler+ GPUs. nVidia implemented something into their displaycontroller which was necessary for G-Sync.
Fact is that Freesync runs only on GCN1.1 hardware.
 

biostud

Lifer
Feb 27, 2003
19,729
6,808
136
Here is Mark Rejhon's take on it: http://forums.blurbusters.com/viewtopic.php?f=5&t=836

Thought it could be of interest.

I'll just put the comment here:
While a positive development, the VESA standard wont be full 120Hz+ for a long time. The Adaptive Sync standard will usually vary only up to 60Hz due to existing limitations in existing chips. It would take two to three years for this to become competitive with GSYNC. Take the word from the display engineers. The VESA standard was originally for battery savings but has been co-opted for dynamic synchronization of refreshes to frames, for elimination if stutter/tearing. But the existing chips only go up to 60Hz, and there is currently no driver support yet.

Even NVIDIA had to do a FPGA in order to pull this off, in full HD resolution at 120Hz.

Give it time, but GSYNC currently has a huge head start despite the shipping delays.

While this might be a problem for gaming monitors, it won't matters to IPS screens that also currently supports up to 60Hz. Personally I would rather have a 60Hz IPS monitor with Adaptive sync than a 120Hz TN panel with GSYNC.

But running a 2560x1440 monitor, I don't really see any point in upgrading my monitor for the next couple of years unless it dies. At that time hopefully 4K+Adaptive sync will be standard and therefore not so expensive.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
nVidia implemented something into their displaycontroller which was necessary for G-Sync.

No, they did not if you believe Anand's Review. Edit: Maybe you talk about full DP 1.2 Support :awe: ?

Fact is that Freesync runs only on GCN1.1 hardware.

Hardware that was in dev long before G-Sync/FreeSync was announced. This does kinda prove that you dont need "additional hardware".
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
G-Sync is only supported by Kepler+ GPUs. nVidia implemented something into their displaycontroller which was necessary for G-Sync.
Fact is that Freesync runs only on GCN1.1 hardware.
Or they didn't want to waste time and resources debugging older hardware so they just limited it to kepler+ and AMD limit it to GCN1.1 for the same reasons :sneaky:
 
Status
Not open for further replies.