VESA Adopts Adaptive-Sync

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
No, they did not if you believe Anand's Review. Edit: Maybe you talk about full DP 1.2 Support :awe: ?

GCN 1.0 supports DP1.2, too.

With the 6000 series AMD upgraded their DisplayPort capabilities from DP 1.1 to DP 1.2. With Southern Islands AMD will be upgrading their HDMI capabilities.
http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/8

Hardware that was in dev long before G-Sync/FreeSync was announced. This does kinda prove that you dont need "additional hardware".
Äh, what? AMD announced that only GCN 1.1 hardware will support Freesync. Hawaii was released in October at the same time like G-Sync, which is supported down to the GTX680...

GK104 had it tape-out nearly 2 years before Hawaii...

Or they didn't want to waste time and resources debugging older hardware so they just limited it to kepler+ and AMD limit it to GCN1.1 for the same reasons :sneaky:

They dont need to debug anything. Either GCN 1.0 supports the DP features or not.
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Äh, what? AMD announced that only GCN 1.1 hardware will support Freesync. Hawaii was released in October at the same time like G-Sync, which is supported down to the GTX680...

The 260X is Bonaire, and it was released in March 2013 (7790).
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I still haven't seen a technical explanation of what it is that was adopted into the standard. When I looked at the difference between eDP and the two features used to achieve power saving and gsync there were quite a lot of technical differences. One of the questions I asked at the time was how it was that Freesync was telling the monitor how long to hold the next frame, there was a suggestion it needed to predict the length of future frames and hence added additional latency.

I still haven't seen anything that answers my query there, gsync I know has the monitor polling the GPU for updates so it can self determine refresh if need be(which is why it has expensive hardware, the monitor ensures its minimum HZ and refresh periods and it has to hold an image buffer to be able to do that refresh) but I don't know how Freesync is meant to do it.

I assume based on where it came from its using panel self refresh and variable vblank, but the problem with variable vblank in eDP was that its not designed for individual frames, you could set the refresh rate at the end of the prior frame and it would apply from the next frame on. Panel self refresh combined with that allows a similar capability to gsync but it still has the question of whether the variable vblank needs that prediction or not or whether they also intend for the panel to poll the GPU and have the panel be much more complicated and refresh itself if need be. A lot of technical questions to answer to determine if AMDs solution is really a parallel to gsync or not. It certainly wont support the low persistence mode or the 3D mode that also comes with the gsync module but whether its a true equivalent with the same characteristics remains to be seen. Anyone got a link to the actual technical specs?


Edit - I went looking for the specs myself and on the VESA site you have to pay them to see anything past Display port 1.1. Interesting, for a spec that is meant to be open that isn't very sporting and it means I can't be bothered to check (I have no idea what it costs but for me its curiosity and understanding driving and I am not going to pay them to see the details, it will all come out in the end).
 
Last edited:

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
I still haven't seen anything that answers my query there, gsync I know has the monitor polling the GPU for updates so it can self determine refresh if need be(which is why it has expensive hardware, the monitor ensures its minimum HZ and refresh periods and it has to hold an image buffer to be able to do that refresh) but I don't know how Freesync is meant to do it.

There is no need for "special hardware" for polling the GPU. DP is bi-directional.
 

Mand

Senior member
Jan 13, 2014
664
0
0
A lot of technical questions to answer to determine if AMDs solution is really a parallel to gsync or not.

Nvidia's response to the CES demo was basically "...yeah, there's a reason we didn't do that." AMD does have a lot of technical questions to answer, and its official position has been to hope that other people answer them for them.

Let's remind everyone that a spec update is not a prototype. There is no guarantee that this will actually pan out into something that works on par with G-Sync, or even close to it.
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
Let's remind everyone that a spec update is not a prototype. There is no guarantee that this will actually pan out into something that works on par with G-Sync, or even close to it.

Just another "wait and see" tactic...
 

dn7309

Senior member
Dec 5, 2012
469
0
76
Nvidia's response to the CES demo was basically "...yeah, there's a reason we didn't do that." AMD does have a lot of technical questions to answer, and its official position has been to hope that other people answer them for them.

Let's remind everyone that a spec update is not a prototype. There is no guarantee that this will actually pan out into something that works on par with G-Sync, or even close to it.


What did you expect Nvidia to say? 'Yeah it works just as we'll and Gsync and it's open source'
 
Last edited:

parvadomus

Senior member
Dec 11, 2012
685
14
81
What did you expect Nvidia to say? 'Yeah it works just as we'll and Gsync and it's open source'

:D
Looks like Nvidia is failing hard recently. Titan-Z, G-Sync, Tegra, Tesla, all their products being humiliated by the competition. :|
 

Mand

Senior member
Jan 13, 2014
664
0
0
:D
Looks like Nvidia is failing hard recently. Titan-Z, G-Sync, Tegra, Tesla, all their products being humiliated by the competition. :|

I'm not sure how you can say that Nvidia is failing on G-Sync when there is no FreeSync hardware yet, and won't be for at least a year, and even then there's no guarantee it will be as good or anywhere close to as good.

But sure, proclaim doom already.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
I'm not sure how you can say that Nvidia is failing on G-Sync when there is no FreeSync hardware yet, and won't be for at least a year, and even then there's no guarantee it will be as good or anywhere close to as good.

But sure, proclaim doom already.

I wouldnt but something that will be replaced in a few months by open stuff.
 

dn7309

Senior member
Dec 5, 2012
469
0
76
I'm not sure how you can say that Nvidia is failing on G-Sync when there is no FreeSync hardware yet, and won't be for at least a year, and even then there's no guarantee it will be as good or anywhere close to as good.

But sure, proclaim doom already.

Gsync is dead. This is like bluetooth vs lighting connector speakers. One is open source and one is proprietary. When the first bluetooth speaker was announced there were naysayer who say the music quality will never match the direct digital connection. Look at who is winning the battle now? :)
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Nvidia's response to the CES demo was basically "...yeah, there's a reason we didn't do that." AMD does have a lot of technical questions to answer, and its official position has been to hope that other people answer them for them.
That could easily be a hardware limitation on their part
I'll be very intrested in know why they didn't should AMD manage to pull this off
 

Gryz

Golden Member
Aug 28, 2010
1,551
204
106
When people don't even know the difference between "open source" and "open standard", maybe they shouldn't participate in standards discussions ?

Also, I think the name "adaptive sync" is a really badly chosen name. nVidia has a proprietary feature called "adaptive vertical sync". There is going to be a lot of confusion between the two features. And also, when people are gonna call adaptive sync "async" there is going to be even more confusion. Async has been used as an abbreviation for "asynchronous" for decades. Adaptive synch has nothing to do with asynchronous.
Not gonna be fun when people google for your new feature called "async" and only get pages and pages of search results about other stuff.

Will adaptive sync also work with framerates well above 60 fps, and monitors capable of 120+ Hz ?
 

Mand

Senior member
Jan 13, 2014
664
0
0
That could easily be a hardware limitation on their part
I'll be very intrested in know why they didn't should AMD manage to pull this off

http://techreport.com/news/25878/nvidia-responds-to-amd-free-sync-demo

Petersen quickly pointed out an important detail about AMD's "free sync" demo: it was conducted on laptop systems. Laptops, he explained, have a different display architecture than desktops, with a more direct interface between the GPU and the LCD panel, generally based on standards like LVDS or eDP (embedded DisplayPort). Desktop monitors use other interfaces, like HDMI and DisplayPort, and typically have a scaler chip situated in the path between the GPU and the panel. As a result, a feature like variable refresh is nearly impossible to implement on a desktop monitor as things now stand.


That, Petersen explained, is why Nvidia decided to create its G-Sync module, which replaces the scaler ASIC with logic of Nvidia's own creation. To his knowledge, no scaler ASIC with variable refresh capability exists—and if it did, he said, "we would know." Nvidia's intent in building the G-Sync module was to enable this capability and thus to nudge the industry in the right direction.


When asked about a potential VESA standard to enable dynamic refresh rates, Petersen had something very interesting to say: he doesn't think it's necessary, because DisplayPort already supports "everything required" for dynamic refresh rates via the extension of the vblank interval. That's why, he noted, G-Sync works with existing cables without the need for any new standards. Nvidia sees no need and has no plans to approach VESA about a new standard for G-Sync-style functionality—because it already exists.


Bold added.
 

dacostafilipe

Senior member
Oct 10, 2013
797
297
136
Will adaptive sync also work with framerates well above 60 fps, and monitors capable of 120+ Hz ?

It should.

Q: What is the supported range of refresh rates with FreeSync and DisplayPort™ Adaptive-Sync?
A: AMD Radeon™ graphics cards will support a wide variety of dynamic refresh ranges with Project FreeSync. Using DisplayPort™ Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display. Potential ranges include 36-240Hz, 21-144Hz, 17-120Hz and 9-60Hz.
Source: http://www.brightsideofnews.com/2014/05/12/vesa-adds-adaptive-sync-displayport-1-2-standard/
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136

Quoting doesn't equal knowledge. Nvidia had to go the route they did, ie design the entire board, because they don't have transistors in their chips dedicated to scaling like AMD has had since the 5k series. AMD's cards can control the scaling thus negating the need for a scaler board.

The reason AMD is only committing to support with their recent GPUs is just because they need to support the feature in drivers. Even though they could support back to the 6k and 7k series (they both have DP 1.2), by the time the monitors are released we will at least have 1 more generation of cards. There is no reason to commit to legacy hardware.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
AMD doesnt support older architectures because they are not compartible. They all using the same DisplayPort protocol and the same cable. Either it works or not.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Nvidia's cards are perfectly capable of scaling, its right there in the drivers you can choose if the monitor or the GPU does it. I don't think its remotely anything to do with that.

No one from AMD has provided enough information on how this works to determine if its an equivalent technology. When they showed it off in January I looked into the eDP spec and determined what they were doing was based on two features of that spec - panel self refresh and adaptive refresh rate. The problem I could see based on the technical briefs I had hold of was that the adaptive refresh rate was not directly driven by the vblank, it seemed that in order to make it work you had to set the new monitor display parameters, like refresh rate and resolution at the end of the frame via the Auxilary channel. It was something that happened after vblank and told the monitor how long subsequent frames would be. This also impacted on PSR where the panel would refresh on that frequency. You kind of needed both technologies in order to get adaptive refresh rates because at very low refresh rates the panel needs to redisplay the same image (and why gsync has its minimum 30 fps at which point it starts redisplaying the image). But these two technologies combined are not the same as gsync, they differ in a very critical way.

The problem as I see it is that adaptive vsync needs prediction, it needs to know how long the next frame will be so it can set the refresh rate, and that is on the assumption the monitors controller is capable of transparently adjusting refresh. AMD at the time said that this was a technology that already existed in eDP which is why I looked there to begin with. But I didn't find gsync like capabilities there, I found features designed to reduce power by reducing monitor refresh rates to known quantities. The whole point of gsync is you don't know when the render will finish and then it goes out immediately, so if AMD requires a frame of prediction for their PSR + adaptive vsync it is not the same thing. They need to explain how it works on a low level.

I am all for a cheaper alternative that is nicely part of the Display port standard that many monitors can implement. I don't want to pay a $200 premium for gsync, I think that is just too expensive. But at the same time I can't get enough information on what Freesync is to determine if its even going to be a competitive or just PSR + adaptive refresh from the eDP standard and hence not what gamers need at all. We have a working product you buy verses something that is the promise of equivalent technology, with no details and no hardware for at least a year, its vapourware right now and we can't even be sure it works well enough and like it enough to care about it.
 
Last edited:

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
If this is so easy breezy to do with a "trivial" VESA spec tcon firmware update and AMD's existing video cards why don't they have a partner lined up already with a monitor to show this off? Or a demo?

I believe that like how all the first gen 4K TV's have had magic updates to work at 4K 60Hz... Oh wait they haven't
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
If this is so easy breezy to do with a "trivial" VESA spec tcon firmware update and AMD's existing video cards why don't they have a partner lined up already with a monitor to show this off? Or a demo?

I believe that like how all the first gen 4K TV's have had magic updates to work at 4K 60Hz... Oh wait they haven't

Ahhh because the spec was just approved.....
 
Last edited:

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
There is no reason to even argue about this subject matter.

Companies do spend money on R&D when making new products. I doubt Adaptive-Sync will be any different when it come to the $'s spent.

If you prefer G-Sync then hope Adaptive-Sync takes off as it'll drive down the NVidia tax/tariff.

Backwards compatibility with older hardware....Who cares as most of those arguing jump to the latest and greatest gpu's.

It seems like too many members judge companies by past performance only. There is a thing called change....Sometimes for the better and sometimes for the worse. Only the future holds the answer.

I'm pretty sure when AMD stated free that they meant there won't be any AMD tax's applied to the final purchase price of said compatible monitors.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Quoting doesn't equal knowledge. Nvidia had to go the route they did, ie design the entire board, because they don't have transistors in their chips dedicated to scaling like AMD has had since the 5k series.

I'd like to know how you think you know more than Nvidia about why Nvidia is doing what they're doing. Quoting doesn't equal knowledge, but the person being quoted sure does.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
There is no reason to even argue about this subject matter.

Companies do spend money on R&D when making new products. I doubt Adaptive-Sync will be any different when it come to the $'s spent.

If you prefer G-Sync then hope Adaptive-Sync takes off as it'll drive down the NVidia tax/tariff.

Backwards compatibility with older hardware....Who cares as most of those arguing jump to the latest and greatest gpu's.

It seems like too many members judge companies by past performance only. There is a thing called change....Sometimes for the better and sometimes for the worse. Only the future holds the answer.

I'm pretty sure when AMD stated free that they meant there won't be any AMD tax's applied to the final purchase price of said compatible monitors.

The real problem with Nvidia's design is that it uses an FPGA. They are not cheap for any company to use. $90-100 of the cost of the G-Sync unit. They said themselves they "hope" to get the unit price down to $130. There is no solution that min price being +130 over the monitor is going to beat a standard in terms of cost efficiency.
 
Status
Not open for further replies.