AMD's FreeSync and VESA A-Sync discussion

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
I dont understand the need for variable refresh if a 480hz vsync is barely 2ms and if people dont mind frame rates going down by fractions.

because a fractional framerate is horrible

plus, we need to have a 480Hz capable monitor first, heck we don't even have a true 240Hz monitor, just one from Eizo that claims 240Hz because of 120Hz + strobing (and another coming from LG)
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Now I'm a patient man; at least I like to think so, but if their next freesync demo is going to be the windmills again, I'll be burning effigies.
Anyways, looks like there is some reddish text in the top right hand corner of the screen in their latest showing at AMD30. I can just make out FPS cant see if it's varying or not. Hopefully some of the audience can youtube it or shed some more light on this. Video here, from around the 35 minute mark if you missed the live stream.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Yes, that makes exactly 0 sense.

NV is giving those expensive boards to the display manufacturer for free to spread g-sync adoption. Meanwhile pesky little b-s-tards sell g-sync equipped displays with a huge premium. All while NV itself makes sure the adoption is sky-rocketing by asking $200+ for DIY kits.

Truth is, nv makes sure they make a whole bunch of $ on each g-sync module sold. Suck every last drop of the sweet fanboy's milk they can before free-sync hits.

If anything, people should be grateful AMD is spreading the word free g-sync alternative is coming. They are saving people's money and if someone still buys g-sync, they help make informed decision.

Its almost as if you didn't even read any of my posts.

Read them and try again.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Now I'm a patient man; at least I like to think so, but if their next freesync demo is going to be the windmills again, I'll be burning effigies.
Anyways, looks like there is some reddish text in the top right hand corner of the screen in their latest showing at AMD30. I can just make out FPS cant see if it's varying or not. Hopefully some of the audience can youtube it or shed some more light on this. Video here, from around the 35 minute mark if you missed the live stream.

When I saw the video initially I thought it was static 45. Watching it again however that looked like it might be varying between 45 and 60 to me but its really hard to tell.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
They still say that it will be free to the end user, monitors will be coming shortly.

So much for the argument that it's going to cost, even if there is a cost it will have to be small or they wouldn't talk about it that way.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Interesting:

From AMD's FS FAQ:


Will every monitor eventually support Project FreeSync?
AMD has undertaken efforts to encourage broad adoption for Project "FreeSync", including:
Royalty-free licensing for monitor vendors;
Open and standardized monitor requirements (e.g. no non-standard display controllers or ASICs);
Industry-standard implementation via the DisplayPort Adaptive-Sync amendment to the DisplayPort 1.2a specification; and
interoperability with existing monitor technologies.
AMD is presently advocating these benefits to display vendors and working with their respective design teams to expand the capabilities of high-performance/gaming-oriented monitor lineups to include Project FreeSync. While AMD cannot possibly guarantee that "every monitor" will adopt Project FreeSync in time, we do believe that this approach is best to achieve wide industry support.
Additionally, it must be established that all dynamic refresh rate technologies require robust, high-performance LCD panels capable of utilizing a wide range of refresh rates without demonstrating visual artifacts. Such LCD panels naturally cost more to manufacture and validate than less capable panels, which may render dynamic refresh rate technologies economically unviable for especially cost-conscious monitors. Economies of scale and the maturation of dynamic refresh rate technologies could help alleviate this concern and further promote adoption in the future.

You know what's REALLY hilarious. In that interview a while back it was stated that FS would not cost more if you had a GPU that supported it (260,260x, 290,290X) and a monitor that supports FS.

Oh hey, if you already paid for the monitor and GPU then freesync is free. I love the marketing word games AMD uses these days, how low can they go. I guess g-sync is free if you already own the g-sync monitor and a 660 or higher GPU. Interesting. Good grief man. Who are they trying to kid here: the scaler electronics and control board will need to be more robust than what a typical 60hz panel will need, so the cost will be controlled by volume and market forces. It certainly won't cost zero.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
AMD isn't in control of the price. The monitor manufacturers and scalar providers are responsible for the sku's and what they cost. AMD has already said its not their call just yesterday. It's why they aren't announcing products or partners because they don't control the schedule, they simply aren't working that closely. It might be free, it might cost a lot more, not something AMD controls.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
He did more than say he wouldn't support the DP 1.2a extension, he expressed grave concern about this specification change becoming mandatory in DP 1.3 and forcing all monitors to change, Nvidia would be against that. Seeing as how they are on the committee that I think is an expression that they would block its inclusion.

Like I said before standards are part of a process were all interested parties come together to define what the solution is. If one party tries to force its version through, a version the competitor can't itself implement, then its not possible to make a standard out of it. It wont be become mandatory in DP 1.3 by the sounds of things, it might only be optional again or maybe not in there at all.

Why against? What is the harm if nv doesn't support it and runs on legacy mode without Adaptive sync? I understand they don't push industry forward, by why holding the industry back? Why not support a-sync and g-sync at the same time? Why not make a shift in the industry to variable refresh rate?

Because $$$ - that is why! They want to milk every single person remotely interested in variable refresh rate, and their dog (LOL).

That is greed and anti-consumer practice. I'm against it!

Yeah, AMD is forcing their solution as a standard which would harm everybody who cant support it. Heck they it even harm their own customers who brought a 7970 instead of a GTX680.

Harm what, who and how?
I can't even express how stupid this sounds.
There should never be a new standard in anything, ever to prevent something that was designed earlier to become legacy part, which by your twisted logic is unacceptable. We should all roll back to dx1.0 so that every gpu supports the rendering api standard.
DX12? Give me a break, trash it! My 8800 GTS doesn't support it, so it should't be released! I guess I will have to sue MS for releasing DX11 since it doesn't support that aswell!:eek:
Go home! You're drunk!
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Why against? What is the harm if nv doesn't support it and runs on legacy mode without Adaptive sync? I understand they don't push industry forward, by why holding the industry back? Why not support a-sync and g-sync at the same time? Why not make a shift in the industry to variable refresh rate?

Because $$$ - that is why! They want to milk every single person remotely interested in variable refresh rate, and their dog (LOL).

That is greed and anti-consumer practice. I'm against it!

Gsync and Freesync for example still doesnt work in the OS GUI or with windowed mode.

Both are only targetted the more extreme gamer niche. And there is a huge mass not willing to pay for it. Whatever the premium may be.

If they get proper support all around. Then it may happen.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
Harm what, who and how?
I can't even express how stupid this sounds.
There should never be a new standard in anything, ever to prevent something that was designed earlier to become legacy part, which by your twisted logic is unacceptable.

Well when you put it that way, that other argument does sound unreasonable. :D
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Gsync and Freesync for example still doesnt work in the OS GUI or with windowed mode.

Both are only targetted the more extreme gamer niche. And there is a huge mass not willing to pay for it. Whatever the premium may be.

If they get proper support all around. Then it may happen.

So?
I have FullHD TV in my dinning room. 8 from 276 channels that I receive are broadcasted in FullHD. Does Samsung somehow harm me by selling FullHD TVs?

My motorolla modem is capable of 100Mb/s data transfer, but the cable I'm connected to internet provider can't go higher than 6Mb/s. Does it mean Motorolla harms me in some way?

I'm not sure but from the sounds of it, adaptive sync doesn't come with a premium, and if it will I don't think it will be anywhere close to what nv asks. If it will be a standard, I don't expect it to come with a premium larger than what dvi has compared to d-sub.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So?
I have FullHD TV in my dinning room. 8 from 276 channels that I receive are broadcasted in FullHD. Does Samsung somehow harm me by selling FullHD TVs?

My motorolla modem is capable of 100Mb/s data transfer, but the cable I'm connected to internet provider can't go higher than 6Mb/s. Does it mean Motorolla harms me in some way?

I'm not sure but from the sounds of it, adaptive sync doesn't come with a premium, and if it will I don't think it will be anywhere close to what nv asks. If it will be a standard, I don't expect it to come with a premium larger than what dvi has compared to d-sub.

Adaptive sync does cost extra. You make it sound like it comes free out of thin air and everyone that charges extra for it is greedy bastards.

AMD said:
Additionally, it must be established that all dynamic refresh rate technologies require robust, high-performance LCD panels capable of utilizing a wide range of refresh rates without demonstrating visual artifacts. Such LCD panels naturally cost more to manufacture and validate than less capable panels, which may render dynamic refresh rate technologies economically unviable for especially cost-conscious monitors. Economies of scale and the maturation of dynamic refresh rate technologies could help alleviate this concern and further promote adoption in the future

And in terms of standard. No its not. Its an optional feature. And only a selected few AMD GPUs support it. its already too late for Broadwell, Maxwell and so on to even implement it. Assuming best case scenario for freesync. An example is HDMI 2.0. Spec was finished in sept 2013. Still 0 GPU support. And Broadwell for example wont support it either.
 

Mand

Senior member
Jan 13, 2014
664
0
0
AMD said:
While AMD cannot possibly guarantee that "every monitor" will adopt Project FreeSync in time, we do believe that this approach is best to achieve wide industry support.

Additionally, it must be established that all dynamic refresh rate technologies require robust, high-performance LCD panels capable of utilizing a wide range of refresh rates without demonstrating visual artifacts. Such LCD panels naturally cost more to manufacture and validate than less capable panels, which may render dynamic refresh rate technologies economically unviable for especially cost-conscious monitors. Economies of scale and the maturation of dynamic refresh rate technologies could help alleviate this concern and further promote adoption in the future.
Well, this makes it pretty clear it won't be mandatory in DP1.3.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
480hz would be a major advancement, its 4x any consumer product today, and 8x a "normal" monitor. That order of magnitude improvement would make things a lot smoother, even if the variance was quite high. I suspect you are right that would eliminate the problem, it would also take 8x as much horse power to render it. If we can get a similar quality of output at 60-120hz with adaptive sync technology instead then its considerably cheaper to do it that way than to try and push monitors much further than they are capable. Another way to put it is that each frame on the GPU would have 1/8th the processing power available compared to the same game at 60hz, so it will look a lot worse.
The games dont have to run at 480fps just because the digital signal is 480hz.

Anyway, I hate to sound like an aristocrat, but people who act like ~4k x ~2k resolution gaming is so cool have been making me angry since that means more graphics memory and more graphics bandwidth and more ROP processing power so less likely to use SGSSAA, less likely to use higher bit depth render targets, color buffers, depth buffers, and/or more likely to use lower scene complexity. 1920x1200p RGB10 is excellent for a 22" computer monitor with a 288 hz signal because that is well within the bandwidth capability of display port 1.2 and because it is easier for manufacturing to keep color and brightness uniformity and contrast ratio high with a smaller screen that doesnt have a ridiculously high resolution. and that also makes it more economically efficient to use RGB LED arrays across the screen. 4k x 2k only makes much sense when you go past traditional computer monitor sizes and even then it is nowhere near worthit in my opinion no matter how much better my more detailed my mom's apple iPad 2's screen looks compared to my monitor (the latter of which I reduced the native res on to get a 75hz signal rate within pixel clock limits, thanks to ToastyX:).

And thank you for your kind reply:)
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Now wait a minute.
The stuff that gets posted here is always a bunch of bull.

Im not pro Nvidia or AMD, I own both.

But lets not mistake the FACT... that its Nvidia who locks out AMD users from using G-sync.
And it's a FACT... that it's Nvidia (NOT AMD)who locks out Nvidia owners from using A-sync (free sync) or whatever it will be called.

Free-sync (a-sync) isn't proprietary in the slightest...no matter what your definition of proprietary is.
Just because Nvidia refuses to support free-sync for its customers in addition to G-sync, doesn't make it proprietary....that just makes it Nvidia's decision not to support a different tech other than its own for its customers.
Unlike AMD who have zero chance at offering G-sync support to its customers!

But I do agree, AMD needs to start showing some proof of concept and start promoting its own features and stop slamming the competition instead.
Its just a more respectable way of doing things IMO.

Kudos to the 1st manufacturer that offers monitors that support both G-sync and Free-sync in a single monitor to its customers!
Because it's becoming increasingly clear that this is the only option that Nvidia will allow that will truly benefit all of "US", the "PC gamers" as a whole no matter what GPU you own now or in the future. :thumbsup: :thumbsup:
 
Last edited:

Spanners

Senior member
Mar 16, 2014
325
1
0
Interesting:

From AMD's FS FAQ:




You know what's REALLY hilarious. In that interview a while back it was stated that FS would not cost more if you had a GPU that supported it (260,260x, 290,290X) and a monitor that supports FS.

Oh hey, if you already paid for the monitor and GPU then freesync is free. I love the marketing word games AMD uses these days, how low can they go. I guess g-sync is free if you already own the g-sync monitor and a 660 or higher GPU. Interesting. Good grief man. Who are they trying to kid here: the scaler electronics and control board will need to be more robust than what a typical 60hz panel will need, so the cost will be controlled by volume and market forces. It certainly won't cost zero.

Where is that interview? I can recall people misinterpreting what was said thinking it implied no cost but I never read anything from AMD stating that.

I did some searching was it the Robert Hallock interview you were referring to? If it was the only references to price are "costs less" (than G-sync I'm fairly safe to assume) and "no expensive or proprietary hardware modules". None of that seems to agree with "not cost more if you had a GPU that supported it". If it's another interview then please link away.
 

MathMan

Member
Jul 7, 2011
93
0
0
Maybe a significant amount of engineering work is needed to make GSYNC also run FreeSync. You can't blame Nvidia for first bringing a product that was designed before that was an open spec.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Maybe this will not really be a big deal.

I'm thinking about comparing this to motherboards. Sometimes your motherboard supports only NVidia's SLI, sometimes only AMD's Crossfire, and sometimes both.

Just as a motherboard manufacturer can support either format, so too could a monitor/display manufacturer support both. So you can get a monitor according to your planned GPU purchases, just as you get a motherboard to support your planned GPU purchases. If you like both companies, buy the monitor that supports both.

So it's no big deal if a video card doesn't support a particular display feature, because it's like where you don't expect a crossfire motherboard to support SLI.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Now wait a minute.
The stuff that gets posted here is always a bunch of bull.

Im not pro Nvidia or AMD, I own both.

But lets not mistake the FACT... that its Nvidia who locks out AMD users from using G-sync.
And it's a FACT... that it's Nvidia (NOT AMD)who locks out Nvidia owners from using A-sync (free sync) or whatever it will be called.

Free-sync (a-sync) isn't proprietary in the slightest...no matter what your definition of proprietary is.
Just because Nvidia refuses to support free-sync for its customers in addition to G-sync, doesn't make it proprietary....that just makes it Nvidia's decision not to support a different tech other than its own for its customers.
Unlike AMD who have zero chance at offering G-sync support to its customers!

But I do agree, AMD needs to start showing some proof of concept and start promoting its own features and stop slamming the competition instead.
Its just a more respectable way of doing things IMO.

Kudos to the 1st manufacturer that offers monitors that support both G-sync and Free-sync in a single monitor to its customers!
Because it's becoming increasingly clear that this is the only option that Nvidia will allow that will truly benefit all of "US", the "PC gamers" as a whole no matter what GPU you own now or in the future. :thumbsup: :thumbsup:

+1 nicely stated.

/fud
 

jackstar7

Lifer
Jun 26, 2009
11,679
1,944
126
I wonder if the costs of using the adaptive sync specs could in any way be offset by removing HDMI and it's costs from a given monitor. I wonder if we'll see displayport-only monitors at a more reasonable markup akin to current 120Hz/144Hz monitors.

Gotta wait and see though.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Kudos to the 1st manufacturer that offers monitors that support both G-sync and Free-sync in a single monitor to its customers!

This isn't really possible. G-Sync replaces the scaler, you can't have two scalers in the panel. There's just too much replaced in the display pipeline.

It's not just a matter of monitor manufacturers deciding to do it, there are technical incompatibilities.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
This isn't really possible. G-Sync replaces the scaler, you can't have two scalers in the panel. There's just too much replaced in the display pipeline.

It's not just a matter of monitor manufacturers deciding to do it, there are technical incompatibilities.

I'm no monitor expert, but why would a scaler affect timings? I'm assuming all the grunt work done for Adaptive sync is in the TCON.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
This isn't really possible. G-Sync replaces the scaler, you can't have two scalers in the panel. There's just too much replaced in the display pipeline.

It's not just a matter of monitor manufacturers deciding to do it, there are technical incompatibilities.

I won't be surprised to see a monitor that supports both once they've both been released. I think they will once the huge gsync price tag is reduced. They could have two separate inputs etc.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I won't be surprised to see a monitor that supports both once they've both been released. I think they will once the huge gsync price tag is reduced. They could have two separate inputs etc.

If they both end up doing exactly the same thing (practically speaking)?
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
This isn't really possible. G-Sync replaces the scaler, you can't have two scalers in the panel. There's just too much replaced in the display pipeline.

It's not just a matter of monitor manufacturers deciding to do it, there are technical incompatibilities.

Couldn't you say the same thing about DisplayPort and VGA? Yet we see monitors that support both - you have separate inputs to receive the signal. Yet we also see monitors that have only a single DVI input, to avoid lag.

I'd imagine the monitor people could figure something out. I'm not sure it's so limited. Perhaps you can have a monitor that has two displayport inputs, and a toggle switch or something to select whether one input (for G-sync) or another input (for a-sync) has exclusive control over the display? Even a new take on the old familiar KVM boxes where you switch between different systems by using a dumb connect-disconnect thing.

It would be interesting though if the agreements for some components are bound up with a limitation preventing their use on monitors that use competing standards, to ensure exclusivity of premium features.