AMD Demonstrates Prototype FreeSync Monitor

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Shivansps

Diamond Member
Sep 11, 2013
3,916
1,570
136
Then what was the point of adopting the standard if it doesn't work? A little common sense goes a long way.

It's going to come and it's going to work. No amount of nay saying is going to change that. It's going to be a feature that everyone has access to. No hardware lock ins/outs. It also adds efficiency which is a nice added bonus for gamers.

Lets put it like this, Command Lists has been part of DX11 specs for like forever, did Intel and AMD even cared to adopt it?

It will implemented by everyone, eventually, but not now, what you gona do with the millons of scaler asics that does not support it? also FreeSync is not a "no-go" item for any monitor either.

So you can expect that, at first, FreeSync monitors will be considered a premium, them mainstream and finally a value item, until them, there will be still cheaper monitors without it. Thats how things has been done in this industry since forever.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
Then what was the point of adopting the standard if it doesn't work? A little common sense goes a long way.

It's going to come and it's going to work. No amount of nay saying is going to change that. It's going to be a feature that everyone has access to. No hardware lock ins/outs. It also adds efficiency which is a nice added bonus for gamers.


What's the point of adopting the standard? So that when display manufacturers get around to supporting a-sync, they all meet some basic set of requirements, and they all work with a-sync compatible video cards, instead of having to re-do the interface work every time someone wants to develop a new monitor.


If it turns out the standard isn't practical to implement, it will be ignored or revised in order to make the products work. In any case, a standard's existence isn't evidence of a working implementation, and if a working demonstration of freesync exists, it might not be compatible with the a-sync requirements.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Lets put it like this, Command Lists has been part of DX11 specs for like forever, did Intel and AMD even cared to adopt it?
Command lists is not a standard that the participants got together and adopted. Not even comparable.

It will implemented by everyone, eventually, but not now, what you gona do with the millons of scaler asics that does not support it? also FreeSync is not a "no-go" item for any monitor either.
People add features to improve their products. Not all companies see adding features as a way to raise their ASP. I'm not sure what you are saying about Free-Sync. Free-Sync is not a monitor feature. Adaptive-Sync is. Free-Sync is AMD's way of implementing the variable refresh capability of Adaptive-Sync.

So you can expect that, at first, FreeSync monitors will be considered a premium, them mainstream and finally a value item, until them, there will be still cheaper monitors without it.
That may be true, or it may not. You don't know even though you want it to be accepted as a forgone conclussion for some reason. At least not anymore of a premium than Displayport is in the first place. AMD added the function to an off the rack Korean 27" panel. These asics aren't super high tech limited availability items.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
What's the point of adopting the standard? So that when display manufacturers get around to supporting a-sync, they all meet some basic set of requirements, and they all work with a-sync compatible video cards, instead of having to re-do the interface work every time someone wants to develop a new monitor.


If it turns out the standard isn't practical to implement, it will be ignored or revised in order to make the products work. In any case, a standard's existence isn't evidence of a working implementation, and if a working demonstration of freesync exists, it might not be compatible with the a-sync requirements.

Right. The tech has been in commercial use since 2009, but the VESA members are dumb and think it can do something that it can't?
 

Mand

Senior member
Jan 13, 2014
664
0
0
That may be true, or it may not. You don't know even though you want it to be accepted as a forgone conclussion for some reason. At least not anymore of a premium than Displayport is in the first place. AMD added the function to an off the rack Korean 27" panel. These asics aren't super high tech limited availability items.

But they're still a marketing premium. Even if there are no license fees (which, I'll point out, there's no evidence that Nvidia is charging in the first place), it's still a capability that the monitor has that other similar ones don't. How could they not charge for it?
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
But they're still a marketing premium. Even if there are no license fees (which, I'll point out, there's no evidence that Nvidia is charging in the first place), it's still a capability that the monitor has that other similar ones don't. How could they not charge for it?

Competition. A benefit you don't get with G-Sync since it's proprietary. Besides, it's part of the standard. A standard that can be used for free. Who isn't going to use it?
 

Mand

Senior member
Jan 13, 2014
664
0
0
Competition. A benefit you don't get with G-Sync since it's proprietary. Besides, it's part of the standard. A standard that can be used for free. Who isn't going to use it?

People who don't want to do the development required. Just because AMD made a prototype does not mean everyone will.

Just because the standard can be used for free does not mean the R&D required to make it is also free.

And we still get competition even if G-Sync is proprietary. Proprietary is not inherently bad, and it doesn't mean they never win. Want proof? I bet you have a Blu-Ray player - guess what? That's a proprietary standard. Its competitor, HD-DVD, was open. The proprietary one won the game of market share, and the open one died out. How you think that's an impossibility in this case is something I don't understand.

Open is not magic. It does not make challenges vanish. It does not guarantee victory.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,808
1,289
136
Who isn't going to use it?
Nvidia.

Intel is going to use and so is AMD. A lot of people ignore that Intel has had its eye on adaptive frame rate control for some time. A lot of their demos for it used a proprietary interface now they don't need to use it anymore. I'm pretty Samsung and Qualcomm will plan to use a similar but not the same free sync for Android video games.

AMD isn't the only one who wanted this.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
People who don't want to do the development required. Just because AMD made a prototype does not mean everyone will.

Just because the standard can be used for free does not mean the R&D required to make it is also free.

And we still get competition even if G-Sync is proprietary. Proprietary is not inherently bad, and it doesn't mean they never win. Want proof? I bet you have a Blu-Ray player - guess what? That's a proprietary standard. Its competitor, HD-DVD, was open. The proprietary one won the game of market share, and the open one died out. How you think that's an impossibility in this case is something I don't understand.

Open is not magic. It does not make challenges vanish. It does not guarantee victory.

I'm not using ridiculous terms like you are attributing to me. Companies do R&D all of the time. It's a part of business. I never said proprietary is inherently bad. Also, you might want to apply Blue Ray as a similar example of G-Sync, but it's not. Blue Ray is more similar to DX in the overall landscape. Considering at the time almost every major movie was produced in Sony's studios. They had things locked up about as well as msft did when they released DX. I also never claimed open is magic. Try and have a real conversation.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Snidely insulting my word choices is also not having a real conversation, by the way.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
This Adaptive-Sync technology is pretty old.

https://www.google.com/patents/WO2008026070A2

Going by this it makes sense that nVidia went with the G-Sync module. Instead of relying too much on software they let the G-Sync module handle all the work.

Especially the part with "detemine an image frame rate" explains why AMD uses their own 3D application instead of games. Games are too unstable and unpredictable...
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Going by this it makes sense that nVidia went with the G-Sync module. Instead of relying too much on software they let the G-Sync module handle all the work.

Especially the part with "detemine an image frame rate" explains why AMD uses their own 3D application instead of games. Games are too unstable and unpredictable...

Considering it says it updates on a frame by frame basis how does the "stability" of the frame rate matter?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Frame by Frame was mentioned with the speed of the link between source and display:
If the video stream clock recovery logic in the DisplayPort receiver can react fast enough to changes in the video stream clock rate (as indicated by signals on the link), then the video stream rate could be dynamically adjusted on a frame-to-frame (or perhaps finer) basis.

And it matters because the GPU is responsible for the changes in the vBlank intervall.
There doesnt exist one reason why you need to "determine an image frame" when you only need to send the image to the display controller.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
From what I read, and I understand that I'm no expert, as soon as the frame is ready to render the command is given during the VBLANK interval to refresh the screen and display the new image. The time to process this command is measured in nanoseconds. I don't see how, as slow relative to that the refresh rate is, the info can't be updated fast enough regardless of frame by frame fluctuations.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
And that's the reason why nVidia invented the G-Sync module: Moving this part to the end of the connection allows a much easier and simpler software stack. G-Sync just works.

With Freesync the GPU needs to modify the options within the vBlank intervall. That's the reason why it "determines an image frame rate". It works just fine with nearly static frames - 47FPS on a 60Hz display - but it gets complicated when you have variable frames. And it relies on a huge software stack with a frame limiter.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
And that's the reason why nVidia invented the G-Sync module: Moving this part to the end of the connection allows a much easier and simpler software stack. G-Sync just works.

With Freesync the GPU needs to modify the options within the vBlank intervall. That's the reason why it "determines an image frame rate". It works just fine with nearly static frames - 47FPS on a 60Hz display - but it gets complicated when you have variable frames. And it relies on a huge software stack with a frame limiter.
Dave Baumann says it works differently so I guess one of you is fibbing
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
I dont need to advertise my products.

Seriously, he works for AMD. What do you expect from him? Saying the truth? :lol:

Fact is, AMD hasn't shown that Freesync is able to work in games with unpredictable frame times. As long as they dont do it there doesn't exist a reason to believe that this would be possible.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I dont need to advertise my products.

Seriously, he works for AMD. What do you expect from him? Saying the truth? :lol:

Fact is, AMD hasn't shown that Freesync is able to work in games with unpredictable frame times. As long as they dont do it there doesn't exist a reason to believe that this would be possible.

This is a really tired argument. You can't discredit what the man says, so you simply slander the person. Same as during the Mantle build up because everyone who was working with Mantle had something to gain so could be assuredly counted on to be lying through their teeth every time they spoke.

While I freely admit I am no expert and will accept someone who is explaining why Adaptive-Sync can't be made to work as advertised, all the real experts disagree with you and say it does work. From the press who have seen it in action, to VESA themselves, and to, yes, AMD staff. For you though it's all some giant conspiracy to deprive nVidia of their fame and fortune selling G-Sync. Sorry, I just don't think that, and the fact you have nothing to sell, gives you a really credible position.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
I dont need to advertise my products.

Seriously, he works for AMD. What do you expect from him? Saying the truth? :lol:

Fact is, AMD hasn't shown that Freesync is able to work in games with unpredictable frame times. As long as they dont do it there doesn't exist a reason to believe that this would be possible.
So we should believe what you say? Ok :colbert:
They have all of 6 months to a year to show their wares, if it was in a state to show what you want to see, you might as well be able to to buy it in ~3 weeks
 

dacostafilipe

Senior member
Oct 10, 2013
794
284
136
There doesnt exist one reason why you need to "determine an image frame" when you only need to send the image to the display controller.

Of course there is!

The driver has to tell the screen that he has to redraw with the incoming frame and stop the VBLANK if it's the case.

Now you'll be saying: But why does the screen not simply redraw with every frame?

Because A-Sync is an optional extension. And as for all the extensions, you don't want to mess with the rest of the protocol. The best is to leave the frame handling as it is and just react to information inside SDP (that is send with every single frame anyway).

Then you'll be like: Won't this add a delay?

No, it won't. Even without FreeSync, the driver has to build a command packet when sending a new image to the LCD controller. Here he just adds metainformation to the packet.


The idea is that the GPU should drive the LCD and to remove the controllers from the LCD, not adding even more specialised stuff.
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
This is a really tired argument. You can't discredit what the man says, so you simply slander the person.

He has no proof for his words.

While I freely admit I am no expert and will accept someone who is explaining why Adaptive-Sync can't be made to work as advertised, all the real experts disagree with you and say it does work. From the press who have seen it in action, to VESA themselves, and to, yes, AMD staff. For you though it's all some giant conspiracy to deprive nVidia of their fame and fortune selling G-Sync. Sorry, I just don't think that, and the fact you have nothing to sell, gives you a really credible position.

"All the real experts"?
G-Sync is the only working implementation of variable refresh rate in games on any consumer product. There doesn't exist one "real expert" who has seen any other implementation providing the same function.

All these "experts" are basing their facts on nVidia's implementation. Nobody has ever used or seen Freesync in real games.

I even have no clue if there is one or two notebooks out there which are capable of PSR...
 
Status
Not open for further replies.