AMD Demonstrates Prototype FreeSync Monitor

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
the semantic behind the upgrade sentiments seems to be causing confusion. When they say updated firmware, they most likely aren't talking about the monitors you have at home...
 

Mand

Senior member
Jan 13, 2014
664
0
0
Where has AMD outright claimed that their demo has fluctuating frame rates? In order to be lying they would have to say this demo is running at frame rates ranging from 30-60 fps and the refresh rate is matching perfectly. Since they haven't and are just showing that they can run existing monitors at whatever refresh rate the GPU tells it to run as a proof of concept they are not lying.

Journalists misreporting what they are seeing are the number one issue here.

At CES, when they said that it was showing a competitor to G-Sync performing variable refresh, but they instead had two laptops running two static refresh rates.

This is very, very easy to prove. Go watch the CES video on youtube, and watch the laptops' OSDs. Go look up the reports on the CES demo. They have quotes from AMD, they had people there, listening to AMD. If you're trying to tell me the entire tech press corps somehow misunderstood, every single one of them, and it isn't AMD's fault then that's a pretty ludicrous claim. Even if they did all misunderstand, then that's still on AMD for not being clear, and not bothering to correct them as stories were repeated for the next five months.

So either AMD was deceptive in their presentation, or they deliberately let misreporting stand. Which would you prefer, as the arbiter of what is or is not trolling?
 

Bergen

Junior Member
Jun 5, 2014
15
0
66
Hi!

This was posted by Dave Baumann on Beyond3d today.

"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.

During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."

http://forum.beyond3d.com/showthread.php?p=1852034#post1852034

Should clear up some misconceptions...
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
They are only able to know the time between frames when they use additional frame buffers.
 

SoulWager

Member
Jan 23, 2013
155
0
71
While I agree it would be great to get actual Frame Time, Frame Rate numbers and if it is variable, people are being overly harsh on the actual efforts and timeframe AMD has going. And easy on Nvidia for some reason.

As someone who will be buying a GSync monitor, this is the oldest article I found (quickly) unveiling the tech (showing frame numbers too). October 18. Here we are, 9 months in July after this unveil where we can possibly buy an off-the-shelf solution. 7 months after CES where actual monitor manufacturers unveiled actual products. Too fucking long of a wait. I don't consider the modding an option due to voiding warranties.

AMDs efforts are broader in scope, and ultimately up to Monitor Manufacturers to implement, if it was ready for testing then yes, I expect it to be a purchasable product within 1-3 months.

My issue overall, this damn tech is taking too long to get here. I miss my CRTs...

The problems with G-sync are already well known, and nobody was trying to pretend they don't exist(the biggest being price and compatibility). What else is there to discuss? People talk about freesync because AMD makes big unsubstantiated claims, and people fill in the gaps with their own expectations, speculation, and the odd dash of sobriety. If AMD had simply said "We're working on a less expensive alternative, and expect products to market with equivalent performance within 2 years." I would have given them the benefit of the doubt. The fact that they showed those useless demos and are pushing the "freesync" branding, gives me the impression that they aren't taking the problem seriously, and are just trying to piss on Nvidia's parade.

Nvidia recognized the problem with fixed refresh rates, and actually found a way to get display manufacturers on board with fixing it. For that I'm grateful. AMD has a responsibility as the party playing catch up: to drive prices down. For that you need a cheaper product with the same performance. If they can do that with their current method, I'll be happy, but extremely surprised. I certainly don't think they can do that within 12 months.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Hi!

This was posted by Dave Baumann on Beyond3d today.

"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.

During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."

http://forum.beyond3d.com/showthread.php?p=1852034#post1852034

Should clear up some misconceptions...

They are only able to know the time between frames when they use additional frame buffers.
Bolded part does not read like they need to know the time between frames.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,595
136
Good grief NV have invented g-synch for such a little, tiny piece of technology. Send ranges - send blank. Pathetic. We could have a gazilion proprietary standards if small bit like that was so special.

Thats not to dismiss the effect. Its about time it happens. But its not excactly rocket science.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Hi!

This was posted by Dave Baumann on Beyond3d today.

"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.

During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."

http://forum.beyond3d.com/showthread.php?p=1852034#post1852034

Should clear up some misconceptions...

It'd be nice if they had more than an obscure forum post as their communication strategy. Like, for example, saying this at the demo, and making sure everyone who saw it was clear about what they were seeing. Because at least one person walked away from it thinking it wasn't showing controlled variable refresh rate, hence all of the confusion.

Hit the interview circuit, put the demo through its paces, let people who know what they're looking for get their hands on it. That's what would make me happy.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Hi!

This was posted by Dave Baumann on Beyond3d today.

"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.

During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."

http://forum.beyond3d.com/showthread.php?p=1852034#post1852034

Should clear up some misconceptions...

Basically, freesync uses a third framebuffer on the GPU instead of panel self refresh. Not sure if that's the best performing option, or just the fastest for AMD to implement. Long term it may mean using more expensive memory to hold the same data, and add a little GPU overhead.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It'd be nice if they had more than an obscure forum post as their communication strategy. Like, for example, saying this at the demo, and making sure everyone who saw it was clear about what they were seeing. Because at least one person walked away from it thinking it wasn't showing controlled variable refresh rate, hence all of the confusion.

Hit the interview circuit, put the demo through its paces, let people who know what they're looking for get their hands on it. That's what would make me happy.

An easy way would be to have shown the frame rate counter like they did in the first demo, which is what clearly proved it wasn't variable. Only this time despite showing what looks like exactly the same demo the frame rate counter is hidden.

Anyone who is even slightly cynical of dodgy marketing demo's is going to put 2+2 together and decide the mostly likely reason they didn't show it is because the frame rate is not variable.
 
Last edited:

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
well its true... perf overlay is glaringly absent this time around

as to how far AMD would go with "marketing", we can only guess,
but every G-Sync monitor sold means one gaming enthusiast locked up with NV for foreseeable future
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Next thing we know will be the display had wooden screws in it... ;)
I don't remember FRAPS and FCAT graphs from NV when they showed g-sync neither do I remember they outcry going on forums about that...
 

96Firebird

Diamond Member
Nov 8, 2010
5,738
334
126
I don't remember FRAPS and FCAT graphs from NV when they showed g-sync neither do I remember they outcry going on forums about that...

I, for one, don't care about FCAT graphs for this technology. They have nothing to do with each other...

Nvidia's demo had the FRAPS on-screen display. Probably the reason no one complained about it. Because it was, you know... there. ;)

https://www.youtube.com/watch?v=4BtyYnxjcN8
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
VESA claims DP 1.2 supports 4k displays, and the monitor on my desk has a 1.2 plug but isn't 4k.

You are misinterpreting what the VESA spec actually says. It is what A-Sync will look like when implemented, not that all implementations of it at any stage of prototype development are actually capable of it.

Until we actually have proof of frame-by-frame variable refresh (which I *THOUGHT* this was showing but now am beginning to doubt, based on the reports from press), I will remain deeply skeptical of AMD's claims based on their history of deception and outright lying from the very start.

And no, this is not just AMD-bashing. Read my posts earlier in the thread, I was ready to accept this as legit when I first saw it. Now, I'm beginning to suspect that this is yet another round of half-truths and half-baked presentations designed to wow people who don't look too closely but won't stand up to scrutiny.

Once again you tell me I don't understand something (misinterpret). There is nothing to misinterpret. VESA was very clear.

You are allowed to be skeptical. I don't have a problem with that. Claiming they are lying, etc..., etc..., etc... is beyond anything reasonable though.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Want to continue to be able to post in this forum? Stay on topic, lose the hostile attitude, and stop the personal attacks.
-- stahlhart
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?

Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?
 

Mand

Senior member
Jan 13, 2014
664
0
0
Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?

Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?

No, there isn't a technical reason why it's unlikely to work. Which is what makes the complete failure of AMD to clearly deliver on the most basic promises about its functionality all the more baffling.

Their entire approach from the start has basically been "Nooo, don't buy G-Sync, we have something just as good and FREE!" Only, they haven't been able to show that it even exists, let alone is just as good, and there are a lot of reasons to believe it won't be free. The CES demo was trying to pull the red carpet out from under G-Sync's in-the-wild product launch, and it largely succeeded, judging from the defenders AMD has managed to rack up. Only they haven't done anything to justify the support they've gotten. Maybe, maybe this one is finally providing that justification - but with it still unclear whether they were actually running a variable refresh rate on the demo, the history of the last five months of AMD's lies makes it hard to take them at their word.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?

Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?
Basically, the claim is that they can make purpose built hardware do something that wasn't even considered to be a potential usage when that hardware was designed.

It would be like an auto manufacturer saying, "Hey, this car is rainproof, therefore I can turn it into a boat, instead of wasting time and money building a boat from scratch." It can probably be done, but it wouldn't make a very good boat.
 
Last edited:

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?

Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?

It's an early prototype setup. It does seem to be lacking in refresh rate range (40-60), but that will be up to the monitor/screen on how wide of a range it can handle.

As far as the capabilities read what VESA says, since people want to attack AMD as an unreliable source.
http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Again, a spec is not the same thing as implemented technology. Just because a spec says it, does not mean the piece of hardware on the desk can do it.

Then what was the point of adopting the standard if it doesn't work? A little common sense goes a long way.

It's going to come and it's going to work. No amount of nay saying is going to change that. It's going to be a feature that everyone has access to. No hardware lock ins/outs. It also adds efficiency which is a nice added bonus for gamers.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Then what was the point of adopting the standard if it doesn't work? A little common sense goes a long way.

It's going to come and it's going to work. No amount of nay saying is going to change that. It's going to be a feature that everyone has access to. No hardware lock ins/outs. It also adds efficiency which is a nice added bonus for gamers.

The standard is prescriptive. It says what the requirements are in order to sell it. It doesn't mean that a prototype heavily in development is at the endpoint.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
The standard is prescriptive. It says what the requirements are in order to sell it. It doesn't mean that a prototype heavily in development is at the endpoint.

No kidding. Still doesn't answer why adopt something that isn't going to work. I'm specifically talking about VESA here. VESA is the one making these claims for Adaptive-Sync. Do you really think all the hardware vendors got together and adopted a standard that they can't make work? Do you think the monitor vendors joined in ratifying it but aren't going to make monitors to support it? The standard is only a few weeks old. This stuff doesn't happen overnight.
 
Status
Not open for further replies.