monstercameron
Diamond Member
- Feb 12, 2013
- 3,818
- 1
- 0
the semantic behind the upgrade sentiments seems to be causing confusion. When they say updated firmware, they most likely aren't talking about the monitors you have at home...
Where has AMD outright claimed that their demo has fluctuating frame rates? In order to be lying they would have to say this demo is running at frame rates ranging from 30-60 fps and the refresh rate is matching perfectly. Since they haven't and are just showing that they can run existing monitors at whatever refresh rate the GPU tells it to run as a proof of concept they are not lying.
Journalists misreporting what they are seeing are the number one issue here.
While I agree it would be great to get actual Frame Time, Frame Rate numbers and if it is variable, people are being overly harsh on the actual efforts and timeframe AMD has going. And easy on Nvidia for some reason.
As someone who will be buying a GSync monitor, this is the oldest article I found (quickly) unveiling the tech (showing frame numbers too). October 18. Here we are, 9 months in July after this unveil where we can possibly buy an off-the-shelf solution. 7 months after CES where actual monitor manufacturers unveiled actual products. Too fucking long of a wait. I don't consider the modding an option due to voiding warranties.
AMDs efforts are broader in scope, and ultimately up to Monitor Manufacturers to implement, if it was ready for testing then yes, I expect it to be a purchasable product within 1-3 months.
My issue overall, this damn tech is taking too long to get here. I miss my CRTs...
Hi!
This was posted by Dave Baumann on Beyond3d today.
"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.
During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."
http://forum.beyond3d.com/showthread.php?p=1852034#post1852034
Should clear up some misconceptions...
Bolded part does not read like they need to know the time between frames.They are only able to know the time between frames when they use additional frame buffers.
Hi!
This was posted by Dave Baumann on Beyond3d today.
"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.
During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."
http://forum.beyond3d.com/showthread.php?p=1852034#post1852034
Should clear up some misconceptions...
Hi!
This was posted by Dave Baumann on Beyond3d today.
"The demo was a full "FreeSync" demo, i.e. controlled variable refresh rate. DisplayPort ActiveSync is not, however, FreeSync, it is purely part of the ecosystem specification that enables FreeSync to work. FreeSync uses the specification and GPU hardware and software to sync the refresh rates.
During display initialisation the monitor EDID will send back the timing ranges available on the monitor to the GPU and the GPU drivers will store these for operation and control over when to send the VBLANK signal. During a game the GPU will send a VBLANK signal when a frame is rendered and ready to be displayed; if a frame rendering is taking longer than the lowest refresh then the prior frame will be resent only to be updated with the new frame as soon as it is finished within the timing range."
http://forum.beyond3d.com/showthread.php?p=1852034#post1852034
Should clear up some misconceptions...
It'd be nice if they had more than an obscure forum post as their communication strategy. Like, for example, saying this at the demo, and making sure everyone who saw it was clear about what they were seeing. Because at least one person walked away from it thinking it wasn't showing controlled variable refresh rate, hence all of the confusion.
Hit the interview circuit, put the demo through its paces, let people who know what they're looking for get their hands on it. That's what would make me happy.
I don't remember FRAPS and FCAT graphs from NV when they showed g-sync neither do I remember they outcry going on forums about that...
I, for one, don't care about FCAT graphs for this technology. They have nothing to do with each other...
VESA claims DP 1.2 supports 4k displays, and the monitor on my desk has a 1.2 plug but isn't 4k.
You are misinterpreting what the VESA spec actually says. It is what A-Sync will look like when implemented, not that all implementations of it at any stage of prototype development are actually capable of it.
Until we actually have proof of frame-by-frame variable refresh (which I *THOUGHT* this was showing but now am beginning to doubt, based on the reports from press), I will remain deeply skeptical of AMD's claims based on their history of deception and outright lying from the very start.
And no, this is not just AMD-bashing. Read my posts earlier in the thread, I was ready to accept this as legit when I first saw it. Now, I'm beginning to suspect that this is yet another round of half-truths and half-baked presentations designed to wow people who don't look too closely but won't stand up to scrutiny.
Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?
Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?
Basically, the claim is that they can make purpose built hardware do something that wasn't even considered to be a potential usage when that hardware was designed.Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?
Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?
Is there some technical reason why Freesync is unlikely to work? I mean, are they promising the equivalent of a perpetual motion machine, or otherwise describing something that we know cannot be done?
Or is it just that they are not showing their hand completely, because the tech is still being actively developed/perfected/decided?
As far as the capabilities read what VESA says, since people want to attack AMD as an unreliable source.
http://www.vesa.org/news/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/
Again, a spec is not the same thing as implemented technology. Just because a spec says it, does not mean the piece of hardware on the desk can do it.
Then what was the point of adopting the standard if it doesn't work? A little common sense goes a long way.
It's going to come and it's going to work. No amount of nay saying is going to change that. It's going to be a feature that everyone has access to. No hardware lock ins/outs. It also adds efficiency which is a nice added bonus for gamers.
The standard is prescriptive. It says what the requirements are in order to sell it. It doesn't mean that a prototype heavily in development is at the endpoint.