I once again ask you to think rationally. Just because YOU aren't interested in adaptive vsync to save power on the desktop doesn't mean that the millions of people buying CPU's with onboard GPU's from intel and amd do not care about that extra power usage. Stop thinking in terms of enthusiasts. If 10,000 monitors and workstations in a massive office building are all suddenly using less power that is something that companies will upgrade for thus buying new Intel chips. This is why I believe others are going to follow AMD in supporting the standard.
Intel and ARM are all about lowering power usage on all fronts. The writing is on the wall. You just need to take your nvidia shrine off said wall to see it. Gsync is a great idea. I said one of the companies needed to make that exact technology in a thread a few months before we had heard anything about Gsync. The problem is that Nvidia's exclusionary tactics are a losing proposition in this instance. My opinions have nothing to do with my feeling to either nvidia or amd.
I don't understand how you can come to these conclusions. The ANANDTECH article is right there. Unless my reading comprehension is completely off basically every one of those so called lies is outright contradicted by the Anandtech article from Computex 2014.
You talk about a great way to start a conversation. Let's talk about a great way to ruin a thread. If you aren't interested in constructively talking about Freesync and Adaptive sync go post in a Gsync thread. No one is stopping you. You'll notice I'm absent from Nvidia threads on these forums because I don't have anything to say, negative or positive.
If you don't have anything but negative to say about AMD why are you wasting all this time and energy on your FUD?
Let's talk about the spec, the ability to save power. That's exactly what your not understanding. You and many many others are confusing two very different things.
The spec change brings an eDP feature/function over to the desktop. Its already being used to save power on mobile devices, this ability is a far cry from what gsync is. Moving this ability to desktop monitors, isn't freesync. No matter if 100,000,000,000 monitorss use it, its not freesync. For yrs now, the same capability that is in the new DP spec has existed in the eDP spec. It is in laptops. Bringing the power saving feature to desktop does not give anyone a gsync experience. No more than it ever did on our laptops.
The spec is not freesync.
Once u understand that, you will start to see how moving the spec over to desktop cannot kill gsync by itself. AMD has to make use of this spec in a special way, one that no one has used-> it has yet to be proven.
I am not against the effort at all. I am not the guys screaming "AMD lies". If AMD can pull off a gsync like experience using this spec, that is awesome. I am sure they can come up with something and it may be as good, better, or worse than the gsync approach. But AMD figuring out a way to mimic gsync doesn't make the spec equal freesync. It doesn't mean that everyone will all of a sudden be be using amds freesync. Even if everyone adopts the power saving extension on the desktop, everyone uses the spec, that doesn't mean everyone has a gsync alternative.
The eDP capability has existed and is being used all ready. And it gives nothing like a gsync experience. So again. Having this same spec ability on the desktop, does not equal freesync. It will not give anyone a gsync experience.
Amd is trying to make clever use of this spec, using their GPUs and drivers. Its obviously not simple or we would already have seen freesync on their laptops. So using the spec in a new way to create a gsync like experience is drastically different than having thus power saving spec change in desktop monitors
You cannot put them together like that. Freesync is a creation from AMD that works thru their software using their GPU and a monitor with a new DP spec. It's a far cry from what people are trying to make it out to be.
How it will compares to and affects gsync remains to be see. The good thing is that since it will be part of the signal spec (a sync) and when it's supported in monitors, it could potentially be utilized by all gpus (with some effort). If it's good and cheap or free it'll drive the ridiculous gsync costs down since gsync wouldn't be worth a premium.
It's clearly beneficial to consumers now we just have to wait until the implementation details and reviews start to trickle out.
Anandtech repeating AMD's lies doesn't make them true.
And it's not my FUD. It's AMD's FUD, that I'm trying to cut through. Yet, people like you are so resistant that you reject any information that doesn't meet with your preconception.
Examine the facts. Look at the evidence. Look at the information. It's all there, and it's overwhelmingly conclusive.
I would suggest you to wait for the first products to hit the market and then make accusations of lies and draw any conclusions.
But I also think that Nvidia won't add Adaptive-Sync in their drivers. Why would they?
I can't wait to see it in 4k monitors. Hopefully by then there are price drops as well on 4k screens as they get more mature. I'm guessing free synch will add little or no additional costs.
This thread is so full of fud it's a train wreck to read. That same thread crapping is getting old. Every thread isn't made to vent about the same stuff. The faux rage is absurd.
Yeah that's my hope with regards to 4k as well. GPU controlled refresh should been here after LCDs took hold, but with 4k I guess the feature comes at the right time. In regards to the fud, GPU controlled refresh looks to end the refresh rate madness (VSYNC, triplebuffer, stuttering) and to be discussing about marketing is a waste. Well, to each their own. In my opinion the more widespread the usage the better, I'd hate to see gpu controlled refresh be a niche market.
so free sync in 16 monthsNo, it is not missing. G-Sync is supported by "every" Kepler card since March 2012 while for the whole Adaptive-Sync spec you need on of the latest AMD cards. And that is the reason why it is unlikely that every vendor can support the whole A-V spec outside of AMD at this moment.
triple buffering is still necessary for AMD's minimum refresh rate fallback case, or they lose GPU performance whenever a frame finishes during scanout. You want to switch to double buffering at some point, to avoid excessive latency or dropped frames at the max refresh fallback case.
That said, FreeSync, once available, will do exactly the same thing as GSync. So, be happy, AMD owners!
Still trying to understand the low end edge case(not arguing against it, just myself don't understand the solution completely). Definitely see the need for buffering for max FPS above monitor refresh rate otherwise get tearing. Very curious technically how AMD will tackle the high edge case without lag, crossfire, surround, etc) for Freesync. Hopefully not a compromised solution or ignored problem, like crossfire stuttering before 290x. I really meant that if Freesync/GSYNC takes off and is common place, really see a console like experience coming to the PC (just install, play. Nothing to set GPU, desktop or in game).
I don't want to think about how complicated this all gets when you use multiple GPUs.
Remember the frame pacing issues AMD had with crossfire? throw extra framebuffers, displays with different timing constraints, a framerate prediction algorithm that throws in extra refreshes, extra overhead, and the fact that your refresh rate is not predictable anymore, and it's not as simple.It's not that much more complicated, is it?
I mean, whatever multi-GPU is doing to generate a frame, it's still got to get to a point where the GPU in control of the display says "Okay, frame is ready!" - at which point it's the same process as it would be for a single GPU. The display doesn't care what the GPUs are doing before the frame is ready, only when the frame is done.
LOL, just watched one of the techreport guys over on some newegg podcast talking about the synch wars.
one of the hosts pointed out something:
now that vesa AS is part of the 1.2a 1.3 displayport spec, any future gsync monitor that tries to be DP compliant will effectively be FS compatible.
so FS users can use any monitor with AS or GSw/DP. but GS users can only buy GS monitors.
Is adaptive sync something that was already a mandatory feature in the upcoming 1.3 they brought over to dp 1.2a as optional feature? or is it also optional for 1.3? And if it is mandatory, is it mandatory only for gpu's, or also monitors?
LOL, just watched one of the techreport guys over on some newegg podcast talking about the synch wars.
one of the hosts pointed out something:
now that vesa AS is part of the 1.2a 1.3 displayport spec, any future gsync monitor that tries to be DP compliant will effectively be FS compatible.
so FS users can use any monitor with AS or GSw/DP. but GS users can only buy GS monitors.
Meaning it will only be used by those that are willing to spend the extra money to attact buyers that wish it. In other words, the wast majority of monitors will most likely not adopt it because they can save money not doing so.
Variable refresh isn't a mandatory part of eDP. The standard allows for it, but you don't have to implement it to use some other parts of the eDP standard."Adaptive-Sync is a standard component of VESAs embedded DisplayPort (eDP) specification. eDP is a companion standard to the DisplayPort interface."
So it's a "mandatory" part of eDP which is a companion standard (not mandatory) part of DP.
Monitors that are eDP compliant, not necessarily those that are DP compliant.
Some of the power saving aspects may be attractive to non-gaming desktop monitors. If the added cost is minimal it may see widespread adoption outside of mobile, hard to predict at this point.
