[TR] FreeSync monitors will sample next month, start selling next year

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I can see why nv could be butthurt about FS ruining their Grand Milking Festival. But why would the forum posters be aswell?
More free options? Sign me up!
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
I once again ask you to think rationally. Just because YOU aren't interested in adaptive vsync to save power on the desktop doesn't mean that the millions of people buying CPU's with onboard GPU's from intel and amd do not care about that extra power usage. Stop thinking in terms of enthusiasts. If 10,000 monitors and workstations in a massive office building are all suddenly using less power that is something that companies will upgrade for thus buying new Intel chips. This is why I believe others are going to follow AMD in supporting the standard.

Intel and ARM are all about lowering power usage on all fronts. The writing is on the wall. You just need to take your nvidia shrine off said wall to see it. Gsync is a great idea. I said one of the companies needed to make that exact technology in a thread a few months before we had heard anything about Gsync. The problem is that Nvidia's exclusionary tactics are a losing proposition in this instance. My opinions have nothing to do with my feeling to either nvidia or amd.

Let's talk about the spec, the ability to save power. That's exactly what your not understanding. You and many many others are confusing two very different things.
The spec change brings an eDP feature/function over to the desktop. Its already being used to save power on mobile devices, this ability is a far cry from what gsync is. Moving this ability to desktop monitors, isn't freesync. No matter if 100,000,000,000 monitorss use it, its not freesync. For yrs now, the same capability that is in the new DP spec has existed in the eDP spec. It is in laptops. Bringing the power saving feature to desktop does not give anyone a gsync experience. No more than it ever did on our laptops.

The spec is not freesync.

Once u understand that, you will start to see how moving the spec over to desktop cannot kill gsync by itself. AMD has to make use of this spec in a special way, one that no one has used-> it has yet to be proven.

I am not against the effort at all. I am not the guys screaming "AMD lies". If AMD can pull off a gsync like experience using this spec, that is awesome. I am sure they can come up with something and it may be as good, better, or worse than the gsync approach. But AMD figuring out a way to mimic gsync doesn't make the spec equal freesync. It doesn't mean that everyone will all of a sudden be be using amds freesync. Even if everyone adopts the power saving extension on the desktop, everyone uses the spec, that doesn't mean everyone has a gsync alternative.

The eDP capability has existed and is being used all ready. And it gives nothing like a gsync experience. So again. Having this same spec ability on the desktop, does not equal freesync. It will not give anyone a gsync experience.

Amd is trying to make clever use of this spec, using their GPUs and drivers. Its obviously not simple or we would already have seen freesync on their laptops. So using the spec in a new way to create a gsync like experience is drastically different than having thus power saving spec change in desktop monitors

You cannot put them together like that. Freesync is a creation from AMD that works thru their software using their GPU and a monitor with a new DP spec. It's a far cry from what people are trying to make it out to be.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
I don't understand how you can come to these conclusions. The ANANDTECH article is right there. Unless my reading comprehension is completely off basically every one of those so called lies is outright contradicted by the Anandtech article from Computex 2014.

You talk about a great way to start a conversation. Let's talk about a great way to ruin a thread. If you aren't interested in constructively talking about Freesync and Adaptive sync go post in a Gsync thread. No one is stopping you. You'll notice I'm absent from Nvidia threads on these forums because I don't have anything to say, negative or positive.

If you don't have anything but negative to say about AMD why are you wasting all this time and energy on your FUD?

Anandtech repeating AMD's lies doesn't make them true.

And it's not my FUD. It's AMD's FUD, that I'm trying to cut through. Yet, people like you are so resistant that you reject any information that doesn't meet with your preconception.

Examine the facts. Look at the evidence. Look at the information. It's all there, and it's overwhelmingly conclusive.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
I can't wait to see it in 4k monitors. Hopefully by then there are price drops as well on 4k screens as they get more mature. I'm guessing free synch will add little or no additional costs.

This thread is so full of fud it's a train wreck to read. That same thread crapping is getting old. Every thread isn't made to vent about the same stuff. The faux rage is absurd.
 

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Let's talk about the spec, the ability to save power. That's exactly what your not understanding. You and many many others are confusing two very different things.
The spec change brings an eDP feature/function over to the desktop. Its already being used to save power on mobile devices, this ability is a far cry from what gsync is. Moving this ability to desktop monitors, isn't freesync. No matter if 100,000,000,000 monitorss use it, its not freesync. For yrs now, the same capability that is in the new DP spec has existed in the eDP spec. It is in laptops. Bringing the power saving feature to desktop does not give anyone a gsync experience. No more than it ever did on our laptops.

The spec is not freesync.

Once u understand that, you will start to see how moving the spec over to desktop cannot kill gsync by itself. AMD has to make use of this spec in a special way, one that no one has used-> it has yet to be proven.

I am not against the effort at all. I am not the guys screaming "AMD lies". If AMD can pull off a gsync like experience using this spec, that is awesome. I am sure they can come up with something and it may be as good, better, or worse than the gsync approach. But AMD figuring out a way to mimic gsync doesn't make the spec equal freesync. It doesn't mean that everyone will all of a sudden be be using amds freesync. Even if everyone adopts the power saving extension on the desktop, everyone uses the spec, that doesn't mean everyone has a gsync alternative.

The eDP capability has existed and is being used all ready. And it gives nothing like a gsync experience. So again. Having this same spec ability on the desktop, does not equal freesync. It will not give anyone a gsync experience.

Amd is trying to make clever use of this spec, using their GPUs and drivers. Its obviously not simple or we would already have seen freesync on their laptops. So using the spec in a new way to create a gsync like experience is drastically different than having thus power saving spec change in desktop monitors

You cannot put them together like that. Freesync is a creation from AMD that works thru their software using their GPU and a monitor with a new DP spec. It's a far cry from what people are trying to make it out to be.

How it will compares to and affects gsync remains to be see. The good thing is that since it will be part of the signal spec (a sync) and when it's supported in monitors, it could potentially be utilized by all gpus (with some effort). If it's good and cheap or free it'll drive the ridiculous gsync costs down since gsync wouldn't be worth a premium.

It's clearly beneficial to consumers now we just have to wait until the implementation details and reviews start to trickle out.
 

Fastx

Senior member
Dec 18, 2008
780
0
0
How it will compares to and affects gsync remains to be see. The good thing is that since it will be part of the signal spec (a sync) and when it's supported in monitors, it could potentially be utilized by all gpus (with some effort). If it's good and cheap or free it'll drive the ridiculous gsync costs down since gsync wouldn't be worth a premium.

It's clearly beneficial to consumers now we just have to wait until the implementation details and reviews start to trickle out.

You know you might be right on this, I didn't think about that possibility.:) I was thinking/wondering about a possible G-sync/FS monitor down the road if FS turns out to be decent/real.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Anandtech repeating AMD's lies doesn't make them true.

And it's not my FUD. It's AMD's FUD, that I'm trying to cut through. Yet, people like you are so resistant that you reject any information that doesn't meet with your preconception.

Examine the facts. Look at the evidence. Look at the information. It's all there, and it's overwhelmingly conclusive.

I would suggest you to wait for the first products to hit the market and then make accusations of lies and draw any conclusions.
 

Mand

Senior member
Jan 13, 2014
664
0
0
I would suggest you to wait for the first products to hit the market and then make accusations of lies and draw any conclusions.

No, I won't wait for products to hit the market. Because their lies are able to be proved to be false now.

Why let them get away with a year of lying? What reason could there possibly be for saying "Well, shucks, I know they lied about not one but two of their demos by claiming they showed things they didn't actually show, but let's just give them the benefit of the doubt until they're done."

No. If they're making statements, the burden is on them to not lie to us. If they do lie to us, they deserve to get called on it, always.
 

MathMan

Member
Jul 7, 2011
93
0
0
I think it's undeniable that AMD has been spreading FUD and misinformation all over the place. The 1 frame delay claim by Huddy in GSYNC probably being the most egregious. It has since been contradicted even by AMD marketing guys in the TechReport article.

I think it's fair to call AMD out on that. Nobody should be able to get away with a smearing campaign because they don't have an answer ready.

That said, FreeSync, once available, will do exactly the same thing as GSync. So, be happy, AMD owners!

But I also think that Nvidia won't add Adaptive-Sync in their drivers. Why would they?
 

SoulWager

Member
Jan 23, 2013
155
0
71
But I also think that Nvidia won't add Adaptive-Sync in their drivers. Why would they?

They will if and when there is a big enough installed base of adaptive sync displays to justify losing monitor sales in exchange for video card sales. I think they'd support it on mobile parts first, because they wouldn't be harming their other product sales by supporting it there. Might even give it the g-sync branding.
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
I can't wait to see it in 4k monitors. Hopefully by then there are price drops as well on 4k screens as they get more mature. I'm guessing free synch will add little or no additional costs.

This thread is so full of fud it's a train wreck to read. That same thread crapping is getting old. Every thread isn't made to vent about the same stuff. The faux rage is absurd.

Yeah that's my hope with regards to 4k as well. GPU controlled refresh should been here after LCDs took hold, but with 4k I guess the feature comes at the right time. In regards to the fud, GPU controlled refresh looks to end the refresh rate madness (VSYNC, triplebuffer, stuttering) and to be discussing about marketing is a waste. Well, to each their own. In my opinion the more widespread the usage the better, I'd hate to see gpu controlled refresh be a niche market.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Yeah that's my hope with regards to 4k as well. GPU controlled refresh should been here after LCDs took hold, but with 4k I guess the feature comes at the right time. In regards to the fud, GPU controlled refresh looks to end the refresh rate madness (VSYNC, triplebuffer, stuttering) and to be discussing about marketing is a waste. Well, to each their own. In my opinion the more widespread the usage the better, I'd hate to see gpu controlled refresh be a niche market.

triple buffering is still necessary for AMD's minimum refresh rate fallback case, or they lose GPU performance whenever a frame finishes during scanout. You want to switch to double buffering at some point, to avoid excessive latency or dropped frames at the max refresh fallback case.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
No, it is not missing. G-Sync is supported by "every" Kepler card since March 2012 while for the whole Adaptive-Sync spec you need on of the latest AMD cards. And that is the reason why it is unlikely that every vendor can support the whole A-V spec outside of AMD at this moment.
so free sync in 16 months
gsync 2.5 years
got it
 

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
triple buffering is still necessary for AMD's minimum refresh rate fallback case, or they lose GPU performance whenever a frame finishes during scanout. You want to switch to double buffering at some point, to avoid excessive latency or dropped frames at the max refresh fallback case.


Still trying to understand the low end edge case(not arguing against it, just myself don't understand the solution completely). Definitely see the need for buffering for max FPS above monitor refresh rate otherwise get tearing. Very curious technically how AMD will tackle the high edge case without lag, crossfire, surround, etc) for Freesync. Hopefully not a compromised solution or ignored problem, like crossfire stuttering before 290x. I really meant that if Freesync/GSYNC takes off and is common place, really see a console like experience coming to the PC (just install, play. Nothing to set GPU, desktop or in game).
 

Mand

Senior member
Jan 13, 2014
664
0
0
That said, FreeSync, once available, will do exactly the same thing as GSync. So, be happy, AMD owners!

We hope.

So far, AMD has done a rather awful job at justifying thinking that.

And they could have avoided it. They could have said "We're working on it, we'll let you know." Instead, they lied, smeared, and faked their way to fooling people into thinking FreeSync is a legitimate competitor.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
Still trying to understand the low end edge case(not arguing against it, just myself don't understand the solution completely). Definitely see the need for buffering for max FPS above monitor refresh rate otherwise get tearing. Very curious technically how AMD will tackle the high edge case without lag, crossfire, surround, etc) for Freesync. Hopefully not a compromised solution or ignored problem, like crossfire stuttering before 290x. I really meant that if Freesync/GSYNC takes off and is common place, really see a console like experience coming to the PC (just install, play. Nothing to set GPU, desktop or in game).

Double buffering means you have one framebuffer you're rendering, and one that's presented to the display. If you flip framebuffers during the scan, you get tearing. If you wait until the scan is done, you lose performance This loss of performance is why double buffered v-sync drops from 60fps directly to 30fps, skipping all the numbers in between. If your freesync monitor does 30-60fps, double buffered freesync would mean you drop from 30fps directly to 20fps. If the monitor did 30-120hz, you would drop from 30fps to 24fps.

Triple buffering adds an extra framebuffer, so if you finish a frame during scanout, you have 1 frame scanning, 1 frame done, and the gpu can start working on the next frame in the third framebuffer. as soon as you're done scanning, you can start sending the next frame. This is important for the minimum refresh rate case because you don't want to idle your GPU for several milliseconds when you're already at a low framerate. At the max refresh case however, triple buffering either adds a frame of latency or drops frames. It's okay to drop frames you've already sent to the monitor, but to avoid excess latency or judder, you need to tell the GPU to wait before rendering the next frame, instead of using the third framebuffer. Basically, you have to dedicate the vram to triple buffering, but you can't let the GPU use the extra framebuffer unless you're repeating a frame.


Max refresh rate fallback case is simple enough, but we're just getting started with repeating frames. Sometimes you want to repeat a frame before you're forced to, because it lets you use that 24~30fps range not just for rendering the next frame, but also for displaying the current frame at the exact moment it finishes.

Say your panel can manage 33ms before needing to be refreshed, a refresh takes 8ms, last frame took 30ms to render, and this frame takes 35ms to render. Frames normally take close to the same amount of time to render as the previous frame, so you can reduce the risk of frame completion in the 8ms lockout by repeating the previous before you're forced to. If you repeat the frame at say 12ms, you can move your "display exactly when finished" window from 8 through 33 to 20 through 53ms.


No clue if the new g-sync monitors are doing that last bit, but the swift does appear to let the GPU run flat out when running between 20 and 30fps, which is an improvement over the early mod kits.

I don't want to think about how complicated this all gets when you use multiple GPUs.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
I don't want to think about how complicated this all gets when you use multiple GPUs.

It's not that much more complicated, is it?

I mean, whatever multi-GPU is doing to generate a frame, it's still got to get to a point where the GPU in control of the display says "Okay, frame is ready!" - at which point it's the same process as it would be for a single GPU. The display doesn't care what the GPUs are doing before the frame is ready, only when the frame is done.
 

SoulWager

Member
Jan 23, 2013
155
0
71
It's not that much more complicated, is it?

I mean, whatever multi-GPU is doing to generate a frame, it's still got to get to a point where the GPU in control of the display says "Okay, frame is ready!" - at which point it's the same process as it would be for a single GPU. The display doesn't care what the GPUs are doing before the frame is ready, only when the frame is done.
Remember the frame pacing issues AMD had with crossfire? throw extra framebuffers, displays with different timing constraints, a framerate prediction algorithm that throws in extra refreshes, extra overhead, and the fact that your refresh rate is not predictable anymore, and it's not as simple.

Life would probably be easier if they got split frame rendering to work well. What was the problem with it anyway?
 

MathMan

Member
Jul 7, 2011
93
0
0
The problem with split frame rendering is that it's terrible at dealing with intra-frame dependencies.

Say you first render to a texture, then this texture is used during the final render.

You have 2 options:
- render that texture both a GPU A and GPU B, then split the render of the final frame. This make you do work twice and reduces SLI efficiency accordingly.
- render the texture on GPU A and then copy it over to GPU B to be used during the split render. This has GPU B potentially sitting idle while GPU A is busy.

With AFR, you only have inter-frame dependency issues, but those are much more rare.

Split rendering in the old days was easy because render-to-texture didn't exist. Now it's everywhere.

Same issue exists with compute shaders and their results.

Split render is only solvable these days with extremely high bandwidth channel between the 2 GPUs. And even then it's going to be hard.
 

gorobei

Diamond Member
Jan 7, 2007
4,049
1,541
136
LOL, just watched one of the techreport guys over on some newegg podcast talking about the synch wars.

one of the hosts pointed out something:
now that vesa AS is part of the 1.2a 1.3 displayport spec, any future gsync monitor that tries to be DP compliant will effectively be FS compatible.

so FS users can use any monitor with AS or GSw/DP. but GS users can only buy GS monitors.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Is adaptive sync something that was already a mandatory feature in the upcoming 1.3 they brought over to dp 1.2a as optional feature? or is it also optional for 1.3? And if it is mandatory, is it mandatory only for gpu's, or also monitors?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
LOL, just watched one of the techreport guys over on some newegg podcast talking about the synch wars.

one of the hosts pointed out something:
now that vesa AS is part of the 1.2a 1.3 displayport spec, any future gsync monitor that tries to be DP compliant will effectively be FS compatible.

so FS users can use any monitor with AS or GSw/DP. but GS users can only buy GS monitors.

Its an optional feature.
http://www.anandtech.com/show/8008/...andard-variable-refresh-monitors-move-forward

Meaning it will only be used by those that are willing to spend the extra money to attact buyers that wish it. In other words, the wast majority of monitors will most likely not adopt it because they can save money not doing so.

So no, you can have a gsync monitor that is not freesync compatible.
 

Spanners

Senior member
Mar 16, 2014
325
1
0
Is adaptive sync something that was already a mandatory feature in the upcoming 1.3 they brought over to dp 1.2a as optional feature? or is it also optional for 1.3? And if it is mandatory, is it mandatory only for gpu's, or also monitors?


"Adaptive-Sync is a standard component of VESA’s embedded DisplayPort (eDP™) specification. eDP is a companion standard to the DisplayPort interface."

So it's a "mandatory" part of eDP which is a companion standard (not mandatory) part of DP.

LOL, just watched one of the techreport guys over on some newegg podcast talking about the synch wars.

one of the hosts pointed out something:
now that vesa AS is part of the 1.2a 1.3 displayport spec, any future gsync monitor that tries to be DP compliant will effectively be FS compatible.

so FS users can use any monitor with AS or GSw/DP. but GS users can only buy GS monitors.

Monitors that are eDP compliant, not necessarily those that are DP compliant.

Meaning it will only be used by those that are willing to spend the extra money to attact buyers that wish it. In other words, the wast majority of monitors will most likely not adopt it because they can save money not doing so.

Some of the power saving aspects may be attractive to non-gaming desktop monitors. If the added cost is minimal it may see widespread adoption outside of mobile, hard to predict at this point.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
"Adaptive-Sync is a standard component of VESA’s embedded DisplayPort (eDP™) specification. eDP is a companion standard to the DisplayPort interface."

So it's a "mandatory" part of eDP which is a companion standard (not mandatory) part of DP.



Monitors that are eDP compliant, not necessarily those that are DP compliant.



Some of the power saving aspects may be attractive to non-gaming desktop monitors. If the added cost is minimal it may see widespread adoption outside of mobile, hard to predict at this point.
Variable refresh isn't a mandatory part of eDP. The standard allows for it, but you don't have to implement it to use some other parts of the eDP standard.

For example, someone building a kiosk with a touchscreen might use embedded displayport, but they aren't going to care about the power saving features as much as someone using it in a battery powered device.
 
Status
Not open for further replies.