[TR] FreeSync monitors will sample next month, start selling next year

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SoulWager

Member
Jan 23, 2013
155
0
71
The demos were to show the synchronization of FPS with the monitor refresh rate (Hz) using existing technology and yes they proved that.
Yes they didnt show variable fps synchronized to variable Monitor refresh rates but that doesnt mean they lie.
Why dodnt we all wait for the next demos or the actual products before making accusations ??
They proved they can do v-sync at 50hz, which any monitor in the last 20 years can do, by delaying frames until the next refresh cycle. They got called out for the CES demo, then they pulled the exact same shit at computex. Either they're too incompetent to show what they have, or they're outright lying about their capabilities. Neither bodes well for a working product.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
They proved they can do v-sync at 50hz, which any monitor in the last 20 years can do, by delaying frames until the next refresh cycle. They got called out for the CES demo, then they pulled the exact same shit at computex. Either they're too incompetent to show what they have, or they're outright lying about their capabilities. Neither bodes well for a working product.

They used vblank to set the refresh rate, which proved their concept.

At least that was my understanding. Either way, the reactions in this thread are why NDA's exist, and why companies keep new products close to their chest usually.

It just goes to show how incompetent AMD's marketing team is.
 

SoulWager

Member
Jan 23, 2013
155
0
71
They used vblank to set the refresh rate, which proved their concept.

At least that was my understanding. Either way, the reactions in this thread are why NDA's exist, and why companies keep new products close to their chest usually.

It just goes to show how incompetent AMD's marketing team is.
If they did that, there's no evidence of it. If you need an oscilloscope to prove you've done something, either bring the oscilloscope to the demo, or don't show a demo in the first place. "freesync" hasn't been demonstrated until variable refresh has been demonstrated.

As for NDAs, The reactions in this thread are caused by the difference between AMD's claims and AMD's demonstrations. If they matched, there would be nothing to argue over. No, NDAs are to prevent your competition from knowing your product strategy, and to a lesser extent, control PR(you want product hype to peak at release, not before). Not relevant in this case because AMD isn't trying to keep a new product under wraps until release, they're trying to slow down adoption until they can catch up to an existing product.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The Wright brothers designed their new prototype in a way that lets it reach the speed of 40 mph, which after attaching wings can create enough lift force to take off with an adult man aboard! We gathered today to see if the machine can reach the required speed!
*The prototype reaches 40mph, then stops. Wright brothers celebrate!*
-from the crowd: " But they didn't lift an inch! Liars!"
-it can't lift because we didn't attach wings yet.
- Another lie! The prototype didn't accelerate to 40mph. They blowed air on it, pulled with an invisible rope, and measured the speed of cars going down the nearby road! Liars!

Some people...
It was proof of concept. It worked. They didn't need to change the refresh rate in display settings, it adjusted by itself. Presentation over, job done. What is so hard to get here?
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
They haven't demoed variable refresh rates yet because it's not ready. They did however hint that we will be seeing this in September.

The whole point of their demo with the laptop screens was to use vblank to change the refresh rate of their monitor to match the FPS being displayed and PROVE THEIR CONCEPT.

We just have to wait for the final implementation to appear. That demo took place before VESA even finalized the official spec for adaptive-sync. I don't understand the outrage over what AMD showed.

Now come on!

See, when you say things like this it starts to look like your purposefully trying to mislead. I mean, we just went over this and here you are saying some rather strange things again. Things that you yourself know better.

You know as a matter of fact that the spec change has absolutely no stake in their laptop. You yourself know that the eDP capabilities have been added to the desktop spec. That this is nothing brand new. The way AMD says they can use it to mimmick gsync, that's a new approach. No one has done it. It can't be done without new technology, radically different on the software side alone. freesync will be developed by AMD, it will be an AMD only feature.


A few post ago in this thread, someone claimed nvidia chips don't have the capabilities to use the eDP spec as AMD can. This came from AMD, who claimed nvidia just didn't have that ability so this must be why nvidia developed gsync. This claim, if it is really really true is pretty solidifies that debate. If nvidia, the company that serves 2\3rd the dgpu market doesn't have this special capability that AMD claims they have had since 2009, then it completely trashes the idea that freesync will and can destroy gsync. No one has stepped up to claim they can do freesync, no one but AMD. And they say it is because they have an ability that even nvidia doesn't have. Well so much for all the claims that Intel will follow. Heck, Intel buys their graphics IP from nvidia.

Its just really really far fetched to say gsync is dead because AMD will try to copy the feature.
It looks much more likely that it will be a case of Amds freesync vs nvidia gsync. No one else . This absolutely will be the case and there is no indication otherwise except wild imagination.

There is a long road from having the spec capabilities and having it work to give a gsync like experience. AMD claimed very boldly that nvidia doesn't even have the ability to use the eDP spec like they do. There is no one else saying they have this capability either. Just AMD.
So the more you dig into it, the more it becomes clear: the deck being stacked against gsync, it just doesn't look to be the case.

Freesync will be an AMD product and gsync is nvidias.
there is no indication whatsoever it will be gsync against the world.

So I just want to make it very clear that your prior claims of gsync being dead, it is complete fabrications.


I am very hopeful that freesync is good and i have nothing against AMD coming up with a mock feature. I do have issues with crazy over the top, unfounded claims. Far fetched being passed off as inevitable.
 

SoulWager

Member
Jan 23, 2013
155
0
71
The Wright brothers designed their new prototype in a way that lets it reach the speed of 40 mph, which after attaching wings can create enough lift force to take off with an adult man aboard! We gathered today to see if the machine can reach the required speed!
*The prototype reaches 40mph, then stops. Wright brothers celebrate!*
-from the crowd: " But they didn't lift an inch! Liars!"
-it can't lift because we didn't attach wings yet.
- Another lie! The prototype didn't accelerate to 40mph. They blowed air on it, pulled with an invisible rope, and measured the speed of cars going down the nearby road! Liars!

Some people...
It was proof of concept. It worked. They didn't need to change the refresh rate in display settings, it adjusted by itself. Presentation over, job done. What is so hard to get here?
A more suitable comparision would be driving a car at 40mph with a propellor taped to the front. You realize basically any game can change resolution and refresh rate from inside the game, right? They just bound a key to it. Switching between two fixed refresh rates isn't a demonstration unless they can do it in microseconds, and for every single frame.
 
Last edited:

Eymar

Golden Member
Aug 30, 2001
1,646
14
91
A more suitable comparision would be driving a car at 40mph with a propellor taped to the front. You realize basically any game can change resolution and refresh rate from inside the game, right? They just bound a key to it. Switching between two fixed refresh rates isn't a demonstration unless they can do it in microseconds, and for every single frame.

You cant switched refresh rates on most monitors I ever had without it blanking out for a few seconds. I have to agree without more details there is no proof that both monitors were set at 60hz desktop refresh rate and AMD was able to make the Freesync enabled monitor to display 50fps at 50hz (not 50fps at 60hz refresh rate) by VBLANK of 20ms every frame. However, I would like to believe so as it makes Freesync a viable GPU controlled refresh rate solution. I myself am most likely to buy a GSYNC monitor (a 4k ROG swift would be very nice), but like to have more choices for sure.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
They used vblank to set the refresh rate, which proved their concept.

At least that was my understanding. Either way, the reactions in this thread are why NDA's exist, and why companies keep new products close to their chest usually.

It just goes to show how incompetent AMD's marketing team is.

So they proved their concept of a non variable refresh rate?

Come on man.

Now i'm pretty sure the final product won't be like that, even if it's not going to be here for a while, but jeez. I don't think anyone would expect a variable refresh demo to actually be a static refresh rate. Common sense dictates the killer app for variable refresh is gaming to eliminate tearing and input lag at low framerates, and the monitor to adjust refresh on the fly constantly to the varying FPS output of the GPU.

Either AMD is stupid to create a demo that wasn't a varied framerate or they just weren't ready, it's that simple, don't make it something that it isn't.

Like I said though. I'm sure AMD wouldn't allow a final spec to be anything like that. But you'd at least thing a tech demo would demonstrate you know...a variable refresh instead of a fixed refresh. We all know the killer app for this is gaming at varying low framerates as to avoid tearing and input lag. IN real world gaming a framerate is generally not fixed at anything, it varies up and down constantly. This is what variable refresh is meant to fix, it is meant to fix tearing and input lag while a framerate is varying constantly.

We'll see soon enough how it fares, but i'm most interested in whether AMD has an answer to ULMB or lightboost. I don't think they can merely copy these because they were created by NV, and are a huge benefit for competitive and high framerate gaming. While g-sync or variable refresh is the realm for lower framerate gaming, they both cover the entire gamut.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Why do people feel the need to constantly respond to things I haven't said and then act like they won some type of victory.

The concept that they proved was that they could use the vblank signal is controllers that were already available to change the refresh rate of the monitor.

From the press releases I never read anything that stated here's a monitor with variable frame rates and the refresh rates match exactly. If that is what was reported by tech sites looking for page hits is another matter entirely.
 

SoulWager

Member
Jan 23, 2013
155
0
71
Why do people feel the need to constantly respond to things I haven't said and then act like they won some type of victory.

The concept that they proved was that they could use the vblank signal is controllers that were already available to change the refresh rate of the monitor.

From the press releases I never read anything that stated here's a monitor with variable frame rates and the refresh rates match exactly. If that is what was reported by tech sites looking for page hits is another matter entirely.
You're writing fan fiction to fill AMD's plot holes. The demos were represented to the press as an implementation of variable refresh. AMD said variable refresh could be done via ignorable MSA timings, but there is no evidence to show that it's being done in their demos.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Why do people feel the need to constantly respond to things I haven't said and then act like they won some type of victory.

The concept that they proved was that they could use the vblank signal is controllers that were already available to change the refresh rate of the monitor.

From the press releases I never read anything that stated here's a monitor with variable frame rates and the refresh rates match exactly. If that is what was reported by tech sites looking for page hits is another matter entirely.

Using vblank to change from one static refresh rate to another static refresh rate is very, very different from changing the interval between frame scans on a dynamic, frame-by-frame basis. The latter is what AMD claimed, the former is what AMD demonstrated. "Here's a monitor with variable frame rates and the refresh rates match exactly" is PRECISELY what AMD has claimed, every time. That is, literally, FreeSync.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
You cant switched refresh rates on most monitors I ever had without it blanking out for a few seconds. I have to agree without more details there is no proof that both monitors were set at 60hz desktop refresh rate and AMD was able to make the Freesync enabled monitor to display 50fps at 50hz (not 50fps at 60hz refresh rate) by VBLANK of 20ms every frame. However, I would like to believe so as it makes Freesync a viable GPU controlled refresh rate solution. I myself am most likely to buy a GSYNC monitor (a 4k ROG swift would be very nice), but like to have more choices for sure.

You disable Vsync when using FreeSync. It's not at all the same as setting Vsync at different frequencies. Vsync is off. The Graphics card simply tells the monitor to refresh when it has a frame ready.
 

Pantalaimon

Senior member
Feb 6, 2006
341
40
91
You're writing fan fiction to fill AMD's plot holes. The demos were represented to the press as an implementation of variable refresh. AMD said variable refresh could be done via ignorable MSA timings, but there is no evidence to show that it's being done in their demos.

Does it matter at this point if they actually proved it or not in that demo? If this Freesync will sample next month and start selling next year, then it sounds like that this feature works. How well it works will be determined when the products are on the market and people can test it.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Does it matter at this point if they actually proved it or not in that demo? If this Freesync will sample next month and start selling next year, then it sounds like that this feature works. How well it works will be determined when the products are on the market and people can test it.

It matters if you are trying to discredit Freesync and get people to buy a Gsync monitor. Other than that, I can't think of any other reason to be so animately against it.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It matters if you are trying to discredit Freesync and get people to buy a Gsync monitor. Other than that, I can't think of any other reason to be so animately against it.

Even if Freesync is not mainstream yet, it doesn't really matter much because frankly G-Sync itself has not proven to be a good enough alternative for the following reasons:

1) It's available only in TN panels, and out of that already small selection of monitors, there is ONLY 1 monitor that is even good -- the $800 Asus PG278Q 27". Are you kidding me? The price is too high and the screen size is way too small for many of us who have been PC gaming on larger LCDs/30" monitors and Plasma screens for 5-6 years. When 27" Korean IPS 2560x1440 monitors go between $300-400, $800 for a TN GSync monitor is just appalling even from a prestigious brand like Asus. :whiste:

2) We all know that 4K is the future so buying a 2560x1440 $800+ panel only to replace it in 2-3 years with a 4K monitor seems like a stop-gap solution. There isn't a single good 4K monitor with G-Sync yet. And even when it comes out, if 27" TN 2560x1440 costs $800, how much will the 4K G-Sync IPS one cost?

3) G-Sync => So you spend $$$ and lock yourself to NV for the useful life of a monitor which for most of us is 6-8 years (!!!). Not sure if serious. Maybe NV will produce the best GPUs for the next 8 years but maybe not but buying a G-Sync monitor more or less means you will be paying more for every single GPU upgrade for the next 6-8 years since you don't have a choice buying AMD. For example right now I can buy 2 used after-market 290s for $600-700 vs. buying $1,200 GTX780 Ti SLI that can even be much slower, which is unacceptable. Can NV guarantee that for the next 6-8 years they'll offer very similar price/performance to AMD every generation or if I am paying a premium be faster in every single game? Nope, they can't.

#1 and #2 should be addressed over time but not #3. I really have a problem with being locked into 1 vendor, especially one that overcharges for their products and the price increases have only gotten worse in the last 5 years. I want an industry open standard so that I can pick and choose what GPU I want to buy.
 
Last edited:

SoulWager

Member
Jan 23, 2013
155
0
71
Does it matter at this point if they actually proved it or not in that demo? If this Freesync will sample next month and start selling next year, then it sounds like that this feature works. How well it works will be determined when the products are on the market and people can test it.
Like it or not, the status of the tech 6 months ago has implications for the price, performance, and release date. I'm just saying you shouldn't get your hopes up or make plans around a freesync monitor until we see some actual testing and reviews. It sounds like AMD will be getting to the point next month, where a lot of people thought they were 6 months ago.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
Like it or not, the status of the tech 6 months ago has implications for the price, performance, and release date. I'm just saying you shouldn't get your hopes up or make plans around a freesync monitor until we see some actual testing and reviews. It sounds like AMD will be getting to the point next month, where a lot of people thought they were 6 months ago.

6 months ago was February, the CES laptop demo was a month before that.They announced their intention to do this in November 2013. Everything pointed to a mid to late 2015 or even early 2016 for any actual product. To be sampling monitors in September 2014 is pretty good going IMO especially if the samples can show promised functionality. Nvidia themselves said it took them 2 years to get to that point were they unveiled their g-sync tech.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Like it or not, the status of the tech 6 months ago has implications for the price, performance, and release date. I'm just saying you shouldn't get your hopes up or make plans around a freesync monitor until we see some actual testing and reviews. It sounds like AMD will be getting to the point next month, where a lot of people thought they were 6 months ago.


Who? Six months ago we didn't even have a DP1.2a standard.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Vesa said "this technology" has existed way before 1.2a.
So where can i see it in action?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Last edited:

wand3r3r

Diamond Member
May 16, 2008
3,180
0
0
Even if Freesync is not mainstream yet, it doesn't really matter much because frankly G-Sync itself has not proven to be a good enough alternative for the following reasons:

1) It's available only in TN panels, and out of that already small selection of monitors, there is ONLY 1 monitor that is even good -- the $800 Asus PG278Q 27". Are you kidding me? The price is too high and the screen size is way too small for many of us who have been PC gaming on larger LCDs/30" monitors and Plasma screens for 5-6 years. When 27" Korean IPS 2560x1440 monitors go between $300-400, $800 for a TN GSync monitor is just appalling even from a prestigious brand like Asus. :whiste:

2) We all know that 4K is the future so buying a 2560x1440 $800+ panel only to replace it in 2-3 years with a 4K monitor seems like a stop-gap solution. There isn't a single good 4K monitor with G-Sync yet. And even when it comes out, if 27" TN 2560x1440 costs $800, how much will the 4K G-Sync IPS one cost?

3) G-Sync => So you spend $$$ and lock yourself to NV for the useful life of a monitor which for most of us is 6-8 years (!!!). Not sure if serious. Maybe NV will produce the best GPUs for the next 8 years but maybe not but buying a G-Sync monitor more or less means you will be paying more for every single GPU upgrade for the next 6-8 years since you don't have a choice buying AMD. For example right now I can buy 2 used after-market 290s for $600-700 vs. buying $1,200 GTX780 Ti SLI that can even be much slower, which is unacceptable. Can NV guarantee that for the next 6-8 years they'll offer very similar price/performance to AMD every generation or if I am paying a premium be faster in every single game? Nope, they can't.

#1 and #2 should be addressed over time but not #3. I really have a problem with being locked into 1 vendor, especially one that overcharges for their products and the price increases have only gotten worse in the last 5 years. I want an industry open standard so that I can pick and choose what GPU I want to buy.

Nice points.

As far as the vendor lock-in goes, we don't know if freesync or rather adaptive sync will be implemented by NV, so it may only be for AMD - at least initially. Since I am expecting freesyc/async monitors to not have a price premium I would say this is very acceptable (presuming you are not paying much if anything for it). Paying $200 for gsync with the vendor lock in is a joke imo.

I agree, it's a joke to get a low resolution 1440p screen for more than 4k screens are. My monitor upgrade will certainly be 4k and the prices are dropping rapidly! A year ago people said we won't see $1k prices this year on 4k screens and we've seen almost half of that ($600).
 

davie jambo

Senior member
Feb 13, 2014
380
1
0
It's the same idea as the Nvida one , just get the GPU to tell the monitor when to display a frame. Not sure why you are all getting your knickers in a twist

I was going to buy a monitor and try it out for myself when it arrives but just read that my graphics card(s) does not support it

Quite annoyed
 
Status
Not open for further replies.