Freesync monitors to start releasing in November

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
Backlight pulsing is extremely cheap to implement, and any monitor manufacturer can implement it. You have to have fast pixel transition times so you don't have two images on panel at the moment you pulse the backlight. There are some tricks that can make it look better, but it's not really a difficult to implement technology.

It would be much more expensive making it work at the same time as variable refresh, not so much on a per unit cost, but on the research and engineering costs. You can't use backlight pulsing at low framerates, because you'd get very noticeable flicker, and you'd have to modify the pulse width on a per frame basis in order to get a constant apparent brightness when pulsing the backlight on a changing refresh rate.


Interesting, thanks. I remember when Mark of BlurBusters was first developing his scanning backlight hack, he concluded that it could be done with an inexpensive LED off eBay. However, I thought I also read that it becomes more expensive to start using backlights that strobe together with the panel's refresh (top to bottom?) because it's a more nuanced strobe than just strobing the entire screen at once. When you refer to 'pulsing' the backlight, I assume you are referring to strobing the entire screen at once
 

beginner99

Diamond Member
Jun 2, 2009
5,314
1,756
136
backlight pulsing can be done on any monitor with fast pixel transition times, and it's pretty cheap to implement. I expect to see it on the vast majority of 120hz+ displays, regardless of whether they implement variable refresh.

Yeah such models already exist and here is the advnatage of freesync monitors with pulsing. The pulsing should work with NV GPU too. So if you switch from AMD to NV you only loose freessync.

No, they would be making AMD more competitive by supporting AMDs standard ("open" or not) rather than their own. This makes no business sense for the company - especially considering their first to market status and the unknown/unclear feature set associated with freesync.

Exactly. Because if NV supported freesync you could freely switch between AMD and NV GPU and not loose it or the pulsing mode. If you buy a G-Sync and ULMB display, you're locked into NV and you loose freesnyc when going from AMD to NV with a freesync display.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Most G-Sync monitors (those that can do 120Hz) also include the option for ULMB to further improve motion clarity (cannot be used in conjunction with variable [G]sync), something AMD has never addressed AFAIK, but that technology could be implemented by the monitor manufacturers on an individual basis

Is this bad worded or my english? You first make it sound like you can boost the image quality to 11 thanks to g-sync and ULMB, while a disclaimer doesn't say that you can't have both at the same time in a straight way.

Reads like its copy/pasta from nv marketing.


Anyway. The most important difference is gsync is manufacturer locked, expensive implementation, while freesync is open standard for every gpu manufacturer to use.
If nv decides their expensive way is no-go or it has a major flaw, and the support in next gen gpu is dropped, then you have a plain TN display that costed like 4k screen.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Yeah such models already exist and here is the advnatage of freesync monitors with pulsing. The pulsing should work with NV GPU too. So if you switch from AMD to NV you only loose freessync.



Exactly. Because if NV supported freesync you could freely switch between AMD and NV GPU and not loose it or the pulsing mode. If you buy a G-Sync and ULMB display, you're locked into NV and you loose freesnyc when going from AMD to NV with a freesync display.

Which makes one think a little more about the next purchase.

What actually happened? Not only nv lock you with their own implementation, but they also locked their consumers from using freesync, amd developed standard (a-sync).

Even if AMD didn't want that, FreeSync (adaptive-sync) is only supported by AMD, which make is artifficialy vendor locked for now thanks to nv.
Why would anyone support the company which disables features and blocks consumers from using what they paid for? Who in their right mind would support such anti-consumer practices?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Which makes one think a little more about the next purchase.

What actually happened? Not only nv lock you with their own implementation, but they also locked their consumers from using freesync, amd developed standard (a-sync).

Even if AMD didn't want that, FreeSync (adaptive-sync) is only supported by AMD, which make is artifficialy vendor locked for now thanks to nv.
Why would anyone support the company which disables features and blocks consumers from using what they paid for? Who in their right mind would support such anti-consumer practices?

And you forgot Intel is not supporting any of those.

And who disables features and blocks consumers again? Only some AMD cards even support DP1.2a. There is no DP1.2a support in Kepler, Maxwell, Haswell, Broadwell, Skylake etc. So just drop the FUD. Its not artificial.

And have Freesync even been reviewed yet? It seems Freesync was more an attempt to actually destroy the market that solve the issue. A status Quo if you like. I think people will be badly surprised how it actually will turn out. If they think they can get Freesync capable monitors without big premiums to avoid the g-sync cost.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
One has to be delusional if he thinks value oriented brand such as AMD is going to command high premiums on their features. Its also naive to think that cheaper to implement feature will cost as much as much more expensive one.

The lack of latest connectivity in new and yet to be released products from competition is another reason to not buy them. Thank you for pointing that out, but that is not the topic here.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
One has to be delusional if he thinks value oriented brand such as AMD is going to command high premiums on their features. Its also naive to think that cheaper to implement feature will cost as much as much more expensive one.

Its not AMD who is going to take the premiums. Just as with G-Sync the final price have nothing to do with nVidia as such.

Not to mention Freesync and G-Sync is not the same. And we still have to see Freesync reviewed.

The lack of latest connectivity in new and yet to be released products from competition is another reason to not buy them. Thank you for pointing that out, but that is not the topic here.

True, but AMD may some day in the future support HDMI 2.0. But thats no reason to exclude them is it?
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,314
1,756
136
One has to be delusional if he thinks value oriented brand such as AMD is going to command high premiums on their features. Its also naive to think that cheaper to implement feature will cost as much as much more expensive one.

But consider that the display manufacturer must implement support for the feature (mot mandatory) and any feature must be tested. Since it will be for a limited market (gaming) less units will be sold on which the cost could be spread. So they will clearly cost more than normal monitors. And all they need to do is be cheaper than the ROG Swift. So it would not surprise me if freesync, pulsed "2K" monitors will be around $50-$80 less than the ROG swift and still way more expensive than a normal display. It has nothing to do with AMD but the display manufactures.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Its not AMD who is going to take the premiums. Just as with G-Sync the final price have nothing to do with nVidia as such.

Not to mention Freesync and G-Sync is not the same. And we still have to see Freesync reviewed.
Really? the 200$ pricetag for the DYI kit says quite the opposite.

True, but AMD may some day in the future support HDMI 2.0. But thats no reason to exclude them is it?

Its quite different to not support the tech released after the card, then not supporting the available tech by products that are in development.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Really? the 200$ pricetag for the DYI kit says quite the opposite.

The G-Sync kit was a low volume temporary product for older monitors.

Its all about volume. Without large volume it will never be cheap. And if the buyers crowd is willing to pay for G-Sync and Freesync, then the monitor manufactors will take a large premium for it. Thats simple capitalism.

Its quite different to not support the tech released after the card, then not supporting the available tech by products that are in development.

HDMI 2.0 was released a year ago. And Maxwell2 is the only GPU supporting it. No support for HDMI 2.0 from AMD or Intel before a good 6-9 months from now. And you are for example going to see the exact same case with DP 1.3.

And how long was it ago that the non mandatory DP1.2a spec was released?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It is our current understanding that the software architecture of select games may not be compatible with dynamic refresh rate technology like Project FreeSync. In these instances, users will be able to toggle the activation of FreeSync in the AMD Catalyst™​ driver
Any bets on how many of the FPS capped and locked console ports that will suffer from this? Since they have a tendency to match progress and FPS.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The G-Sync kit was a low volume temporary product for older monitors.

Its all about volume. Without large volume it will never be cheap. And if the buyers crowd is willing to pay for G-Sync and Freesync, then the monitor manufactors will take a large premium for it. Thats simple capitalism.
Isn't the g-sync DIY kit the same thing they put into g-sync display in factory.

Any bets on how many of the FPS capped and locked console ports that will suffer from this? Since they have a tendency to match progress and FPS.
It will reduce refresh to fps. If its locked @30, the refresh will most likely by 60Hz.
If you'r dropping below 30 fps, the synchronization should match the refresh rate.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Its quite different to not support the tech released after the card, then not supporting the available tech by products that are in development.

So, why doesn't AMD support HDMI 2.0 with Tonga? :hmm:

It should be obvious that Adaptive-Sync for gaming is a AMD feature and only compartible with AMD's GPU for the next few years.
 
Last edited:

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
It seems Freesync was more an attempt to actually destroy the market that solve the issue. A status Quo if you like. I think people will be badly surprised how it actually will turn out. If they think they can get Freesync capable monitors without big premiums to avoid the g-sync cost.
This is absolutely ridiculous. So now Freesync is nothing but an attempt to destroy the market? Really?

AMD saw a way to improve gaming, just as Nvidia did. But instead of coming out with a vendor locked design that would cost consumers an additional $200 on top of the cost of a new monitor, AMD found a way to cheaply replicate the results with an updated scaler and GPU. It was such a good idea that VESA included it as an optional spec for DP1.2a which makes it free for anyone who wishes to use it, Nvidia included.

AMD set out a timeline for delivery of adaptive-sync capable monitors of (I believe) Q1 2015 and now we are getting reports that the first monitors may actually come out next month.

So tell us, which design is better for consumers worldwide? Nvidia with their royalty generating vendor locked +$200 Gsync module or AMD with their free industry standard DP1.2a scaler?

The answer is pretty obvious.
 

DigDog

Lifer
Jun 3, 2011
14,368
2,830
126
as an NVidia enthusiast, i can easily say that:

1) after the reviews
2) if the price is right
3) if i can get a performance similar to G-Sync

i'll buy a freesync monitor and AMD card (if i need it)

i don't care who makes it, as long as i get high refresh rate, lag-free FPS gaming.

i always liked NV products, but if AMD delivers at a lower price range, why not?

And, if AMD delivers *nearly* as good at a really convenient price, i'll still go the cheaper route. Compared to this crappy Hannsspree monitor i have now, it will be joy anyway.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
So tell us, which design is better for consumers worldwide? Nvidia with their royalty generating vendor locked +$200 Gsync module or AMD with their free industry standard DP1.2a scaler?

The answer is pretty obvious.

Right. G-Sync.

Vaporware is never better than an existing and buyable technology.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
This is absolutely ridiculous. So now Freesync is nothing but an attempt to destroy the market? Really?

AMD saw a way to improve gaming, just as Nvidia did. But instead of coming out with a vendor locked design that would cost consumers an additional $200 on top of the cost of a new monitor, AMD found a way to cheaply replicate the results with an updated scaler and GPU. It was such a good idea that VESA included it as an optional spec for DP1.2a which makes it free for anyone who wishes to use it, Nvidia included.

AMD set out a timeline for delivery of adaptive-sync capable monitors of (I believe) Q1 2015 and now we are getting reports that the first monitors may actually come out next month.

So tell us, which design is better for consumers worldwide? Nvidia with their royalty generating vendor locked +$200 Gsync module or AMD with their free industry standard DP1.2a scaler?

The answer is pretty obvious.

The 200$ G-Sync cost is based on a DIY FPGA design and not ASIC. With ASIC the cost is estimated to go down to around 40$.

Also Freesync is not equal to G-Sync. So dont try and compare it as 2 equals because it simply isnt. And we havent even gotten a review of Freesync yet, kinda odd isnt it? But again, its PR value was mainly to take some air out of G-Sync.

Remember this?
http://www.anandtech.com/show/7641/amd-demonstrates-freesync-free-gsync-alternative-at-ces-2014

6 months later:
http://www.techspot.com/news/57023-amd-demos-freesync-on-hacked-monitor-at-computex.html

Why no reviews or testing? AMD saw G-Sync and panicked while trying to scramble something of their own together. Thats what happend.

And with the optional DP1.2a and DP1.3 feature that nobody but AMD will support until 2016 or longer(if ever). It is the same as a vendor lock.
 

Creig

Diamond Member
Oct 9, 1999
5,170
13
81
Right. G-Sync.

Vaporware is never better than an existing and buyable technology.
You obviously are confused as to what the term 'vaporware' means:

"In the computer industry, vaporware (or vapourware, see spelling differences) is a product, typically computer hardware or software, that is announced to the general public but is never actually manufactured nor officially cancelled."

http://en.wikipedia.org/wiki/Vaporware

FreeSync was pre-announced to possibly be available by Q1 2015. And so far, they appear to be on schedule. Therefore, it is not vaporware.
 

Pariah

Elite Member
Apr 16, 2000
7,357
20
81
This is absolutely ridiculous. So now Freesync is nothing but an attempt to destroy the market? Really?

AMD saw a way to improve gaming, just as Nvidia did. But instead of coming out with a vendor locked design that would cost consumers an additional $200 on top of the cost of a new monitor, AMD found a way to cheaply replicate the results with an updated scaler and GPU. It was such a good idea that VESA included it as an optional spec for DP1.2a which makes it free for anyone who wishes to use it, Nvidia included.

AMD set out a timeline for delivery of adaptive-sync capable monitors of (I believe) Q1 2015 and now we are getting reports that the first monitors may actually come out next month.

So tell us, which design is better for consumers worldwide? Nvidia with their royalty generating vendor locked +$200 Gsync module or AMD with their free industry standard DP1.2a scaler?

The answer is pretty obvious.

AMD has pulled this crap at least once before with a major technology. Back when Intel developed the IA-64 architecture in an attempt to advance the industry in one giant step while ditching 20 years of band-aid repaired legacy garbage, AMD decided to develop the 64 bit extensions for x86, adding even more band aids to a dead corpse. This decision was horrendous for the industry as it killed the opportunity to truly advance it.

Just because it is free, doesn't make it better. AMD has no choice but to make it free since they don't have the clout to release a version that costs money, especially not after Nvidia had already reached the market with their standard.

The answer is pretty obvious? How can you make that claim without having seen a working example? AMD doesn't exactly have the best track record for stable/predictable software. Some of us are willing to pay for superior technology compared to inferior free options. I don't know which version will be better, but I will wait until they are both on the market until I declare a winner like you already have.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
AMD has pulled this crap at least once before with a major technology. Back when Intel developed the IA-64 architecture in an attempt to advance the industry in one giant step while ditching 20 years of band-aid repaired legacy garbage, AMD decided to develop the 64 bit extensions for x86, adding even more band aids to a dead corpse. This decision was horrendous for the industry as it killed the opportunity to truly advance it.
You can't be serious. :| Also it was originally developed by Hewlett-Packard.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Take the off-topic discussion of CPUs over to the CPUs forum, and get this thread back on topic.
-- stahlhart
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
You obviously are confused as to what the term 'vaporware' means:

"In the computer industry, vaporware (or vapourware, see spelling differences) is a product, typically computer hardware or software, that is announced to the general public but is never actually manufactured nor officially cancelled."

http://en.wikipedia.org/wiki/Vaporware

FreeSync was pre-announced to possibly be available by Q1 2015. And so far, they appear to be on schedule. Therefore, it is not vaporware.

Freesync was never live demoed. There are no hands-on preview 10months after the CES. There are powerpoint slides and videos of displays showing a static frametime.

You dont know if Freesync works like advertised. So how could it be better than G-Sync?
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Freesync was never live demoed. There are no hands-on preview 10months after the CES. There are powerpoint slides and videos of displays showing a static frametime.

You dont know if Freesync works like advertised. So how could it be better than G-Sync?

Call me cynical but that's still my problem with freesync - it's still power point marketing. No game has ever been demo'd with it. Now AMD has a history of disruptive power point marketing (bullet physics, AMD 3D whatever that was called) that come to nothing. Nvidia release something, AMD say yes we've got that too and ours is open(tm), they show some slides, and a year later it's quietly forgotten about.

If it's really being released this soon surely AMD will have proper demo's (with games not fixed fps videos of windmills) - I mean both the previous windmill video's claim the hardware is there to do it, so why no real demos of games running at variable fps?