[TR] FreeSync monitors will sample next month, start selling next year

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
The obvious one: $$$

Let's see how strong adoption is among folks before leaping to conclusions about its place in the market.
but everyone of the pro Gsync peeps in these threads have spent the $250-2k extra to be running nv high end cards so buying a $800 tn Gsync monitor shouldn't be a issue for every anti AS poster here.
just maybe they should up date their sigs.
 

MathMan

Member
Jul 7, 2011
93
0
0
You are all very confused about my post based upon your apparent hatred of all things amd.
No hatred. Just realism.

Nvidia will eventually be supporting a VESA standard (adaptive sync) not an AMD driver (freesync) that you are all apparently confused about.
What will Nvidia be supporting that? What makes you so sure? It's optional: if they are connected to a AS monitor they can ignore it completely if they want to.

My comment about the quality and resolution still stands. Why buy a 1440p gsync monitor when 4k is coming?
Because gaming on 4k requires way too heavy GPUs? Because 4k144Hz panels don't exist? Because there isn't even an interface standard that can carry 4k144Hz over a single DP cable? Because 2560x1440 is the sweet spot right now?

Are those enough reasons?

There will be 4k versions of gsync and adaptive sync. Buying 1440p now would be stupid when most users keep monitors for much longer than their other hardware.
What you're saying is: those sold out Swift monitors in Europe were all bought by idiots who don't know any better. But maybe they had the smarts, lacking with others, that 4K is just too heavy and low refresh rates for current GPUs? Could that be it?

Now finally on to Intel and arm. Why would they enter into a cross licensing agreement with nvidia setting themselves up for potential lawsuits later on if the deal goes south when they could just use the open VESA standard. Adaptive Sync's potential to reduce power usage is why I believe Intel and arm chip makers will support it.
Nvidia themselves have said in the past that they're not interested in licensing. So I don't get why you're bringing that up.

As for lower power: who cares for desktop usage? Where the power is in the backlight. That's not going to change with any kind of sync.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.

VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.

I'm really not sure why I'm trying to use rational thought in vc&g when fud, slander, marketing, and accusations of wrongdoing are all half the people in here talk about.
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.

VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.

I'm really not sure why I'm trying to use rational thought in vc&g when fud, slander, marketing, and accusations of wrongdoing are all half the people in here talk about.

the ignore list is super useful, half the threads are just blanked out for me, what I don't get is the vitriol against freesync and towards AMD.

Just how much do you guys think AMD has control over? If AMD says freesync is free, doesnt translate to the display manufacturers also saying its free, think about it... just why they don't want their brands shown in demos.
 
Last edited:

Mand

Senior member
Jan 13, 2014
664
0
0
the ignore list is super useful, half the threads are just blanked out for me, what I don't get is the vitriol against freesync and towards AMD.

Just how much do you guys think AMD has control over? If AMD says freesync is free, does translate to the display manufacturers also saying its free, think about it... just why they don't want their brands shown in demos.


The reason for the vitriol is that AMD has been proven to have been lying, repeatedly, from the very start, about just about everything relating to FreeSync.

Why shouldn't we react negatively to that? Why aren't you reacting negatively to that? What I don't get is the unflinching loyalty to AMD even after they lie to your face.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.

VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.

I'm really not sure why I'm trying to use rational thought in vc&g when fud, slander, marketing, and accusations of wrongdoing are all half the people in here talk about.

eDP adjustment of refresh rate is already being used to reduce power consumption. Switching a display to a lower refresh rate lowers power consumption, and people aren't going to care much when browsing websites at 20 Hz instead of 30 Hz. But, that's switching from one static refresh rate to another static refresh rate.

What's very different is variable refresh, where you're changing the frame interval dynamically based on how long it takes to render a frame - effectively not having a refresh rate at all. So far, that hasn't been implemented by anyone but Nvidia. It's a fundamentally superior way of displaying video, so I don't doubt that it will become universal eventually, but there hasn't been any indication that Intel or ARM has been moving toward it with any rapidity.
 

MathMan

Member
Jul 7, 2011
93
0
0
Its pretty obvious that chip builders don't care about desktop at all let alone their power usage.
Would you care to explain? I have no idea what you're trying to say.
Which chip builders? The monitor chip makers? The ones who sell millions of chips per year for desktop monitors? Why would they not care about the desktop? It's a big part of their bread and butter.

As for the power: you ignored my statement: the largest power component in a monitor is the backlight (~80%). That part is refresh independent. No amount of variable refresh rate is going to reduce that. The rest is mostly in the refreshing of the LCD cells. You can't go much lower than 30fps there, so the benefits are quite limited too. The !/$ maybe high for a laptop where each Watt counts but a Watt more or less in a desktop won't move the needle.

So what exactly are you suggesting?

VESA has an optional standard that could let a ultrabook or tablet gain extra battery life and you don't think Intel and ARM manufacturers are interested.
How did Ultrabooks suddenly become a topic in a Freesync discussion?
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
TBH i don't see any reason why Intel would not develop their own implementation on adaptive sync. AMD is already doing the marketing and talking to the monitor manufacturers. They don't have any dGPU and people aren't exactly in a hurry to ditch Intel processors for an AMD equivalent. At worst it would be another check box on the feature list.
 

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Have you ever heard of Bulldozer?

Vaporware is a term in the computer industry that describes a product, typically computer hardware or software, that is announced to the general public but is never actually released nor officially cancelled.

So no the Bulldozer was not Vaporware.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
The reason for the vitriol is that AMD has been proven to have been lying, repeatedly, from the very start, about just about everything relating to FreeSync.

Why shouldn't we react negatively to that? Why aren't you reacting negatively to that? What I don't get is the unflinching loyalty to AMD even after they lie to your face.
Please post proof of them lying about freesync.
 

Abwx

Lifer
Apr 2, 2011
11,912
4,890
136
What's very different is variable refresh, where you're changing the frame interval dynamically based on how long it takes to render a frame - effectively not having a refresh rate at all. So far, that hasn't been implemented by anyone but Nvidia. It's a fundamentally superior way of displaying video, so I don't doubt that it will become universal eventually, but there hasn't been any indication that Intel or ARM has been moving toward it with any rapidity.

Using DisplayPort Adaptive-Sync, the graphics card can detect and set an appropriate maximum and minimum refresh rate based on the capabilities reported by the display.

the minimum and maximum times between the display of new frames (the vblank period) is exposed to the GPU via DisplayPort Adaptive-Sync. Because the minimum/maximum vblank period is known to the graphics card, successive frames will intelligently be sent within those boundaries. Predictive or speculative timing is not required under this model, and the GPU will adjust the display's refresh rate to match the current frame rate.
If an upcoming frame is delivered outside of the monitor's supported vblank period, that frame will be immediately presented on-screen when available to ensure the fastest possible screen update.


You have it all here, where you can see that you re making wrong statements about Freesync since it s obvious for me that you dont really understand how both propositions works :

http://support.amd.com/en-us/kb-articles/Pages/freesync-faq.aspx
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Use a forum search. I've been in just about all the FreeSync discussions since the beginning, I've seen all the arguments, and I did not make these up. These are things that people are convinced are true, but aren't.

No. They are mostly exaggerations. You might have had an individual make incorrect claims. That sort of thing happens. You are definitely trying to make claims simply do you can say it isn't true. AMD never said anything like you are claiming.

A-Sync will be in every DP1.2a/DP1.3 compliant monitor
A-Sync will not add to monitor production cost, as it will be part of the normal upgrades that display manufacturers acquire from ASIC manufacturers
A-Sync will not cause monitor manufacturers to add a price premium
A-Sync will work with every GPU


For example, someone saying A-Sync is part of the DP1.2a standard. Why would a company not include it? Doesn't translate into will be in every DP1.2a DP3.0 monitor. ETC...
 

SoulWager

Member
Jan 23, 2013
155
0
71
Nobody knows how AMD's proposition works because they haven't actually shown it demonstrating the important parts of the tech. The easiest thing to screw up would be the fallback cases for min and max refresh rate. At min refresh rate, you want to use triple buffering so you don't lose performance whenever a frame finishes during a (repeated) scanout. At max refresh, you want to use double buffering, or maybe even disable v-sync in order to maintain the low latency you get in cpu limited circumstances. Nvidia sidesteps this problem with panel self refresh.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
Please post proof of them lying about freesync.

He's not going to prove anything, just dance around it. All he has to do is say they are lying over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over again, and then his goal of burning the phrase "AMD is lying" into everybody's heads will be complete.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Please post proof of them lying about freesync.

Use the search function. Lots of other threads on FreeSync.

He's not going to prove anything, just dance around it. All he has to do is say they are lying over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over again, and then his goal of burning the phrase "AMD is lying" into everybody's heads will be complete.

And you can use the search function too. The point is not invalidated just because I don't feel the need to repost the same thing every time a newcomer shows up and doesn't understand. It's all there in the threads, go read it yourself.
 

Mand

Senior member
Jan 13, 2014
664
0
0
No. They are mostly exaggerations. You might have had an individual make incorrect claims. That sort of thing happens. You are definitely trying to make claims simply do you can say it isn't true. AMD never said anything like you are claiming.

A-Sync will be in every DP1.2a/DP1.3 compliant monitor
A-Sync will not add to monitor production cost, as it will be part of the normal upgrades that display manufacturers acquire from ASIC manufacturers
A-Sync will not cause monitor manufacturers to add a price premium
A-Sync will work with every GPU


For example, someone saying A-Sync is part of the DP1.2a standard. Why would a company not include it? Doesn't translate into will be in every DP1.2a DP3.0 monitor. ETC...

I didn't say AMD claimed all of those things (they have claimed some), I said that the forum defenders of AMD claim them. And debunking the forum-generated myths is just as important.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
Nobody knows how AMD's proposition works because they haven't actually shown it demonstrating the important parts of the tech. The easiest thing to screw up would be the fallback cases for min and max refresh rate. At min refresh rate, you want to use triple buffering so you don't lose performance whenever a frame finishes during a (repeated) scanout. At max refresh, you want to use double buffering, or maybe even disable v-sync in order to maintain the low latency you get in cpu limited circumstances. Nvidia sidesteps this problem with panel self refresh.

yet people are for real claiming that the writting is on the wall for gsync.

This:

Upon connecting a FreeSync-enabled monitor to a compatible AMD Radeon™ graphics card, the minimum and maximum duration between the display of new frames (the vblank period) is exposed to the GPU via DisplayPort Adaptive-Sync. Because the minimum/maximum vblank period is known to the graphics card, successive frames will intelligently be sent within those boundaries. .​
is totally and completely not the gsync approach.

There are people claiming intel and arm are all gonna go freesync? Seriously? What exactly are they gonna use? The new spec? Do people know what the new spec even is? Its an extension of the eDP spec for use on desktop monitors. Mobile devices already had the capability!!! It is absolutely meaningless in the context of gsync. They are not making any sense whatsoever.

The market for Gsync is not effected by the eDP capability to lower refresh rates. This is a silly attempt to muddy up and confuse everything.

Nvidia has a specific market for Gsync. It is the desktop PC gamer. This market is dominated by Nvidia currently. Freesync is nowhere near making gsync irrelevant. And the new change in the desktop DP spec will do absolutely nothing to Gsync by itself. The eDP spec has been there all this time and no one was using it to get a Gsync like experience. No one.

Why they heck would moving the eDP spec over to desktop DP all of a sudden have this massive impact. The new spec simply gives desktop the ability to do something that eDP was able to do all a long. And no one used it to make a gsync experienced. And no one is planning to try to use it on the desktop like that except AMD.

The standard change is not freesync. As a matter of fact, a lot more work has to be done to turn that standard into a gsync like experience. The eDP standard has been out all along and AMD has not released freesync on their mobile HW. This is because there is a lot more to it. The "new" ability of this spec existed in the eDP for years and AMD has yet to turn it into "freesync". When they finally do no one knows how good or not it may be. Its incredible how much twisting and manipulating is being done
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It is the desktop PC gamer. This market is dominated by Nvidia currently.

Ive seen you made the same mistake before. NV only has the higher dGPU market share. That doesnt mean it dominates desktop PC gaming.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I didn't say AMD claimed all of those things (they have claimed some), I said that the forum defenders of AMD claim them. And debunking the forum-generated myths is just as important.

and I never said that you said AMD claimed all of those things. Again, you exaggerate claims to make them untrue so you can deny them. This is your MO.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Use the search function. Lots of other threads on FreeSync.



And you can use the search function too. The point is not invalidated just because I don't feel the need to repost the same thing every time a newcomer shows up and doesn't understand. It's all there in the threads, go read it yourself.

Using the forum search function to get countless results showing anti AMD trolls posting their own opinions is not proof.

yet people are for real claiming that the writting is on the wall for gsync.

This:

is totally and completely not the gsync approach.

There are people claiming intel and arm are all gonna go freesync? Seriously? What exactly are they gonna use? The new spec? Do people know what the new spec even is? Its an extension of the eDP spec for use on desktop monitors. Mobile devices already had the capability!!! It is absolutely meaningless in the context of gsync. They are not making any sense whatsoever.

The market for Gsync is not effected by the eDP capability to lower refresh rates. This is a silly attempt to muddy up and confuse everything.

Nvidia has a specific market for Gsync. It is the desktop PC gamer. This market is dominated by Nvidia currently. Freesync is nowhere near making gsync irrelevant. And the new change in the desktop DP spec will do absolutely nothing to Gsync by itself. The eDP spec has been there all this time and no one was using it to get a Gsync like experience. No one.

Why they heck would moving the eDP spec over to desktop DP all of a sudden have this massive impact. The new spec simply gives desktop the ability to do something that eDP was able to do all a long. And no one used it to make a gsync experienced. And no one is planning to try to use it on the desktop like that except AMD.

The standard change is not freesync. As a matter of fact, a lot more work has to be done to turn that standard into a gsync like experience. The eDP standard has been out all along and AMD has not released freesync on their mobile HW. This is because there is a lot more to it. The "new" ability of this spec existed in the eDP for years and AMD has yet to turn it into "freesync". When they finally do no one knows how good or not it may be. Its incredible how much twisting and manipulating is being done

I once again ask you to think rationally. Just because YOU aren't interested in adaptive vsync to save power on the desktop doesn't mean that the millions of people buying CPU's with onboard GPU's from intel and amd do not care about that extra power usage. Stop thinking in terms of enthusiasts. If 10,000 monitors and workstations in a massive office building are all suddenly using less power that is something that companies will upgrade for thus buying new Intel chips. This is why I believe others are going to follow AMD in supporting the standard.

Intel and ARM are all about lowering power usage on all fronts. The writing is on the wall. You just need to take your nvidia shrine off said wall to see it. Gsync is a great idea. I said one of the companies needed to make that exact technology in a thread a few months before we had heard anything about Gsync. The problem is that Nvidia's exclusionary tactics are a losing proposition in this instance. My opinions have nothing to do with my feeling to either nvidia or amd.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
He's not going to prove anything, just dance around it. All he has to do is say they are lying over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over and over again, and then his goal of burning the phrase "AMD is lying" into everybody's heads will be complete.

There's merit to it. Huddy, intentionally or unintentionally, has thrown out many mis-truths out there and that is a fact. AMD stated from the start that FS would require no additional hardware. Yet here I am reading that FS requires specific GPUs and monitors, with additional hardware added in and specifically, a new controller board that supports the feature on monitors.

Huddy stated that G-sync adds a frame of latency. Come on, NV isn't stupid: they wouldn't do this to a gaming panel because it would kill the very idea of a "gaming" monitor. A blatant lie. It doesn't add a frame of latency, it is a look aside buffer.

Anyway, FS as an alternative is good. Problem is, right now, G-sync has ULMB and 144 hz, and g-sync can be used on any resolution panel out there. If i'm not mistaken there is a 4k g-sync panel in the works by Acer.

I do not believe AMD can do anything to emulate ULMB or lightboost. Lightboost is patented by NV and while it was hacked to work in 2D mode (even for Radeon users) this technology was created by NV and it works very, very well, It's included with the G-sync asic (or module) built in. I have not hard of AMD having anything comparable to low motion blur modes, and this is a staple expected of gaming panels. So while AMD might have the low framerate thing covered (and we have NO IDEA if it will be as good as g-sync, and won't know until 2015), it doesn't have any comparable equivalent to lightboost or ULMB. And probably won't since both of these were created by NV.

I find it somewhat laughable that some are claiming FS will be better. Could it? Maybe. Perhaps. But let's be real: questionable unless you consider "working on AMD" to make it better, and there's the whole lightboost/ulmb thing too. And the claims out of nowhere that intel will be using FS. Uh-huh. Okay. Prove it? Fact of the matter is, low framerate areas are the only thing covered and we have no idea how well it will do that. AMD does not and probably will not have anything comparable to lightboost or ULMB, and as mentioned these are patented techs by NV.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
There's merit to it. Huddy, intentionally or unintentionally, has thrown out many mis-truths out there and that is a fact. AMD stated from the start that FS would require no additional hardware. Yet here I am reading that FS requires specific GPUs and monitors, with additional hardware added in and specifically, a new controller board that supports the feature on monitors.

Huddy stated that G-sync adds a frame of latency. Come on, NV isn't stupid: they wouldn't do this to a gaming panel because it would kill the very idea of a "gaming" monitor. A blatant lie. It doesn't add a frame of latency, it is a look aside buffer.

Anyway, FS as an alternative is good. Problem is, right now, G-sync has ULMB and 144 hz, and g-sync can be used on any resolution panel out there. If i'm not mistaken there is a 4k g-sync panel in the works by Acer.

I do not believe AMD can do anything to emulate ULMB or lightboost. Lightboost is patented by NV and while it was hacked to work in 2D mode (even for Radeon users) this technology was created by NV and it works very, very well, It's included with the G-sync asic (or module) built in. I have not hard of AMD having anything comparable to low motion blur modes, and this is a staple expected of gaming panels. So while AMD might have the low framerate thing covered (and we have NO IDEA if it will be as good as g-sync, and won't know until 2015), it doesn't have any comparable equivalent to lightboost or ULMB. And probably won't since both of these were created by NV.

I find it somewhat laughable that some are claiming FS will be better. Could it? Maybe. Perhaps. But let's be real: questionable unless you consider "working on AMD" to make it better, and there's the whole lightboost/ulmb thing too. And the claims out of nowhere that intel will be using FS. Uh-huh. Okay. Prove it? Fact of the matter is, low framerate areas are the only thing covered and we have no idea how well it will do that. AMD does not and probably will not have anything comparable to lightboost or ULMB, and as mentioned these are patented techs by NV.

ulmb is just a strobing backlight, used in many tv's and also the eizo foris. It's not something amd would concern themselves with, it's something for the monitor manufacturers.

huddy spreading false info on gsync is disgusting though.
 

NTMBK

Lifer
Nov 14, 2011
10,486
5,905
136
Wow. I post a thread sharing the good news that some cool tech will be more widely available soon, and it turns into a five page rant fest full of accusations of lying and deceit. For the love of god, I don't know why I even bother with this place any more.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
There's some history to it. I think it all dates back to the original announcement in January 2014 when assertions of how FS would work on everything and anything, any GPU, any existing monitor with a firmware update (that sure didn't happen), would be available "very soon" (2015 != soon, as of Jan 2014), but whatever. Some of the assertions were in fact unbelievable, and AMD did in fact make many assertions that turned out to not be true. But perhaps they misspoke or did that unintentionally. Enough arguing about that.

Looking past that, I do agree that a competing alternative is good, even if I find AMD's marketing this a year to a year and a half in advance to be pretty ridiculous. That's their thing though. Whatever. I am looking forward to see how FS monitors fare, I just hope that reviewers have full unfettered access to give us the real scoop.
 
Last edited:
Status
Not open for further replies.