[Sweclockers] AMD opens up about Freesync

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Since these companies are willing to tell me that their monitors don't support freesync now or ever it seems fine. Go ask them yourself if you don't believe me, they seem to be willing to answer the question. They can't talk about future products but existing ones and what they are capable of they will talk about. Like I say don't believe me, send out 10 emails to 10 different companies asking which existing products support adaptive sync/freesync in the future and you wont get a "we can't talk to you about future products" you'll get a "none of our current hardware can do it, the spec only just got released and it needs new hardware". If they answer it then they either consider it already public knowledge or its so blatantly obvious they don't have an issue answering it.

Not needing new hardware is a lie, there isn't a monitor out there today that supports freesync, period. But you don't even need to take my word for it, should you choose you can verify it yourself.
 
Last edited:

NomanA

Member
May 15, 2014
134
46
101
No, I hope folks don't realize that, because that would be completely wrong. Even according to AMD's statements. The display controller in the display, you know, because it's a display controller, is what is doing the work. AMD, itself, said that FreeSync would require a compatible controller in the display. In the display.

Display controller is in the display, because it's a display controller? Ummm. Ok, I understand why you are confused, but you need to realize what they are talking about here.

First of all that's an idiotic and completely misplaced premise to start with.

Second, the quote once again was,

Make no mistake, providing dynamic refresh rates to users still takes a lot of ‘secret sauce’ from the hardware and software ends of our products, including the correct display controllers in the hardware and the right algorithms in AMD Catalyst.

He's talking about "our products", not monitors in this particular statement. Later on in the same interview, he said,

All AMD Radeon graphics cards in the AMD Radeon HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon R9 295X2, 290X, R9 290, R7 260X and R7 260 additionally feature updated display controllers that will support dynamic refresh rates during gaming.

What he meant was that supporting truly dynamic refresh rates in games over the adaptive refresh protocol, without other system side-effects requires some special logic in hardware that they recently added in Hawaii and Bonaire. And earlier series of GPUs missing these updates would support FreeSync in limited capacity.

He did mention the changes needed for monitor, which we have all known from beginning; that the displays need to support Displayport Adaptive Sync protocols. This would not a complex module (AMD's rep words are, 'not expensive or proprietary'). According to him, the complexity is on the GPU side to have the games work with variable refresh.

So do blast AMD for all I care, and continue to throw doubts about their execution or intent, but at least don't misunderstand what's clearly answered by the interviewee and then trigger a pointless discussion about something that was never said.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
No, I'm not. I'm saying that there isn't any reason whatsoever to believe that FreeSync is better, since we know precisely nothing about it beyond lies, deception, and spin.

Holy crap wow, that is some serious spin you have there yourself. Try being more negative, I dare you. Seriously, reading your posts is like listening to political commentators on Fox News.
 

sirroman

Junior Member
Aug 23, 2013
17
0
0
I never claimed it will be cheaper. What I said was that AMD's claim that their version will be cheaper can't be substantiated.

You can't be mad when I don't provide proof for something I didn't claim. Well, you could, but then we can all make up our minds about you "by ourselves."

Stop setting fire to the strawmen, please.

You accused AMD of lying. And worse you did that not with claims, but by premeditated insinuations, with the sole reason of putting in doubt what doesn't have any reason to be.

For god sake, there are eDP controllers/scalars right now that have this functionality. ASYNC doesn't need any RAM on the monitor, it's now an industry standard (based on ANOTHER industry standard), it should be sourced from any willing company (not just nvidia, and you should know how competition drives prices down, as well as volume), etc.

If you are going to defend the contrary of what's just common sense, you have to at least give something other than baseless arguments, insinuations. Specially when you knowingly misrepresents one of AMD's quotes, as seen here: http://forums.anandtech.com/showpost.php?p=36381408&postcount=16

Which, by the way, was the only reason you could muster: your misunderstanding of a AMD's representative quote. The point being: you misunderstood it.

Buddy, I'm not mad, read again my first post, here:

Slowdown, man. :confused:

You were right when you said "Just because the hardware won't be proprietary, that doesn't mean it won't be expensive". It doesn't mean that it will be expensive either.

Provide one source or proof that Gsync is (or will be) cheaper than A-Sync. In time: "it needs R&D" isn't one.

You are correct about time-to-market, announced industry support etc, but going beyond that discredits you.

I gave you the benefit of doubt over that really confused post, and even acknowledged where you were right.

That accusation is just embarrassing and I'm pointing that out so you don't bring that up again.

Just spoke to Iiyama on the phone about Freesync/adaptive vsync support. They said a few things that were quite interesting. The first was that none of their lineup can possible support it because the scalars in them don't support 1.2a. A hardware change in the scalar is necessary to support the functionality. Novatech (one of the scalar manufacturers) has been spoken to by Iiyama about support for the technology and Novatech said that theoretically its possible to implement it but right now they aren't considering doing so because the market is too small. They shift 50 million scalars a year and the gamer market is more like 10k units a year as is and isn't worth the investment.

This is damning news. Admittedly its second hand from a iiyama presales but he doesn't expect to see monitors this year and novatech wont even consider support until the next generation scalar which is due end of the year, likely meaning monitors are well over 18 months away!

He also talked about gsync not really being worth the cost. His concern is that the technology itself is only really valuable at low FPS and the money is better spent on faster graphics cards and a 144hz monitor.

I have to say I wasn't expecting as much information as I got but dang that is really interesting discussion for about 15 minutes on monitor technology.

Gsync has clear time-to-market advantages, but if Novatech as well as other scalar manufacturers adds it to their next generation scalar (which was expected, by the way: follow the standard in the next update), Adaptive-Sync can be widespread and have negligible added cost to the monitor (besides board changes). That's a win for us at the gaming crowd.
 

Mand

Senior member
Jan 13, 2014
664
0
0
Display controller is in the display, because it's a display controller? Ummm. Ok, I understand why you are confused, but you need to realize what they are talking about here.

First of all that's an idiotic and completely misplaced premise to start with.

Second, the quote once again was,

Make no mistake, providing dynamic refresh rates to users still takes a lot of ‘secret sauce’ from the hardware and software ends of our products, including the correct display controllers in the hardware and the right algorithms in AMD Catalyst.

He's talking about "our products", not monitors in this particular statement. Later on in the same interview, he said,

All AMD Radeon graphics cards in the AMD Radeon HD 7000, HD 8000, R7 or R9 Series will support Project FreeSync for video playback and power-saving purposes. The AMD Radeon R9 295X2, 290X, R9 290, R7 260X and R7 260 additionally feature updated display controllers that will support dynamic refresh rates during gaming.

What he meant was that supporting truly dynamic refresh rates in games over the adaptive refresh protocol, without other system side-effects requires some special logic in hardware that they recently added in Hawaii and Bonaire. And earlier series of GPUs missing these updates would support FreeSync in limited capacity.

He did mention the changes needed for monitor, which we have all known from beginning; that the displays need to support Displayport Adaptive Sync protocols. This would not a complex module (AMD's rep words are, 'not expensive or proprietary'). According to him, the complexity is on the GPU side to have the games work with variable refresh.

So do blast AMD for all I care, and continue to throw doubts about their execution or intent, but at least don't misunderstand what's clearly answered by the interviewee and then trigger a pointless discussion about something that was never said.

And I've already described, in detail, how the "not a complex module" claim by AMD is false, and that they themselves admitted otherwise previously. The claim has been completely and thoroughly debunked, with sources to back it up.

If we were to take AMD's word for it, then that's the end of the story, yes. But AMD's behavior over the last five months on this subject indicate that we absolutely should not just take their word for it.
 

Mand

Senior member
Jan 13, 2014
664
0
0
You accused AMD of lying. And worse you did that not with claims, but by premeditated insinuations, with the sole reason of putting in doubt what doesn't have any reason to be.

There are tons of reasons to have doubts. Mostly, because AMD lied about the CES presentation. They claimed it was showing variable refresh: it wasn't. They claimed it wouldn't require expensive hardware: it does.

Everything they have said on FreeSync from the beginning has been deception.
For god sake, there are eDP controllers/scalars right now that have this functionality. ASYNC doesn't need any RAM on the monitor, it's now an industry standard (based on ANOTHER industry standard), it should be sourced from any willing company (not just nvidia, and you should know how competition drives prices down, as well as volume), etc.
No, there aren't. There are controllers with eDP that can adjust vblank, but not frame by frame. That's an important distinction. No eDP system in existence can do full variable refresh. They can change it from one static refresh rate to another static refresh rate, but not update it dynamically, frame-by-frame.

Open is not magic. Open doesn't invent hardware. Open doesn't pay the R&D bills.

I misunderstood nothing. They lied. Yes, my comment contradicts their statement. Because they lied. The proof has already been posted, please go read it.


The rest of your post is just personal attacks, so I'm not going to respond to those.
 
Last edited:

NomanA

Member
May 15, 2014
134
46
101
And I've already described, in detail, how the "not a complex module" claim by AMD is false, and that they themselves admitted otherwise previously. The claim has been completely and thoroughly debunked, with sources to back it up.

If we were to take AMD's word for it, then that's the end of the story, yes. But AMD's behavior over the last five months on this subject indicate that we absolutely should not just take their word for it.

Just admit it, that you misunderstood AMD rep's answers and that your lack of understanding of display controllers resulted in a long rant about what you perceived as lies, and then the resulting pointless back and forth discussion. I posted the quotes before in my last post from the interview, and they are crystal clear.

You are welcome to keep your own opinions about how and why AMD are pulling the fast one, but none of that is based on the interview which this thread is about.
 
Last edited:

NomanA

Member
May 15, 2014
134
46
101
No, there aren't. There are controllers with eDP that can adjust vblank, but not frame by frame. That's an important distinction. No eDP system in existence can do full variable refresh. They can change it from one static refresh rate to another static refresh rate, but not update it dynamically, frame-by-frame.

The whole point of VESA accepting AMD's change request is to have frame-by-frame control.

The change request was summarized at Hardware.fr back in January.
http://www.hardware.fr/news/13545/amd-freesync-proposition-dp-1-2a.html

And these changes were approved recently.

Summary
Extend the "MSA TIMING PARAMETER IGNORE" option to DisplayPort to enable source based control of the frame rate similar to embedded DisplayPort.

Intellectual property rights
N/A

Benefits as a result of changes
This enables the ability for external DisplayPort to take advantage of the option to ignore MSA timing parameter and have the sink slave to source timing to realize per frame dynamic refresh rate.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
The next individual in this thread who accuses another member of lying, or otherwise impugns their character in an attempt to discredit their position, is going to say hello to the ban hammer.

If you can't deal with what is being discussed here without attacking people personally, then stay out of the thread.

-- stahlhart
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
today; because its basically at prototype stage. it'l come down to pennies.

early release 100Gb SSD = $500
late release 100Gb SSD = $50

That is actually a bit different. The early SSDs were using the same type of NAND as the later ones but due to the way silicon doubles the number of transistors (in this case resistors) at the same price point as the process nodes change the same amount of space over time decreases in cost. Early SSDs were on 45nm, now we have SSDs coming out on 16nm, huge huge difference in the cost per GB but its due to increasing density of the NAND itself.

The big cost with Gsync is that its done on a "flip chip" (FPCGA or equivalent presumably). These are reprogrammable hardware that contain a lot of logic gates you can configure how you want to and reprogram them many times with different hardware layouts. They are great for developing hardware and fixing the bugs because you can keep reprogramming it until you get it right. They are however kind of expensive and the ones I have used in the past are well over $1500. What Nvidia appears to be doing at least for the initial modules was programming those prototype modules and using them in monitors. It works and its a nice quick way to get to market and to fix problems rapidly but its also a kind of expensive way to roll out hardware. I don't know if the monitors we have coming in the next month are based on those or not based on cost it looks like they probably are. Doing it this way allows us the customer to get to it earlier, and when they go for a genuine silicon build of the module it will be considerably cheaper, its cost wont stay that high. The 144hz Nvision 2 monitors have a premium but its not as much as the gsync modules and based on the functionality of the module I suspect we'll see a similar price premium. I view it as Nvision 3, its just nvision 2 refined with gsync being the major new functionality, but it also contains all the previous features as well some of which (low persistence mode) have been refined. As far as I know there is no plans from AMD to compete with low persistence or the 3D gaming that these monitors also provide. Its not unreasonable for that to cost more.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
today; because its basically at prototype stage. it'l come down to pennies.

early release 100Gb SSD = $500
late release 100Gb SSD = $50

Does Gsync benefit from better process nodes? Because that is exactly what allowed SSDs reduce their costs so greatly.


PS: Dont try to answer it, we all know it doesnt.
 

DigDog

Lifer
Jun 3, 2011
14,448
2,873
126
Does Gsync benefit from better process nodes? Because that is exactly what allowed SSDs reduce their costs so greatly.


PS: Dont try to answer it, we all know it doesnt.

wow this is embarrassing;
you really think the cost of a product is only in the materials? the memory chips in the SSD were what was keeping the cost high? R&D costs having nothing on it, nor retooling machines, marketing. Packaging. Shipping contracts. Licenses.

the rule is that an industrially-produced product which stays on the market goes down in price. Because you find ways top make it cheaper, deal in higher volumes, broker deals, and shift public interest.

Or, if you find these words confusing, "G-sync will become cheaper".
 

Gunbuster

Diamond Member
Oct 9, 1999
6,852
23
81
I wonder if AMD will come up with a demo running on an FPGA. Oh what sweet ironing that would be...
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
The big cost with Gsync is that its done on a "flip chip" (FPCGA or equivalent presumably). These are reprogrammable hardware that contain a lot of logic gates you can configure how you want to and reprogram them many times with different hardware layouts.

It's "FPGA" and it means "Field-Programmable Gate Array", there's no "flip chip" anywhere in there. They can also be pretty good with cost efficiency vs. ASIC with lower production volumes, depending on the design of course.
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
wow this is embarrassing;
you really think the cost of a product is only in the materials? the memory chips in the SSD were what was keeping the cost high? R&D costs having nothing on it, nor retooling machines, marketing. Packaging. Shipping contracts. Licenses.

the rule is that an industrially-produced product which stays on the market goes down in price. Because you find ways top make it cheaper, deal in higher volumes, broker deals, and shift public interest.

Or, if you find these words confusing, "G-sync will become cheaper".

Then dont use the SSD analogy because it is fundamentally wrong to compare the Gsync ASIC to them. SSDs became so cheap in such little time because their price was largely determined by the process node the modules were made in. Citing the SSD case is a lame attempt to mislead people into think Gsync modules will also become 10x cheaper in a short time (and we know it wont, ever).
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
wow this is embarrassing;
you really think the cost of a product is only in the materials? the memory chips in the SSD were what was keeping the cost high? R&D costs having nothing on it, nor retooling machines, marketing. Packaging. Shipping contracts. Licenses.

the rule is that an industrially-produced product which stays on the market goes down in price. Because you find ways top make it cheaper, deal in higher volumes, broker deals, and shift public interest.

Or, if you find these words confusing, "G-sync will become cheaper".

Some people are WAY behind the discussion. The FPGA chips alone are 90-$100 of the cost of the G-Sync boards. Those prices will not come down. That is what they cost in volume for any company. That is why Nvidia said "they hoped" to get the board price down to $130 for a consumer. That is their floor for their product. The only way G-Sync can compete is to go ASIC.

Why are we rehashing what we already know about G-Sync on the A-Sync thread?
 

SoulWager

Member
Jan 23, 2013
155
0
71
Some people are WAY behind the discussion. The FPGA chips alone are 90-$100 of the cost of the G-Sync boards. Those prices will not come down. That is what they cost in volume for any company. That is why Nvidia said "they hoped" to get the board price down to $130 for a consumer. That is their floor for their product. The only way G-Sync can compete is to go ASIC.

I think that's backwards. The only way freesync can compete is to use an ASIC implementation. Despite AMD's claims at CES, it hasn't yet been demonstrated that an existing scaler can be used with freesync, and certainly not at the high refresh rates and resolutions g-sync displays can hit.
Why are we rehashing what we already know about G-Sync on the A-Sync thread?
A lot of people still have misconceptions about a-sync, particularly the idea that a-sync will be significantly cheaper at release than g-sync, while still matching performance.

If AMD uses a FPGA, it won't be significantly cheaper than g-sync, and risks getting undercut on both price and performance by an ASIC implementation of g-sync. I think this would be an unlikely and stupidly risky business decision unless all the freesync monitors using FPGAs are g-sync designs updated to comply with the new vesa standard. g-sync monitors would likely be the easiest hardware to get working with freesync from a technical perspective, but whether it would happen depends on how restrictive Nvidia's agreements with the display manufacturers are.

If AMD uses preexisting ASICs, it will likely be cheaper than g-sync, but will be limited to specs typical of mobile(likely 60hz, no 4k). If AMD's time estimate is accurate, this is the most likely option.

If new ASICS are required, I don't think AMD's time estimate is realistic, though it could result in products competitive in both price and performance.


Long term(5-10 years from now), I expect all g-sync monitors to support freesync, and if monitors exist that only support freesync, they'd be $20-30 cheaper than an equivalent monitor with g-sync (similar to motherboards with crossfire vs sli). It's also possible the standards get integrated, or one disappears.
 

SoulWager

Member
Jan 23, 2013
155
0
71
It's "FPGA" and it means "Field-Programmable Gate Array", there's no "flip chip" anywhere in there. They can also be pretty good with cost efficiency vs. ASIC with lower production volumes, depending on the design of course.

Flip chip is just a packaging term, you can have flip chip FPGAs and flip chip ASICs. It just means the side of the die with circuitry on it is facing the PCB.
 

DigDog

Lifer
Jun 3, 2011
14,448
2,873
126
you need to consider two factors;

first, *today* G-sync uses FPGA chips, but in the future it might not;
AMD's claims about Freesync first and foremost point to a cheaper solution, which NVidia is free to rip off should it be an option. And they've proven that they can get the industrial aspect done quicker than AMD.

and also thats what R&D is for. monitor makers might want to take a portion of the costs if this opens them up to a new market;

and the higher demand for chips - even if they remain in their current form, which i am led to understand is not the case, since these are

The big cost with Gsync is that its done on a "flip chip" (FPCGA or equivalent presumably). These are reprogrammable hardware that contain a lot of logic gates you can configure how you want to and reprogram them many times with different hardware layouts. They are great for developing hardware and fixing the bugs because you can keep reprogramming it until you get it right.

**SNIP**

It works and its a nice quick way to get to market and to fix problems rapidly but its also a kind of expensive way to roll out hardware.

essentially something which fits now, but might be re-engineered to be purpose-built for the application; and with higher demand there will possibly be lower costs.

something like bringing that ASUS monitor down from $800 to $550.
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
I don't see too many monitor manufacturers integrating vendor specific/limited tech into their asics without said asic costing a fair bit more.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I like AMD, but NV does more than just 'blowing smoke' about these new techs. Same with SLI, CUDA and G-sync. Nvidia brought the tech to market, albeit in a proprietary way, a few years prior to AMD. Sure, the tech eventually goes to a open-source model, but NV has shown time and time again that they produce these products that people are willing to pay in order to have it NOW.

Whether G-sync or A-sync is better is immaterial. We don't have enough information on that, we don't even know which will be more cost effective in the short-term. NV may move to ASIC, we don't really know what is required for AMD to truly update scalers to compete with G-sync on high refresh/resolutions.

All we know is that G-sync works well, very well actually, according to pretty much all reviews. If the price is worth-it to you, you can buy a display with it right now. A-sync doesn't exist, and has not even been demoed yet...
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
I think that's backwards. The only way freesync can compete is to use an ASIC implementation. Despite AMD's claims at CES, it hasn't yet been demonstrated that an existing scaler can be used with freesync, and certainly not at the high refresh rates and resolutions g-sync displays can hit. A lot of people still have misconceptions about a-sync, particularly the idea that a-sync will be significantly cheaper at release than g-sync, while still matching performance.

If AMD uses a FPGA, it won't be significantly cheaper than g-sync, and risks getting undercut on both price and performance by an ASIC implementation of g-sync. I think this would be an unlikely and stupidly risky business decision unless all the freesync monitors using FPGAs are g-sync designs updated to comply with the new vesa standard. g-sync monitors would likely be the easiest hardware to get working with freesync from a technical perspective, but whether it would happen depends on how restrictive Nvidia's agreements with the display manufacturers are.

If AMD uses preexisting ASICs, it will likely be cheaper than g-sync, but will be limited to specs typical of mobile(likely 60hz, no 4k). If AMD's time estimate is accurate, this is the most likely option.

If new ASICS are required, I don't think AMD's time estimate is realistic, though it could result in products competitive in both price and performance.


Long term(5-10 years from now), I expect all g-sync monitors to support freesync, and if monitors exist that only support freesync, they'd be $20-30 cheaper than an equivalent monitor with g-sync (similar to motherboards with crossfire vs sli). It's also possible the standards get integrated, or one disappears.

You didn't even read the FAQs or the interviews. F-Sync = eDP implementation. A-Sync = standards based so all of the companies that go with the latest DP 1.2a will update their TCONs and scalers to support. Their TCONs and scalers are either programmable so their vendor(s) need to develop the update to flash to existing hardware or they are fixed function and they need new ASICs. NONE of them are using FPGAs because they are way too expensive. AMD isn't designing the hardware like Nvidia did hence the standards based approach.
 

Mand

Senior member
Jan 13, 2014
664
0
0
You didn't even read the FAQs or the interviews. F-Sync = eDP implementation. A-Sync = standards based so all of the companies that go with the latest DP 1.2a will update their TCONs and scalers to support.

This is not true. A-Sync support is an OPTION. Not "all" companies will update their hardware to support it. Considering there hasn't been an announcement of even one company saying they will, there is no reason to believe this.

Again, standards have two things: requirements, and options. A-Sync is not a requirement, it is an option. Being 1.2a compliant does not tell you whether or not it is A-Sync compliant. And no, it is not just a firmware flash to make them compliant.

"Standards based approach" doesn't mean you magically don't have to do R&D. Someone has to do it. Until a couple weeks ago, AMD specifically stated that they weren't doing it, that they were pushing FreeSync in order to encourage display OEMs to develop the necessary hardware. Now, they say that they're partnering with hardware vendors, but neither AMD nor the vendors will say who they are.

I can't understand how people can continue repeating these clearly, demonstrably, provably false statements. Taking what they said in the FAQ at face value is only as good as whether or not the FAQ is full of lies, misstatements, and half-truths. And given that their claims are easily disproven by a simple Google search, you should not take them at face value in any way regarding FreeSync. They have been deceptive and underhanded since day one with their CES demo, and have not demonstrated any improvement whatsoever.
 
Status
Not open for further replies.