AMD Migrates 20nm Chips In Development to FinFET – Pays $33 Million Charge

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
If AMD can incorporate HBM into their APUs, I think they for sure could make a gaming type apu with much better performance than Intel. I have given up on igpu improvements from Intel TBH, except for just throwing more EUs at the problem. First it was wait for Gen 8, now it is Gen 9, and on and on. The problem for AMD is exactly where that theoretical APU will fit and what the cost to manufacture and sell it will be.

I still am not convinced an APU is going to be anything more than a low/mid range solution for desktops. They have to be able to get good performance into a mobile package for it to make sense.

Another problem (for both AMD and Intel) is that 14nm dgpus should be out by then with HBM as well, giving much better performance (and performance/watt) across the board.

Yupp, but I think the interesting question is whether an APU with "4 core Zen + HBM + latest AMD GPU cores + 14 nm" would be good enough to pass into the territory where it's actually usable as a gaming computer?

So far, I don't think iGPUs have passed that criteria, neither AMD's nor Intel's. But I think such an APU might actually do it. In that case it you could get a very nice gaming computer covering the needs for a lot of gamers. If so it could be a very nice product that will sell well.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Why would that be the case? If the cost per transistor for 14 nm is higher when buying from the foundries, it means their investments for 14 nm process tech has also been higher. Intel has had to absorb that process tech cost too for 14 nm.

Its been explained multiple times. Its about design time and design cost to deal with the multipatterning issuews. Something you need in the area of 10 billion$ revenue on a product on to be viable with. Again, gate utilization and parametric yields.

Also, as could be seen in your graph, the difference in cost between 28 nm vs 14 nm was not that big anyway.

Yes, per transistor. A 100mm2 14nm die will cost over the double of a 100mm2 28nm die.

Of course. But that's the case for all manufacturers of discrete GFX cards (effectively AMD and nVidia), not just AMD. Also, the cost in your graph is just at one point in time. The cost per transistor for new nodes goes town over time, just as it does for all new nodes. That's the whole deal with Moore's law. Relax. ;)

It seems you dont understand the issue at all. Time will not fix 14nm cost.

Finally, you never answered my question regarding your statement that all companies buying from foundries at 14 nm would have to use EUV due to unspecified "cost issues". What was that all about?

Could you elaborate? The point of EUV is to remove the need for multipatterning and increase both gate utilization and parametric yields.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
Its been explained multiple times. Its about design time and design cost to deal with the multipatterning issuews. Something you need in the area of 10 billion$ revenue on a product on to be viable with. Again, gate utilization and parametric yields.
Any source confirming that $10B figure?

And anyway, if it requires $10B of revenue per product long term at 14 nm there won't be many companies using it. Contradicts with what products several companies have announced for 14 nm.

Yes, per transistor. A 100mm2 14nm die will cost over the double of a 100mm2 28nm die.
Of course, have I stated otherwise?

The point is that it'll not be much more expensive for AMD to have Zen on 14 nm than 28 nm. It will be the same number of transistors regardless of node, and 14 nm is not much more expensive than 28 nm per transistor.
It seems you dont understand the issue at all. Time will not fix 14nm cost.
So Moore's law is dead?
Could you elaborate? The point of EUV is to remove the need for multipatterning and increase both gate utilization and parametric yields.
Well, it was you who said that "We are not just talking about new, we are talking about cost issues that only EUV can possible solve for the fabless." regarding 14 nm that was being discussed. So I wondered if you could clarify how EUV comes into play at 14 nm, when no one has announced that EUV be used for 14 nm? And why that would only apply to fabless companies anyway?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Any source confirming that $10B figure?

And anyway, if it requires $10B of revenue per product long term at 14 nm there won't be many companies using it. Contradicts with what products several companies have announced for 14 nm.

We have been over this before and either you dont get it or you dont want to. Its about getting lower transistor cost due to higher gate utilization and parametric yield. It needs high design cost and time.

http://electroiq.com/petes-posts/20...long-design-cycles-may-delay-20nm-and-beyond/

Of course, have I stated otherwise?

The point is that it'll not be much more expensive for AMD to have Zen on 14 nm than 28 nm. It will be the same number of transistors regardless of node, and 14 nm is not much more expensive than 28 nm per transistor.

So AMD wont increase the transistor budget? Yet we gonna see this miracle product come to life? Remember the design cost is still ~4x more expensive. Not to mention mask costs.

So Moore's law is dead?

For companies that cant afford the design cost and masks, dont have the time or doesnt have the volume for it. Yes. But thats nothing new. There is a reason why so much is made on old nodes today.

Well, it was you who said that "We are not just talking about new, we are talking about cost issues that only EUV can possible solve for the fabless." regarding 14 nm that was being discussed. So I wondered if you could clarify how EUV comes into play at 14 nm, when no one has announced that EUV be used for 14 nm? And why that would only apply to fabless companies anyway?

I am not saying EUV comes to play at 14nm.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
The problem with having a gaming APU is that you also power limited. Current CPUs are rated at 95w max. A modern gaming computer will have at least a 95w CPU and a 200w GPU.

So, a gaming APU would be limited to 95w max in total for both CPU and GPU. Suitable for laptops and low end gaming but not much else.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
If AMD can incorporate HBM into their APUs...*snip*

HBM will be a nice boost to AMD IGP, given the impact seen simply by using faster system RAM, but Intel is really stepping up their game in this area (Iris Pro), albeit at a higher cost.

I don't know how when/if we'll Micron's Hybrid Memory Cube (HMC) architecture work its way down to mainstream CPUs, but I have to believe that Intel's Knights Landing (14nm) is a sign of things to come. Does DX12 open up the world of graphics to Phi (or smaller future iterations of Iris Pro) in a way not see before via ray tracing (like NVIDIA's Mech Ti demo)?

There's an avalanche of changes in the next two years. Pretty exciting!
 

Avalon

Diamond Member
Jul 16, 2001
7,567
156
106
Yes, per transistor. A 100mm2 14nm die will cost over the double of a 100mm2 28nm die.

Honest question, but if a 14nm transistor costs 10% more than a 28nm transistor, then if talking about a chip using say 1 bil transistors, wouldn't the 14nm chip still only be 10% more expensive than the 28nm chip? Or am I missing something.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Honest question, but if a 14nm transistor costs 10% more than a 28nm transistor, then if talking about a chip using say 1 bil transistors, wouldn't the 14nm chip still only be 10% more expensive than the 28nm chip? Or am I missing something.

Actually more due to design cost. But overall yes.

1 billion transistors is cheaper on 28nm than its is on 14nm.

The big problem is that better performance depends largely on higher transistor budgets. Specially for graphics.

So either the company will earn less or the consumer will pay more. Its really a deathrace. Unless you are part of the exception(s) like Intel.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
We have been over this before and either you dont get it or you dont want to. Its about getting lower transistor cost due to higher gate utilization and parametric yield. It needs high design cost and time.

http://electroiq.com/petes-posts/20...long-design-cycles-may-delay-20nm-and-beyond/

That article also said:

"We think 20nm, if it does go into volume production, it will not be in 2014. Potentially 2015 and maybe 2016,” he said.

"Similarly, Handel believes there will be a postponement of 16/14nm. “We expect initial production in late 2016, beginning of 2017.

And yet we had 20 nm products in 2014, and 14 nm products in 2015(!). So I don't know how much faith to put in that article. :\

Also, then how come AMD claims it only cost $33M to migrate their designs to 14 nm as mentioned in the OP?
So AMD wont increase the transistor budget? Yet we gonna see this miracle product come to life? Remember the design cost is still ~4x more expensive. Not to mention mask costs.
Most likely, but they would increase the transistor budget that regardless if the design was on 14 nm or 28 nm.

So in the end it's the price per transistor on 28 vs 14 nm that matters. And as your graph showed, the difference is small.
For companies that cant afford the design cost and masks, dont have the time or doesnt have the volume for it. Yes. But thats nothing new. There is a reason why so much is made on old nodes today.
So you think the design costs are so big that that's the determining factor of whether Moore's law continues for on 14 nm but not for companies like AMD? Really?

I'd like to see a calculation proving that, and how much you envision the difference to be between them.

Also, I think the design costs will differ quite a lot depending on design. GPUs for example contain lots of transistors, but it's basically just multiple instantiations of the same IP block (a GPU core). So I imagine the process tech design related costs being lower in such cases, in relation to the cost of the complete product. I.e. in such cases the price per transistor will have much greater impact on the total cost of the product.
I am not saying EUV comes to play at 14nm.
So what did you mean by that statement about then!?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Also, then how come AMD claims it only cost $33M to migrate their designs to 14 nm as mentioned in the OP?

Thats not what AMD said. They said they wasted 33M$ on 20nm.


You choose to ignore all reports on the matter. So why continue.

So what did you mean by that statement about then!?

EUV removes the current need for multipatterning below 28nm.

LithoCost.jpg
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
Thats not what AMD said. They said they wasted 33M$ on 20nm.
Here's what AMD actually said:

"Additionally the company anticipates GAAP gross margin to be further impacted by a one-time charge of approximately $33 million associated with a technology node transition from 20 nanometer (nm) to FinFET. The company started several product designs in 20nm that will instead transition to the leading-edge FinFET node."


I.e. the $33M was for transitioning the designs from 20 to 14 nm. But I agree that it is somewhat unclear exactly what those costs are associated with. Does it include only include 20 nm specific costs, only 14 nm specific costs, or both? Anyway, it indicates that the process tech related design costs for a specific node does not seem to be that high.
You choose to ignore all reports on the matter. So why continue.
All you have provided as evidence for your claims is an old speculative article, which already has been found to contain lot's of incorrect guesses (e.g. about what year 20 nm and 14 nm would be available). On top of that article you have added your own speculations.

So based on this we're supposed to take all you're saying as truth? Come on...
EUV removes the current need for multipatterning below 28nm.

LithoCost.jpg
Yeah, and so...? How is that different for foundries vs Intel (which you claimed)?

And what does it have to do with AMD transitioning their designs from 20 to 14 nm? EUV will not be used for 14 nm anyway. o_O
 
Last edited:

Snafuh

Member
Mar 16, 2015
115
0
16
You choose to ignore all reports on the matter. So why continue.

We could also look at this recent graph
eezbRGE.jpg


But you will ignore it because it's based on AMDs internal research. All the slides you posted were made before the GF/Samsung licence deal. GF uses their older tools for the Samsung process. That's the reason why they are late but the process will be cheaper.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
We could also look at this recent graph
eezbRGE.jpg


But you will ignore it because it's based on AMDs internal research. All the slides you posted were made before the GF/Samsung licence deal. GF uses their older tools for the Samsung process. That's the reason why they are late but the process will be cheaper.

Would you trust AMD for understanding broad market trends?
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Here's what AMD actually said:

"Additionally the company anticipates GAAP gross margin to be further impacted by a one-time charge of approximately $33 million associated with a technology node transition from 20 nanometer (nm) to FinFET. The company started several product designs in 20nm that will instead transition to the leading-edge FinFET node."


I.e. the $33M was for transitioning the designs from 20 to 14 nm. But I agree that it is somewhat unclear exactly what those costs are associated with. Does it include only include 20 nm specific costs, only 14 nm specific costs, or both? Anyway, it indicates that the process tech related design costs for a specific node does not seem to be that high.

You don't seem to understand finance very much. The one time charge is a write off of stuff that can no longer be used.

It's saying they had a value on their books of $33M that is now worthless. Therefore it's being written off. It is in no way related to the costs of developing 14nm. That would be an expense, not a write down.

You keep trying interpret what AMD said in your own terms. How you need to interpret it is in business finance terms.
 

Snafuh

Member
Mar 16, 2015
115
0
16
Would you trust AMD for understanding broad market trends?

Do you think AMD shows investors some fantasy numbers?

20141218_b1.gif

In 2013 AMD showed an other forecast but things changed since then. There was the GF/Samsung deal and TSMC's 16nm process is basicly dead because it can't compete with Samsungs 14nm process.
Using 2 years old figures for a 2016/2017 product and ignoring recent numbers makes no sense
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Better be more than 8GB in 2017. And did you consider the actual cost and target audience of such a chip? Not to mention what marketshare AMD may have left by the time.

APUs have been a massive flop for AMD in the PC segment. And sales is getting close to 0. And we are reaching a point where the ultimate question may be if Zen launches at all. Q2 will show a semicustom company with a PC division that is tagged on.

To be fair to AMD, APUs seemed to do pretty well up until end of 2014. They were constantly all over top sellers at Amazon, Best Buy, etc.

What I saw was in 2014, many OEMs started sticking extremely weak laptop APUs into desktop PCs. I think that flat out destroyed their reputation, because even in graphics those chips can't compete with desktop Intel parts (and were never meant to). Word of mouth gets around quickly these days, and APUs were labelled a ripoff.

I suspect that kind of marketing / merchandising flub is why Lisa Su immediately axed certain marketing / merchandising VPs.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
You don't seem to understand finance very much. The one time charge is a write off of stuff that can no longer be used.

It's saying they had a value on their books of $33M that is now worthless. Therefore it's being written off. It is in no way related to the costs of developing 14nm. That would be an expense, not a write down.

You keep trying interpret what AMD said in your own terms. How you need to interpret it is in business finance terms.

The article doesn't say it's a write off, nor specifies it implicitly, so we don't know that for sure. And even if it was, it would indicate that the 20 nm process tech related design costs were only $33M. Corresponding 14 nm may be more than that, but not that much, let's say $50M. So in the end, the whole idea that designs would be so expensive on 14 nm that AMD could not afford it is debunked.

PS. You're very keen on telling everyone else on this forum that they don't understand this and that. It's kind of funny. You should step down from your high towers and direct that towards yourself, where it's more correctly aimed at since you're usually the one on this forum that gets things wrong.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
To be fair to AMD, APUs seemed to do pretty well up until end of 2014. They were constantly all over top sellers at Amazon, Best Buy, etc.

What I saw was in 2014, many OEMs started sticking extremely weak laptop APUs into desktop PCs. I think that flat out destroyed their reputation, because even in graphics those chips can't compete with desktop Intel parts (and were never meant to). Word of mouth gets around quickly these days, and APUs were labelled a ripoff.

I suspect that kind of marketing / merchandising flub is why Lisa Su immediately axed certain marketing / merchandising VPs.

I agree, OEM's have made unnecessarily poor designs based on AMD CPUs, which have not benefited AMD.

Also, I'd say that from 2014 (or even earlier) and onwards, I think Zen has been the main focus at AMD. Better to spend the R&D resources there than on tweaking the Bulldozzer based uArch. That's likely also part of the explanation.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
To be fair to AMD, APUs seemed to do pretty well up until end of 2014. They were constantly all over top sellers at Amazon, Best Buy, etc.

What I saw was in 2014, many OEMs started sticking extremely weak laptop APUs into desktop PCs. I think that flat out destroyed their reputation, because even in graphics those chips can't compete with desktop Intel parts (and were never meant to). Word of mouth gets around quickly these days, and APUs were labelled a ripoff.

I suspect that kind of marketing / merchandising flub is why Lisa Su immediately axed certain marketing / merchandising VPs.

Thats not what AMDs sales numbers showed.

Example:
Q1 2012 1203M$ CPU revenue.
Q1 2013 751M$ CPU revenue.
Q1 2014 663M$ CPU revenue.
Q1 2015 532M$ CPU+GPU revenue.

Its a massive flop.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Thats not what AMDs sales numbers showed.

Example:
Q1 2012 1203M$ CPU revenue.
Q1 2013 751M$ CPU revenue.
Q1 2014 663M$ CPU revenue.
Q1 2015 532M$ CPU+GPU revenue.

Its a massive flop.

What you can't see there is the collapse of AMDs non-APU line, especially in the server space but also non-APU CPUs. Virtually no-one is buying FX or Opteron anymore. For that matter the GPU segment has fallen off a cliff too, you can see that on Steam hardware survey - the entire R9/R7 2xx series is major flop.

If they broke down APUs, I think you'd see a bell curve.

The problem is they eviscerated the other segments of their business first, so those can't carry them at all now, and then destroyed the reputation of their new 'core' line of APUs.

It's the most basic law of the jungle, don't let go of vine #1 until you have a firm grasp on vine #2.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What you can't see there is the collapse of AMDs non-APU line, especially in the server space but also non-APU CPUs. Virtually no-one is buying FX or Opteron anymore. For that matter the GPU segment has fallen off a cliff too, you can see that on Steam hardware survey - the entire R9/R7 2xx series is major flop.

If they broke down APUs, I think you'd see a bell curve.

The problem is they eviscerated the other segments of their business first, so those can't carry them at all now, and then destroyed the reputation of their new 'core' line of APUs.

It's the most basic law of the jungle, don't let go of vine #1 until you have a firm grasp on vine #2.

Llano had a pretty large presence in the laptop market for example. I remember seeing quite a few models online and in the best buy catalog. Now looking online and in catalogs, I see very few kaveri notebooks. Trinity also did decently. Kaveri seems cannibalized by Beema.

Their APU's most definitely did crater in sales, perhaps not a dramatically as their server segment but it has been a slow steady decline.