Handel Jones: cost per transistor flat from 28 to 7nm

witeken

Diamond Member
Dec 25, 2013
3,866
3
106
#1


http://www.eetimes.com/author.asp?section_id=36&doc_id=1329887

The number of gates or transistors per unit area increases as feature dimensions are reduced, which is growing at a higher rate than wafer costs. Conversely, systemic and parametric yields decrease as feature dimensions are reduced, which gives the higher gate cost.

In an ideal environment where yield per unit area is constant with the reduction in feature dimensions, there can be a decrease in gate cost. This, however, is not the reality in the real world due to the increasing impact of overlay and other factors that affect yield. There are also improvements in performance and reductions in total power consumption with smaller feature dimensions, but the penalty is higher gate cost.
The foundry market at 180nm is still in high-volume production. The 300mm wafer volume at 28nm will be above 150K WPM for the next 10 to 15 years. Consequently, new process technology options can have a lifetime of 20 to 30 years.
Intereting they're still saying costs go up from 28nm to 20nm. I thougt that was outdated and cost per x'tor went down at least a little bit. If this is indeed the case, then Intel's will gain quite an advantage since they're projecting steady decline from 22 to 7nm, at least as big as historically. But of course that's only with constant yield. But that's also with TSMC/Samsung: this graph assumes yield decreases just as much as density goes up, which sounds quite a lot.
 
Last edited:

NTMBK

Diamond Member
Nov 14, 2011
8,251
229
126
#2
Scary to be looking at the end of an era. :( Hope the brilliant minds at Intel/TSMC/GloFo(lol j/k) make some breakthrough, and we see another 50 years of exponential improvements. It would be strange to see technology stagnate in my lifetime.

EDIT: Wait, is this not lifted from a 2 year old paper paid for by Soitec? http://www.soitec.com/pdf/WP_handel-jones.pdf Explains why the 20nm number is out of date.
 
Last edited:

wpcoe

Senior member
Nov 13, 2007
586
0
81
#3
Quantum computing on the horizon? That would be a whole new paradigm. How much does a qubit cost? :hmm:
 
Aug 11, 2008
10,457
66
126
#5


http://www.eetimes.com/author.asp?section_id=36&doc_id=1329887





Intereting they're still saying costs go up from 28nm to 20nm. I thougt that was outdated and cost per x'tor went down at least a little bit. If this is indeed the case, then Intel's will gain quite an advantage since they're projecting steady decline from 22 to 7nm, at least as big as historically. But of course that's only with constant yield. But that's also with TSMC/Samsung: this graph assumes yield decreases just as much as density goes up, which sounds quite a lot.
After the 14nm debacle, I take with a very big grain of salt anything Intel says about their process nodes. Why should the cost go down for intel and be flat for everyone else?

However, it should put to rest those who keep ripping intel for not cutting prices on the ever smaller chips as the process nodes get smaller. Even though the chips are smaller, they still may not be cheaper to produce. Also brings into question those who keep arguing that Zen will be cheap because it is such a small chip.
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
0
136
citavia.blog.de
#6


http://www.eetimes.com/author.asp?section_id=36&doc_id=1329887

Intereting they're still saying costs go up from 28nm to 20nm. I thougt that was outdated and cost per x'tor went down at least a little bit. If this is indeed the case, then Intel's will gain quite an advantage since they're projecting steady decline from 22 to 7nm, at least as big as historically. But of course that's only with constant yield. But that's also with TSMC/Samsung: this graph assumes yield decreases just as much as density goes up, which sounds quite a lot.
Interesting times ahead. I think, we're already seeing some trends (as archs/uarchs developed recently or now will be part of that future):
  • chips of similar pricing categories will simply get smaller to keep # of transistors in check
  • well chosen set of IP blocks -> they might become more general, like DSP blocks for video, audio
  • more performance per transistor via:
  • higher clocks (outweighing the power savings)
  • more efficient logic, e.g. do more with the same amount of shaders, distribute work better, etc.


Quantum computing on the horizon? That would be a whole new paradigm. How much does a qubit cost? :hmm:
That is a different world.
 

Nothingness

Golden Member
Jul 3, 2013
1,881
32
106
#7
After the 14nm debacle, I take with a very big grain of salt anything Intel says about their process nodes. Why should the cost go down for intel and be flat for everyone else?
Because the laws of physics don't apply to Intel :ninja:

I wonder how credible that graph is. After all it seems the study wants to prove FD SOI is competitive so anything FinFET related might be twisted to reach the "proof".
 
Mar 10, 2006
11,719
120
126
#8
Because the laws of physics don't apply to Intel :ninja:

I wonder how credible that graph is. After all it seems the study wants to prove FD SOI is competitive so anything FinFET related might be twisted to reach the "proof".
Handel Jones seems to be promoting FD-SOI quite a bit, look through his other analyses.
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
0
136
citavia.blog.de
#10
Do these cost predictions actually take care of the "risk" (at least to the predictions) of some unexpected EUV development success?
 

sm625

Diamond Member
May 6, 2011
8,176
1
106
#14
And yet apple is able to get A9 chips for $58. Who actually believes that the price of A10 is going to be 30% higher with 30% more transistors? I bet it will be the same $58 +/- 5.
 
Mar 10, 2006
11,719
120
126
#15
And yet apple is able to get A9 chips for $58. Who actually believes that the price of A10 is going to be 30% higher with 30% more transistors? I bet it will be the same $58 +/- 5.
$58 seems really high for the A9.
 

coercitiv

Diamond Member
Jan 24, 2014
3,112
388
136
#16
And yet apple is able to get A9 chips for $58. Who actually believes that the price of A10 is going to be 30% higher with 30% more transistors?
Even silicon obeys Apple propaganda these days: Thou shalt not leak!
 
Dec 17, 2008
1,848
7
91
#17
$58 seems really high for the A9.
TLDR: That was the old BOM of the A9+External Qualcomm Modem. Newer BOM price of just the CPU is $22 and the modem is $15 for we got a new datapoint due to the iPhone SE being released 7 months later after the iPhone 6s

SM625 said:
And yet apple is able to get A9 chips for $58. Who actually believes that the price of A10 is going to be 30% higher with 30% more transistors? I bet it will be the same $58 +/- 5.
That is the price estimate for the SOC+External Qualcomm Modem for the iPhone 6s which came out in 9/2015, the price estimate for BOM is IHS which specializes in for they do BOM for some big name devices as a free marketing but they make their money figuring out this data as a contracting business, I want to build X, before I try to build X I hire IHS to figure out my estimated BOM and possible ways to make it cheaper or find new suppliers.

This number is now out of date since we have a newer datapoint with the iPhone SE. The iPhone SE uses the same CPU as the iPhone 6s and was released in 4/2016 (aka 7 months later) and when IHS did the breakdown they separated soc and modem, with the soc being valued at $22.00 and the modem being $15 (more on the modem later). If I were to guess the full reasons why the price reduction of these two parts is due to better yields, and the R&D for chip design being already paid and thus it is just the material cost of making the soc.


The same firm who did the price estimate for the Iphone 6s puts the Samsung Galaxy 7 Qualcomm 820 price estimate at $62. Remember with the 820 you get soc+modem on the same silicon.


Now with the galaxy 6 edge which uses the exynos cpu that was competing with the Qualcomm 810 the cpu bom was $29.50 and the modem bom was $15, note this modem with the Galaxy 6 edge is the same modem used by Apple on the Iphone 6s.

If you were just to buy the modem by itself it would be about $15 based on looking at the breakdown for the iPhone SE (the small 4" with A9 processor) making the total for the combo being $37.00 (aka a price reduction of 21). That said this comparison is not quite the same for the iPhone SE modem is the same one used by the iPhone 6 but not the newer 6s which is the qualcomm MDM9625M, the iphone 6s uses the newer qualcomm MDM9635M. But since IHS listed the newer model of modelm the MDM9635M also the same cost on the Galaxy 6 edge I bet the total price for a modem is around $15.

I have no clue why apple went with the MDM9625M for the iPhone SE since the newer chip takes up less size, uses less power, and has more bands and a greater max download speed. If I were to guess the reason is Qualcomm is trying to dump their very old MDM9625M and gave apple a discount compared to retail to unload inventory, or Apple had already accrued excess inventory of the MDM9625M and they wanted a phone / device to get rid of these possible theoretical excess modems.

----


Remember in the near future that Intel will be supplying many of the iPhone modems as long as the phone are not the verizon or china models which will retain the qualcomm chips. (Though some of the GSM carriers Apple may still go with qualcomm in some markets). Thus the modem price will likely go down, or stay at roughly the same price due to multiple suppliers and intel and qualcomm competing against each other for this business.

Thus to sum it all up, the soc / cpu costs for Apple is anywhere from $22 to $45 dollars plus an additional $15 for modem.
 
Sep 19, 2000
10,204
12
91
#18
Quantum computing on the horizon? That would be a whole new paradigm. How much does a qubit cost? :hmm:
Doesn't solve the same problem. It is similar to saying "Well, maybe we can just add more cores". And as most people should be familiar, more cores doesn't equate directly to higher performance in all cases, even if used optimally.

In fact, it is highly likely that quantum computing will benefit very few aspects of general purpose computing. There are a few novel applications that benefit from it, but it isn't something that you could, for example, plug into your gpu and get super awesome magic performance increases.
 

know of fence

Senior member
May 28, 2009
555
0
71
#19
This graph is to a large degree projection based on assumptions and variables, which will change in the future. The presented cost is a kind of long term average, backed by the assertion that the fab will remain in production for a long time, right?

But if fabs remain in production longer still and early chips can maintain a high level of pricing for longer then the biggest item on the books "depreciation" (investment / time) will also decrease.

I'm not an accountant, but it's hard to imagine the business working without a decrease in price/transistor.
 

plopke

Senior member
Jan 26, 2010
212
4
101
#20
Well lets hope stuff like :

http://www2.imec.be/be_en/press/imec-news/vlsi-junctionless-device-iiiv.html

GAA-NWFETs -with the gate fully wrapped around the device body for optimum electrostatics control- are considered one of the most promising candidates for enabling (sub-)5nm CMOS scaling. Moreover, junction-less devices offer great process simplicity as they do not require junctions.
will work for the next couple of years , was presented 16th june somewhere in hawai.
 
Last edited:

SAAA

Senior member
May 14, 2014
398
0
56
#22
I just hope transistors die as fast as possible and we find something else to do the logic work. It was mechanical first, then digital with electronic, valves and next transistors.

If this era is really ending after 50 years (we had an hint with the end of frequency scaling) it's better to look around at all the innovative solutions.
Intel's 3dXpoint is vaguely interesting considered it completely ignores transistors and uses the extremely advanced knowledge we have with litography: 5nm may look small but why the heck do you still need thousands of atoms or more (speaking of volume here) just to decide 1-0?

That's the paradigm we have to fight with, well before quantum computers or similar things.
It still baffles me how we didn't got past that and introduced more complex, yet faster mechanisms when we can already work with a few nanometer precision.
Is analog computing still dead? What about multiple voltages on the same circuit line? So many physical possibilities still open, exciting times ahead! :D
 
Nov 20, 2005
14,612
2
126
#23
In fact, it is highly likely that quantum computing will benefit very few aspects of general purpose computing.
Yeah but you could argue that "general purpose computing" doesn't need it.

Seriously, what great "killer app" is there for PCs that can't be done today? I know personally I am doing basically the same tasks I did a decade ago, only faster.

"The next big thing" according to the way that companies like Google are investing is AI, and AI is greatly helped by quantum computers.
 

sm625

Diamond Member
May 6, 2011
8,176
1
106
#24
Thus to sum it all up, the soc / cpu costs for Apple is anywhere from $22 to $45 dollars plus an additional $15 for modem.
It's crazy that a SoC that is not much smaller than a $350 Intel CPU only costs that much. But this sort of makes my point, who actually expects Apply SoC costs to rise every generation? Because that is what has to happen if cost per transistor is going to stop falling. Because we know each new SoC is going to have significantly more transistors. They always do.
 
Oct 14, 2003
5,932
141
126
#25
5nm may look small but why the heck do you still need thousands of atoms or more (speaking of volume here) just to decide 1-0?
It's not that easy, nowhere near it. Big picture it is, but actually doing it, not.

It's not like those molecules are made out of big red bricks and you just move them around. Just like the athletes can't really shave down much time off world records because they are limited by human capabilities, so are those physicists and scientists working on things like CPUs.

The reason we call those guys smart is because they make such complex things so easy to understand for the rest of us. And if they can't do it, do we have any reason for complaint? :)
 

ASK THE COMMUNITY