The end of Moore's Law according to DARPA

seitur

Senior member
Jul 12, 2013
383
1
81
Generally I do agree.

ALTHROUGH

What imo could restart Moore Law is if due to new i.e. substance there would be whole new dimension of performance in computing created. What I mean?
In example - if someday really graphene problems would be overcame and if it really would achieve what some say it may achieve - like few hundread MHZ CPUs then whole computing could change - providing possibilities that exceed current technology and then in turn restart Moore Law.

Of course that don't have to be new susbtances but also maybe new ways of doing computing. Organic based processing, quantum computing, other than binary systems and so on.

Anyway - current progress based on silicon systems and shrinkage - yes it's coming to an end.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Physics will end Moore's law eventually regardless of economics.
Well, duh. Natural laws of the universe are predetermined -- we can't play God and write our own rules. The best we can do is figure out what the rules are -- which is the point of mathematics, physics, and other sciences.

On another note: Hold up, hot chips is going on right now?!? How the hell did I forget that?!?
______________________________

Question: Why doesn't Intel buy ASML? I'm ignoring the question of whether they have the finances to do so or not for now, but wouldn't it be tremendously advantageous for them?
 
Last edited:

JimmiG

Platinum Member
Feb 24, 2005
2,024
112
106
What about new materials? We've been building chips in pretty much the same way since integrated circuits were invented like 50 years ago. Sure the techniques are incredibly refined now compared to early microchips, but there must be a better way..
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
In direct response to the topic, I do think that Moore's Law will end within the next decade, unless some particularly massive changes were made to the economic side of the equation.

First, let me point out how small Intel is. Yes, you read that correctly. Their annual revenue is 53 billion. Walmart and Exxon sit around the 450 billion mark.

Relatively speaking, Intel is a large corporation, seeing as they're ranked 54th in the Fortune 500. But they're still tiny compared to the bigger corporations out there, and they're absolutely miniscule in comparison to the US government (which has an annual income tax revenue of 1.5 trillion).

Moore's Law could certainly continue on, or could even be beaten -- it's all a question of money.

Fun hypothetical scenario: If the US government had spent $1 trillion dollars on semiconductors, instead of the Iraq War, we'd have 7nm chips in our hands right now (or better). I'm not intending for this to be a political debate, but consider that we could be 8+ years ahead with a "simple" shift in priority. In reality, such a move is anything but simple, but if you were to stretch the truth, you could say that the Iraq War cost the US 8 years of technological progress. Now was the Iraq War worth it? Perhaps, but that is the debate that needs to be avoided in this forum.

In order for Moore's Law to continue, Intel's profits and revenues are going to have to increase dramatically. More funding is going to need to go into photolithography firms like ASML.
What about new materials? We've been building chips in pretty much the same way since integrated circuits were invented like 50 years ago. Sure the techniques are incredibly refined now compared to early microchips, but there must be a better way..
New materials suffer from the same problem: money. There are better ways... we just don't have the money to put a graphene/[insert supermaterial here] SoC in everybody's hands.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Fun hypothetical scenario: If the US government had spent $1 trillion dollars on semiconductors, instead of the Iraq War, we'd have 7nm chips in our hands right now (or better).

I wonder if that'd really happen. There has to be limits to how much you can accelerate a development schedule by throwing more money at it. Similarly to how there are limits to how quickly you can solve most problems by throwing more cores at it.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Yep there's more to it than simple money, in terms of speed of process progress.

One hypothetical situation I had toyed with in the past was the Globalfoundries would end up the eventualy fab winner simply by being bankrolled by an oil-rich nation. I wonder if that were to happen, say Intel runs out of cash at 10nm, and GF threatened to go lower, if the US government would then step in and save Intel based on the simple fact that they would rather not have Abu Dhabi in control of chip manufacturing?

I could think of worse things to spend trillions of dollars on, as Homeles indicated. We'll see, but I have a feeling there might be nationalizing of fabs at one point in the future.

I'm sure I was getting hell on the forum recently for even suggesting Intel might stop with fab progression btw! :D
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
I wonder if that'd really happen. There has to be limits to how much you can accelerate a development schedule by throwing more money at it. Similarly to how there are limits to how quickly you can solve most problems by throwing more cores at it.

Its no different than the Manhatten project (development of the atomic bomb) or the moon landings...if you take away the requirement that the endeavor be economically viable in an open-market sense then you can push decades of "slow-and-steady development" away and get things done rather quick.

Moore's law is literally nothing but a measure of the rate for which node development can be done in an economically viable fashion.

Why was the pace initially 12 months when Moore first observed it? Why did it then become 18 months, and then 24 months?

Why was a node shrink itself roughly defined to be that which yields ~70% linear shrink, why not 80%, why not 40%?

These are all rhetorical questions of course because the answer is really quite simple (and should be obvious).

The very observation of Moore's law by Moore was simply a measure of the absolute dollar amount that execs were willing to pump into R&D at the time versus the economic viability of the output from that R&D.

Implicit in that was the requirement that R&D's budget would grow at whatever CAGR necessary so as to sustain the 12 month pace, and implicit in the assumption that execs would continue to sign-off on such an R&D budget CAGR was the requirement that the TAM for the semiconductor market itself would grow at a commensurate CAGR.

A DARPA employee noting that economics seems to be critical to Moore's law is akin to someone noting that oxygen appears to be critical to the survival of a human being. It isn't news to anyone in the field, but I suppose it might come as a shock for anyone still coming to terms with how R&D and process node development actually gets done.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Of course more money would accelerate it to some extent, I just wonder what the practical limits are; I think we can both agree it doesn't scale arbitrarily. The space race really is a striking illustration, but here it's not so much a matter of moving to constrained economic viability as it is moving to almost zero economic viability, at least until recently. And it's not like they would have gotten a man on the moon in a year if the money was 20x greater. I also don't think they would have managed with fewer accidents if they pushed it faster with more money, validation in particular is one of those more serializing events. Those $20k parts with extensive screening and 50 week lead times probably couldn't be delivered in 5 weeks even if you paid $1m.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
I wonder if that'd really happen. There has to be limits to how much you can accelerate a development schedule by throwing more money at it. Similarly to how there are limits to how quickly you can solve most problems by throwing more cores at it.
Well, EUV has been technically possible for a while now, and working EUV machines have been available since 2011. They simply cost too much, and can't process enough wafers per hour to be economical.

The current problem is one that could be solved by throwing money about it. Also, I believe Intel's 10nm R&D is basically all but finished from a technical standpoint. Now it's just a matter of moving the designs over and getting the defect density down.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
A DARPA employee noting that economics seems to be critical to Moore's law is akin to someone noting that oxygen appears to be critical to the survival of a human being. It isn't news to anyone in the field, but I suppose it might come as a shock for anyone still coming to terms with how R&D and process node development actually gets done.
It doesn't come as a shock, per-se, but I had never considered the Intel factor before. Unless something changes with semiconductor fabs in the next decade, such that Intel is no longer the world's leading producer, Moore's Law is over when Intel says it's over. I'm accustomed to thinking more broadly, and that Moore's Law is over when no one can find a way to further improve fabrication. So the fact that Intel really holds the keys is something of a minor revelation.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Clearly anyone who thinks about it a bit and does the math can understand quickly that it things that double every two years can't continue on forever - or we'd have more transistors on a chip than there are atoms on the earth. In fact, if my back of the envelope rough estimate is about right, we'd have more transistors on a chip than atoms on the earth in about 160 years - but, as happens every time that I do math on the internet, I'm sure I'm wrong and someone will rub my face in it in a few posts. :) But regardless of the actual answer, it's a clear fact that like "doubling every two years" can't last forever, or actually even not very long at all, or they start hitting ludicrously high numbers. So, yes, Moore's Law will end - and no one denies this. And, similarly, as IDC posted, the idea that economics is tied to Moore's Law is also fairly obvious to anyone thinking about it. In fact, the first figure in Gordon Moore's original paper was about the cost per transistor over time - so he was focused primarily on manufacturing costs. (original paper here: http://web.eng.fiu.edu/npala/EEE5425/Gordon_Moore_1965_Article.pdf Fun reading actually).

But those two points aside, one of the most common things for technologists to prognosticate about is when "The End" will come. For as many people who try to predict the end of the world (last Dec, for example), there seem to be even more who are predicting the end to Moore's Law.

Coincidentally, I spent my lunch today watching an internal webcast from a researcher at Intel about all the methods under investigation to continue to drive semiconductor fabrication well under 10nm, and I'm increasingly convinced of the fairly obvious point of the original article - that we will hit an economic limit before we hit a physics limit. But one thing that people have failed to credit when they predicted the end to scaling at 1 micron decades ago is that people are very very ingenious... particularly when there's a lot of money involved. Gordon Moore once said, and I paraphrase, "At any given point, we would look ahead 10 years and it was very unclear how scaling could continue beyond that. The next 10 years forward were pretty obvious to us, but beyond 10 years it seemed insurmountable that scaling could continue beyond that. And then 10 years would pass, and once again we could see clearly the path for another 10 years". And I can echo Dr. Moore's words... from where I sit, Moore's Law will continue for another 10 years... but beyond that it's unclear how it can continue. But undoubtedly there will be a lot more researchers publishing papers declaring "the end is nigh".

It might be interesting to plot the number of published papers and articles declaring the end of Moore's Law as a function of time. I wonder if it doubles every 18 months...
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Moore's Law can only carry on for 11 more-ish full nodes anyway, and that's an overly-optimistic number.

Given that 14nm lands next year, and we assume the Van der Waals radius of a silicon atom is 210pm, we end up being 21 years away (2034) from the point where shrinking further means we're not even dealing with silicon anymore, or ~16 (2029) if we assume an 18 month cadence.

However, we'll hit the EOT limit well before then. The 5nm node is currently seen as the end of the road for Moore's Law. After that, physicists will have to get creative.

Could they get creative in time? Sure, but the end of the road as we can see it is around 2020, with the 5nm node. Even if Moore's Law doesn't end in 2020, there's only 7 full nodes left.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
It doesn't come as a shock, per-se, but I had never considered the Intel factor before. Unless something changes with semiconductor fabs in the next decade, such that Intel is no longer the world's leading producer, Moore's Law is over when Intel says it's over. I'm accustomed to thinking more broadly, and that Moore's Law is over when no one can find a way to further improve fabrication. So the fact that Intel really holds the keys is something of a minor revelation.

It is true, but only if you take a static snapshot of the marketspace today and extrapolate from there.

Had you done that in the 1960's for example then you'd have concluded Eastman Kodak would surely be a monopoly in the digital camera market today.

Intel is THE leading-edge game in town...today.

But they are by no means the only well-monied player in the semiconductor market.

It wouldn't take much to motivate an Apple ($145B cash on hand) or a Qualcomm ($22B cash on hand) for one of them to dive into process technology and fab ownership if it became painfully obvious their future was in jeopardy otherwise.

Moore's Law can only carry on for 11 more-ish full nodes anyway, and that's an overly-optimistic number.

Given that 14nm lands next year, and we assume the Van der Waals radius of a silicon atom is 210pm, we end up being 21 years away (2034) from the point where shrinking further means we're not even dealing with silicon anymore, or ~16 (2029) if we assume an 18 month cadence.

However, we'll hit the EOT limit well before then. The 5nm node is currently seen as the end of the road for Moore's Law. After that, physicists will have to get creative.

Could they get creative in time? Sure, but the end of the road as we can see it is around 2020, with the 5nm node. Even if Moore's Law doesn't end in 2020, there's only 7 full nodes left.

Remember, Moore's law isn't about scaling...that is simply a means to the end.

Moore's law is about cutting the cost in half, per component, in an IC.

To date the means to that end has been scaling. Don't conflate scaling (and the implications of the physics when doing that with atoms) with Moore's law as if they were interchangeable terms, because they really are not.

Look no farther than V-NAND for a traditional example of how one can continue to reduce costs per component (bit, in this example) without relying on dimensional scaling.

Moore's law dies when people run out of ideas on how to make things cost less on a per-component basis. That is all one can say about it if one is talking about Moore's law.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
It is true, but only if you take a static snapshot of the marketspace today and extrapolate from there.

Had you done that in the 1960's for example then you'd have concluded Eastman Kodak would surely be a monopoly in the digital camera market today.

Intel is THE leading-edge game in town...today.

But they are by no means the only well-monied player in the semiconductor market.

It wouldn't take much to motivate an Apple ($145B cash on hand) or a Qualcomm ($22B cash on hand) for one of them to dive into process technology and fab ownership if it became painfully obvious their future was in jeopardy otherwise.
Indeed, that's a very good point. But Intel has been the leading semiconductor manufacturer for what? 20 or 30 years now?

Given what we know about their roadmaps they certainly aren't going to let the gap narrow in the next couple of years; if anything their lead is growing. Beyond that anything is possible, but I certainly wouldn't expect Intel to fall before the economics of maintaining Moore's Law become as big a concern as DARPA is predicting.
 
Last edited:

GreenChile

Member
Sep 4, 2007
190
0
0
The driving force behind Moores Law has been the economies of scale. Basically what it boils down to is engineering a way to add more transistors to a chip without the cost to produce it going up. Traditionally this has been done by shrinking each node by approx 50% which should allow, all things being equal, twice the number of transistors to be added for the same die size.

This works great until the physical limits of shrinkage are reached. But then what? One obvious alternative will be to start building the transistors in multiple layers.

Up to this point I've only heard about chip stacking which involves building two separate chips, grinding the backside of one down so it's really thin, and gluing it onto the top of the other. Then they are electrically connected to each other. But this technique is flawed for scaling purposes in that two dies worth of silicon space are used to make one chip.

I foresee the future of silicon scaling to involve building up multiple layers of transistors. This can have the same effect of doubling the number of transistors for a given die space without the need for shrinking the physical dimensions of the transistor. So Moores Law does not have to die just because we reach the limits of shrinking.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Up to this point I've only heard about chip stacking which involves building two separate chips, grinding the backside of one down so it's really thin, and gluing it onto the top of the other. Then they are electrically connected to each other. But this technique is flawed for scaling purposes in that two dies worth of silicon space are used to make one chip.

I foresee the future of silicon scaling to involve building up multiple layers of transistors. This can have the same effect of doubling the number of transistors for a given die space without the need for shrinking the physical dimensions of the transistor. So Moores Law does not have to die just because we reach the limits of shrinking.

Checkout V-NAND :) It is not stacked chips or stacked wafers, but truly stacked transistors on the same chip.

Cool Samsung video here (definitely check out the video)

And slides from Goto-san.

01.png



60% CAGR is a heck of a ride, and it is what pays for the ballooning R&D expense.

05.png


V-NAND will combine both linear scaling (just not as aggressively as in the past) as well as stack height scaling to lower the cost-per-bit for the foreseeable future.

06.png


The "vertical view" is a stack of 24 NAND gates, (rather impressive proof of concept stuff for the future).

16.jpg


Another nice cross-section view of the device which is already in production:

13.jpg


And what is the motivation? Money, naturally!

23.png


^ so long as the CAGR for NAND continues to grow at absurdly high rates (like the current 60%), the TAM will grow right along with it, and with the TAM growth goes the growth in economically viable R&D budgets.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Remember, Moore's law isn't about scaling...that is simply a means to the end.

Moore's law is about cutting the cost in half, per component, in an IC.

To date the means to that end has been scaling. Don't conflate scaling (and the implications of the physics when doing that with atoms) with Moore's law as if they were interchangeable terms, because they really are not.

Look no farther than V-NAND for a traditional example of how one can continue to reduce costs per component (bit, in this example) without relying on dimensional scaling.

Moore's law dies when people run out of ideas on how to make things cost less on a per-component basis. That is all one can say about it if one is talking about Moore's law.
I know what Moore's Law is supposed to be about; the issue is that it's already dead when you look at it in that light. Cost per transistor has essentially leveled off for fabless companies -- I'd imagine interest in TSMC's 20nm right now is for performance and space advantages, and has little to do with lowering cost/transistor.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Its no different than the Manhatten project (development of the atomic bomb) or the moon landings...if you take away the requirement that the endeavor be economically viable in an open-market sense then you can push decades of "slow-and-steady development" away and get things done rather quick.

Moore's law is literally nothing but a measure of the rate for which node development can be done in an economically viable fashion.

Why was the pace initially 12 months when Moore first observed it? Why did it then become 18 months, and then 24 months?

Why was a node shrink itself roughly defined to be that which yields ~70% linear shrink, why not 80%, why not 40%?

These are all rhetorical questions of course because the answer is really quite simple (and should be obvious).

The very observation of Moore's law by Moore was simply a measure of the absolute dollar amount that execs were willing to pump into R&D at the time versus the economic viability of the output from that R&D.

Implicit in that was the requirement that R&D's budget would grow at whatever CAGR necessary so as to sustain the 12 month pace, and implicit in the assumption that execs would continue to sign-off on such an R&D budget CAGR was the requirement that the TAM for the semiconductor market itself would grow at a commensurate CAGR.

A DARPA employee noting that economics seems to be critical to Moore's law is akin to someone noting that oxygen appears to be critical to the survival of a human being. It isn't news to anyone in the field, but I suppose it might come as a shock for anyone still coming to terms with how R&D and process node development actually gets done.

Eh, there's a limit to getting things done fast. There's only so many experts in the world in a field, and they can only theorize and experiment so fast.
Building tools and rebuilding tools when they're wrong takes a certain amount of time too.
Plus, isn't TSMC already funded by its government? Same with the Chinese fabs? Free money alone isn't a guarantee of success, there needs to be an established knowledge base, and those experts need a certain minimum amount of time.
 

GreenChile

Member
Sep 4, 2007
190
0
0
Checkout V-NAND :) It is not stacked chips or stacked wafers, but truly stacked transistors on the same chip.

Cool Samsung video here (definitely check out the video)

And slides from Goto-san.
Wow that is a truly innovative technique Samsung is demonstrating there. This is a very good representation of the direction scaling will need to go in the not so far off future. You can be sure it's something all the chip manufacturers are researching right now.

One obvious concern about this technique is heat buildup. Each layer is multiplying the current density for a given area of silicon so temperatures can potentially ramp quickly out of control. You'll need some very low power transistors for this to be a viable option.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,637
748
126
Remember, Moore's law isn't about scaling...that is simply a means to the end.

Moore's law is about cutting the cost in half, per component, in an IC.

To date the means to that end has been scaling. Don't conflate scaling (and the implications of the physics when doing that with atoms) with Moore's law as if they were interchangeable terms, because they really are not.

Look no farther than V-NAND for a traditional example of how one can continue to reduce costs per component (bit, in this example) without relying on dimensional scaling.

Moore's law dies when people run out of ideas on how to make things cost less on a per-component basis. That is all one can say about it if one is talking about Moore's law.

That's right. However I'm more interested in the rate of future performance increases (since at some point that price doesn't matter much, e.g. if a desktop CPU costs $50 or $100 is not that big of a deal). There might not be a name for that law. But what can we expect? It seems like we've more or less already hit the wall based on recent year's performance improvements.

Also, lower price for the same CPU does not sell new computers. So what's Intel and other semiconductor companies going to do about that? I mean if I already have a desktop computer where the CPU at the time of purchase cost $100 I'm not going to buy a new one even if it now just costs $50 if the performance is the same... ;)
 
Last edited:

jhu

Lifer
Oct 10, 1999
11,918
9
81
Eh, there's a limit to getting things done fast. There's only so many experts in the world in a field, and they can only theorize and experiment so fast.
Building tools and rebuilding tools when they're wrong takes a certain amount of time too.
Plus, isn't TSMC already funded by its government? Same with the Chinese fabs? Free money alone isn't a guarantee of success, there needs to be an established knowledge base, and those experts need a certain minimum amount of time.

Taiwan provided half of TSMC's startup costs in the beginning (back in the 1980s). Now I don't think they get anything more than tax breaks, just like other companies.