Celebrating 50 Years of Moore's Law (IEEE Spectrum)

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
SPECIAL REPORT: 50 YEARS OF MOORE’S LAW

Moore's Law, arguably the most-known but least-understood law in all of technology, is celebrating its 50th anniversary in 2015, a enormous milestone for such an exponential trend -- and it isn't going to end for at least another 10 years, according to Intel fellow Mark Bohr.

"In development and research, we see scaling continues at least another 10 years, which is the same answer we gave 10 and 30 years ago. It’s [always] hard to see beyond 10 years.

Although it gets harder every time, we are still developing technology that’s lower cost-per-transistor than the previous node. I remember when one micron was terrifying to us.
Today our 22 nm process is Intel’s highest yielding, lowest defect technology. In a year or so, our 14 nm process will match that, but it will take a lot of work."

The heady challenges of design at 14 nm made the node later than expected for Intel, closer to a three-year than to Intel’s typical two-year cadence. “We don’t expect we’ll have similar problems at 10 nm, because we’ve learned and we’re trying harder,” he said.

7nm is already on the roadmap, with no slowing down (maybe time to market, but we'll see), with or without EUV.


For this anniversary, IEEE Spectrum is publishing a series of articles in the coming time. From those already online, I liked The Multiple Lives of Moore’s Law most.

This isn't the first article on Moore's Law, though. IEEE Spectrum (and other sites, of course) has published numerous articles on the subject, including The Status of Moore's Law: It's Complicated and Shrinking Possibilities. For people who want to see Moore's Law in reality, Chipworks has a 2-part blog with images.


A few words on Moore's Law. Moore's Law is often confused with Dennard Scaling. IEEE Spectrum also has some great articles related to that one, including The Amazing Vanishing Transistor Act, The High-k Solution, Transistor Wars (FinFET), and Changing the Transistor Channel. Dennard Scaling, although almost unknown to most people (that also know ML) is what should actually matter most to enthusiast: it dictates speed and power, whereas Moore's Law really is a an economical law. Loosely speaking, it states that the cost of a transistor will decrease exponentially, which is done by decreasing the transistor size, allowing more transistors to be used, which is especially helpful for manycore designs like GPUs -- but it doesn't say anything about transistor performance or power.

One effect of Moore's Law, however, has been consolidation as costs kept (and still are) increasing. It's a common misconception that this will lead to the end or slowing of Moore's Law because of lower competition; the reverse is true: consolidation (up to a few companies) gives more money to few companies, which helps perpetuating Moore's Law because they can invest more -- it's important to note that Moore's Law is good for companies as well since it reduces their cost.

Moore's Law isn't only used for transistors. Other exponential trends have been observed, like energy efficiency: Outperforming Moore's Law. More articles can also be found on ExtremeTech. A last notable link concerning Moore's Law is this presentation from former CEO Craig Barrett: A Historical Perspective on Semiconductors and Moore's Law.


Intel will also be celebrating Moore's Law, presumably at Computex and surely at IDF in August. IDF Shenzhen is next week. I'll leave you with this one:

“The first one is Moore's Law, and I'll talk a little bit about that this morning, but Bill's gonna give you a really fun, in-depth discussion of Moore's Law for those who really enjoy it. [laughing audience] I do wanna make a point thought: next year is the 50th anniversary of Moore's Law, and I don't think a lot of people think about this being a law that's been around for 50 years. And through that time, my 30 year engagement with that 50 years of myself personally as an engineer at Intel, I can tell you many times people've talked about the Law ending. Our job at Intel is to make sure it lives on for as long as possible. But 50 years is a momentous milestone and we will be doing some things next year to recognize that.” --Brian Krzanich, CEO Intel, IM’14

PS: When do you think Moore's Law will end ;)?
 
Last edited:

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
Moore's Law, arguably the most-known but least-understood law in all of technology, is celebrating its 50th anniversary in 2015...
One thing that is for certain misunderstood is "Moore's Law" is not actually a law at all but an observation. As for it ending, we're getting there node shrinks are getting more and more difficult at some point (relatively soon) there will be no where to go as far as increasing transistor density.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
One thing that is for certain misunderstood is "Moore's Law" is not actually a law at all but an observation.
No, this is also misunderstood. Moore's Law isn't an observation, it's a prediction. The exponential doubling was only going on for a few years (which isn't difficult with transistor counts being <50) when he made his prediction.

As for it ending, we're getting there node shrinks are getting more and more difficult at some point (relatively soon) there will be no where to go as far as increasing transistor density.
There will be lots of innovation left even after the ending of density scaling.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
No, this is also misunderstood. Moore's Law isn't an observation, it's a prediction.
http://en.wikipedia.org/wiki/Moore's_law

"Moore's law" is the observation that, over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years.

Although this trend has continued for more than half a century, "Moore's law" should be considered an observation or conjecture and not a physical or natural law.
It's an observation that morphed into a prediction, but it is not a law.
 
Last edited:

positivedoppler

Golden Member
Apr 30, 2012
1,149
256
136
True. Laws exist in nature/physics and is backed by indisputable mathematics. Its more like the near linear path Moore guessed and recommended the industry to follow. Its actually disappointing that there has been no true breakthrough in technology that allows us to leapfrog his expectation 50 years ago. I want robots and holodecks by now.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,636
748
126
Dennard Scaling, although almost unknown to most people (that also know ML) is what should actually matter most to enthusiast: it dictates speed and power, whereas Moore's Law really is an economical law. Loosely speaking, it states that the cost of a transistor will decrease exponentially, which is done by decreasing the transistor size, allowing more transistors to be used, which is especially helpful for manycore designs like GPUs -- but it doesn't say anything about transistor performance or power.

Correct. But up until the last 5-10 years, Moore's law has also correlated with increase in CPU performance. So previously it use to be correct to say something like "CPU performance approximately doubles every 18 months", even though that was not what Moore's law explicitly stated. I guess that fact has added to the confusion. I.e. although the performance prediction was not what Moore's law was talking about, it proved to be correct anyway for a long period of time. ;)
 
Last edited:

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
http://en.wikipedia.org/wiki/Moore's_law


It's an observation that morphed into a prediction, but it is not a law.

Well, it's just semantics. You can't make (scientific) predictions without first having some data (from observations). Moore already made his famous prediction in that paper. It's a law insofar that if you shrink the transistor by a factor sqrt(2), you'll be able to fit 2x the amount of transistors on the die (at least that part is a real law), which you can keep repeating at regular intervals -- until you can't anymore.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Intel spokesperson:
“Moore’s law is dead, we killed it.”
...
“Creating new process nodes takes large and consistent investments into leading edge technologies and facilities. With our new business plan and corporate cost structure we have a chance to dispense with those investments. We currently have the brightest and most technically capable employees money can buy. Fortunately for our investors the board of directors with the help of their friends at large financial institutions had the courage to challenge our need for those expenditures. With the ground breaking technical and cultural success that Broadwell has proven itself to be, we no longer see a reason to keep chasing the Dragon.”
...
“Intel sees its core strengths as its marketing, management, and human resources staff.”

Source SA
http://semiaccurate.com/2015/04/01/intel-announces-the-end-of-its-core-product-line/
 

ninaholic37

Golden Member
Apr 13, 2012
1,883
31
91
I always thought it was called a "Law" because Intel made a rule that it must do it (double the transistors every 2 years or whatever), as a challenge to themselves to keep the process evolving. Sort of like the "law" that you have to renew your fishing or drivers license every x-amount of years. So if they failed, they are breaking their own law or code or whatever. Celebrating 50 years of Moore's Law to me is saying that they managed to succeed at it for 50 years. I never really paid much attention though.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Intel expects to continue supplying the PC market with its Broadwell chips indefinitely and will complete the roll-out Broadwell&#8217;s derivative products like Broadwell-E, EP, and EX within the next two years. &#8220;This is not the end of microprocessors, but it is the beginning of soaring profitability and corporate margins.&#8221;

Awesome, best April's fools joke yet.

To meet the promises that Intel made during its last quarterly call to boost gross margins by a full ten basis points the company has announced pricing increases across its product portfolio. As of May 1st the price of Intel&#8217;s top-tier Core i7-5960X chip will be increasing from an austere $1,049 to a luxurious $9,999 and 3.14159 Bitcoins.

&#8220;We want our marketing to match our products and our pricing. The days of selling processors as cheap as chips are over. If our customers want Extreme Edition CPUs then they&#8217;re going to want to pay an extreme price for that honor.&#8221;

But the lack of mentioning of Skylake makes it suspicious.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,636
748
126
Moore's Law really is a an economical law. Loosely speaking, it states that the cost of a transistor will decrease exponentially
At what rate? If I remember correctly, the transistor cost should be cut in half every 18 months.

Doesn't seem to be that way in reality though, when looking as some rough estimates:

2008Q1: Core2Quad Q9300, 456 million transistors, $200. => $436 per billion transistors.
2013Q2: Haswell 4770K, 1.4 billion transistors, $350. => $250 per billion transistors.

So the price in 2013 was 250/426=0.587 of that in 2008. But 63 months have passed during that period, so it should be 0.5^(63/18)=0.5^3.5=0.088 according to Moore's law.

In other words the price per transistor is 0.587/0.088=6.67 times as high as it should be according to Moore's law.

Having said this, the calculations above are of course the price to the end customer. And I assume Moore's law is talking about the production cost per transistor? So does it mean Intel is making more profit per sold transistor these days, or what's the explanation?

Disclaimer: The calculations above are just estimates, and other CPUs could probably have been chosen for the calculations. But I think it should be enough to prove the point.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Fjodor2001: I think you can do better than that.

A node shrink roughly doubles density (not sure if anyone wants to research the average for Intel for the past few nodes, but according to Intel it's .53x or so), while wafer costs rises only 1.1x and more recently closer to 1.3x*. The delta between new nodes is slightly higher than 2 years, which thus gives a slightly lower than 2x density improvement and a somewhat less than that cost/x'tor improvement (on average).

I don't really know how Intel does it, unless prior nodes where quite a bit below the 2x, but apparently that 1.3x wafer costs doesn't stop them from even accelerating the cost reduction trend:

449891-intel-cost-reducation-fall-2014.jpg
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Is that a real answer or an allusion to the fact that people always claim moores law is 10 years away from ending?

Was a joke, there is no limit to how inexpensive a transistor can be. Sometimes they are so cheap you actually get paid to take them!
 

Mortius

Junior Member
Dec 4, 2013
14
0
0
At what rate? If I remember correctly, the transistor cost should be cut in half every 18 months.

Doesn't seem to be that way in reality though, when looking as some rough estimates:

2008Q1: Core2Quad Q9300, 456 million transistors, $200. => $436 per billion transistors.
2013Q2: Haswell 4770K, 1.4 billion transistors, $350. => $250 per billion transistors.

So the price in 2013 was 250/426=0.587 of that in 2008. But 63 months have passed during that period, so it should be 0.5^(63/18)=0.5^3.5=0.088 according to Moore's law.

In other words the price per transistor is 0.587/0.088=6.67 times as high as it should be according to Moore's law.

Having said this, the calculations above are of course the price to the end customer. And I assume Moore's law is talking about the production cost per transistor? So does it mean Intel is making more profit per sold transistor these days, or what's the explanation?

Disclaimer: The calculations above are just estimates, and other CPUs could probably have been chosen for the calculations. But I think it should be enough to prove the point.

You are assuming that either
a) Production cost is the sole cost involved in releasing a CPU. (Good news. Sales and marketing, customer support and actually designing the thing is free)
or
b) Every other cost scales in proportion to design cost.

Lets assume that for the Q8300 100 engineers were responsible for designing the layout of the transistors for each square mm of the chip. That makes each engineer responsible for (456e6/16400) ~= 27, 800 transistors.

Now assume that Haswell was assigned engineers at the same rate.
Each engineer would be responsible for (1.4e9/17700) ~= 79,100 transistors.

Is it reasonable to assume that Intel CPU design engineers of 2013 are capable of doing the work of ~3 2008 design engineers? Are those engineers happy with not having had a pay rise in 5 years? (If we use your scaling factor, each one 2013 engineer does the same amount of work as ~11 2008 ones)
(Ignoring any difference in CPU sales with which to amortise costs)

Repeat for sales and marketing, logistics, customer support and research engineers that develop the new process. I don't think anyone claims that the latter gets cheaper.

Other than that
1. Inflation, exchange rate fluctuations, purchasing power parity etc are all ignored. A 2008 dollar is worth more than a 2013 dollar (again ignoring exchange rate fluctuations).
2. That wasn't the Q8300's launch price. In the old days Intel used to discount processors when they released one with a higher multiplier.
3. Comparing the least expensive Penryn based Core 2 Quad (at release) with the most expensive Haswell (at release) may not be the fairest comparison.
4. The Haswell CPU now incorporates what used to be the Northbridge. (An extra $26 worth of value for the G45 Northbridge, which may not have been release cost and which may or may not have been absorbed into the PCH cost.)
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,636
748
126
You are assuming that either
a) Production cost is the sole cost involved in releasing a CPU. (Good news. Sales and marketing, customer support and actually designing the thing is free)
or
b) Every other cost scales in proportion to design cost.

Lets assume that for the Q8300 100 engineers were responsible for designing the layout of the transistors for each square mm of the chip. That makes each engineer responsible for (456e6/16400) ~= 27, 800 transistors.

Now assume that Haswell was assigned engineers at the same rate.
Each engineer would be responsible for (1.4e9/17700) ~= 79,100 transistors.

Is it reasonable to assume that Intel CPU design engineers of 2013 are capable of doing the work of ~3 2008 design engineers? Are those engineers happy with not having had a pay rise in 5 years? (If we use your scaling factor, each one 2013 engineer does the same amount of work as ~11 2008 ones)
(Ignoring any difference in CPU sales with which to amortise costs)

Repeat for sales and marketing, logistics, customer support and research engineers that develop the new process. I don't think anyone claims that the latter gets cheaper.

Other than that
1. Inflation, exchange rate fluctuations, purchasing power parity etc are all ignored. A 2008 dollar is worth more than a 2013 dollar (again ignoring exchange rate fluctuations).
2. That wasn't the Q8300's launch price. In the old days Intel used to discount processors when they released one with a higher multiplier.
3. Comparing the least expensive Penryn based Core 2 Quad (at release) with the most expensive Haswell (at release) may not be the fairest comparison.
4. The Haswell CPU now incorporates what used to be the Northbridge. (An extra $26 worth of value for the G45 Northbridge, which may not have been release cost and which may or may not have been absorbed into the PCH cost.)

As I wrote, it was a rough estimate. Your considerations above are valid. But do they compensate enough for the fact that the price per transistor to the consumer is about 6-7x higher than it should be if the cost reduction progress from Moore's law was passed on to the customer? I have a hard time seeing how that could be the case...
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,636
748
126
Fjodor2001: I think you can do better than that.

A node shrink roughly doubles density (not sure if anyone wants to research the average for Intel for the past few nodes, but according to Intel it's .53x or so), while wafer costs rises only 1.1x and more recently closer to 1.3x*. The delta between new nodes is slightly higher than 2 years, which thus gives a slightly lower than 2x density improvement and a somewhat less than that cost/x'tor improvement (on average).

I don't really know how Intel does it, unless prior nodes where quite a bit below the 2x, but apparently that 1.3x wafer costs doesn't stop them from even accelerating the cost reduction trend:

449891-intel-cost-reducation-fall-2014.jpg

So the conclusion of combining your post with mine is that:

1. Production cost per transistor has been reduced in accordance with Moore's law.
2. The cost reduction in 1) has not been passed on to the consumer at the same rate.
3. Intel is making more profit per transistor than before? :confused:
 

dealcorn

Senior member
May 28, 2011
247
4
76
As I wrote, it was a rough estimate. Your considerations above are valid. But do they compensate enough for the fact that the price per transistor to the consumer is about 6-7x higher than it should be if the cost reduction progress from Moore's law was passed on to the customer? I have a hard time seeing how that could be the case...

Good point except that no one wants to purchase an Intel 8088 chip for a fraction of a penny. Folks demand many, many transistors in a contemporary chip and they are willing to pay for the development work necessary to let this happen. The 8088 was a fantastic chip compared to the alternative of no chip. However, it only of historic interest to today's consumers.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
So the conclusion of combining your post with mine is that:

1. Production cost per transistor has been reduced in accordance with Moore's law.
2. The cost reduction in 1) has not been passed on to the consumer at the same rate.
3. Intel is making more profit per transistor than before? :confused:

Yes.

Intel's prices to the consumer are based on a supply/demand curve set by the consumer.

Intel does not get to set the supply/demand curve unless they happen to choose to operate in a supply-limited capacity the likes of the dram market (thank you Samsung and Hynix).

So the price you get, the price Intel sets, is very much all about optimizing Intel's return on investment. It has nothing (unless coincidentally so) about optimizing your return on purchase.

Moore's Law (the thread topic) has absolutely nothing to do with consumer prices. It is 100% about manufacturing cost.

You know this, right?