• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Moore's Law - Is it for real?

Muse

Lifer
Yesterday in the Technology section of the San Francisco Chronicle:

Article on Moore's Law

First articulated in 1965, Gordon Moore's dictum had the number of transistors in a given space doubling every year. Ten years later, he decided that it should be every 2 years. I've seen it described as every year and a half.

People say it's uncanny how true this has born out. Myself, I've always wondered how much of this comes from planning by the computer manufacturing industry. In other words, clearly if things change at a measured pace it will be easier for all concerned. Well, not necessarily all concerned, but surely for the manufacturers of computer hardware, the producers of computer software, the computer industry in general. If development and evolution of these things happens at a fairly predictable rate, people have some idea what to expect, and what is expected of them. However, if things change in fits and starts it's a lot more stressful. Is Moore's Law largely because of a sort of self moderation of the industry (even if unconscious) or is it basically intrinsic to some natural and real progression of developmental engineering science. I hope this makes sense to people! 😀
 
The way I understand it, to some extent it's a self-fulfilling prophecy. I there's a body that lays out a roadmap for semiconductor technology, the ITRS (International Technology Roadmap for Semiconductors). This roadmap lays out what nodes should be hit by the industry and when (more or less really, I think it's only a guideline). I'm not sure exactly how they come up with this roadmap but I'm sure Moore's Law plays a role in determining what technology nodes go where. In turn this means that lithography equipment manufacturers base their development on this roadmap, and chip manufacturers then use this equipment to manufacture chips which meet the roadmap.

Since creating a process for Intel, IBM, or AMD is so expensive, and involves a bunch of manufacturing partners they really can't "skip ahead" all that far and so Moore's law lives on. I'm sure some of the posters that work in the industry can give you more information on the details of how it all works 🙂.
 
It was self indulgent hype and I'm sick of hearing about it. It ain't a "law", it's a freakin opinion. Could the world please get over it.
 
Originally posted by: Greenman
It was self indulgent hype and I'm sick of hearing about it. It ain't a "law", it's a freakin opinion. Could the world please get over it.


ouch. where's the love?

you got a problem with doubling transistor density or somethin?
 
Just as the fuel consumption of cars tends to decrease, the installed power of the engine to increase, so is the number of transistors increasing. You could also make a law about fuel consumption of cars, or their power or something.
Just that the "Moore's law" was easier to implement in reality
 
It's an observation of a curve fit. Mr. Moore was graphing chip progress for a meeting where he needed to guess what the future would hold and he was extrapolating and drew a curve fit to the points that he'd plotted.

Gordan Moore himself used to cringe when he heard it referred to as his "law". He said he finally got used to it enough that he could refer to it himself in speeches only within the last 10 years or so.

As far as it being a self-fufilling prophecy, I don't know that I would agree with that wording exactly. I don't believe that the industry rolls out new processes every 24 months or so just to make sure that they are staying on track with "Moore's Law". There are huge savings to be had in being able to use 1/2 as much area for a given chip every couple of years.

As far as moderation... I am sure that the industry would accelerate the curve if they could... in fact at several points the curve has slowed and then sped up later. There are numerous technical challenges to new process technologies that require solutions that take literally years of research. Clearly it cannot go on forever, but it looks like it will continue to hold for at least the next 10 years.


Still, there is a "roadmap" that the industry uses to make sure that everything will line up: http://public.itrs.net/Files/2003ITRS/Home2003.htm

If anyone thinks this stuff is easy, however, look at page 6 of the lithography section: http://public.itrs.net/Files/2003ITRS/Litho2003.pdf and note how much is highlighted red indicating: "manufacturable solutions are not known"
 
As I understand it, a guy named Moore saw a trend, and word got out. It soon became known as Moore's Law. It has as much bearing on the real world's workings though as Murphy's Laws do.
 
Originally posted by: The Boston Dangler
Originally posted by: Painkiller
:thumbsup: Agreed. It's not a law, because it will not stand the test of time.



ummm, it already has.

Wow, with time being so long, I had no idea that lasting 40 years proved the "test of time." Transistors can only get so small. Once they are down to the size of electrons, there will be no mor doubling of transistors. It will not prove to be true over the long run. Simply, technology was advancing at that time in such a rate that transistor density could be doubled every 18 months. It was pure extrapolation on a graph, and nothing more. It has held true by pure coincidence, and once manufacturers realize they can't cram any more transistors onto a die, the extrapolation will die.
 
Originally posted by: Jeff7
As I understand it, a guy named Moore saw a trend, and word got out. It soon became known as Moore's Law. It has as much bearing on the real world's workings though as Murphy's Laws do.

I think that Murphy's Law is probably more scientific😛
 
Originally posted by: complacent
Originally posted by: The Boston Dangler
Originally posted by: Painkiller
:thumbsup: Agreed. It's not a law, because it will not stand the test of time.



ummm, it already has.

Wow, with time being so long, I had no idea that lasting 40 years proved the "test of time." Transistors can only get so small. Once they are down to the size of electrons, there will be no mor doubling of transistors. It will not prove to be true over the long run. Simply, technology was advancing at that time in such a rate that transistor density could be doubled every 18 months. It was pure extrapolation on a graph, and nothing more. It has held true by pure coincidence, and once manufacturers realize they can't cram any more transistors onto a die, the extrapolation will die.


40 years happens to be the majority of the entire history of electronic computers.
Transistors and other components can shrink to the size of small molecules. By the time physical size becomes the absolute minimum, technology will move on, abandoning the IC as we know it.

Both AMD and Intel have independantly confirmed and announced moore's law will remain on target for another 20 years.
 
Once they are down to the size of electrons, there will be no mor doubling of transistors.
While I agree with you that there is a fundamental limit, there are more ways to go when we hit it (which will be well above the size of electrons): then we can increase the size of the die itself, or stack them in the vertical dimension. If we are down to atomic sizes at that point, stacking in the 3rd dimension will still add quite a lot of "doubling".

But yes, Moore's Law is merely an observation - not a Law, and eventually it will run out... unless we can figure out how to tuck the extra transistors into one of the extra dimensions around us. 🙂

I think that Murphy's Law is probably more scientific
Ouch. Full credit goes to Mr. Moore for being the first to observe this trend, as well as being directly responsible for several of the data points graphed along it.
 
Umm when Moore originally theorized this "law" he was first quoted as saying it would double every three years then he re-evaluated and said every two years, then 18months later he said 18months... it is not a law because as someone said transistors can only get so small, even Hawking thinks there is a limit on the smallest a particle can be. Also what happens if humans die out or transistors become useless(what with quantum logic gates) then the law is broken, this will happen, hence it is not a law, just an observation. But then again Moores Observation doesn't sound half as cool.
 
As said above, Moore himself never called it a law. It was just an idea he proposed in his book. People who read his book thought it was so fantastic at the time (and optimistic), they coined the term "Moore's Law."

Any sort of "law" in the engineering world based on exponential growth is doomed to fail. We are quickly running out of options with silicon (at best 10 years by current estimates). It used to be just coming up with new manufacturing techniques (i.e. new doping methods, more precise etching, purer fabrication) could double the transistor count. Now it's trying to transcend the physical properties of silicon.


In short, Moore's Law was and still is a guideline for the semiconductor industry. It's just becoming more and more difficult to continue following that guideline.
 
Originally posted by: jagec
Originally posted by: Jeff7
As I understand it, a guy named Moore saw a trend, and word got out. It soon became known as Moore's Law. It has as much bearing on the real world's workings though as Murphy's Laws do.

I think that Murphy's Law is probably more scientific😛

And will stand better to the test of time 😛
 
What are the hard physical limits of todays technology? electron wavelength is in the picometer range iirc, is there anything to keep us from getting to that neighborhood?
 
Originally posted by: Gibsons
What are the hard physical limits of todays technology? electron wavelength is in the picometer range iirc, is there anything to keep us from getting to that neighborhood?


Uhmmm....are you suggesting that we can control individual electrons? Because that's something probably more suited for a Star Trek episode.

What is there to keep us from making picometer electrical components: The fact that we don't even completely understand individual electron behavior. Just look at the conflicts that arise between quantum mechanics and classical relativity.


EDIT: OK I reread the question and I realized I misunderstood it. Current technology uses UV beams to etch transistors. While we are leaning towards electron-beam and ion-beam, it's still too expensive to be commercially viable.

Also, when transistors are on the order of 1E-12 meters, leakage currents become a major problem because the insulating walls are so thin. Energy is wasted, lowering efficiency and generating excess heat.
 
Originally posted by: glorygunk
Originally posted by: Gibsons
What are the hard physical limits of todays technology? electron wavelength is in the picometer range iirc, is there anything to keep us from getting to that neighborhood?


Uhmmm....are you suggesting that we can control individual electrons? Because that's something probably more suited for a Star Trek episode.


What I meant was, we can't make working circuits unless they are X amount larger than the wavelength of the electrons. So I see that as one possible "hard limit" on how far we can go with our current approach. I was wondering if there was some other physics that would occur at a larger size which would prevent even reaching the level where electron wavelength is a problem.


 
What are the hard physical limits of todays technology?
I don't think that any one really knows. About 2 decades ago, no one thought that CMOS silicon would ever scale below 2um (2000nm), we are now at 0.09um (90nm). Every time someone makes a guess at how small things can get before we hit a hard limit and then something changes and we move on. For example, a limit that people have been tossing around recently is that the gate of a CMOS transistor can't be made thinner than a couple of atoms thick - and this is true and would seem to be an insurmountable barrier to progress below 45nm. But now people are talking "FinFETs" and high dielectric gate materials and essentially changing the rules a bit and now it doesn't seem like this is the barrier that one might think at first.

Gordan Moore once said (and I am paraphrasing here) that at any given point in his career he would look forward about 12-15 years and think "well, the game will be over when we reach that point" and then about 12 years later he'd look back and think "I should have been able to see what we would have done" and then he'd look foward another dozen years and think that it will be all over again. And it doesn't.

There is a huge amount of money and people backing the semiconductor industry. Things that seemed unimaginable 20 years ago are now in wide use. There is enough money, talent and inertia to keep the industry moving foward for some time to come. I admit that when I look forward 10 years, I am a bit stumped as to how we are going to deal with leakage, noise and quantum-mechnical issues in a high-volume manufacturable solution. But people have been thinking like this for literally decades now and still the machine rumbles on.

If there is one thing that is patently obvious is that eventually Moore's Law will end. But I would caution anyone from thinking that we are close to the end and that in 10 years or so, the IC industry will stagnate... because mostly likely you will be proven wrong.

If there is one thing that has been steadily happening is that gradually we are moving away from silicon. We are gradually moving towards replacing silicon with more exotic materials - with the channel being a mix of silicon and germanium, interconnects being a complex stack of layered materials. Moving forward, I can only see this trend continuing. The gate will change from silicon nitride to something else that is unlikely to be primarily silicon. The interconnect dielectric material will change from FSG to something that will likely have a much lower quantity of silicon. Now people are talking about replacing polysilicon salicides with a metallic compound. Whatever limits that silicon hits are probably going to be sidestepped with shifts to alternative materials.
 
I think one needs to keep in mind that there is huge difference between what can be done on a commercial scale and what can be done in experiments where you only need a few samples.

Going below 90 nm is absolutely no problem using e-beam lithography, it has been done for many years (the lithographer from JEOL we use most of the time is almost 15 years old), so we know what happens when you go to smaller dimensions (we can make structures with a linewidth of a few nm). We have been doing single-electronics for maybe ten years, using high-kappa materials (which is now on the roadmap) for about as long.
We also know a lot about other semiconductors: GaAs, GaN, InP,SiC, SiGe+about a dozen other.

So basically the main challenge is to figure out how to use these techniques on a commerical scale. There are pleny of new ideas and technologies but as pm has already pointed out in order to use the we will have to start moving away from using Si and that will be painfull.



 
Interesting points pm. One thing I wonder about is if something like GaAs or SiGe could ever be the basis for a ULSI circuits or whether Si will continue to be the basis for these. I guess it all comes down to economics and since Si-based CMOS gets by far teh most attention, everything else is comparatively expensive. In the end, I wonder if at some time in the future it may not be cost-effective for the industry to switch over to GaAs or InP rather than creating increasingly complicated work-arounds for Si.
 
i see gallium arsenide used very sparingly now, mostly studio and headend related equipment. is that the most likeky to succeed silicon? it has to go at some point.
 


quote:
Originally posted by: Painkiller
Agreed. It's not a law, because it will not stand the test of time.





ummm, it already has.


Has it? Albert Einstien's theory of relativitiy is still a theory and it has not been disproven for 100 years.

 
Back
Top