Interesting Speculation on the Evolution of Processors

GZFant

Senior member
Feb 18, 2003
437
0
76
My brother and I were just having an interesting conversation on the path of CPUs. I am definitely technology illiterate, but I was curious to know where Intel/AMD/Whoever will go when the nanometer process shrinks down to 1 or 2 nm? What will they do then?
My brother was speculating that there has to be a large change in CPUs when they hit that small, because, where do you go from there? Him and I obviously came to no conclusion, we were just entertaining different thoughts.

This is all speculation of course, I am curious to hear what all of you think?
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Did you read the article on the new 45nM process? Intel redesigned CMOS transistors to be able to make that shrink. Without further redesign (which has only happened a few times in history) they're not going much smaller. Obviously, when you have only a couple of atoms left, something has to give.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: GZFant
My brother and I were just having an interesting conversation on the path of CPUs. I am definitely technology illiterate, but I was curious to know where Intel/AMD/Whoever will go when the nanometer process shrinks down to 1 or 2 nm? What will they do then?
My brother was speculating that there has to be a large change in CPUs when they hit that small, because, where do you go from there? Him and I obviously came to no conclusion, we were just entertaining different thoughts.

This is all speculation of course, I am curious to hear what all of you think?

The large change has to occur before the physics require it as the economics are prohibitive to scale devices to the limits of physics.

Alternatives already exist, and have for many years and in some cases decades. Higher performance (switching speeds, drive currencts, etc) have always been available in non-Si based integrated circuits...but they come with substantially more cost. SiGe and InGaAs for example. (SOI is also an example, versus bulk Si integration)

So what you are really looking to ask is when does it become cost prohibitive for the consumer driven portion the semiconductor industry to continue to scale contemporary Silicon-based CMOS devices to smaller and smaller scale relative to the cost of designing integrated circuit devices for then existing semiconductor integrations schemes.

My expectations is this type of philosophical conversation is really a complete waste of brain utilization...much as the guys wasted their time and energy fretted in the 1940's over how they were going to continue to scale the land-line telephone into the 21st century...because along came GSM and CDMA (disruptive technology) and completely eliminated the need for evolutionary technology of the then modern landline telephone.

I very much doubt that in 2030 anyone is really going to care that the era of personal computing phased out of existence (due to lack of need) circa 2020 as it was replaced entirely by some disruptive technology that was invented in China or Korea in 2014.
 

GZFant

Senior member
Feb 18, 2003
437
0
76
No bobsmith1492, I did not read that article. I will google it now.

It is probably a waste of "brain usage" to speculate on these types of things. Although it was fun because we were playing poker and drinking scotch. It was a thought since we take no part in changing the course of history involving microprocessors. Honestly, I understand about 10% of what you were talking about Idontcare. I will definitely use the internet resource and educate myself to a level of understanding. Thank you for the info.

The information regarding Gordon Moore and Moore's law was very interesting. As he said, I would also love to come back 100 years from now and see the changes. I think we all would though. The fact that hafnium and the high-K gates have only prolonged Moore's law is very interesting.

Gluing processors to our brains to give is more efficient and greater usage of our brain functions. Who knows?

Wonderful information, I hope there will be more thoughts!
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: GZFant
No bobsmith1492, I did not read that article. I will google it now.

It is probably a waste of "brain usage" to speculate on these types of things. Although it was fun because we were playing poker and drinking scotch. It was a thought since we take no part in changing the course of history involving microprocessors. Honestly, I understand about 10% of what you were talking about Idontcare. I will definitely use the internet resource and educate myself to a level of understanding. Thank you for the info.

The information regarding Gordon Moore and Moore's law was very interesting. As he said, I would also love to come back 100 years from now and see the changes. I think we all would though. The fact that hafnium and the high-K gates have only prolonged Moore's law is very interesting.

Gluing processors to our brains to give is more efficient and greater usage of our brain functions. Who knows?

Wonderful information, I hope there will be more thoughts!

I did not mean to imply that you were wasting yours or anyone else brains/time by contemplating the subject matter.

I was more alluding to why the folks (the big corporations) that have the money to worry about this stuff are, for the most part, not bothering themselves to worry about it. (save for Intel but even then not really)

I.e. explaining why you will not likely find much official information on the topic outside the realm of people expressing personal opinions.

I too enjoy far-off conversation over spirits or a good micro-beer, didn't mean to put a damper on the discussion with my easily misinterpretable choice of words above.
 
Dec 30, 2004
12,553
2
76
TBH, I think it's like other problems-- ignore it until you have to deal with it. Keeping it hush-hush means Wallstreet doesn't think about it, which means people keep buying the Intel stock.

If they learned that in ~10-15 years (depending on how fast the market goes this last stretch) we'd be hitting the 4-5nm barrier (where you're basically talking about single atoms...and you can't get smaller than this), I bet they'd be selling their stock pretty quick.

In talking with a friend, he made the good point that Biotech is probably the next major industry. EE is more or less out, biotechnology (I'm so not a bio/chem person so forgive me if I get the terms wrong) is in. At least your emag understanding with reflection coefficients still applies to cell/plasma membranes etc, so we won't be completely obsolete.

Possible directions for the future I'm thinking (being a halfway through EE major myself) involve an MBA most likely; perhaps some biology based major as well. Or a masters in physics, weighted towards optics, could be an option. Who knows?
 

hokiealumnus

Senior member
Sep 18, 2007
332
0
71
www.overclockers.com
The next logical step (aside from the shrinking of the process even more) is probably quantum computing. They're working on that now, but it's still a long way off. In any case, that's as small as it can get.
 

Borealis7

Platinum Member
Oct 19, 2006
2,901
205
106
once youre done shrinking, start putting more cores on a CPU.
dedicated physics cores, dedicates graphics cores, communication cores etc etc...

it may not be useful for home everyday use, but there could be many applications in research or great big virtualization servers.

 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Originally posted by: soccerballtux
TBH, I think it's like other problems-- ignore it until you have to deal with it. Keeping it hush-hush means Wallstreet doesn't think about it, which means people keep buying the Intel stock.

If they learned that in ~10-15 years (depending on how fast the market goes this last stretch) we'd be hitting the 4-5nm barrier (where you're basically talking about single atoms...and you can't get smaller than this), I bet they'd be selling their stock pretty quick.

In talking with a friend, he made the good point that Biotech is probably the next major industry. EE is more or less out, biotechnology (I'm so not a bio/chem person so forgive me if I get the terms wrong) is in. At least your emag understanding with reflection coefficients still applies to cell/plasma membranes etc, so we won't be completely obsolete.

Possible directions for the future I'm thinking (being a halfway through EE major myself) involve an MBA most likely; perhaps some biology based major as well. Or a masters in physics, weighted towards optics, could be an option. Who knows?

No one is ignoring it, they are just waiting for the solutions, and as you say, people are working on the solutions with things like bio and quantum stuff, it's just not ready for prime time yet.
Intel doesn't really specialise in biotech or anything, but you can be reasonably sure that when bio/quantum is the way forwards, they will be the ones right there using it for technology, after transitioning from being EE. You can be pretty sure they are thinking that far ahead.
IBM developing light based interconnects etc etc.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
I think we will be in trouble long before 1-2nm. Leakage currents tend to get exponentially worse as you make barriers thinner because it becomes much easier for electrons to jump (tunnel) across. What Intel did with high-k was to use a physically thicker barrier for part of the transistor, but made of a different material that still "looks" thin to electric fields. Unfortunately there isn't really a similar trick for the length of a transistor (the 45nm in a 45nm process). A 1-2nm transistor would probably not turn off well enough to be usable.

Now, it turns out that there are other limits that will come in to play even sooner. There's a paper called "limits to binary logic switch scaling - a gedanken model" that basically argues that at around 15nm you hit a point where circuits become theoretically impossible to cool due to the power density (not just power... Even 1watt is impossible to cool if it is coming from a small enough source). The paper also argues that switching away from silicon isn't a solution because the limits are an inherent part of the whole idea of switches that use 1s and 0s. I can't paste the URL for that paper because I don't know how to copy/paste on the device I'm using to type this.

Edit: the autocorrect on this device does strange things... Fixing typos.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
No one is ignoring it - there have been several keynotes speeches about large international conferences on this exact subject ( <a target=_blank class=ftalternatingbarlinklarge href="ftp://download.intel.com/re......e_ISSCC_021003.pdf"><a target=_blank class=ftalternatingbarlinklarge href="ftp://download.intel.com/resea...Moore_ISSCC_021003.pdf"><a target=_blank class=ftalternatingbarlinklarge href="ftp://download.intel.com/research/silicon/Gordon_Moore_ISSCC_021003.pdf">ftp://download.intel.............021003.pdf</a></a></a> ) - and I've been to several others personally but can't find weblinks - although if you want more proof that people are talking it about, I should be able to hit you with at least a dozen other major presentations... although they don't get much bigger than ISSCC (International Solid State Circuits Conference). The industry is well aware of the problem - the idea that people in the semiconductor industry are ignoring it isn't remotely true.

As CTho said, this is likely to hit the industry fairly soon - within 10 years or sooner (45nm -> 30nm -> 15nm). I would expect to see a gradual slowdown in the rate of progress towards smaller process technology nodes. One thing that I would think is that there is a lot of money involved in the semiconductor industry - it's a huge industry - and there's going to be a lot of pressure to find workarounds. Moving to a 3D-shaped gate like a "tri-gate FET" (http://www.intel.com/technolog...-gate-demonstrated.htm and http://en.wikipedia.org/wiki/Trigate_transistors ), should help with a lot of the leakage issues and the need for extremely small channels.



As we move to 15nm and smaller, some of the musing that I would give over future directions are: 3D multilayer, multibit and optical interconnects.

By "3D" I mean, to move up in the third dimension - ie. transistor stacking. Right now transistors are laid out planarly on the lower layers of the fabrication process, so just add in more layers of transistors. The chips get thicker, and there are plenty of issues with doing this (heat being a primary concern), but it seems like a logical next step - if you can't shrink any further in 2 dimensions, move into the third dimension. This would only likely allow a "doubling" for two or three generations (or maybe more) but that's still another 8-10 years... (ish).

Multi-bit approaches would are along the lines of doing more with the transistors that we have. So, if you can somehow store twice as much data in the same number of transistors, you have effectively doubled the number of transistors. Ideas along these lines are multi-bit electrical levels in cells and implementing memory with something that is smaller than a 6T cell. Right now a memory bit is 6 transistors (usually), but people are working on 1T ( http://en.wikipedia.org/wiki/1T-SRAM and http://en.wikipedia.org/wiki/Z-RAM ).

The other approach that's not too improbable based on where we are currently is connecting everything with optical interconnects. Companies have demo'd silicon LED's, laser diodes and receivers. Using optical interconnect one could transfer data at vastly higher data rates and with lower latency and synchronize the clocks across parts accurately. This could allow monolithic packaged units to be built that can communicate across multiple IC's almost as if they were one massive IC.
 

Hulk

Diamond Member
Oct 9, 1999
4,872
3,266
136
There isn't always a leap to a new technology.

Look at the internal combusion engine in automobiles. They have been evolving over the past 100 years. Only now are we experiment with electric vehicles using batteries or fuel cells for energy storage.

So it's possible that current silicon processes could continue to evolve for the next 30 or more years. Continued refinement in processes, big enhancements in architecture, and even possibly more focus in software optimization if hardware development begins to slow.
 

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Originally posted by: Hulk
There isn't always a leap to a new technology.

Look at the internal combusion engine in automobiles. They have been evolving over the past 100 years. Only now are we experiment with electric vehicles using batteries or fuel cells for energy storage.

So it's possible that current silicon processes could continue to evolve for the next 30 or more years. Continued refinement in processes, big enhancements in architecture, and even possibly more focus in software optimization if hardware development begins to slow.

Slightly OT, but the electric car has been competing with the internal combustion engine since the invention of the automobile.
 

pm

Elite Member Mobile Devices
Jan 25, 2000
7,419
22
81
Originally posted by: Acanthus
Originally posted by: Hulk
There isn't always a leap to a new technology.

Look at the internal combusion engine in automobiles. They have been evolving over the past 100 years. Only now are we experiment with electric vehicles using batteries or fuel cells for energy storage.

So it's possible that current silicon processes could continue to evolve for the next 30 or more years. Continued refinement in processes, big enhancements in architecture, and even possibly more focus in software optimization if hardware development begins to slow.

Slightly OT, but the electric car has been competing with the internal combustion engine since the invention of the automobile.

Yep. There's an idea that electric cars are new, but they've been around since before the internal combustion engine.

http://en.wikipedia.org/wiki/Electric_car

BEVs were among some of the earliest automobiles ? electric vehicles predate gasoline and diesel. Between 1832 and 1839 (the exact year is uncertain), Scottish businessman Robert Anderson invented the first crude electric carriage. Professor Sibrandus Stratingh of Groningen, the Netherlands, designed the small-scale electric car, built by his assistant Christopher Becker in 1835.

Hybrids are also an old idea - the first series hybrid electric car was designed by Ferdinand Porsche in 1901 and won several races. The first paralel hybrid was designed in 1915.


But regardless, the idea that there doesn't need to be a revolution to make progress is one that I agree with. I don't think that the industry will grind to a halt in 10 years - in fact, I find that idea impossible to comprehend - the industry will just adapt by changing the design to work around the impossibilities imposed by physics or the difficulties imposed by economics.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Acanthus
Originally posted by: Hulk
There isn't always a leap to a new technology.

Look at the internal combusion engine in automobiles. They have been evolving over the past 100 years. Only now are we experiment with electric vehicles using batteries or fuel cells for energy storage.

So it's possible that current silicon processes could continue to evolve for the next 30 or more years. Continued refinement in processes, big enhancements in architecture, and even possibly more focus in software optimization if hardware development begins to slow.

Slightly OT, but the electric car has been competing with the internal combustion engine since the invention of the automobile.

My first thought when I saw that post too.

In fact the automobile's power engine went thru some amazing disruptive transitions. Electric auto's predated the use of the internal combustion engine (in terms of using gasoline).

And coal-powered steam engines (ala locomotion - trains) were the very first power trains for the horseless carriage.

Generally speaking, yes you can use the power of the English language and paint history with as broad of a brush as you like so as to ensure your point is technically valid.

But proving that there can be an exception to every rule was not the purpose of this thread.

Rather the point of the thread is to discuss the very real and obvious need for disruptive technology because sadly atoms don't scale in the non-relativistic reference frame within which we want to operate our CPU's; they are just bitchy like that most days.
 

SsupernovaE

Golden Member
Dec 12, 2006
1,128
0
76
There will probably have to be a paradigm shift in computation. We'll have to build computers fundamentally different such as using different quantum properties. Hopefully, when the Large Hadron Collider comes online, we will discover something new and useful or at least learn how to use what's left in the Standard Model for computation.
 

BonzaiDuck

Lifer
Jun 30, 2004
15,995
1,645
126
It is difficult, even for the well-educated, to have a clear vision of the future. Our best vision is what we see in hindsight.

In 1983, the microcomputer revolution was still a fairly new phenomenon. What it meant for me at that time was pretty straightforward: I didn't need to go down to the university computer center and wait in line before I could use my econometrics professor's "two-stage least-squares reqression" program on the school mainframe -- I could connect from home with a modem. I didn't need to go to the computing center and wait in line to program my own statistical algorithms -- I could do it at home.

A year later, another professor, noting the enthusiasm with which economics graduate students were embracing the new technology, suggested that eventually there would be "cults" of people wearing headbands like hippies in the '60s, but that these headbands would also sport Intel CPU chips.

Six years ago, I ran into an Australian engineer working in Seattle for Intel. We were sitting in a jazz club in Tacoma with my cousin, and the Aussie remarked with expressions of great awe that "CPU-design was getting down to the molecular level."

Two years later, visiting my cousin, I ran into the Aussie at my cousin's church during Sunday service. He said he'd quit Intel; wasn't pursuing his profession in engineering anymore; couldn't handle the rate of change in the technology; and had switched lifestyles and employments. I think he was trying to start a business. He didn't work much with computers anymore, as he put it.

We've been used to the PCI bus, born of ISA and EISA -- now being transformed into PCI-E. We've become used to the ATX standard for case-design, motherboard-design, and power-supply design. And the rectangular slots in an old IBM AT case for expansion cards are still compatible in their relative positions and in reference to motherboard "slots" -- you can simply use a drill and tap to make threaded holes for motherboard standoffs and make an ATX motherboard fit a case from that era.

There may be a paradigm shift on the horizon. It may be foretold by the signs made in these small "devices" we're seeing in peoples' pockets, or in the number of cores that can be extrapolated as a "status-quo" for CPU design two, four, or six years from now.

But it will be the people who not only have a vision for consumer demand or orientation, but who can shape the demand and orientation through their own innovative processes.

To be in such a state of mind includes elements of anal-retentiveness, near-myopic focus and perseverance, and the contrary ability to see beyond today's paradigm of computing.

The rest of us are just mechanics, who would otherwise need to be "retrained" once the current paradigm becomes obsolete.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: pm
Gordan Moore (Intel co-founder and of "Moore's Law" fame) has been saying something similar recently.

http://blog.wired.com/business...9/idf-gordon-mo-1.html
http://www.i4u.com/article11589.html

No exponential is forever...

I was there when he gave a talk on that at ISSCC and it was a good talk. Everyone's been talking about "Moore's Law" coming to an end for a long time and thinking that sub-micron transistors are impossible! It's no doubt that it WILL have to eventually stop but it's nice to see how far we can get it.

Throwing in my 2 cents, the scaling of transistors is becoming less beneficial as it used to be that we may need to change how we look at scaling. Moore's Law and downsizing the gate length typically resulted in power reduction, transistor speed improvements and smaller chip area. But maybe we eventually have to end up choose a path that improves only 2 out of 3. So a new FINFET design may reduce power and speed up transistors but the area it takes to draw a CMOS design may not be any smaller than it used to be.
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
I believe a few things will change, and they will be hard changes for most.

1. I believe that programming languages will inherently have to change from their current form and move to a more thread/multitasking friendly language. Current languages are just too sequential and hence need to take more advantage of the available resources.

2. The next step will be speciallization, instead of having a single core that does anything, we will see multiple cores that are dedicated to specific tasks. Already we have that with current GPU/CPUs but they are still very general in their processing abilities. Im thinking that we will have maybe one core that can do division very well, another that calculates common physics equations very well and yet another that can do basic arithmitic very well (somewhat already implimented with the ALU ect.)

3. After that, major advances will be more software driven and less hardware driven. Eventually, someone will need the software to do a task that the hardware can't handle, when that point is reach we will likely see more speciallization and even more cores dedicated to said tasks.

Quantium computing is a nice idea, but I don't see it being very practical for the common home user. Heck, for most computer users a Celeron 733 mhz cpu is plenty of processing power.

Really, we are reaching an interesting stage. I believe that 16 nm was the maximum die shrink that we could feasibly get to before we will have to start using some other element (the walls will be too thin and leakage too great).

Another possiblity is roomtempurature supercomputers. While really not all that likely (I wont be holding my breath) It is possible that we find that magic matrial, learn to manufacture it on the nm scale, and the shrink it till it really is just one atom (or compound) next to the other. Again, this isn't the likely solution.

One other alternative is that we just swallow the power costs and use a material that can handle tons of power (carbon Nanotubes) While people might not like it, we could potentially have some pretty big chips completely comprised of carbon nanotubes. CNT have the ability to withstand high temperatures and have fairly low leakage. That might be the key to keeping computers going faster and faster.

*cliffs*
Nobody can predict the future. These are just some of the techs and directions that might be pursued. I am not an Specialist in any of these fields so take what I have said with a grain of salt. (I also don't have a spell checker on this computer so please don't crucify me for some blatant spelling errors I know are probably present).
 

Cogman

Lifer
Sep 19, 2000
10,283
134
106
Originally posted by: BladeVenom
Processors didn't evolve, they are the product of intelligent design. :p

LOL, that made my day. Look Evolution is Twarted!
 

Foxery

Golden Member
Jan 24, 2008
1,709
0
0
Originally posted by: pm

The other approach that's not too improbable based on where we are currently is connecting everything with optical interconnects. Companies have demo'd silicon LED's, laser diodes and receivers.

This. Quantum computing is a long ways off, but I believe the next revolution in CPUs is most likely optical signals.

This thread left out one of the major roadblocks in ramping up processor speed: heat. Modern CPUs can't be pushed beyond 4GHz because of a combination of electrical leakage (tunneling) and the inability to keep the chip cold enough. Light has neither of these problems.

We haven't completely figured out how to produce a fully functional chip this way yet, but IBM among others are working on it, and it won't require nearly the level of painful theoretical physics as quantum machines :)

AnandTech recently grabbed an unlocked Intel processor and graphed both the power draw and heat output over several overclocked frequencies. (Great article) Page 2 shows how both curve upwards quickly after 3GHz, and depicts why getting current 45nm processors past 4GHz is nearly impossible without exotic cooling solutions. Skip to page 11 for some eye-openning graphs as well.