CPU gate size limits

Status
Not open for further replies.

GWestphal

Golden Member
Jul 22, 2009
1,120
0
76
With 22 nm parts coming soon and 14 nm parts in the not so distant future, when will shrinking silcon technology come to an end, even if we consider graphene or carbon nanofibers. I can hardly fathom a stable gate of less than 10nm, much less the reportedly 1 nm graphene ones. I know there is some atom diffusion even in things with crystal structure. When you get to less than 100 atoms wide, diffusion of 1 atom would change the resistance by 1+% and if you think about it over some period of time, you could have 10-20% or more fluctuations in the resistance of a gate. Then you get into tunneling and leak due to the atomic proximity. Could you really have long term, stable function with these effects? Seems to me like we're approaching the hard limit of gate size at 5-10 nm in practical, mass producible terms. What happens then? If you can't make it smaller, where will they innovate? Just start making bigger chips with more instructions? Cache optimization? Will it finally be time for software to catch up to hardware?
 

Matt1970

Lifer
Mar 19, 2007
12,320
3
0
It's not that far away. 11-16 nm seems to be the accepted limit and by Moore’s Law gives us 10 or 12 more years? They have reduced leakage at the logic gate by using a hafnium compound-based insulator with higher dielectric strength than traditional silicon dioxide giving them 100-fold reduction in gate leakage.

DNA computing looks promising. They have already created logic gates made of DNA. "Logic gates are a vital part of how your computer carries out functions that you command it to do. These gates convert binary code moving through the computer into a series of signals that the computer uses to perform operations."
 
Last edited:

GWestphal

Golden Member
Jul 22, 2009
1,120
0
76
I think much less than 10 years. If they follow the roughly 50% area rule, then we have 32 nm right now, 22 nm next year, 14 nm the year after that. So, 3 years and we hit the limit?

A quick glance at google tells me that hafnium may not be great due to its rarity. Though maybe this isn't an issue, IIRC doesn't all silicon for ICs come from a single quarry in Israel or something to that effect? Apparently they have amazing pure silicon there. Does this hafnium doping create 100x less gate leakage at current sizes and does it scale linearly with size reduction?

To me is seems like DNA computing and quantum computing is a pipe dream at this point. There is not a single consumer device that can perform even rudimentary calculations. I would put both of those at 20-40 years out for mainstream.
 

GWestphal

Golden Member
Jul 22, 2009
1,120
0
76
I suppose it would great from a consumer standpoint. Foundries would all have the same process capability so they couldn't compete with each other in that arena, so basically that leaves price competition.
 

Soccerman06

Diamond Member
Jul 29, 2004
5,830
5
81
Isnt 16nm slated for sometime in 2013-2014 with Rockwell, so I wouldnt say 16nm is that close. 11nm is released sometime in 2016 and 8nm in 2018 and so on but I would assume fabs will slow down after 11nm as we still dont have a cheap and viable replacement for silicon (or do we?), so I see a 3D structure coming in one way or another. Since heat seems to be becoming less of an issue than it use to be and the max operating temp seems to keep rising with each gen, I can see 3D architecture with either water radiator setup built in, or high quality copper heatsinks.
 

Matt1970

Lifer
Mar 19, 2007
12,320
3
0
LOL Hey soccerman, I use that same avatar in Pokerstars :)

Everything I have read says 8mn is too thin.
 

GWestphal

Golden Member
Jul 22, 2009
1,120
0
76
I see that as helping between inter-core communication, but I don't think that will replace silicon logic. The 3D ICs could be interesting, having 100 cores in a cube with water cooling would be pretty nifty. Though in terms of mobile applications I'm not sure it would work well.
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
Honestly, aside from obvious physics, one of the biggest things holding CPUs back is the artificial limit Intel/AMD decided to settle on for CPU power consumption. I'm glad GPU makers decided to give the PCI-e standard the middle finger. If CPU makers could do the same I could see both stock and overclocked frequencies going up significantly.

I don't see that happening until they hit a huge wall in process technology.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Honestly, aside from obvious physics, one of the biggest things holding CPUs back is the artificial limit Intel/AMD decided to settle on for CPU power consumption. I'm glad GPU makers decided to give the PCI-e standard the middle finger. If CPU makers could do the same I could see both stock and overclocked frequencies going up significantly.

I don't see that happening until they hit a huge wall in process technology.

300W processors? No thanks.
 

Matthiasa

Diamond Member
May 4, 2009
5,755
23
81
The power limit does in part have to due with physics though. :p
To hot or cold and semiconductor devices don't work. :p
With that higher watt cpu's would need significantly better/stronger cooling (thus most likely louder).
As it is high power parts are extremely loud as it is. :(
 

wuliheron

Diamond Member
Feb 8, 2011
3,536
0
0
I'd bet that a 3D IC would cost a fortune.

There are already rumors that Intel has managed to piggy-back 1gb of vram onto an ivy bridge cpu/gpu for a bandwidth roughly equal to a radeon 5770. That kind of modest 3D architecture is already possible and should come down in price in the near future. It also follows the general trend of moving everything on the mother board except the system ram and storage onto a single chip and doesn't require extreme cooling.

The only reason to continue with silicon is price and the already available infrastructure and research. Graphene can do 100ghz, but that's still nothing compared to optical circuitry which can manage 60thz. Quantum Computers are the only thing theoretically faster, but only for specific types of calculations. Unfortunately silicon isn't very good for optical circuitry and to keep the chip features small requires advanced plasmonic physics.
 

A5

Diamond Member
Jun 9, 2000
4,902
5
81
Honestly, aside from obvious physics, one of the biggest things holding CPUs back is the artificial limit Intel/AMD decided to settle on for CPU power consumption. I'm glad GPU makers decided to give the PCI-e standard the middle finger. If CPU makers could do the same I could see both stock and overclocked frequencies going up significantly.

I don't see that happening until they hit a huge wall in process technology.

I think 130W (Prescott, Nehalem, and Westmere all maxed out here) is a reasonable limit for CPUs - once you get past that point you're talking more exotic cooling methods (either large air coolers or water) and major OEMs aren't willing to add $20+/unit to get acceptable cooling for a small gain in performance.

Overclockers already ignore TDP limits, so I think the only gain would be in stock clocks. Intel could put a 5GHz Sandy Bridge on the market, but they'd have to give it some crazy TDP number like 200W (I don't know the actual SB power draw at the necessary voltage to get there - I'm sure someone here does though), which means no OEM would touch it and no overclockers would get it because it would cost $1500...
 
Status
Not open for further replies.