Originally posted by: socketman1
This seems more like a "Highly Technical" forum question, but here I go.
A few years ago Stanford University was hosting video files on all types of technical subjects. One of them was computers. Intel engineers gave a lecture somewhere about the barriers they face. Here is a highly butchered and marginally inaccurate summary of the video ( Its been a couple of years since ive seen it).
Quantum tunneling:
As you know electrons flow through a circuit and cross gates. These gates either allow electrons to pass or they dont. As microprocessor dies have shrank, so has the width of these gates. The die shrinks, and more transistors are added and gate width shrinking is just a natural step. Now. quantum physics says electrons can "tunnel" or randomly jump from one place to another over very short distances. Gate lengths are so short now that they do feel the effect of these short jumps. Voltage leaks and the circuit shorts itself out.
Engineers do have solutions for this, but the smaller we get ( and hence faster), the more its going to happen
Source of this info
Photo masking:
Forgive me here, but im paraphrasing a very complicated concept from memory. But the principle is the important thing. All circuits are etched onto the wafer by various means. Typically a very short wavelength laser. As the traces, ie. the width of the circuit the electrons flow down get smaller so does the laser width/wavelength. Once that wavelength hits a certain point, you cant garauntee the trace will work. Something about the heisenberg uncertainty principle and the traces needing to be smaller than the shortest wavelength laser we have.
So there are 2 obstacles that need addressing for current and future CPU's.
If any forum members remember those video lectures from 3+ years ago please ring in here, they were really interesting.