- Sep 15, 2008
- 5,053
- 196
- 116
I thought this was a good one -
http://www.pcpro.co.uk/features/360292/how-much-smaller-can-chips-go
http://www.pcpro.co.uk/features/360292/how-much-smaller-can-chips-go
The PC philosophy of piling everything through a CPU instead of creating dedicated processors for specific tasks, as ARM does with smartphones makes the PC particularly susceptible to the dark silicon problem.
The PC architecture has taken any intelligence out of peripheral devices and runs it on the processor, he claimed. Something like an Ethernet controller has been dumbed down. For a low-power architecture, thats the wrong approach. That leads you to having one big, hot processor.
Very true, at a certain point almost everything becomes either a fashion statement or a commodity. I suspect that Desktop PCs are almost there. I have a 18 month old PC that I bought for $600, and can run most games at settings that most people would have a tough time distinguishing from maximum. like an oven or a refrigerator there are things that a more expensive model would do better (encode that video in 12 minutes instead of 20, etc.) but for 99% of tasks the difference is academic already.On the positive side note there may not be a need to go that much smaller.
What really drives the need for faster computing is the vast amounts of data processing needed for higher resolution video/images, and we're nearing the point where the eye can't distinguish it anyway, which is a natural limit we can't really go beyond.
On the positive side note there may not be a need to go that much smaller.
What really drives the need for faster computing is the vast amounts of data processing needed for higher resolution video/images, and we're nearing the point where the eye can't distinguish it anyway, which is a natural limit we can't really go beyond.
8 nm being the limit? Yes for scaling every 2 years, no for the ultimate level, that will be a single atom transistor.. Check back with me in 50 years.So after EUV can they go to X-ray and Gamma ray? LOL My personal guess is 8 nm will be the limit. Will CPU's or GPU's reach the limit faster though? I assume the GPU will hit it faster because they are larger and more complex. And in fact GPU's may only reach 12-16 nm while CPU's will get to 8 nm. I wonder what the next avenue will be to increase performance after the limit is reached. Will Intel need to branch out into other areas as we get beyond 2020? Or will prices go up to compensate? Stay tuned.
8 nm being the limit? Yes for scaling every 2 years, no for the ultimate level, that will be a single atom transistor.. Check back with me in 50 years.
There will never be a single atom transistor, and you can quote me on that. There are few places in engineering where I can say never, this is one of them.
To have a transistor as we use them, AT LEAST 3 atoms are necessary (Base, collector, emitter). And frankly, it is highly unlikely that we will get to the state of a tri-atom transistor (unless, somehow we create a chemical compound that transfers electrons like a transistor.)
Not a transistor I suppose, but it's plausible to do computation with a single molecule. My mistake.
So, when they run up against the physical limits, we shift from bits to qbits, and the acceleration continues.
As long as I can custom build these things, I'll stay in the game.
There will never be a single atom transistor, and you can quote me on that. There are few places in engineering where I can say never, this is one of them.
To have a transistor as we use them, AT LEAST 3 atoms are necessary (Base, collector, emitter). And frankly, it is highly unlikely that we will get to the state of a tri-atom transistor (unless, somehow we create a chemical compound that transfers electrons like a transistor.)
why does it have to stop with atoms? why not subatomic particles? really the current theoretical limit is the plank length. It might take us a few thousand (or million) years to get there, but it's at least theoretically possible.
i remember when they were saying 10ghz was in our future. i sort of looks like that will never happen nowI remember when they were saying they could never get the CPU over 1 Ghz. Funny.
i remember when they were saying 10ghz was in our future. i sort of looks like that will never happen now