What happens when we switch to light CPUs

Status
Not open for further replies.

TecHNooB

Diamond Member
Sep 10, 2005
7,458
1
76
Okay, light powered CPUs might not be the technical term, but you know what I mean. What happens to all the VLSI guys? What happens when any technology gets phased out and happens to be your focus? Back to school?
 

esun

Platinum Member
Nov 12, 2001
2,214
0
0
Good employees learn as technologies change so they can keep their jobs. I'm sure the guys that learned digital design back in the 70's had to adapt a lot as technologies have changed. Furthermore, old technologies still have a lot of use. 32 nm may be cutting edge for processors, but that doesn't mean every digital device needs to use 32 nm technology. I can assure you many, many chips are still produced with 0.25 um processes. Even if some new technology were to overtake CMOS in highly scaled, very high performance uses, that wouldn't mean people wouldn't still use older, cheaper technologies for less performance critical applications.
 

bobsmith1492

Diamond Member
Feb 21, 2004
3,875
3
81
Programming techniques like VLSI won't disappear. The gates on the devices will simply change. You probably won't even know the difference other than... it runs at 100GHz instead of 100MHz! :)
 

Vee

Senior member
Jun 18, 2004
689
0
0
I think light based logic will be superior/competitive in limited areas.
In terms of a traditional general purpose CPU I also believe current transistor technology is already beyond what is achievable with light.

Theoretically, light allows phenomenally fast logical state change. But for more complex processing the results of the 'gates' need to propagate through a logic lattice. And it seems to me that the speed of light and wavelength will impose pretty stiff limitations on what can ultimately be achieved with light.
Granted, solutions can probably be found, that replace current ways of doing things with methods that make good use of switching speed instead. But ultimately that still cuts into relative performance. Finally we have the manufacturing side. Do you really see light-CPUs being manufactured in a way that will compete with ICs? The neat and mature manufacturing process that today makes some half billion transistors available to you for about $100? And what in the future? Current transistor technology is a moving target.

I can't. But ok I'm old and have already retired once. And I really don't know much about the state of optical circuits. I only know that the wavelength of light is around 300-900 nm and that light travels only about 60mm during one clock at 5 GHz.
For the elsewhere proposed 100GHz switch speed that will be 3mm.

As I said, I don't know much about optical computing, but I suspect that the editors of popular science/tech publications that have held up optical computing as the holy graal for decades don't really have a clue either. ...And the signal already travels at the speed of light in traditional electric circuits.

 

spikespiegal

Golden Member
Oct 10, 2005
1,219
9
76
From what I've read on the topic from the engineers standpoint (as if I pretend to understand 100% of it) we might see quantum gate processors before light (wave state / transphasor based) processors.

Current transistor technology is based on basic electron propagation, and while it's hitting a lot of technological walls, like clock speed, other improvements in processor design are helping keep efficiency moving forward. With a light wave based circuit, even though you have a lot less energy and heat to deal with, there's also less to work with. Flipping a light wave based transistor on/off might seem easy in a lab on small scale (I've seen this done with simple crude polarizers or magneto/optical), but trying to increase the scale to millions of gates creates a lot of hurdles.

One scientist had an interesting quote that there just isn't a practical/economic way to migrate to optical based circuits in the short-term, and that's the hold back. Then again Microsoft will need 1000ghz computers for Windows 9 just to run the GUI.

Had a great conversation with a buddy of mine back in the early 90's while he was at Rensselaer Polytech. Basically, we didn't see the need to develop quantum based processors because one in the future had already calculated every particle/location/energy state in the universe. Now I'm starting to get dizzy and need to stop typing.
 

Comdrpopnfresh

Golden Member
Jul 25, 2006
1,202
2
81
Originally posted by: Vee
I think light based logic will be superior/competitive in limited areas.
In terms of a traditional general purpose CPU I also believe current transistor technology is already beyond what is achievable with light.

Theoretically, light allows phenomenally fast logical state change. But for more complex processing the results of the 'gates' need to propagate through a logic lattice. And it seems to me that the speed of light and wavelength will impose pretty stiff limitations on what can ultimately be achieved with light.
Granted, solutions can probably be found, that replace current ways of doing things with methods that make good use of switching speed instead. But ultimately that still cuts into relative performance. Finally we have the manufacturing side. Do you really see light-CPUs being manufactured in a way that will compete with ICs? The neat and mature manufacturing process that today makes some half billion transistors available to you for about $100? And what in the future? Current transistor technology is a moving target.

I can't. But ok I'm old and have already retired once. And I really don't know much about the state of optical circuits. I only know that the wavelength of light is around 300-900 nm and that light travels only about 60mm during one clock at 5 GHz.
For the elsewhere proposed 100GHz switch speed that will be 3mm.

As I said, I don't know much about optical computing, but I suspect that the editors of popular science/tech publications that have held up optical computing as the holy graal for decades don't really have a clue either. ...And the signal already travels at the speed of light in traditional electric circuits.

laboratory experiments have been able to slow light to a near stop, and gates for optics have been made which can hold on to a stream of light- both of which negate the speed of light being an issue. The wavelength would only be an issue when trying to shrink things down more- and experiments/ideas have been developed which take care of that too.
 

Vee

Senior member
Jun 18, 2004
689
0
0
Originally posted by: Comdrpopnfresh
laboratory experiments have been able to slow light to a near stop, and gates for optics have been made which can hold on to a stream of light- both of which negate the speed of light being an issue. The wavelength would only be an issue when trying to shrink things down more- and experiments/ideas have been developed which take care of that too.

I think you misunderstood the estimate I was trying to make. My point was that lightspeed is too slow. The physical constrictions on a GP optical processor makes it ultimately unable to compete with transistor IC. The upscale in achievable switching speed is not able to compensate for less complexity and larger dimensions. Not for general purpose processors or vector processors.

 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
To the OP, I think there will be enough VLSI work to keep all the VLSI happy regardless of light based CPUs or not. In the end, everything is determined by cost. If a company has invested millions into a silicon based infrastructure, they'll continue to operate and maintain them unless there is a critical requirement that light CPUs somehow satisfy. It's little wonder why the word "legacy" so often pops when in conjunction with IT or technology.

By the time majority of CPUs are light based and no need to maintain legacy technology, all the VLSI guys will be long dead. Keep in mind that the majority of CPUs aren't even for a computer.
 
Dec 30, 2004
12,553
2
76
We won't; more likely we'd just switch to optical interconnects between all the vital components. Or everything might just get so small we put it all on the same chip and then we have no need for optics.
 
Status
Not open for further replies.