Light based circuitry

oldman420

Platinum Member
May 22, 2004
2,179
0
0
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?
 

thermalpaste

Senior member
Oct 6, 2004
445
0
0
Originally posted by: oldman420
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?

Copper reaches a saturation point, where it cannot sustain extremely high bandwidth, because of stray EMI, interference, et al. One way to overcome this is to try something the 'RAMBUS' way by serializing data. But this poses one problem, high latency. The other popular way is the "DDR' way, where data is transfered both at the rising edge and the falling edge of a clock cycle. However at a certain point of time, we may require more bandwidth.
Fibre optics can be used, let me give you a really simple example. If we have a strain of fibre optic cable, with an LED attached at one end and a LDR(light dependant resistor) at the other end, the LED when switched on represents a binary 1 and changes the resistance value. When off it it 0. We are transfering data this way using a fibre optic cable..
 

RearAdmiral

Platinum Member
Jun 24, 2004
2,280
135
106
I believe some place in Israel built something like this very recently. Something that can process info with light instead of electrons. Quantum computing will most likely be the future of processing in my opinion.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: thermalpaste
Originally posted by: oldman420
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?

Copper reaches a saturation point, where it cannot sustain extremely high bandwidth, because of stray EMI, interference, et al. One way to overcome this is to try something the 'RAMBUS' way by serializing data. But this poses one problem, high latency. The other popular way is the "DDR' way, where data is transfered both at the rising edge and the falling edge of a clock cycle. However at a certain point of time, we may require more bandwidth.
Fibre optics can be used, let me give you a really simple example. If we have a strain of fibre optic cable, with an LED attached at one end and a LDR(light dependant resistor) at the other end, the LED when switched on represents a binary 1 and changes the resistance value. When off it it 0. We are transfering data this way using a fibre optic cable..

Serializing and DDR does nothing to fix THIS problem (but they do fix others). Assuming a typical PAM-2 transmittion meaning 1 bit per symbol and 2 possible levels, if you want to achieve Gbps transfer rates you'll be sending symbols through the channel at very high frequencies. The problem is that the channel has a serious degradation and reflection problems which may corrupt the incoming signals to where it's unreadable. We can continue using pre-emphasis and reciever equalization to deal with it, but at some point, it's just not gonna work anymore. For very short distances, I don't see optical connections as necessary but for long traces, perhaps.

Edit: Ahh.. but for light based processing... I dunno. It would be rather interesting.
 

Fencer128

Platinum Member
Jun 18, 2001
2,700
1
91
Hi,

The big problem in using photons (as opposed to electrons) is that of interaction. If you wish to send seperate channels of data with minimal interference large distances then photons are great. If you want to have two channels interact in order to produce a third channel (i.e. a computation) then photons are next to useless. It is extremely difficult to get interaction from multiple photons. Sure, non-linear optics could be used (second harmonic generation, kerr lensing, etc) but none of this works anywhere near the scale of the electronic IC or at equivalent power levels associated with these circuits. Add to this the problem of the lack of a photonic equivalent of the electronic IC memory and the current state of affairs means that you shouldn't expect to see photon based CPUs anytime soon. Also, at the current rate of development it would take something equivalent to the transistor in the field of photonics to give a photon based CPU any hope of catching up with conventional electronic technology. Now, a spintronic based CPU is much more viable...

It's not a question of cost per se - it's more the lack of technology.

Cheers,

Andy
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Fencer128
Hi,

The big problem in using photons (as opposed to electrons) is that of interaction. If you wish to send seperate channels of data with minimal interference large distances then photons are great. If you want to have two channels interact in order to produce a third channel (i.e. a computation) then photons are next to useless. It is extremely difficult to get interaction from multiple photons. Sure, non-linear optics could be used (second harmonic generation, kerr lensing, etc) but none of this works anywhere near the scale of the electronic IC or at equivalent power levels associated with these circuits. Add to this the problem of the lack of a photonic equivalent of the electronic IC memory and the current state of affairs means that you shouldn't expect to see photon based CPUs anytime soon. Also, at the current rate of development it would take something equivalent to the transistor in the field of photonics to give a photon based CPU any hope of catching up with conventional electronic technology. Now, a spintronic based CPU is much more viable...

It's not a question of cost per se - it's more the lack of technology.

Cheers,

Andy

Yep, Andy's explanation is pretty spot-on. There is work being done on making optical transistors but so far they are very much experimental and are pretty much built one at a time. Aactually, from what I've been told they're actually grown a lot of the time, since they're often crystal-based. I think there are also micro-mirror schemes being worked on, but I'm not positive if they have found any success with them.

In addition, the optical buffer storage problem doesn't have any solution in sight. In short, optical computing is a pipe dream right now.
 

thermalpaste

Senior member
Oct 6, 2004
445
0
0
Originally posted by: TuxDave
Originally posted by: thermalpaste
Originally posted by: oldman420
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?

Copper reaches a saturation point, where it cannot sustain extremely high bandwidth, because of stray EMI, interference, et al. One way to overcome this is to try something the 'RAMBUS' way by serializing data. But this poses one problem, high latency. The other popular way is the "DDR' way, where data is transfered both at the rising edge and the falling edge of a clock cycle. However at a certain point of time, we may require more bandwidth.
Fibre optics can be used, let me give you a really simple example. If we have a strain of fibre optic cable, with an LED attached at one end and a LDR(light dependant resistor) at the other end, the LED when switched on represents a binary 1 and changes the resistance value. When off it it 0. We are transfering data this way using a fibre optic cable..

Serializing and DDR does nothing to fix THIS problem (but they do fix others). Assuming a typical PAM-2 transmittion meaning 1 bit per symbol and 2 possible levels, if you want to achieve Gbps transfer rates you'll be sending symbols through the channel at very high frequencies. The problem is that the channel has a serious degradation and reflection problems which may corrupt the incoming signals to where it's unreadable. We can continue using pre-emphasis and reciever equalization to deal with it, but at some point, it's just not gonna work anymore. For very short distances, I don't see optical connections as necessary but for long traces, perhaps.

Edit: Ahh.. but for light based processing... I dunno. It would be rather interesting.



Perhaps I didn't frame the answer properly. I was actually refering to the increase in bandwidth and saturation point....and you have explained it correctly..........
 

oldman420

Platinum Member
May 22, 2004
2,179
0
0
What about using different frequencies of light.
this way for instance a receiver gets a red and a blue Photon 0 1 and detects them as purple 10 or a slightly different frequency, and then outputs acordingly.
I am in way over my head huh?
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: oldman420
What about using different frequencies of light.
this way for instance a receiver gets a red and a blue Photon 0 1 and detects them as purple 10 or a slightly different frequency, and then outputs acordingly.
I am in way over my head huh?

Umm... our EYES may see them as purple, but an optical reciever won't see purple. They'll just see a blue and red photon. Lights of 2 different frequencies normally do not interfere with each other. But anyways, for the most part, that is how fiber optic systems work. They find a frequency range that the optical fiber can support. Then they try their best to pack in as many different frequencies in that range to get higher throughput.
 

cirthix

Diamond Member
Aug 28, 2004
3,616
1
76
a 'purple photon' is just a photon that has a wavelength of whatever purple is (300nm or sumthin around there), it is NOT a blue and a red photon combined or next to each other or anything like that.
 

Vee

Senior member
Jun 18, 2004
689
0
0
Originally posted by: oldman420
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?

We're most surely going to see more and more optical communications and transfers. And most posts in this thread seem to be about that. And some kinds of related filtering processing could concievably be done with optical logic.

As for optical complex processing as in a processor, using standing lightwaves as logical elements and interference as logic operations, - no, I don't think so. I believe todays existing transistor CPUs are already way ahead of what is possible to imagine with light, due to limitations of lightwaves physical size, wave divergence and speed of light.

The crucial building block of a computer is some switching element. Standing lightwaves and interference might have seemed like an exciting idea 18 years ago, but isn't today. The reason it was exciting then, was because it seemed switching could be done so much faster with lightwave interference.

Today, a consumer class CPU has about 120 million transistors and switch at about 3GHz. At 3GHz, light only travels 4 inches between clocks. Light then have to have a wave length where it can interact with solid state components, to form some kind of logic lattice with mirrors, halfmirrors and whatever. My guess is that will be a rather red light, but the problems are so big anyway, it doesn't matter much.
Both the lightwave's physical size, and the fact that narrow light beams diverge a lot, place physical size demands on a switching CPU-light-lattice. So severe, lightspeed will be constricting performance.

As if this is not enough, you then have the elegance, convenience and economy of the integration process for manufacturing, to also compete with.

I think next switching element could be molecular transistors. Molecular semiconductors can, and have been produced already today. That's not the difficult part. The hard thing is connecting them together, to form a functioning logical circuit, of some complexity at affordable production cost.
 

RaynorWolfcastle

Diamond Member
Feb 8, 2001
8,968
16
81
Originally posted by: Vee
Originally posted by: oldman420
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?

We're most surely going to see more and more optical communications and transfers. And most posts in this thread seem to be about that. And some kinds of related filtering processing could concievably be done with optical logic.

As for optical complex processing as in a processor, using standing lightwaves as logical elements and interference as logic operations, - no, I don't think so. I believe todays existing transistor CPUs are already way ahead of what is possible to imagine with light, due to limitations of lightwaves physical size, wave divergence and speed of light.

The crucial building block of a computer is some switching element. Standing lightwaves and interference might have seemed like an exciting idea 18 years ago, but isn't today. The reason it was exciting then, was because it seemed switching could be done so much faster with lightwave interference.

Today, a consumer class CPU has about 120 million transistors and switch at about 3GHz. At 3GHz, light only travels 4 inches between clocks. Light then have to have a wave length where it can interact with solid state components, to form some kind of logic lattice with mirrors, halfmirrors and whatever. My guess is that will be a rather red light, but the problems are so big anyway, it doesn't matter much.
Both the lightwave's physical size, and the fact that narrow light beams diverge a lot, place physical size demands on a switching CPU-light-lattice. So severe, lightspeed will be constricting performance.

As if this is not enough, you then have the elegance, convenience and economy of the integration process for manufacturing, to also compete with.

I think next switching element could be molecular transistors. Molecular semiconductors can, and have been produced already today. That's not the difficult part. The hard thing is connecting them together, to form a functioning logical circuit, of some complexity at affordable production cost.


I'm not sure why you'd say that the speed of light would be a limiting factor, photons have no rest mass and therefore almost always travel faster than electrons (although there are cases where this is not true).

Also, there is definitely still research being done on optical transistors. I had a professor that gave an informal talk last year about his research; it mostly had to do with growing crystals that had a structure that had properties that made it the optical equivalent of a transistor. With that said, these transistors were still at the one-transistor-at-a-time hand-built stage, thus nowhere near commercial viability.

Essentially, the problem is that with current technologies photonics are better for communication while electronics (and spintronics) are better for data processing.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
Originally posted by: cirthix
a 'purple photon' is just a photon that has a wavelength of whatever purple is (300nm or sumthin around there), it is NOT a blue and a red photon combined or next to each other or anything like that.

Exactly. If you send out a red and blue photon, you'll get a red and blue photon and not a purple photon. Our eyes may see it as purple, but it is not a true purple photon.
 

Vee

Senior member
Jun 18, 2004
689
0
0
Originally posted by: RaynorWolfcastle
I'm not sure why you'd say that the speed of light would be a limiting factor, photons have no rest mass and therefore almost always travel faster than electrons (although there are cases where this is not true).
You're absolutely right. Electrons in an electrical circuit move very slowly indeed. But I mentioned this indirectly: Light logic is interesting because it promises to switch much faster. For a transistor to switch, a certain amount of electrons have to move into it.

But in processor performance there are two different speeds to consider. The speed with which a signal travels, which is pretty much lightspeed for both electricity and light. And then the speed with which a 'switch' propagates through the circuitry. The switch propagation through light logic could be pretty instantenous. While this would mean light logic would be much different from electric logic, - where we avoid having too long logical chains of transistors, that need to switch, - my guess is still that the sheer physical size of light logic, will mean that signal speed will limit performance. To the extent that even the best theoretically concievable (though I have only made a rough, diletante estimate) still can't match what is already being achieved in current mass production, consumer class items today.

But that doesn't mean light logic won't be vastly superiour for many uses. I just don't see it ever competing on the CPU arena.

 

thermalpaste

Senior member
Oct 6, 2004
445
0
0
Originally posted by: Gannon
Couldn't optics be used for buses/trace replacements though?




Exactly, I was thinking on these lines. Speed of light is 3*10^8m/s^2. Let us assume that we are using a simple LED as a data transmitter, the optic fibre as a medium and an LDR as a reciever . There are no bandwidth limitations from the transmitter side, EMI / stray signals won't interfere.
There is one limitation though: the reciever has to be quick enough to pick up the bits and feed them to the corresponding circuitry. One way to overcome this limitation is to have a semi-parallel transmission.......
If we are transferring 64 bits of data at a given time, we can have 16LEDs (16 bits) and 16 recievers and 4 cycles for the complete 64 bit data to be transferred.......

If somebody has more information on this, please do the honours....I am simply speculating this and I haven't come across this anywhere.
 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
The problem is that you have to convert eletrical signal to light and then back and unfortunately it is difficult to do this really fast (I mean REALLY fast, of the orders of a few ps whcih is what you need in a real circuit).
I know of several projects where they are trying to interface high-speed electronics (>100 GHz) using light but progress has been slow (note that I am not refering to parallell data streams which you can multiplex into a single fiber, in this case you can reach 100 Gbps quite easily AFAIK).
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,597
6,076
136
Two answers to faster computing... SUPERCONDUCTORS and DIAMOND!
 

f95toli

Golden Member
Nov 21, 2002
1,547
0
0
Superconducting circuits are unfortunately not THAT fast, if I remember correctly the current record is about 300 GHz for a very simple circuit.
"Real" circuits can run at about 60 GHz at best which is (unfortunately) not much better than semiconductor based circuits.
 

thermalpaste

Senior member
Oct 6, 2004
445
0
0
Originally posted by: ariafrost
Two answers to faster computing... SUPERCONDUCTORS and DIAMOND!

I have only 2 words to describe ultra fast computingin the future:
serialize, progress.
Serialize: Allows faster data transfer over 'long' distances. (long= distance between BUSes, etc.etc...)
progress: FInd better medium to propogate data at high speeds.
 

Alistar7

Lifer
May 13, 2002
11,978
0
0
Originally posted by: oldman420
I have a feeling that in the future computers will go to a light based vs an electron based Way of processing information.
Am I insane ?

Nope you are right on target, unless someone comes up with a traditional bus that can exceed the speed of light....