IBM's electronic-photonic integrated chip

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
IBM has become the first company to integrate electrical and optical components on the same chip, using a standard 90nm semiconductor process. These integrated, monolithic chips will allow for cheap chip-to-chip and computer-to-computer interconnects that are thousands of times faster than current state-of-the-art copper and optical networks. Where current interconnects are generally measured in gigabits per second, IBM’s new chip is already capable of shuttling data around at terabits per second, and should scale to peta- and exabit speeds

http://www.extremetech.com/computin...commercially-viable-silicon-nanophotonic-chip

so...first step to a photon based CPU ?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91

Photon-based computing is possible but the devices would be absolutely huge.

What makes electron-based computing possible is the wavelength of a free electron in copper and silicon is quite small, on the order of a nanometer.

Try and confine the electron into a space smaller than that and you find the electronic potential energy necessary begins to rise dramatically. (as one would expect given the binding energy of an electron to a single proton when examining the hydrogen atom)

Confining photons to such a tiny space requires even more energy, such that you are looking at using x-ray photons (even more energetic than EUV, which has a wavelength of 13.5nm) if you wanted the circuits to be comparable in size to today's modern circuits.

Instead, what you will end up with is a chip that will be absolutely humongous because the optical waveguides and so forth for opto-computing with IR or Visible wavelength light ends up driving design rules on the order of 1um-0.5um.

Ridiculously faster than using electrons, sure, but also silly large chips that fundamentally cannot benefit from a Moore's Law type scaling in the economics department.

For now, the electron is ideally suited for nanoelectronics owing to the very properties that make this lepton the sub-atomic particle that it is.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,116
136
Photons for interconnects, electron quantum effects is the way to go :thumbsup:
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Photon-based computing is possible but the devices would be absolutely huge.

What makes electron-based computing possible is the wavelength of a free electron in copper and silicon is quite small, on the order of a nanometer.

Try and confine the electron into a space smaller than that and you find the electronic potential energy necessary begins to rise dramatically. (as one would expect given the binding energy of an electron to a single proton when examining the hydrogen atom)

Confining photons to such a tiny space requires even more energy, such that you are looking at using x-ray photons (even more energetic than EUV, which has a wavelength of 13.5nm) if you wanted the circuits to be comparable in size to today's modern circuits.

Instead, what you will end up with is a chip that will be absolutely humongous because the optical waveguides and so forth for opto-computing with IR or Visible wavelength light ends up driving design rules on the order of 1um-0.5um.

Ridiculously faster than using electrons, sure, but also silly large chips that fundamentally cannot benefit from a Moore's Law type scaling in the economics department.

For now, the electron is ideally suited for nanoelectronics owing to the very properties that make this lepton the sub-atomic particle that it is.

why do i think this is a Ph.D territory? o_OD:
 

Haserath

Senior member
Sep 12, 2010
793
1
81
IBM is using the photons as an interconnect only.

Still, it would be interesting to see a chip at terahertz speeds even if it wasn't as wide as modern processors. I could imagine photons could be a competitor to electrons if they could establish a process for it.
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
Yet the cpu still has to run through a million different variables just to get to the end of a calculation, modern computing is grossly inefficient. Molecular computing is the next plausible step, unless you prefer exposing yourself to large amounts radiation while browsing the internet.
 
Last edited: