How does optical computing fit in here? I understand graphine and nanotubes, but is optic computing pretty much tiny fiber optics? I read somewhere that optic tech in today's process node would give us the equivalent of something like a 30,000,000ghz processor.
If that's anywhere near true, and anywhere near the next 20 years that would be awesome.
Optical computing has a number of distinct advantages, but also some drawbacks.
The biggest advantage is bandwidth. In an electronic circuit, you can only have one signal in it at a time. The voltage and current have one value at any given point in a wire. If you want another signal, you need another wire.
In an optical system, you can have a HUGE number of signals in the same physical space. Typical counts for fiber telecom systems are 40x multiplexing, where each individual piece of glass carries 40 distinct, separable, independently switchable signals through it. The same sort of thing can theoretically be applied to an optical processor setup, which can get around some of the size issues involved. If you can carry 40 signals in one channel, that channel can be 40 times the size and still have the same overall throughput.
The downside is that electronics is a very well-developed system at this point, from design to manufacture. There's a reason we use it. It's easy to make switches, transistors, etc, and we know how to set them up to do what we want. Optical computing not only requires new manufacturing processes, it requires new individual components, design architectures, everything. What does a de-multiplexer on the scale of an integrated circuit look like? How do we leverage the advantages of optical components to take advantage of their strengths and mitigate their weaknesses? There's a lot of engineering to be done before we even approach the level of existing semiconductor-based electronics.