• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Optical Chip Enables New Approach to Quantum Computing

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
This is a little above my pay-grade, but maybe a few of our engineering-major types can comment on the possibilties for consumers in the next 20 years?

ScienceDaily (Sep. 16, 2010) — An international research group led by scientists from the University of Bristol has developed a new approach to quantum computing that could soon be used to perform complex calculations that cannot be done by today's computers.

http://www.sciencedaily.com/releases/2010/09/100916145049.htm
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
"It is widely believed that a quantum computer will not become a reality for at least another 25 years," says Professor Jeremy O'Brien, Director of the Centre for Quantum Photonics. "However, we believe, using our new technique, a quantum computer could, in less than ten years, be performing calculations that are outside the capabilities of conventional computers."

OK so you know how "nanotechnology" was all the rage in the media some 3-4 yrs ago, everything that could be equated to "nano" was the new "hot" in tech?

And then, rightfully so, professionals in many other industry came out saying "wait a minute, we've been doing nanotechnology for decades, if not centuries...everything relating to chemistry is operating at the nano-level, from paints to pharmaceuticals to refining petroleum, etc...".

Not to mention the people working in the semiconductor industry were saying "welcome to the mid-90's when we scaled the gate-oxide to less than 100nm :rolleyes:".

So a similar situation is happening with this "quantum" computing hype wagon.

Transistor design has been incorporating and comprehending quantum effects in the actual device functionality for more than a decade now. Your CPU in your computer right now is functioning only because the engineers who designed it were able to properly factor in the quantum effects that manifest at the dimensions involved. (not to mention that nothing happens without a quantum mechanical "cause" to explain the "event")

So what do you suppose a "conventional" cpu is going to entail in 10-25yrs from now anyway?

What these guys are trying to get at, and doing a poor job of it from what I can tell, is that the fundamental difference here is not the presence of quantum effects but rather their employment as an alternative to the binary transistor-transistor logic we use in our CPU's.

Our conventional CPU's harness quantum effects to enable conventional binary computing, these guys are building chips that use the fuzzy realm of quantum effects to define the very basis of the computing structure (qubits instead of binary bits).

And they are right, quantum computing does enable the ability to perform certain calculations that a conventional computing system can't touch. It is kind of like analog versus digital cmos...an analog circuit can give you resolution and capability that no digital circuit could ever touch.

But the existence of one does not negate the utility of the other. Quantum computing will never displace conventional computing because there are certain computations that simply aren't amenable to calculation by quantum computing, and vice versa.

So what does this mean to people like you and me? Well quantum computing makes things like database queries super-fast, as well as encryption and security. So in 10-25 yrs things like trusted computed and trusted platforms would see a real benefit from a quantum-based co-processor in our rigs (be that what they may in 25yrs).

But you won't want to be using a quantum chip to do your 1+1=2 simple stuff.