The transistor was invented in December of 1947. By early 1953, just 5 years later, it was already being used in hearing aids that were being sold. Granted, there were continual advances over time, especially in the switch from germanium to silicon.
The thing is the transistor actually functioned in 1947 -- terahertz processors do not function now. All they have is a material that is semi-stable at -81°F. This material isn't even in a functional transistor yet. We are a long, long way from even getting this concept into a single functional chip. Let alone mass production of chips which now have a multi-year validation and production ramp up.
My biggest thoughts with this are:
1) With electricity flowing 1000 times faster filling and draining the material, where are we getting that power and how are we dissipating that heat? Especially given that the material has to be exposed to laser light, so a heat sink won't function.
2) Are we really going to be using computers colder than -81°F for any purpose outside of specialized labs?