Originally posted by: BFG10K
AFAIK I don't think it should matter in the grand scheme of things because all we need to know is if there's a signal or not. Shine two torches across each other and both will still hit their respective opposing walls. Just pick wavelengths to ensure the signal isn't cancelled and it should be OK.Light is both a wave and a particle, and waves intefere with each other.
Now I understand the context of your diamond comment but I'd like to add that reduced leakage doesn't help the heat/power issue.If the leakage current at small transistor feature-sizes could be significantly reduced, then silicon-based devices could continue to scale down. It would be interesting to use atomic scale CVD or something similar, to build semi-conductors on a crystalline-carbon (aka diamond) substrate base, instead of crystalline silicon.
Umm, how so?
You need 100w power (random numbers), so you have 100w which will be made into heat.
Say you have 30w leakage, you actually need 130w to give the chip the 100w it needs, as 30w is wasted.
You're producing 130w heat and using 130w power.
Say you now reduce leakage. It only leaks 10w.
You now need 110w and produce 110 watts heat.
How would reducing leakage NOT reduce power consumption/heat production? Have I oversimplified it? (I know the difference may not be that great, but it would help, unless you mean that reducing leakage by itself wouldn't be enought to solve the issue, as the difference would only be small)