Obviously cost is the main concern... Not just cost of the materials directly, but also the costs associated with mass manufacturing. And it's pretty much impossible to predict how they'll be able to cope with native defects (inherant in the material).Originally posted by: Jeff7181
Wingznut... as someone who actually has some sort of qualification to provide some insight into this question... what do you think of synthetic diamond being used as a semi-conductor in the future?
I think it's more likely that diamond will be used within the silicon somehow. Either as a substrate or maybe even an insulator in the substrate like SOI.
Although we always hear the cries of Si reaching it's last legs, I don't think that's true for at least another 15-20 years. Maybe longer.
That's a tough one, and your guess is probably as good as mine. I will say that I see some people on these forums freaking out over the rumored 100w+ of Prescott. Well, you probably ought to get used to it. CPU's are going to get hotter and hotter. It wouldn't surprise me if someday they were regulated by the heat they produce rather than clockspeed.I know you're not an expert, but from an engineering standpoint, what kind of temperatures would just be too hot to use in a home computer? I mean, obviously, even if the CPU could handle it, ya can't really have a processor that runs 600 degrees C sitting on your desktop.
