This is just a curiosity question on my part, I make no representation of being a materials genius
Years ago, I helped a gf with a college paper about GaAs (gallium arsenide) IC technology and its bugged me ever since. There seemed to be numerous reasons why using GaAs instead of silicon to make a CPU made perfect sense: lower power consumption, faster processing speeds, etc.
So my question to you that are WAY smarter than me about this sort of thing is: Why was this passed over? With all the talk about power consumption, subsequent heat dissapation issues, speed "ceilings", etc with todays silicon CPU's, why is the GaAs technology not a good way to extend the existing paradigm that much further, rather than going with multi-core and the like?
Not that I'm against multi-core chip mind you, I'm just curious for the discussion
R
Years ago, I helped a gf with a college paper about GaAs (gallium arsenide) IC technology and its bugged me ever since. There seemed to be numerous reasons why using GaAs instead of silicon to make a CPU made perfect sense: lower power consumption, faster processing speeds, etc.
So my question to you that are WAY smarter than me about this sort of thing is: Why was this passed over? With all the talk about power consumption, subsequent heat dissapation issues, speed "ceilings", etc with todays silicon CPU's, why is the GaAs technology not a good way to extend the existing paradigm that much further, rather than going with multi-core and the like?
Not that I'm against multi-core chip mind you, I'm just curious for the discussion
R