- Apr 7, 2003
- 2,021
- 0
- 0
Edit:
MODS: ***I also posted this question in the General Hardware section, here, along with an explanation for the cross-posting. Thanks!
***
---------------------------------------
I know this is a very broad question, but I'm only looking for broad answers.
From my layman's understanding of OS's, their basic structure has remained relatively unchanged since Unix appeared in 1969. That is, you've got the hardware -> kernel -> software/OS ("current system"). Is this the ideal (or at least the most efficient) way to use the hardware? Do you think that this long-standing structure has negatively impacted innovation? i.e. have other more creative or promising methods of interacting with hardware been passed by because of the sheer momentum that our current system has?
In what ways does our current system fail? Or does it? Are there any alternatives? Not just other computer architectures or systems, but other entirely new ways of processing data? If CPU's and transistors suddenly ceased to exist, what other methods might we use to achieve the same results? Could we?
Where do you see "tech" in 5, 10, 25, even 50 years from now? Based on the history of computing thus far, can you draw any conclusions about the future?
Apologies if this is the wrong section, but I figured the folks who hang out in "Highly Technical" would be the best equipped to abstract out the nitty-gritty of computer engineering into broader, more qualitative answers I'd be able to understand.
MODS: ***I also posted this question in the General Hardware section, here, along with an explanation for the cross-posting. Thanks!
---------------------------------------
I know this is a very broad question, but I'm only looking for broad answers.
From my layman's understanding of OS's, their basic structure has remained relatively unchanged since Unix appeared in 1969. That is, you've got the hardware -> kernel -> software/OS ("current system"). Is this the ideal (or at least the most efficient) way to use the hardware? Do you think that this long-standing structure has negatively impacted innovation? i.e. have other more creative or promising methods of interacting with hardware been passed by because of the sheer momentum that our current system has?
In what ways does our current system fail? Or does it? Are there any alternatives? Not just other computer architectures or systems, but other entirely new ways of processing data? If CPU's and transistors suddenly ceased to exist, what other methods might we use to achieve the same results? Could we?
Where do you see "tech" in 5, 10, 25, even 50 years from now? Based on the history of computing thus far, can you draw any conclusions about the future?
Apologies if this is the wrong section, but I figured the folks who hang out in "Highly Technical" would be the best equipped to abstract out the nitty-gritty of computer engineering into broader, more qualitative answers I'd be able to understand.