The Future of Kernels, Operating Systems, and "Tech"

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
MODS: **I posted this same question in the Highly Technical forum, here. However, if moderators are allowed some discretion, please consider not deleting this thread as a duplicate -- I'm specifically interested in the opinions of people who would NOT normally browse the "highly technical" forum. i.e., regular computer users. ;) Thanks!**

------------

I know this is a very broad question, but I'm only looking for broad answers.

From my layman's understanding of OS's, their basic structure has remained relatively unchanged since Unix appeared in 1969. That is, you've got the hardware -> kernel -> software/OS ("current system"). Is this the ideal (or at least the most efficient) way to use the hardware? Do you think that this long-standing structure has negatively impacted innovation? i.e. have other more creative or promising methods of interacting with hardware been passed by because of the sheer momentum that our current system has?

In what ways does our current system fail? Or does it? Are there any alternatives? Not just other computer architectures or systems, but other entirely new ways of processing data? If CPU's and transistors suddenly ceased to exist, what other methods might we use to achieve the same results?

Where do you see "tech" in 5, 10, 25, even 50 years from now? Based on the history of computing thus far, can you draw any conclusions about the future?
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
HUH?

:confused:

Us "regular computer users" cannot comprehend issues that complex.

;)

Honestly, it may have stiffled us somewhat - otherwise we might be like the Borg by now. Neural jacks & all that. But seriously, my computer goes faster now than it did two years ago and much much faster than it did ten years ago. As long as this trend continues, most "regular computer users" aren't going to even worry about the interface between their hardware & software (or even realize there is one for that matter).

:beer:
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
I see more utilization of efficient instruction sets used in more applications.

I see a complete reinvention of nonvolatile memory. Faster, cheaper, stable for thousands of years.

I see further increases in energy efficiency so that server farms can be more powerful and widespread yet manageable and affordable. Eventually I see self powered server farms running on fuel cells that only need to be refueled once every 10 years or so.

I see better and more widespread implementation of multiplexing over optic fibers using different frequencies of light.

10 years from now I predict we will be much better connected, and cloud computing will reach a peak. No more boundaries between devices... work computer/home computer/HTPC/media center/mobile devices will all be synced at all times, ultra-secure, with near-infinite storage.

20 years from now I see computers ditching the internal wires and sending purely optical signals instead of voltage signals. Thermal efficiency will go up by an order of magnitude.

I predict enhanced networking of ultra-cheap embedded sensors, such as weather/traffic monitoring systems.

I don't know.. I can go on for days. Research is currently being done in a few of the areas I mentioned... benefits of being a grad student is you get exposure to cool ideas like this.
 

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
Cheers, Denithor. ;)

Did that first post come off as rude? I wasn't really sure how to word it. I mean, in the auto industry the opinions of the mechanics (the folks in highly technical) are just as valued as the consumers (the folks in general hardware). One isn't "better" than the other, especially when you ask questions as abstracted as I tried to make them. Besides, everyone in General Hardware is probably more computer literate than I am anyway... I majored in liberal arts. lol

But as for your comments about the hardware/software interface: True, most people are quite happy with the current trend of increasing computer speed and capability. But that doesn't mean a better approach isn't possible, right? For example, a "fast" trip from New York to San Francisco in the early 19th century was 6 months by ship. By 1860, clippers were doing it in 3 months. To people living at the time, that's an incredible improvement -- how could you ask for more? But an entirely new method of transportation - rail - obliterated that record just a few years later. In 1869, the trans-continental railroad was completed, allowing coast-to-coast travel in just one week. Even after cutting the Americas in half with a canal and replacing sails with engines, ships today still can't come close to that record. Of course, the incredible speed of rail travel was eclipsed as well... not by faster locomotives, but again by a whole new method of transportation - air travel. What once took a week to travel by rail could now be done in 8 hours by air.

The point of that digression was to try and illustrate how it's usually an entirely new approach to a problem that makes the greatest stride forward, not just a better version of the same old technology. So, if the past 40 years of computer design is our "ship of sail," what will play the role of the railroad? The airplane? I know it's not a perfect analogy, but I'm just hoping to spark some creative juices in you guys. ;)
 
Nov 26, 2005
15,194
403
126
Originally posted by: PaperclipGod
MODS: **I posted this same question in the Highly Technical forum, here. However, if moderators are allowed some discretion, please consider not deleting this thread as a duplicate -- I'm specifically interested in the opinions of people who would NOT normally browse the "highly technical" forum. i.e., regular computer users. ;) Thanks!**

------------

I know this is a very broad question, but I'm only looking for broad answers.

From my layman's understanding of OS's, their basic structure has remained relatively unchanged since Unix appeared in 1969. That is, you've got the hardware -> kernel -> software/OS ("current system"). Is this the ideal (or at least the most efficient) way to use the hardware? Do you think that this long-standing structure has negatively impacted innovation? i.e. have other more creative or promising methods of interacting with hardware been passed by because of the sheer momentum that our current system has?

In what ways does our current system fail? Or does it? Are there any alternatives? Not just other computer architectures or systems, but other entirely new ways of processing data? If CPU's and transistors suddenly ceased to exist, what other methods might we use to achieve the same results?

Where do you see "tech" in 5, 10, 25, even 50 years from now? Based on the history of computing thus far, can you draw any conclusions about the future?

Very good question.
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
But that doesn't mean a better approach isn't possible, right?

I didn't say it wasn't possible, just that most people would never even consider that there *is* an interface between the hardware and their desktop UI. Making a change at that level, no matter how innovative, just isn't going to register in your average user's consciousness whatsoever.

Where do I see computing going? Kinda like wired said, in a few years I think cloud computing will seriously lift off. In terms of consumer-level systems, I don't even know if people will have discrete computers after a point, there will be massive computational power available "offsite" so we will probably be able to use simple terminals/netbook-esque devices to access heavy-duty systems for any crunching we need. Can you say gaming piped directly into our television sets with no need for anything but a screen & controller in your home? And maybe not even the screen...
 

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
Do you think privacy concerns will hamper the move to mainframe/terminal systems for the home-user?

Do you, personally, think the current interface between hardware and software is optimal? Is the hardware itself still optimal for todays needs? In what ways are computers incapable of working on data? i.e., what tasks do we still need a human mind for?

I'm not trying to drag this into a discussion on "AI", but just a more general discussion of our entire approach to data manipulation. Are computers, as we define them today, the best tool for the job?
 

Denithor

Diamond Member
Apr 11, 2004
6,298
23
81
Quantum keys for security?

Honestly don't know enough about the interface between hardware/kernel/OS/software to say if it's optimal. I can say it's at least adequate for the moment. My computer does lots of stuff faster than I can keep up with so no worries just yet on that front.

But what you're getting at is exactly AI. Today's computers can crunch numbers, run simulations, potentially even cure cancer (F@H) faster & more efficiently than the best human minds out there (making them "the best tools for the job" in these cases). However, that's all they do - no innovation, no "thinking" or "conciousness" present - which is viewed by many as a good thing. Borg & Terminator & etc.
 

Nothinman

Elite Member
Sep 14, 2001
30,672
0
0
That is, you've got the hardware -> kernel -> software/OS ("current system"). Is this the ideal (or at least the most efficient) way to use the hardware?

It's not the most effecient but it's probably the ideal. You need those layers of seperation to keep things sane. DOS/Win9X are prime examples of how bad things can get when software can get around the OS and touch hardware directly.