The Future of Kernels, Operating Systems, and "Tech"

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
Edit:
MODS: ***I also posted this question in the General Hardware section, here, along with an explanation for the cross-posting. Thanks! :)***

---------------------------------------


I know this is a very broad question, but I'm only looking for broad answers.

From my layman's understanding of OS's, their basic structure has remained relatively unchanged since Unix appeared in 1969. That is, you've got the hardware -> kernel -> software/OS ("current system"). Is this the ideal (or at least the most efficient) way to use the hardware? Do you think that this long-standing structure has negatively impacted innovation? i.e. have other more creative or promising methods of interacting with hardware been passed by because of the sheer momentum that our current system has?

In what ways does our current system fail? Or does it? Are there any alternatives? Not just other computer architectures or systems, but other entirely new ways of processing data? If CPU's and transistors suddenly ceased to exist, what other methods might we use to achieve the same results? Could we?

Where do you see "tech" in 5, 10, 25, even 50 years from now? Based on the history of computing thus far, can you draw any conclusions about the future?

Apologies if this is the wrong section, but I figured the folks who hang out in "Highly Technical" would be the best equipped to abstract out the nitty-gritty of computer engineering into broader, more qualitative answers I'd be able to understand. ;)
 

yhelothar

Lifer
Dec 11, 2002
18,409
39
91
new interfaces..
I think we'd move to touch screens and BCI's(brain computer interfaces)

 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
The following is all in my humble opinion.

It is all about priority.

Back in the days, the priority of OS were to proliferate usage. This lead to inefficient bloating of the OS in hopes to increase user friendliness.

Then hardware capabilities increased way faster than software cycles and the inefficient bloated OS didn't seem so bad, but it had already achieved their initial goal of proliferation. We got to the point where hardware development lifecycle was shorter than software development lifecycle. The cost of producing hardware also exponentially dropped, while the cost of producing software increased.

Now hardware has reached it's speed limit and instead of gunning for the next speed hill, it broadened itself. Instead of being a 200mph speedbike, hardware is now a 200mph Greyhound bus. Software has yet to catch up to these capabilities.

In the future, hardware will continue to outpace software and the OS will eventually be an integrated, programmable hardware item. Instead of general purpose computing, we will go back to specialized, individual appliances because cost is so cheap.

The priority of software will shift from usability, to accessibility. Software will find its strength in cloud computing and revolutionize software as we know it. Software will be subscription and support based for revenue while hardware will eventually converge with the OS.
 

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
What makes you so sure that hardware development will continue to outpace software development? Is it conceivable that hardware will eventually become fast enough for programming languages to reach new heights of abstraction, to the point where a dictionary is the only list of "commands" a programmer will ever need to know? i.e., checking your code for bugs might be as simple as using spell checker, and even your mom is writing "code"?

As for BCI's, what's the fastest "human bus"? Nerves? And once you get to that point, would it be the brain/nerves that become the "bottleneck"? I mean, neurons still rely on chemical reactions to transmit their signal... if you had a computer directly interfaced with your brain, wouldn't you be able to think about moving your hand, but before the signal even reaches your muscles, the computer screen is already telling you which hand you're going to move? I don't really have a point with all of that, just thinking out loud. :p
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
The more immediant future will be a push to get power->usable times. They've made fair strides in this area, however, I expect there to be some bigger movements towards faster OS loading in the future.

After that, the gui will be the main focus. Ease of use is a big deal for most people, so being more intuitive will be the business of the day. At one time, I thought a 3d desktop might make it, however now I'm convinced it never will. 3d desktops are just too much info for most people.

We might see brain interface devices become a mass consumer market, however, I somewhat doubt it for a couple of reasons. First, the initial models are going to get harder to use then current input devices. This will really hurt them. Who wants to dump a ton of money in research that is almost guaranteed to result in a product that most people don't want to use? There will be a HUGE jump in prosthetic before a viable brain interface becomes available. That might take a while. I might even go so far as to say that we will probably have cancer cured before we get a workable brain input device. (And by that, I mean a device that I could type this post with, not OCZ's device that can move the mouse with a little brain power and constantly monitoring facial muscles). Its a complex problem because every mind of every individual functions a little differently then someone else's mind.

On the hardware side of things, It wouldn't surprise me to see even more dedicated hardware devices. I could see a dedicated video decoder going on the northbridge for example.

Memory is cheep, i expect the OS to find more and more ways to consume it for speed. Vista's Super fetch idea isn't a bad one, regardless of what the haters say.
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
I think the future of Operation Systems lies in the cloud, every time we see the features of new os's are being restricted by anti-trust policies of the government so all the services and additional features are being moved to the cloud. I think there will come a time when the entire OS will be stored in the cloud as a software and online service delivery system which has more room to expand than the static offline OS.

From a end-user perspective, things will be much easier. The future of storage doesn't matter as cloud will take care of everything, the movies/music you buy will be streamed rather than stored locally, applications will run online, gaming becomes hardware independent, system requirements doesn't matter, web browser usage will go down as each web service will run as an online application. Operating systems stored in the computers will become more like system software embedded in the hardware that runs in the background. Computers will be a passive machine feeding on big computers lying in the cloud somewhere just as human being becomes more dependent feeding on machines to do his dirty work.

But what does all this tell you? It tells you, as man becomes more advanced the more degenerated his mind becomes. That's my theory of devolution.
 

Skyclad1uhm1

Lifer
Aug 10, 2001
11,383
87
91
10 years ago people thought we'd have done away with keyboard and mouse by now. Yet still here they are. Touchscreen is nice for some applications, but if you have a 30" screen and have to sort stuff and click a lot you'd have RSI within half an hour if you'd have to use a touchscreen interface. So I see us with faster PCs, touchscreen and voice operated systems for specific purposes, but still largely mouse/keyboard for a lot of work and leisure purposes.

For gaming there will probably be an increase in Wii controller like devices in the near future, but this will only be an additional feature for most systems, rather than a replacement.

Regarding the terminal/server solution that comes up every so often, but if you can choose between one or more servers powerful enough to run all the heavy programs everyone in the company needs to be able to run from their $600 terminals and one or more servers at 1/10th of the cost with $700 terminals which can run the same programs but do not cause downtime if the server or the network fails, guess which most choose.
When the cost of a terminal which could run everything was $30,000 and a bare terminal was only $10,000 it was worth it, but for most companies it now is not worth it anymore.

More companies distributing their software over the internet yes, but not everyone running it over the network. At least not in the next 5 years.
 

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
It seems a bit odd that we've gone from mainframes/terminals, to PC's, and now back to mainframes/terminals. If you look at it on a large scale, it seems to mimic the yearly changes in consumer PC hardware -- is the bottleneck this year the CPU, or the GPU? HDD? Memory? On the larger scale, what's the botteneck that initially made mainframes/terminals the best solution? What's the bottleneck that made PC's the next solution? And today, what's the bottleneck causing us to move back to the mainframe/terminal system?

Can you envision a reason why we might, one day, see the momentum shift once again from the "cloud" back to PC's?

And to take a step back even further, is the entire methodology currently used to process data - a logical circuit - the best tool for the job? As I asked in the thread in GH, are there other ways to organize and process data that we might be overlooking simply because of the sheer momentum that our current solution has? Like moving from a ship to a plane, or a train to a rocket, how else might the problem of "data" be solved? The human brain is not a purely logical device, so why is our data treated as purely logical? Without turning this into a discussion of "AI", can you conceive a system whereby data is organized using anything less than 100% logic?

Then again, is pure logic the best method for data manipulation, and it's the human mind which is at fault for not always acting logically? Does a logic-bound computer seem faulty to us only because our own minds cannot process data so strictly?
 

Aberforth

Golden Member
Oct 12, 2006
1,707
1
0
Originally posted by: PaperclipGod
It seems a bit odd that we've gone from mainframes/terminals, to PC's, and now back to mainframes/terminals. If you look at it on a large scale, it seems to mimic the yearly changes in consumer PC hardware -- is the bottleneck this year the CPU, or the GPU? HDD? Memory? On the larger scale, what's the botteneck that initially made mainframes/terminals the best solution? What's the bottleneck that made PC's the next solution? And today, what's the bottleneck causing us to move back to the mainframe/terminal system?

Can you envision a reason why we might, one day, see the momentum shift once again from the "cloud" back to PC's?

And to take a step back even further, is the entire methodology currently used to process data - a logical circuit - the best tool for the job? As I asked in the thread in GH, are there other ways to organize and process data that we might be overlooking simply because of the sheer momentum that our current solution has? Like moving from a ship to a plane, or a train to a rocket, how else might the problem of "data" be solved? The human brain is not a purely logical device, so why is our data treated as purely logical? Without turning this into a discussion of "AI", can you conceive a system whereby data is organized using anything less than 100% logic?

Then again, is pure logic the best method for data manipulation, and it's the human mind which is at fault for not always acting logically? Does a logic-bound computer seem faulty to us only because our own minds cannot process data so strictly?

Bottlenecks doesn't matter when you find a better way of accomplishing tasks, grid computing is one such thing. If you look more closely at the basic function of devices such as CPU and GPU, they are trying to accomplish specific tasks, their speed is limited. But add thousand such devices running in parallel serving multiple users and running multiple tasks, at that time if we take cloud computing as an example, the network bandwidth will be the biggest limitation.

About human logic: The human mind relies on previous experience to build logic and to do things in a better way in order to gain more free time, that means it's computational capacity is limited to what it experiences, if there isn't anything new or if there is nothing else to compute it goes into standby mode. For example: you already know if you touch a sharp knife its going to hurt, but how do you know that? mind recalls the previous experience on the subject and computes that data with other similar experiences and gives back the result. So we can be quite sure that Brain and other natural interfaces won't be the primary and most preferred way of computing because it's such a mind job and contradicts the purpose of mind. I think if the speed of the cpu stops doubling, we might make a switch to quantum computing...also we might see advancements in better algorithms, that's because there is always a better way of doing things. Take today's multi-tasking for example, it's highly inefficient and it's limited by software, when you come up with better codes the processor will schedule threads in a better way. Computing is nothing but little macros of human mind.
 

videogames101

Diamond Member
Aug 24, 2005
6,783
27
91
Well, an easy one is moving away from mechanical drives and going all solid state. But, I have a feeling that software is going to be playing catchup to parallel CPUs for a long time to come.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Originally posted by: videogames101
Well, an easy one is moving away from mechanical drives and going all solid state. But, I have a feeling that software is going to be playing catchup to parallel CPUs for a long time to come.

I hope that one day we will see MRAM take over some market. Or at very least it would be nice to see an MRAM L3 Cache (imagine having a 256 MB cache on your CPU, in older applications you could almost eliminate the need for ram. :))

Its just that the DRAM and SRAM and NAND markets are so entrenched in our current consumer markets that we may never see the light of day when there is a fast alternative memory type available (Imagine your flash drive being faster then your ram. Then ready boost would offer some pretty big speed increases)
 

pcy

Senior member
Nov 20, 2005
260
0
0
Hi,

To answer the original question:


The basic structure will never change.


The reason for this is that the basic structure is not driven by technology, but by the fundmantal nature of a computer as a tool used by humans.

What we have is:

Hardware

OS: Device Drivers, Event handling, Scheduling, Task Management, API.

Applications and UI



This description is utterly abstact, with no dependance on any particular technology. It will not change, unless we hypothesize developments even broader than your question, for example:
1. The evolution of computers into genuinly intellignet beiongs with whom we would interact in the same way that we interact with other (intelligent) humans
2. Th evolution of the human race to incorporate device drivers for coputer hardware and diect connections to such hardware via the sub-ether


The User interfaces with Applications (i.e. the tools he chooses to use) via the User Interface. Note this is the definition of "User Interface" - however much the User Interface changes the means by which the User Intercfaces with the Application will remain the User Interface.


The Applications need to exist in order to allow each User a choice of tools. That's a reflection of the fact that people are not all identical and have multiple and diverse interests.

The OS exists to manage the physical machine, and make its resoucources avaialble to Applications, i.e. to make the system more reliable abd more flexible, and to make it easier to write the Applications, and also to make it posible to run multiple Applications an a single machine concurrently. Again the need to do this is a direct consequence of of the diversity of humanity and human interests.



So... the fundamantal structure will not change.




Peter
 

Gibsons

Lifer
Aug 14, 2001
12,530
35
91
One thing that interests me isn't so much in what I have on my desktop, but where processors will go. Basically what's the 10 year roadmap for Atoms and the like. 10 years from now, you might have a Core 2 Quad equivalent for a few bucks with a very small footprint - if the power needs go down as well, you could do... well a lot of, uh... stuff.. with it. Maybe it fits into a wristwatch or sunglasses, or micro bots... Maybe someone else can figure out the killer app for these.
 

PaperclipGod

Banned
Apr 7, 2003
2,021
0
0
Great points, pcy!

Gibsons - how would you make use of all that the chip in your watch has to offer? All that power isn't much good if you can't efficiently use it, right?
 

gsellis

Diamond Member
Dec 4, 2003
6,061
0
0
With the regard to OS, where we are at is a comfortable spot. The kernel and quasi-kernel architectures are fairly efficient supporting diverse platforms. There seems to be no big bottleneck that needs to be overcome.

What we may see is a new networking model come along. We now have people with families of devices that they will want to all communicate. The line is blurring a bit with mobile phones, GPS units, and other personal devices. Look for the ability to have a couple of storage devices (mem on a phone for example) to carry music, places, locations. Something will need to act as the central store and give access to that information to other devices. A new key share will be needed to secure the devices amongst themselves.

So something Blackberry-ish will become a communications hub between personal computers, cars, home electronics. Or a mini-netbook (OLED rolling screens?) The concept of an AID seems to be just down the road.
 

Gibsons

Lifer
Aug 14, 2001
12,530
35
91
Originally posted by: PaperclipGod
Great points, pcy!

Gibsons - how would you make use of all that the chip in your watch has to offer? All that power isn't much good if you can't efficiently use it, right?

I really don't know. Put it into sunglasses with earpieces, with displays on the lenses, you can have a MP3 player/smartphone/movie player/nav system all in one? All voice or even thought actuated? I guess you'd need a HD of some sort though... hm.
 

KIAman

Diamond Member
Mar 7, 2001
3,342
23
81
Originally posted by: pcy
So... the fundamantal structure will not change.

Peter

In my opinion (I know people hate that leading statement, but I have to write this to make sure people understand that I am not making a statement of fact), the reason for our current fundamental structure has remained the same is because our current priority for computing remains the same.

Accessibility and Usability leading to proliferation by shortening development time for "software."

Let me riddle you this. What would be the consequence of having hardware with no operating system? Software that incorporates all the tools to utilize the hardware AS WELL AS the software function itself. This leads to severely increased software development time and cost.

But, what if our priority shifted to efficiency and speed? Assuming cost of hardware continues to be driven down, it wouldn't be far fetched to imagine a "internet" machine that only has a single CPU with an on-die processor, gpu, io, memory, etc with the browsing "software" integrating its own tools to utilize the CPU, reducing hardware cost, power cost, storage cost, and increasing speed with noo bloated OS offering a million different functions and APIs, cost, storage and overhead, for such a specialized machine.

Anybody remember the days of OpenGL vs Direct3D in 1996?
 

pcy

Senior member
Nov 20, 2005
260
0
0
Hi,

Originally posted by: KIAmanLet me riddle you this. What would be the consequence of having hardware with no operating system? Software that incorporates all the tools to utilize the hardware AS WELL AS the software function itself. This leads to severely increased software development time and cost.


The answer to that is simple. As well as the increased development costs, the result would be unsable and slow.

If Word and Photoshop both had their own HD driver (for example) there is no way they would co-operatre properly. They would lock eachother out, steal eachother's IRQs and crash. The OS exists to provide - indeed ulimately it is - the collection of software needed to allow multiple applications to share the resources of the physical machine.

Moreover, because of the high leverage (just one driver per physical device but used in millions of machines or by millions of application instances) it is cost effective to write stable, efficent drivers. Load the task of writing the driver onto multiple application development projects and just watch the CPU cost skyrocket.

But, what if our priority shifted to efficiency and speed? Assuming cost of hardware continues to be driven down, it wouldn't be far fetched to imagine a "internet" machine that only has a single CPU with an on-die processor, gpu, io, memory, etc with the browsing "software" integrating its own tools to utilize the CPU, reducing hardware cost, power cost, storage cost, and increasing speed with noo bloated OS offering a million different functions and APIs, cost, storage and overhead, for such a specialized machine.


There is no chnage at all in this scenario. So the UI has become your browser, the API is whatever is offered by the web development packages, and the physical computer is distributed around the universe. So what? Read my previous post carefully, and note the abstract nature of the language I used. So... the OS has become a collection of schedulers and device drivers each located whereever is appropriate in this cloud of hardware resources located anwhere except close to your home. The OS still has to exist, and it still has to perform the same function. I deliberately wrote my response so as not to exclude possibilities such as these, even though I think they are unlikely to come to pass.


Anybody remember the days of OpenGL vs Direct3D in 1996?


After my time, I'm afraid.



Peter

 

Nathelion

Senior member
Jan 30, 2006
697
1
0
Speaking of brain interfaces, I think it is important to specify what is actually meant by such a device. One could imagine a nerve interface, for example you could insert sensors along the arm to read input through the nerves that govern hand movements (and perhaps even provide feedback through sensory nerves). All you would have to do then is to build a control interface "protocol" based on those inputs, and you would have "brain interface" of a sort. I think this sort of thing will become feasible long before any of the more esoteric options do. Granted, such an interface would not be intuitive (unless the control protocol were directly analogous to that of a human hand), but if it were useful enough, people would learn it - just like we currently learn to type quickly on QWERTY keyboards. This kind of interface would allow people to operate computer systems - and whatever physical systems they are connected to - as effectively extensions or replacements for their own body.

There is also the rather primitive stick-an-electrode-in-someone's-brain-and-see-what-regions-activate-when-doing-x approaches that is currently used, for example in the OCZ device albeit in an extremely primitive version. These approaches are currently being done, the only problem is that to get sufficient resolution electrodes must surgically be implanted into a person's brain. For that reason, research is currently limited to animal experiments. I think this line of research might be viable, but I can also see problems, such as that every action has to be performed and recorded ("taught") to the system before the interface will recognize it, and there are differences in brain activation patterns between individuals. The approach outlined in the previous paragraph is essentially the same approach, but with single-nerve resolution and done elsewhere in the body. The reason I separated the two is because I am inclined to believe that single-nerve resolution will require surgical modification and disassembly of surrounding tissue that is not feasible inside the brain, for obvious reasons.

Then there is the sci-fi idea of the computer reading your thoughts directly and acting on them. I'm not sure exactly what such an approach would entail, because I don't really know what a thought is. No one does. Not in a useful sense in this context, at any rate. While such a thing is be theoretically possible, and may even be practical some day in the distant future, it would require an understanding of the brain and of the very fundamental mechanisms of consciousness that we simply don't have today, and won't have for a looong time to come. My guess is that by the time the brain and the mind are so well understood that this is possible, we will be able to migrate human minds to a non-organic computational machinery anyway, and so render the debate academic.


 

SunSamurai

Diamond Member
Jan 16, 2005
3,914
0
0
I highly doubt cloud is the future of computing on any level but business. The simple matter of how people torrent can tell you that. People do not want privet stuff maintained by people that arn't them.


Also if you want to look at the "subscription" or "licends" future of software look no further than the grave EA is digging with many of their games. People want to OWN their good unless renting them has some VERY GOOD added features.