Originally posted by: MrChad
Originally posted by: VirtualLarry
...
:Q I haven't seen you around here in ages. How's it going?
I'm not sure how I would even being explaining, so I'm not, but I guess the answer to your question is, "not so well". But I'm hoping things will get better soon. Both myself, and the state-of-the-art of OSes. I'm getting to the point that I can barely stand computers these days. I'm either a dreamer, or a madman, or both. Software technology needs to make a leap soon, otherwise we will all be mired in a world filled with crapware (malware, bloated useless apps, etc.). My dream is to "clean things up", so to speak, in regards to the software world. Everything is designed so utterly, horribly, backwards. :|
Or to put it another way - I was on the verge of one of the biggest breakthroughs of my existance, so I thought, when I had a bit of a breakdown myself. So I'm in the process of recuperating, and trying to get back up to speed.
Software, and OSes in particular, need to be organic, self-similar, robust in the presence of failure, and still be easy to manipulate by end-users. No small task. Interestingly, I discovered a paper discussing the origins of virtual memory and the "working set", back in the 60s, and discovered that several techniques that I had envisioned for my OS design, were already invented, way back then, and then fell into disuse.
The idea was that every subroutine has its own protected memory space, sort of an O-O virtual-memory design, which also effectively used segmented pointers, a sort of global:local hybrid. Really, from a CS POV it all makes perfect sense, but it boggles the mind that current systems aren't even designed with these sorts of principles in mind, apparently. Here's the link:
Origin of Virtual Memory and Working Set (PDF)
My idea building on top of that, was that the OS could in fact undo and re-route subroutine calls to an alternate implementation that had an identical interface contract, should one of the implementations encounted an exception. If done properly, it would also make it trivial to hibernate/persist individual processes, and then migrate those processes to other systems. Just like relocatable self-relative object/machine code can be moved to different memory addresses and still function, so to can self-relative processes and their resources be moved to different systems. To say nothing of the utility of a save/load-state feature for individual apps. (Much like game emulators let you save the state of the system.) Also, that would likely serve as the basis for a periodic point-in-time snapshot feature, in case there was any sort of exception or incident. Likewise, global system state changes should be fully transactional, the global system state should never be allowed to be in an indeterminate/intermediate state.
But the software guys seemingly NEVER LEARN from the hardware guys, who learn from the real (theoretical) CS guys. Sure, software systems are complex and getting more so, but that doesn't change the fundemental underpinnings of the system architecture and design. I mean, who just starts building houses, without understanding gravity and other forces of physics, and how the load-bearing elements of the structure have to be placed, in order for the building not to collapse. But yet, our computer systems still sometimes collapse and fold up like a house built from a deck of cards. (That was more true in the days of Win9x.)
I'm just rambling again, but there are all these beautiful ideas, most of them heavily/fully-researched in the 60s and 70s, and then the PC revolution happened, and it seems like most "cowboy" programmers decided to forget all about real CS, and just go their own way, and that's how we ended up with the disorganized kitchen-sink OS we call "Windows", where every programmer seemingly created their own new API call to suit their need.
HRESULT STDCALL _WashDishes(void * kitchenSink, void ** pUnknown);
I wouldn't be surprised to find out something like that actually existed, somewhere deep in the bowels of NT...