Well I think they certainly NEED a MUCH BETTER OS than their current offerings for
desktop / workstation uses (I won't even START on SERVER issues).
On the other hand their current offerings SUCK SO BADLY (e.g. VISTA) that it's questionable that
they have the competence / capacity to do the radical improvement / reengineering needed in
a reasonable span of time.
The hardware capabilities are VERY far ahead of the OS's ability to take advantage of them now, and
that gap will be 3x worse by the time the next generation OS can be out even if that's 2 years from
now.
If it is going to take more like 5-7 years to do it with quality they might as well as well start from
scratch and just plan to run today's legacy applications under an VM / emulated / sandbox / something.
What are the basic functions of an OS?
Control / manage the system memory / hardware. : FAIL;
they didn't even get 64 bit working for the MAINSTREAM users with VISTA 64 or XP64; 64 bit PCs
have been the STANDARD CPUs for like 6 years now for anyone who's built a relatively "modern" /
"full featured" desktop. Beyond pervasive 64 bit, they'll have to start intelligently using memory,
manage things like sleep / hibernate better, do MUCH BETTER than the pathetic
readyboost / superfetch type stuff. We're looking at potentially common RAM capacities of 32GB+ and
solid state disk capacities into the terabyte level for a next generation OS.
Control / manage storage: FAIL
The average new affordable commodity disk drive today is bumping around the 500GB-1TB mark,
and clearly it'll be able to be 2TB+ per drive including extensive hybrid / SSD options too if
technology progresses as we may expect within a couple of years. So people will need to be
managing on the order of 2TB-10TB data stores just on their personal PCs.
This is just PHENOMENALLY overwhelmingly transcendentally beyond ANY
BACKUP, SEARCH, DATA ORGANIZATION, SECURITY technologies that exist in any
desktop/workstation OS from Microsoft, and certainly also beyond the capacity of any
APPLICATION INTERFACE they've provided to let people USE their data.
Currently XP / VISTA FAILS MISERABLY beyond around a 128 character path name for
a file, backups start not working, copies mysteriously fail on those parts of the data, applications
can't open the files, applications don't even DISPLAY the path/filename properly etc.
How much worse would it be when one has 10x the amount of storage as today's typical amount?
The very linear concept of a tree structured filesystem is wholly inadequate for that capacity
of general purpose storage for a workstation. You're not going to often care about what the
file NAME or even PATH is, you're going to care about the CONTENT and its metadata,
WHO made it, what's the TITLE, what's the SUBJECT, what's the DATE, what's the VERSION,
is there an UPDATE, WHEN you need to respond to it, WHEN you want to watch it (e.g. multimedia),
WHAT it relates informationally to, etc. Using your own 'disk drive' will be more like surfing the
internet, using WikiPedia, using TIVO / DVR, etc. You've got to be able to
NAVIGATE TO WHAT you want, download / install it as needed, have organization / backup be
essentially automatic, FIND content, and ACCESS it topically, relationally.
One should essentially have the possibility to NEVER *LOSE DATA*, and NEVER NEED to
ERASE DATA once you've got it unless you go out of your way to block / reject / purge it.
WINFS was SUPPOSED to be a step in VISTA to start using file metadata, search more
intelligently to organize, access, manage, categorize, annotate your data. They couldn't
manage even to do its limited functions in all the years of development and instead canceled
the feature. Even in its "wish-list" form it would bave been inadequate for today's needs, and
exremely so for the future, but would still have been orders of magnitude better than the
present almost useless status quo. You could *literally* almost spend the rest of your natural LIFETIME
just LOOKING for information on a 2TB data store filled with things like texts, emails, web-pages, etc.
given poor organizational / metadata / search / database tools. It's GOT to be MUCH more automated.
It was easier to use the paper card catalog in a library 40 years ago than to find a PDF / Word processor
document / hypertext document you're looking for on your OWN COMPUTER today. Pathetic.
Security, stability? Come on, it has to be basically intrinsically correct and robust. There's
just NO EXCUSE for things like buffer overflow / data type overflow security / reliability problems
to continue to exist in 2008, certainly not in 2010. It should just NOT be possible to have ordinary
input data corrupt the functionality of the whole system. It should just NOT be possible to have
data, once stored, be corrupted or deleted / modified without the user's intention. Can you
IMAGINE losing 10TB of data? EVERY photograph your family has EVER taken, all the
baby pictures, school graduations, birthdays, memories with deceased relatives, etc?
EVERY music/movie you "own" since it seems like physical media "copies" of these things will
be getting rarer and rarer. EVERY email, EVERY document? It doesn't matter if your hard
drive crashes and needs to be replaced, it doesn't matter if there's a fire, it doesn't matter
if you download malicious "end user" programs, that's the kind of data preservation DESIGN
we'll need IMHO going on into the next generations of computing. Seamless backups, restores,
merges, re-organizations, indexing, searching, sharing, ubiquitous access locally, over the
internet, wireless, to portable devices, whatever. Use ECC, use RAID technologies, use
distrubuted filesystem technologies, use encryption, etc. Just as you can surf your favorite
web sites and not be aftaid you'll accidentally DELETE them or CORRUPT them if YOUR computer
gets a virus or your browser flakes out, you should have the same confidence that you can always
navigate / find / access your content regareless of hardware failures, upgrades, replacements,
software glitches, etc. It's not beyond current software or hardware techologies to accomplish
these things, but the architecture by which we use these things must fundamentally
evolve in a quantum leap NOW.
Communications / Accessability / Ubiquity? As has been long said, "the netqwork is the computer".
Even the *INTERNET* is pathetically bad at actually efficiently helping people find and
access information. Google? Give me a break, it's a pathetic hack. It's time for the SEMANTIC
WEB, it's time that tools in the OS, in Office Suites, in publishing tools, etc. start to work with the
SEMANTIC content of and metadata ABOUT information. Looking for a recipe for chocolate cake?
OK, that should be able to be found without resort to string searching terabytes of unformatted
data looking for keywords. Microsoft controls a lot of the OS marker, and has controlled a lot of
the office document editing / authoring market for years. In a next generation OS / office application /
authoring / workflow type system they'll have to provide itegrated AI as well as UI architecture
so that informational resources are efficiently tagged / codified from the start. Don't build
pathetic parsing searches into the OS except as a last resort, get things right from the start,
natural language processing, semantic extraction / indexing, speech recognition, handwriting
recognition, library science, etc. Ok, paper-clip, puppy, Google, tell ME what informational
resources on the NET or locally or in the library exist that relate to what projects I'm working on.
No, I don't want to click 13000 times for the next 20 results, I want it all in a database,
I want all my files in a database, I want all my projects, workflows, requirements, interests, studies
in a database. Make it semantic. Make it correlative. Make it language neutral as much as possible.
Ok, so I've got basically a supercomputer of 20 years ago on my desk and most e-storage capacity
than a typical city library has book-space for. Hello, Rosetta stone, show me what I've been
missing in Mandarin, Greek, Gaelic, search, translate, correlate, heck teach me new
languages, new fields of study. MIT's had a lot of its courseware online free for years,
I don't want next-generation paperclip / search doggie to teach me how to FIND the
PRINT MENU in WORD, I want it to teach me russian and topology with the assistance
of 3D models / CAD / computer algebra in my DX12 GPU, my 16-Core 100 GFLOP CPU, my
32GB memory, and my 32 channel XXX-Fi surround sound system and new 3D holo-projector.
Ok, maybe that's asking a lot, but a man's reach should always exceed his grasp; the
hardware technologies are either here or soon to be so, and the software's certainly the same.
Why am I being inconvenienced about not even being able to burn a DVD in VISTA 64
(a problem that ought to have been solved like 15 years ago), when THESE are the kinds
of challenges OS, APPLICATION designers should have ALREADY been tackling and
will certailnly NEED to do so ASAP?
Running an OS like VISTA on a computer 5 years from now will be like booting DOS on your
current quad-core 8GB RAM / 500 GB disk system... yeah you CAN OF COURSE do all that
petty stuff, but, really, it's all like BELOW your dignity / usefulness to even WANT to micro-manage
anything at the level of a mere text mode program, 8.3 tree filesystem, trivial utilities / programs, etc.
Ask not what you can do ON your computer, ask what your computer can do FOR YOU!
Don't DESIGN your OS / major Application Suite for *yesterday's hardware*
design it for something more like HAL-9000, THEN add in some simplification layers
to help adapt the few areas where the software / hardware is lacking in the next 10 years
until we're fully there on the SW/HW front.
Because at the current rate I'm betting they'll still be fighting about BLU-RAY vs HD-DVD
(i.e WHO THE HECK CARES, it's like arguing about PUCHED CARD vs PAPER TAPE
for your next VISTA laptop) when things like quantum computing, DNA computing,
holographic storage, etc. all come to practical fruition, very possibly within the next 5-15 years.
Even without those technologies at the very LEAST MOORE'S LAW will give us like 10x improvement
over current supercomputer node level technology by then just using traditional
multi-core designs.
Don't FRET about the UI RIBBON GUI design for OFFICE 2009 because you should really be
working on AI PERSONAL SECRETARY 2010 that's like at least MAX-Headroom if not quite HAL yet.
The true value of a next generation OS / Application design isn't going to be in throw-away
non-reusable CODE, it's going to be in the TIMELESSLY VALUABLE *enduring human
knowledge / heuristics / "real world" data* they put in it. A dewey decimal paper catalog
index interface to a decently stocked LIBRARY is a lot more useful of an information
RESOURCE than VISTA 2.0 on a personal supercomputer if its main benefits are
a prettier GUI, more DRM, and 3D video games. Add in content that's about REAL
information, REAL world concerns, then it becomes a useful TOOL.
e.g. google earth / world wind / GIS vs. a paper map.