mmntech
Lifer
I think computers have reached the point of being good enough for the majority of people. The focus is big on mid-range and low-end these days. Truth be known, the iPad is plenty computer for your average user. It's not like the 90s when people needed the latest and greatest. Most people just use their systems to surf the web, watch YouTube vids, and IM. You don't need an i7 for that.
True. X86 is getting a little long in the tooth. Revisions aside, most computers today are built on technology that's 30 years old. There's probably a better way of doing it but logistics gets in the way. Everything has to be built from the ground up again. People want that legacy support, especially businesses. Just going to 64-bit wasn't exactly smooth. Even if we do develop a new architecture, it's going to have to incorporate some sort of x86 virtualization.
I disagree. It is easy to write something that taxes the hardware beyond what it is currently capable (x264, it can easily bring an i7 down to a slow 1.5 fps, or lower with the right filters). I think it is more of a problem with architectural design. Just about every corner that could be cut, every known optimization, ect has been implemented in hardware. Now they have been stagnating with adding more and more cores in hopes that some magical enhancement will come along.
True. X86 is getting a little long in the tooth. Revisions aside, most computers today are built on technology that's 30 years old. There's probably a better way of doing it but logistics gets in the way. Everything has to be built from the ground up again. People want that legacy support, especially businesses. Just going to 64-bit wasn't exactly smooth. Even if we do develop a new architecture, it's going to have to incorporate some sort of x86 virtualization.