I just don't think x86 is viable at < 15 W and because innovations in heat dissipation and battery power are so slow, the easiest way to improve performance is efficiency. At some point Intel should have looked at x86 and realized it wouldn't be viable and made something different. Instead, they attempted to engineer a way around it with their process tech (Bay Trail).
I may be wrong and I know most think I am but I think this is an area where PC geeks have their minds clouded by irrational hatred for Cupertino.
First and foremost; I don't have any hatred let alone an irrational one for Apple. Ive had an iPhone (two actually.), had an iPad and my family members currently have iPhones, iPads and even my pa has that new shiny iMac. It's neat.
I just personally don't use Apple hardware (anymore.), don't find them to have a particular draw (to me.) and think maybe they're on a bit of a different ecosystem (they... are.)
I agree. ARM has some inherent benefits over x86 in the <15W range. There's no getting around that.
But gotta stick something out front and center; Windows isn't on ARM. Sad to say it again but; if the software ain't there, neither are 99% of people. People don't walk around rendering on their Android tabs for a reason, and it's not because of the lack of power; it's because you feasibly can't on ARM. It's an entirely different ecosystem meant for an entirely different purpose. I'm not denying it: ARM has inherent performance and efficiency benefits in the <15W ecosystem. The proof is in the pudding. But the software unfortunately isn't on the ARM side and won't be for a very nice chunk of time.
Intel will have quite a bit of longevity with x86_64, atleast until Android or some other ARM-based OS (lets be real here though it'll be Android or iOS but most likely Android) has all the software most any consumer could want, which is sort of far away. Until then, I'll love window shopping W8 tabs yet likely not buying one because the form factors I want only come with 1GB DDR3 instead of the 2GB really needed to make the system fully usable (and trust me, the CPU isn't the issue here. It's OEM decisions to save a couple dollars to make an inferior product).
ARM has it's draws and benefits, and I can certainly see there being a day where I'm using an ARM-only device as my primary device, but until I get a Windows-like (ie. an experience where I get all the software I could want) experience in the ARM ecosystems, myself as well as most people won't be boarding the ARM train and will have to stick with x86.
Of which, Intel has a "bit" of the upper hand in.
Small tidbit; don't devalue process. For some reason people around here seem to think that using your lithography process' advantages is some kind of "you're cheating! That's not equal!" trick when in reality... Intel worked
really hard to make a
good process, and they have. Let them use the damn thing.