To phrase my next response I want to point out a couple of things that happened not all that long ago and didn't concern Intel at all.
AMD almost launched a 4850 that would have been utterly obliterated in the market, it wouldn't have been viable at launch in the $100 price segment. Very late in the game Wavey(Dave B) convinced them to make changes to the board increasing its performance a very healthy amount and making it a part that made some serious waves in the market. The seriously underestimated where nVidia was going to be and almost embarassed themselves and launched a part that would have been too slow with too little RAM to be remotely competitive.
nVidia not all that long ago launched a couple of different parts at more then $100 over where they needed to be because the underestimated(or perhaps, accurately estimated where ATi would have been if not for the changes) the competition and assumed that they would have a clear lock on the first few tiers of high end performance in the graphics market.
Between AMD and nVidia they are, beyond any doubt, the top two companies in the field of graphics hardware today. Noone else has come out with anything remotely in their league, and many companies have rolled over and died in their wakes.
Both of them can and do make some fairly large judgemental mistakes.
It does naturally beg a degree of self-assessment though - unless we presume ourselves to be superiorly intelligent to the decision makers at Intel, we must assume Intel knew ALL of this before they even assembled the layout team for Larrabee nearly 3 yrs ago.
Three years ago Intel made several very poor predictions. They did not keep these secret, so it's pretty easy to tell what they were. For many years GPUs trailed CPUs by several build processes, both AMD and Intel being 2-5 steps ahead of the GPUs. As time moved on, the GPUs increasingly got closer to the CPUs and eventually we ended up where we are today, 40nm GPUs are coming about when volume shipment of 32nm CPUs should start. Intel saw this as a sign that GPUs were not going to be able to continue their torrid increases in performance as they were pushing up against the same type of limitations Intel themselves had.
Intel decided that the days of GPUs scaling were over.
Those are not my words, those are Intel's. We know, as a point of fact, they were wrong. Anyone with a very moderate level of capacity and understanding back when they first thought that also knew that they were wrong. At no point did it take consultation with the top engineers in the world to conclude that at best, Intel was a bunch of idiots making that call. Not borderline, they were flat out morons.
Another very poor prediction they made, rasterizers were running into limitations scaling. Nothing ever indicated this to be true in any way whatsoever, there has never been even the slightest evidence of this so I honestly can't say where their logic came from. I suppose from the standpoint that we were closing in on being 'fillrate complete'(made up term by me, simple way of stating that fillrate is a 'solved' issue- we aren't there yet but close) they could state that rasterizers couldn't improve much on that angle, but when something handles something perfectly it doesn't seem to be a good plan of attack to go after that area. If it were a company with any sort of knowledge about the graphics industry at all, they would have known the point in time when raw fill was no longer a concern shader die space would see an exponential increase, but this is the same company that made the staggering mistake of stating that the days of GPUs scaling were over. We know, for a fact, that they were wrong in no uncertain terms.
Just saying we've got to be giving ourselves a lot of credit in the grey matter department and assuming Intel's decision makers are operating with a commensurate depletion of it in order to be so confident as to assume we know what they don't and that we foresee in their competition something which they do not.
I've been making calls like this for over a decade on these forums(my registration date was the day these particular forums went live, I was here for a while before that). I have no problems at all standing by my record on predictions relating to the graphics industry and they many flawed attempts by all sorts of companies with
FAR more expertise in the graphics industry then Intel. Of course, everytime I point out what is certain to be a rather terrible failure, the loyalists for that company get extremely upset and always bring up the argument that the people making these choices know a lot more then I do. Of course, they have all since lost their jobs because clearly, they didn't
In reality the biggest mistakes companies have is their own arrogance when undertaking a project such as this.
This is where we get to the biggest mistake Intel made with Larry. They thought ray tracing would be great because they use it in movies. They looked over where they were at and where they were likely to be three years out and decided that they could make a real time ray tracer using software emulation to allow for complete flexibility and noone would be able to do anything remotely comparable. This is a lesson they should have learned from Itanic. Developers aren't going to switch unless you give them a BIG reason to do so. Larry, if a custom software render engine hand tuned by Abrash himself(who I would wager heavily will be the top person they can hope to enlist- he taught Carmack tricks back in the day) were offered still will not be able to compete with GPUs. Could it have if Intel was accurate with their predictions years ago that GPUs were no longer scaling? Yes, it could have. But Intel was so shockingly wrong with their initial prediction it isn't really close.
Getting game developers to move from rasterization to ray tracing is as much of a hurdle as it was trying to get developers to move their mainstream applications from x86 to IA64, we know how great that worked for Intel.
Just saying we've got to be giving ourselves a lot of credit in the grey matter department and assuming Intel's decision makers are operating with a commensurate depletion of it in order to be so confident as to assume we know what they don't and that we foresee in their competition something which they do not.
Itanium? i740? NetBurst.... well OK, I didn't know in advance Netburst was going to be a huge failure, but the other two were very obvious from the outset. They weren't close calls either, the entire concept for market acceptance was at best moronic.
Intel is without a doubt utterly brilliant in the x86 market, outside of that their record places them far closer to 'utterly inept and bankrupt' then 'dominating market force'.