Let's look at this long term, say five or so years, the design cycle of a modern CPU. As we've noted earlier, the X86 CPU is about to take a radical turn, and the designs you will see at the turn of the decade won't resemble anything you see now. What do we mean by that? Mini-cores and Larrabee.
Until Sun came out with Niagara, modern CPUs were big, fast, hot out of order execution (OoO) beasts that ran a thread as fast as possible. Programmers were stupid creatures that had to have their work done for them in hardware, and elegance was the domain of game developers of yore. Fat cores were in.
Then came Sun with a hard left turn, lots of little, stupid cores that can do more in aggregate that a single big core. It had been tried in the past, but not with a modern ISA for mainstream use. If your application fit the bill, in Sun's case, this meant no FP code more than anything else, it simply flew. If it did not fit, well, you had problems. Can we offer you one of our other more conventional products?
The first salvo in the modern mini-core wars was fired, and the world changed. Now, Sun is on the verge of releasing Niagara II, and Niagara III is sure to follow. Intel was not about to let this winning strategy go unchallenged, and now has enough mini-core projects going to fill a phone book.
Kevet and Keifer were a mini-core and a CPU made of 32 of those cores respectively aimed at server workloads. It was four times what Niagara was reaching for, but also five years later. Intel is going for the swarm of CPUs on a slab approach to high performance CPUs, and more importantly, is going to upgrade the chips on a much swifter cycle than we've been used to.
With 32 small and simple cores, you can design each core much more quickly than a normal CPU, much more quickly. Design complexity, verification and other headaches make things almost a geometrically increasing design problem. A small core cut and pasted 32 times can mean smaller teams doing more real work instead of busy work, and more teams tweaking things for niches.
We think Intel is aiming at a much more rapid design upgrade cycle, most likely yearly, and much more niche-aimed CPUs. If you can make a new core with 1/10th the effort, and put it in an already existing and verified infrastructure/interconnect, then you can revamp your line up with a rapidity that would be flat out impossible to do today.
Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.
Basically, GPUs are a dead end, and Intel is going to ram that home very soonAMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.
The other option is to buy a team of engineers that produce world-class products, are battle tested, and have a track record of producing product on the same yearly beat Intel is aiming for. There are two of these in existence, ATI and Nvidia. Nvidia is too expensive, and has a culture that would mix with AMD like sand and Vaseline. That leaves ATI, undervalued and just as good.
So build versus buy for long term strategic competitiveness, the choice is obvious, you have to buy. This will put AMD about 12-18 months behind the first of the mini-cores from Intel, about the range AMD is behind for everything else. Intel bites the bullet and proves the market, then AMD steps in. Here, AMD is going to let Intel do the heavy lifting, and then waltz in at the right time.
Long term, buying ATI is the only thing AMD can do to survive. It will bring some short term pain, and Wall Street will simply not have a clue once again, but there is no doubt that it is a necessary thing.
The more interesting time is mid-term, in the year to three year range. ATI has two sets of deep engineering knowledge that AMD can suck in and benefit from, memory controllers and PCIe. AMD is integrating both into the CPU, so ATI engineering can help greatly there. On the flip side, AMD has world-class manufacturing facilities that ATI can make GPUs and chipsets on without paying an arm and a leg to use. This is a win/win.