Originally posted by: postmortemIA
It is the opposite, the graphics cards have higher processing power and constant innovation (new cores come every 6-9 months)
As for intel innovation, C2D is Pentium III Coppermine alike core, with many tweaks of course, but nevertheless. And AMD Athlon 64 series core has been around for 3-4 yrs now...
No matter how much money intel invests, they cannot overcome difference in technology that ATi and nVidia possess. It is like saying technological backward nation can become global leader in one year.
This move is more a fear from AMD, just in case AMD pulls something like fusion CPU-GPU, so intel is not far behind.
Like I said, intel has been making graphics chips for about as long as ATI has, and actually has been making them alot longer than nVidia has. I'm pretty sure that they had DX10 integrated graphics in the works, and again, with graphics chips once you have the core feature set covered, they're very scalable; they can just add on tons of pipelines, increase the memory bus, etc., and the chip will usually scale much better than a CPU would (again, because they tend to be "simpler", although with DX10 they are approaching CPU-like complexity).
If you look at nVidia as an upstart with the TNT2 and Geforce cards, they did amazingly well in a short period of time, and pretty much took over the market completely. Intel should never be doubted or second-guessed, as AMD recently learned with the C2D.
The C2D is miles ahead of a P3, and is far more advanced in many ways.
Just because CPU designs are updated every 3-4 years in terms of a major product line refresh, it doesn't mean much to this discussion.
I'm interested to see what intel can crank out in terms of a graphics core using their advanced fab processes. :beer: