Originally posted by: JasonCoder
<div class="FTQUOTE"><begin quote>
Originally posted by: Yanagi
Why would Discrete graphics cards be obsolete just because we have a few extra general purpouse cores in our systems?</end quote></div>
Several reasons actually
<div class="FTQUOTE"><begin quote>Now, if you add in GPU functionality to the cores, not a GPU on the die, but integrated into the x86 pipeline, you have something that can, on a command, eat a GPU for lunch. A very smart game developer told me that with one quarter of the raw power, a CPU can do the same real work as a GPU due to a variety of effects, memory scatter-gather being near the top of that list. The take home message is that a GPU is the king of graphics in todays world, but with the hard left turn Sun and Intel are taking, it will be the third nipple of the chip industry in no time.
Basically, GPUs are a dead end, and Intel is going to ram that home very soon. AMD knows this, ATI knows this, and most likely Nvidia knows this. AMD has to compete, if it doesn't, Intel will leave it in the dust, and the company will die. AMD can develop the talent internally to make that GPU functionality, hunt down all the patents, licensing, and all the minutia, and still start out a year behind Intel. That is if all goes perfectly, and the projects are started tomorrow.</end quote></div>
This is essentially one of the motivations AMD had when purchasing ATI but there's some good info and links to the future of discreet gfx solutions... or lack thereof.