I think Azn has a point, though we must keep in mind that HT3 and CSI will redefine what it means for a component to be integrated. Both HT3 and CSI should allow you to plug in coprocessors and/or add-in cards in much the same way that processors plug in to multi-socket SMP systems currently. It will make more sense for mobo manufacturers to sell systems with an extra processor socket (or in the case of AM3 boards, either extra sockets or HTX slots, or both) so that end-users and/or OEMs can plug in barebones GPUs that will utilize the main system's memory subsystem. It will work like an integrated GPU, but it'll be swappable for a new GPU down the line, and mobo costs will remain about the same as they are now.
Some folks might think that would make for some lousy video memory performance as is often the case with current integrated graphics solutions, but we need to keep in mind that the industry has already released DDR3 SDRAM that can be overclocked to DDR3-2000 speeds or run at stock speeds of DDR3-1666. It is not outside the realm of possibility that we could see DDR3-2500 or higher soon which should offer at least decent memory performance for a GPU attached via a CSI or HT link.
Alternatively, we could see octal core CPUs pushed out to market with hybrid core layouts. Noticed how many people have commented that we don't need 4, 8, or more general-purpose cores on anything but high-end number crunchers and servers? well, that may be true, to a point . . . but why not build 1-4 specialized GPU cores into the die, precluding the need for semi-integrated barebones GPUs? Let's face it, most folks out there are having trouble utilizing 4 general=purpose cores in their Q6600s as it is . . . if we see Intel continue to roll out cheap, advanced chips at low prices like a $266 octal-core chip a year from now, what do you think the average user would rather get in that octal-core package? 8 general-purpose cores, or 4 general-purpose cores plus 4 GPU/vector processing cores? OEMs would love it, researchers might be able to use the GPU cores for number crunching anyway (and might like them better than general-purpose cores), gamers would probably like it . . .
The fact remains that you will see the most powerful GPU solutions on their own cards w/memory + mem controller for some time to come, but it should be very easy and very cheap for companies like Intel and AMD to produce bare-bones add-in GPUs or multi-core CPU/GPU hybrids that would offer graphics performance that will be competitive with low and mid-range video cards in any give generation of GPUs. Nvidia makes a lot of money selling low-end and mid-range cards to end-users and OEMs which will be a lot of lost income if Intel and AMD eat up that market share with integrated or semi-integrated GPUs.