Era of the APU has only just begun, give it a couple years and 80% of people will no longer need dedicated cards to do what they do on their computers.
It will take more than a couple of years. The external GPU will die when it is sufficiently cheap to include sufficient RAM bandwidth, while still allowing RAM expansion.
Tho I think lower end discrete GPU market will eventually die out.
It almost has. Adding displays, and/or getting driver features not offered by Intel, accounts for a lot of it, already. If the IGP in Haswell Xeons has some MCA support, you can expect the low-end Quadro and FirePro market to shrivel up, too (they only just added ECC, which Intel's will have, "for free," so if Intel can get the app certs, and add logged errors for the hardware, only the cards offering many times the IGP's performance will be worth it).
The days of MMO's have been and gone.
Next Gen is going to push games more than ever and that means more powerful GPU's
There is no need to play games on cheap PC's when consoles are going to be rocking decent graphics. Unless you want to play PC only games.
...and they have a keyboard, mouse/trackball, and can have other programs of mine running in the background? You can't play the same game on a console, no matter what the graphics (in many cases, difficulty/balance mods can come to the rescue, for multiplatform games). Some genres just don't work as well with a pad, and/or aren't as fun with an aimbot, and so on and so forth. Some games could benefit, though: imagine Dwarf Fortress with a streamlined menu system, required for gamepad use
.
A decent GPU has a 3-5x the power requirements of the fastest desktop CPU. This will always be the case. BF3 today looks ok but in 3 years time it will look crap in comparison to whats new on the market. You will never have all that power inside an APU.
Show me the DIMMs. Cheap RAM technology is already a limiting factor to current APUs, and there's no sign of that ending on the horizon, unless AMD pulls some kind of coup along w/ the XB720, offering API-exposed embedded RAM on COTS x86 processors, or something like that. If not that, we will need the next major DRAM spec to improve bandwidth by several times (2x every ~4 years is just keeping up with increasing requirements for CPUs alone), before it becomes feasible. It
will happen, but the technology to make cards obsolete is not apparent. The
now is not having to buy a card due to IGP being total crap.
Solve the bandwidth/pin problem, in a way that doesn't require coding to one specific HW platform, and we'll get there quickly (and, maybe Intel has some driver magic that can do that with their DRAM). Until then, it will remain around the next corner, and the next, and the next...
we used to have a math co-processor
history teaches well
So, you think we'll have a video card that, when plugged in, replaces the CPU?