You can easily fit an AMD APU + a discrete GPU into a console for hybrid CF. HD7950M is a slightly underclocked desktop HD7870 2GB part and uses 50W of power only.
http://www.notebookcheck.net/AMD-Radeon-HD-7950M.72676.0.html
With an AMD APU and a discrete mobile HD7950M you'd come in well below 170W of power consumption. The question is not whether it's physically possible but whether it's cost effective to do so.
Launch Xbox360 used
180W of power consumption in gaming.
Launch PS3 used
240W of power consumption in gaming.
50W HD7950M is comparable to
45W used by GeForce 7900GTX Go, which is very close to RSX GPU used in PS3.
You guys need to take into account that you cannot compare the power consumption of desktop GPUs to mobile GPUs. Mobile parts are an entirely different breed as they are binned (think of binned Intel Core i5/i7 Ivy Bridge chips).
With 4GB of GDDR5, a GTX680M with its 1344 SPs @ 720mhz uses just
100W of power. Will a desktop GTX670 underclocked to 720mhz use just 100W of power? Not a chance.
If anyone is going to estimate the power consumption of GPUs, you have to start looking at mobile parts since neither MS nor Sony will ever use a desktop GPU in their consoles and a desktop GPU was never used last generation either. Mobile GPUs have totally different bins, voltages and GPU clocks, as well as the option to reduce memory bus in half is always on the table to curb power consumption (as was done exactly for RSX to move it to 22.4 GB/sec compared to 44.8GB/sec for the desktop variant).
These types of GPUs in consoles are not only highly binned mobile chips, but they are custom made to extract even higher power efficiency.
I am not saying that an APU will be used, but power consumption is definitely not a limitation. Based on the estimated power consumption of the RSX GPU in PS3, it doesn't take a lot of hard math to estimate that PS3's Cell was not really more power efficient than AMD's current APUs, and yet Sony had no problems dealing with its power consumption. No matter what happens, it's impossible to make an inferior console for Sony as even a $60 AMD CPU will mop the floor with the Cell in modern x86 gaming code. The key is going to be the GPU, not the CPU. Until we know the details of what GPUs will lie in the next Xbox or PS4, there isn't much point discussing the CPU side as we've seen with PS3 vs. Xbox360, a superior Cell CPU on paper accounted for squat as PS3 fell flat on its face 90% of the time, being GPU limited anyway.
This is a
theme that persisted for most of PS3's life in the last 6 years:
""Both games aim for 30 frames per second, dropping v-sync if the target is not met - and it's immediately apparent that it's the PS3 version that has the most issues in maintaining that goal." Even the best CPU in the world cannot save you if your GPU is slower and it showed with PS3 vs. 360 in prob. 90% of console ports.
Whichever console has the better GPU setup next generation will have the best graphics most of the time. The key to best graphics is a discrete/dedicated GPU which is why I am going for a discrete GPU being a must for next generation consoles unless MS and Sony plan to sell them for $299. Not going with a discrete GPU is suicide because if the competitor goes with a discrete GPU, your console with even AMD's best APU is toast. That's a lot of risk to take. You could end up with 400 Shader Trinity going up against a 1280 SP Pitcairn and then you might as well quit unless you are Nintendo.
I can see the GPU and CPU on the same package as Wii U but it won't be an APU (i.e., CPU+GPU on the same die like IVB or Trinity) exclusively. AMD's current generation APU is just not fast enough at 384 Shaders to last another 6-7 years. Another reason it won't be exclusively an APU is because the Wii U never used one. If the Wii U never used an APU and knowing how cost conscious Nintendo is, I am sure they evaluated the possibilities, then it's doubtful MS and Sony will take this compromising approach. And finally, probably the most important reason it won't be exclusively an APU is that AMD has nothing between
Trinity and Kaveri. Kaveri is only expected to have 512 GCN shaders which is still worse than even an HD7770. It's also supposed to have Steamroller CPU cores but that CPU is unlikely to launch in 2013 based on current rumors of delays to 2014.
MS could just as easily go with a more advance PowerPC architecture like Nintendo did.