HD6850 uses around 100W at load.
HD6870 uses around 130W in a game.
Source
If they shrink either of those designs to 28nm, you could easily get GPU way more powerful than the low-end HD5770 and still keep GPU power consumption under 100W.
They don't need anything that powerful. When you program a game for a console, you know that every single console will have the exact same hardware. This lets you eek out a lot of additional performance because you can target very specific hardware. Look at a game like Uncharted 3. It looks amazing. Now look at the hardware that's running it.
That's wishful thinking. Bulldozer hasn't even launched on 32nm. AMD has had way too many problems trying to get a Bulldozer core out on its own. So suddenly you think they'll be able to add a GPU inside a Bulldozer in 2012 on 32nm? Btw, trinity is still ways out (prob. won't launch until 2013); that's if it doesn't get delayed.
Trinity is scheduled for H1 2012. Whether it slips remains to be seen. The next generation of consoles from Microsoft and Sony are a ways from being launched. Next year at E3 is probably the first time we'll hear about either and they won't ship for another year after that. Plenty of time for GFs process to mature, assuming that they have use Global Foundries to make their chips.
Bulldozer is a 32nm CPU. Next generation GPUs are going to be 28nm. How do you do a die shrink on a 32nm CPU and a 28nm GPU? You'd need to move both to 22nm. That's not happening until 2013 (if taking both into account).
If they did build an APU, it would be from the ground up. Anything that they make will be a custom design. It could even be some kind of strange design where it's a BD module that has 4 BD cores fed by a single front end. It's not going to be an existing product and whatever they might make will be designed for a single process, most likely the 28 nm TSMC process because that will be the most mature at the time.
You missed my point entirely. A $500 console costs that much because the components inside of it are expensive. If they put a $50 GPU into the console, what exactly is going to cost $450 extra? Blu-Ray drives no longer cost $230 (the price of a Blu-Ray drive when PS3 debuted). They can squeeze a $120-150 GPU design into that price. So it makes no sense to put such a cheap GPU in 2012 into PS4 or Xbox3. Also, please explain why I wouldn't put an HD6800M into a PS4 instead? My $400-500 console budget easily suffices for that; and that GPU doesn't break any power consumption limits.
As Sony found out, having a $600 console kills sales quite a bit and as Microsoft found out trying to cut corners can also blow up in your face. These consoles won't need high-end graphics cards. They're going to be used to run games at 60 fps at 1080p. That doesn't take a lot of power.
If AMD could have easily put an HD5770 GPU inside Llano, then Llano wouldn't have debuted with a 400 SP 6620 GPU. It's obviously very expensive and not economically feasible on 32nm process. In fact, AMD wasn't even able to fit a full-fledged Phenom II X4 core with L3 cache together with a GPU. They had to use a small Athlon II X4 core to make sure their die size doesn't balloon.
It turns out that putting anything more than they did would have resulted in a system so bottlenecked that the extra SPs would have resulted in minimal performance gains. Additionally, you don't want to design big chips on a new process if you can help it. It really eats into the yields.
However, by the time Sony and MS are going to start building their new consoles, the processes will be more mature and the design constraints that limited Llano won't be in place.
How is it worth it? It'd be more cost effective to design a CPU + GPU on an embedded 2 die package instead. The costs of increasing die size with a modern GPU + CPU is exponential. Did you not see that Xbox360 had 5 revisions from 90nm process before it even became cost effective to shrink the 2 components under 1 package? If you start off with an APU design, you severely limit both your CPU and GPU performance right off the bat (not to mention your TDP envelope is not more than 150W). There is no reason console makers need to be constrained by these limits. Consoles are sold at a loss with game and accessory sales making up the difference over time. From that perspective, both MS and Sony will want to put the fastest possible CPU+GPU into the console. If you start with an APU design, you literally start with the slowest possible GPU you can put......Sure one of the consoles may choose to go that route, but it would be a huge mistake imo.
Not really. Manufacturing silicon isn't terribly expensive once you get the ball rolling. Also, since these a chips for a console, they don't need to be as large or complicated as the ones driving PC gaming rigs. A single chip makes the board design more simple and the cooling system more simple. Considering that the chip can be designed differently than current APUs and make assumptions such as a wide-ass memory bus connected to GDDR5 memory, there's no need that the current problems with APUs need to exist.
Microsoft and Sony sell their consoles at a loss and it really hasn't worked out that well for them. Nintendo has beaten both and probably made money on each console sale. Until we move to larger televisions, the performance target for consoles isn't moving beyond a certain point and the longer MS or Sony wait, they cheaper it will be to deliver that performance.
I don't expect Microsoft or Sony to release anything above $400. I think that they've both learned that there's a price ceiling for consoles and that going above it can really hurt sales. The only reason to surpass that point is because the initial stock of devices is extremely limited.
Let's not forget game developers have a say in this too. They have been learning how to utilize IBM's CPUs for the last 4-5 years on PS3 and Xbox360. They'd likely want a more powerful IBM CPU, but they can already leverage what they learned. If you start with Bulldozer, you are starting from scratch all over again.
Yes, just like they worked on an Intel chip for the Xbox and the Emotion Engine for the PS2 before that. They'll learn new architectures and like every console generation before them, they'll get better at utilizing the newest generation as time goes on. It's not as though they haven't gone through this before.