The 360 fiasco definitely was a big part in this. It clearly showed the highest TDP that they can reliably fit into a console sized package- namely, slightly lower than the launch XBox 360. Remember that even the hottest original incarnation of the 360 only drew
180W while gaming- tiny compared to a modern PC PSU!
Well, right, but an a8-7600 can go as low as 45W, which includes CPU and GPU in one die . . . mind you, that is one third as many GCN cores as what you get from the PS4 chip, at a lower clockspeed. They totally could have done 3x A8-7600 in a 130W power envelope, and those chips run pretty cool too. Whether or not they would have wanted to deal with a 3P monstrosity is an entirely different matter.
While console power consumption has fallen since last generation, gaming PC power consumption has massively increased. The entire PS4 only consumes 140W at most, less than the TDP of a single HD 7870- and less than half the power consumption of an R9 290X.
Gaming PC power consumption has increased massively on the GPU side (ignoring the latest Nvidia offerings). For anyone running an APU, power consumption is quite low, and you can still run at decent resolution/quality settings with a Kaveri chip. Based on what AMD has been able to do on the PC with the Kaveri, it almost looks like the PS4 and Xbox1 have overbalanced their APU by using too many GCN cores. 1152 shaders? Yowza. That's got to be a huge slice of the 140W power consumption figure you listed.
TDP was never an issue. Cooling have also improved dramaticly since plus throttle behaviour if it should overheat.
Well yes, but my claim was that it was more of a psychological issue than a technical issue. Some executive proclaims "no more thermal hardware failures! AND we're spending less on components this time!", and viola, the engineers roll their eyes and spec a lower overall TDP so they can go in front of the brass and claim to have less power consumption, lower temperatures, and lower hardware failure rates without having to improve the cooling solutions.
I have a feeling they went with the 8 slower cores due to PR. A 2M/4T chip just wouldnt look good on paper when you had a 3C/6T and an "8 core" Cell before. The cores are also so weak that we see a whole 2 cores dedicated to non gaming.
That may well have been a factor. History has shown us many examples of engineering decisions being made in the PC sector base on marketing hype (such as the old "mhz myth").
Its a real shame tho, because it impacts games in such a negative manner.
I'm still a little confused as to why Ubisoft is having so much trouble, though that issue is being well-discussed elsewhere, so I'll leave others to that.