But in the context of nV vs. AMD offerings, how do you justify putting a GTX460 instead of an HD6870 or say a GTX560 instead of an HD6950 2GB?
None of those parts are going to be used as they are produced today in a next gen console. There is zero chance. The largest chips we will see will be 28nm for the HD twins next generation.
Tessellation brings a major performance hit in games.
Tessellation brings a major performance hit in
PC games. This seems to be a major issue for non gamers, or PC exclusive gamers, to wrap their heads around. Tesselation if built in with proper support on the consoles is likely to be used for
everything it reasonably can be, which is a significant portion of scene data in any game. In the PC space we haven't seen anything that takes this approach or anything close to this approach in games. Console devs use the tools they are given far more effectively then the best PC devs could ever hope to, using Carmack's guidelines you can expect a x00% performance boost on consoles doing the same tasks, 100% being the low side. Highly optimized code designed to run on one exact platform will always significantly outperform generic code, always.
AMD dont supply MS with any chips, they sold the design and get royalties for each 360 sold. Putting the blame at AMD for MS problems with there gpu is pure BS.
nVidia has never made a GPU, TSMC does. Using your precise logic bumpgate wasn't nVidia's fault. I honestly don't see the RRoD issue as being AMD's issue, it is the XClamp that fails(fixed enough of them to know)- but if you blame nVidia for bumpgate and you have any integrity, you have to blame AMD for the RRoD issues and the billions of dollars of losses that it caused.
My hope is that the console designers are aiming to get 28nm parts. If that's the case, then AMD has a better track record of rolling out new designs at a price point that's palatable for inclusion in a console- say $125 for the graphics core or thereabouts.
Your pricing is off by ~$100. Console makers are spending the majority of their costs for the entire system on one chip, zero chance of that happening. People should keep that in mind when discussing these things. Is it worth it for the companies to spend the R&D on these chips given the potential RoI.
I am guessing there is a 95% likelihood the next gen PS4/720 will have an HD5000 or HD6000 derivative GPU.
Forgot to mention this one earlier, there is somewhere between 0% and 0% chance of that happening. The next gen consoles will not launch with current parts.
The 8800 series launched less then a week before the PS3, the RSX was based on the core that was nVidia's highest end offering a week prior to it shipping(it was scaled down). There is no chance of the next PS or XB hitting before the end of next year(and even then the odds are very low), the more likely scenario is 2013 or 2014. There is absolutely no chance whatsoever that the next gen consoles are going to be using current GPUs, none.