This has got to be one of the most biased threads I've ever read on AAT. Even if the compnay uses the 6670, it is going to do WORLDS better than a 6670 in today's computers. Game devs won't need to worry about compatibility with differnet drivers or brands, or how a game is going to scale, but instead can focus on making the game work for THAT specific piece of hardware.
We aren't comparing a PC with HD6670 vs. a console with HD6670. Of course a console will run games much faster since developers optimize for the console. What we are talking about is PS3/360's GPUs relative to PCs at their respective launch dates had roughly mid-range to upper-mid-range GPUs in them. I already linked in this thread that MS used a
$141 R500 graphics card from NV in their $399 Xbox. The "equivalent" to that GPU would be at least an HD7850 by end of 2013. This is the main reason people are disappointed. It's a matter of perspective. HD6670 level of performance, even HD7750 level, can already be had for $50-80 today on Newegg. Based on these rumours, MS is barely spending any of their Bill of Materials budget on the GPU. If these next gen consoles will also last 7-8+ years, this will directly impact all PC gamers. The faster the GPU is in the consoles, the better it is for us not because we are "competing with consoles" but because it would allow more developers to make much better looking games because 100 million installed console user base will be able to play them.
Again, this isn't about consoles vs. PCs or being biased against consoles. It's about how such a slow GPU in consoles may negatively impact development of games overall, which in turn
will affect us. Think about it, it wasn't long before PS3 and Xbox360 started to limit game engine development and adoption. It would be absurd to deny this. This is another reason why even though HD5870 was DX11 capable in Fall of 2009, it's taking almost 3 years for DX11 games with real DX11 features to come out. The few developers that have put effort into next generation graphics are those who wanted to make a name for themselves (Crytek, CD Projekt) or those with $$ trees in their office (Dice) or small passionate groups of programmers who picked up a publishing license (STALKER: COP, Metro 2033).
If next gen consoles start out of the gate with gimped graphics hardware, it will for sure affect us PC gamers negatively since developers aren't going to be likely to include as many next generation graphical killing features such as Bokeh DOF, Tessellation since they will cripple an HD6670 in a console.
We have already seen that most games are developed with the lowest common denominator - i.e., current consoles. This is why we are discussing why this may be very bad for us.
The problem is that it would stiil be lame by new console standards. When the last console gen came out, they used top end GPUs -- castrated to a 128 bit memory bus width, but still top end. To match that pattern the 720 would need something like a 6970. A 6670 pales in comparison.
I couldn't have summarized this better. I wrote a huge post and you put it down succinctly. :thumbsup:
=============
"According to developers, Microsoft has shipped versions of the new Xbox hardware, due at the end of 2013, to programmers to start coding games for the system.
What they’ve apparently seen is a console with a 16-core IBM PowerPC CPU, a massive jump from the three-core CPU in the current Xbox, as well as an AMD Radeon HD 7000-series GPU. It also comes with a built-in Blu-ray player. The next version of Kinect supposedly requires 4 of those 16 CPU cores, so we know one reason why Microsoft has gone core-happy." ~
Source