AMD GPU will power all 3 major next gen consoles?

podspi

Golden Member
Jan 11, 2011
1,982
102
106
Yes, very surprising. On the other hand, AMD is probably willing to play hardball to get business. Interesting that the PS4 might use a BD-based APU. I guess Sony actually wants to make some $$$ off of these this time around...
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
NVidia screwed themselves over not budging on pricing of their chips for the original xBox. Could have sworn the next gen playstation would be using nvidia based graphics though, perhaps that isn't the case any longer.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I believe Nintendo already confirmed that Wii U has an RV770 style GPU in it. Chances are that Microsoft is going to continue with AMD as well. That only leaves Sony as the "unknown". Since consoles have limited cooling space and the original Xbox360 was heavily criticized for being loud, I believe there will be an added emphasis on power efficiency/heat. In this case, both AMD 5000 and 6000 series chips are simply superior in terms of performance / Watt compared to GTX4xx/5xx series. Therefore, it only makes logical sense that AMD is the preferred supplier for consoles in the next generation.

In addition, NV's higher end chips need a more costly 320/384-bit memory bandwidth which will add significant costs to the console motherboard design. It would be far cheaper to implement a 256-bit memory interface with GDDR5, which again AMD provides.

Either way, it's far more important to see how powerful the GPUs will be, rather than what vendor will provide them. I mean if they put an HD6770 or an HD6550 as part of the APU into the PS4, I will not be impressed, regardless of how efficient that chip is. So I am waiting until exact specs are released. At least if all 3 GPUs will be provided by AMD, there will be no more fanboy arguments over which GPU is faster since you would be able to compare them across AMD architectures :)

It is unfortunate that the consoles are launching during the "stagnant GPU" time though. GPUs haven't really improved in performance that much since HD5870 was released in Sept of 2009. I am still hoping either Sony or Microsoft squeeze a 28nm Kepler or 28n HD7000 series into their 2012-2013 consoles, even if it is one of their mid-range offerings (given the limitations for power consumption). But that's probably unlikely considering development is way under way for both. I am guessing there is a 95% likelihood the next gen PS4/720 will have an HD5000 or HD6000 derivative GPU.
 
Last edited:

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Whoa IBM still makes CPUS?
And still makes more than enough money with that and their support contracts for it and the software.. they just got out of the consumer market while they still could - much higher margins elsewhere.

As for which chip will finally land in it - I'd think that pretty much solely depends on who offers the best deals ("you can save x$ per console or use this vastly superior architecture" - who wants to guess with what the decision makers will go?) - which as a consequence means nobody gets much money in that - look at how much AMD got out of their deals for the current-gen consoles.
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
this time round,i just hope the consoles have an output that can feed my 30" monitor
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,393
8,552
126
Whoa IBM still makes CPUS?

yup, consoles and big iron mostly.

power and powerpc chips are found all over the place. freescale does most of the embedded work, though.
 

Madcatatlas

Golden Member
Feb 22, 2010
1,155
0
0
Your refering to the Red ring of death ,RROD i think!?

Not sure what caused that, but id hate to see a Nvidia Bumpgate failure or a Nvidia card killer driver release wave in next gen consoles.

The fact of the matter is that AMD Radeon 5xxx and 6xxx series, and very likely 7xxx series, are superior to Nvidia Geforce in power usage/heat output / efficiency and ofcourse price/performance.

You dont have to be a genius to understand who gets the contracts with such clearcut advantages..

Ofcourse, things are not this easy, Nvidia might make Sony or MS a "deal they cannot refuse"...
 

tommo123

Platinum Member
Sep 25, 2005
2,617
48
91
Ofcourse, things are not this easy, Nvidia might make Sony or MS a "deal they cannot refuse"...

wouldn't that mean removing a ton of xtors that exist for CUDA reasons? it would need it wouldn't it to get it anywhere near AMD in thermals and power/watt?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Since consoles have limited cooling space and the original Xbox360 was heavily criticized for being loud, I believe there will be an added emphasis on power efficiency/heat. In this case, both AMD 5000 and 6000 series chips are simply superior in terms of performance / Watt compared to GTX4xx/5xx series. Therefore, it's only makes logical sense that AMD is the preferred supplier for consoles in the next generation.

In addition, NV's higher end chips need a more costly 320/384-bit memory bandwidth which will add significant costs to the console motherboard design. It would be far cheaper to implement a 256-bit memory interface with GDDR5, which again AMD provides.

I will have to disagree,

Nvidia can make a Fermi GF110 derivative chip with no 64bit FP support (consoles don’t need it) and 256bit memory that will have much better performance/watt and cost less than current GF110 chips.

And don’t forget that GF100/110 design is superior in performance/watt in DX-11 Tessellation and that makes them much more desirable for next consoles designs.

So the decision to use an AMD chip could be more about a better deal than performance/watt.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I will have to disagree,

Nvidia can make a Fermi GF110 derivative chip with no 64bit FP support (consoles don’t need it) and 256bit memory that will have much better performance/watt and cost less than current GF110 chips.

And don’t forget that GF100/110 design is superior in performance/watt in DX-11 Tessellation and that makes them much more desirable for next consoles designs.

So the decision to use an AMD chip could be more about a better deal than performance/watt.

I would assume that it's both, a better deal (it would have to be) and better perf/W. This is the first time I've seen anyone try and make nVidia's tessellation performance somehow translate into their chips being more efficient than AMD's.

Look at it this way. Both companies make chips that are capable of doing the job. AMD's are smaller and use less power. They are therefore, cheaper to buy and use.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I would assume that it's both, a better deal (it would have to be) and better perf/W. This is the first time I've seen anyone try and make nVidia's tessellation performance somehow translate into their chips being more efficient than AMD's.

Look at it this way. Both companies make chips that are capable of doing the job. AMD's are smaller and use less power. They are therefore, cheaper to buy and use.

Well, I believe next gen consoles will use Tessellation and in that department GF110 clearly has the advantage.
 

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
626
126
I will have to disagree,

Nvidia can make a Fermi GF110 derivative chip with no 64bit FP support (consoles don’t need it) and 256bit memory that will have much better performance/watt and cost less than current GF110 chips.
If this is true, then whey didn't Nvidia do this in the first place?
And don’t forget that GF100/110 design is superior in performance/watt in DX-11 Tessellation and that makes them much more desirable for next consoles designs.
I have never seen a chart that correlates this, got a link?
So the decision to use an AMD chip could be more about a better deal than performance/watt.
Key word is could. It could also be because the overall package is better, meaning cost/performance/watt.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
I believe Nintendo already confirmed that Wii U has an RV770 style GPU in it. Chances are that Microsoft is going to continue with AMD as well. That only leaves Sony as the "unknown". Since consoles have limited cooling space and the original Xbox360 was heavily criticized for being loud, I believe there will be an added emphasis on power efficiency/heat. In this case, both AMD 5000 and 6000 series chips are simply superior in terms of performance / Watt compared to GTX4xx/5xx series. Therefore, it's only makes logical sense that AMD is the preferred supplier for consoles in the next generation.

In addition, NV's higher end chips need a more costly 320/384-bit memory bandwidth which will add significant costs to the console motherboard design. It would be far cheaper to implement a 256-bit memory interface with GDDR5, which again AMD provides.

Either way, it's far more important to see how powerful the GPUs will be, rather than what vendor will provide them. I mean if they put an HD6770 or an HD6550 as part of the APU into the PS4, I will not be impressed, regardless of how efficient that chip is. So I am waiting until exact specs are released. At least if all 3 GPUs will be provided by AMD, there will be no more fanboy arguments over which GPU is faster since you would be able to compare them across AMD architectures :)

It is unfortunate that the consoles are launching during the "stagnant GPU" time though. GPUs haven't really improved in performance that much since HD5870 was released in Sept of 2009. I am still hoping either Sony or Microsoft squeeze a 28nm Kepler or 28n HD7000 series into their 2012-2013 consoles, even if it is one of their mid-range offerings (given the limitations for power consumption). But that's probably unlikely considering development is way under way for both. I am guessing there is a 95% likelihood the next gen PS4/720 will have an HD5000 or HD6000 derivative GPU.

These GPUs are based on a design that is in the wild. But they arent replica's. Meaning Nvidia or AMD could build a memory sub system around the GPU that works for the console makers. I believe the ATI chip in the xBox has a highspeed EDRAM attached to it. Or they can add or remove features that are desired or not needed.

Considering the console nature of video game development. I hope if AMD wins all three designs they improve tesselation and physics performance. Or we can simply forget great leaps forward on the PC side for a decade.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
If this is true, then whey didn't Nvidia do this in the first place?

Im talking about Consoles not PC or workstation, and as i have said before i believe next gen consoles will use Tessellation thats why i have mentioned it.

AMD cut off 64bit FP on HD68xx and installed 57xx series Memory controllers in order to get the chip smaller and have better performance/watt than 58xx series, same can be done in GF110/114 for console use.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Well, I believe next gen consoles will use Tessellation and in that department GF110 clearly has the advantage.

That advantage only comes into play in the GTX470/480/570 and 580. The lower end chips have no Tessellation advantage over the 6900 series because they have far fewer Tesselation engines. The problem is NV's chips with the Tessellation performance consume a lot more power to begin with. HD6950 2GB consumes less power than even the GTX560 Ti.

It gets even worse for NV from there as we go down the food chain. HD6870 consumes about the same power than the GTX460 but has superior performance to begin with. Alternatively, HD6850 has similar performance to the 460, but consumes less power.

Also your argument about removing double-precision and reducing bandwidth can also apply to AMD cards. AMD's current and last generation is simply superior in performance per watt.

From a cost perspective: if we assume either AMD or NV were to shift current generation of chips to 28nm, you'll get even higher yields with a much smaller AMD die. That will reduce your manufacturing costs as well.

AMD simply has superior performance per watt at nearly every price level below a GTX580. For the consoles this is probably the most important metric since fitting a GTX570/580 is just not an option based on their power consumption.

The other key advantage AMD has is a track record of implementing eDRAM which enhances AA performance. NV hasn't shown that they can do this (which I am sure they can). But, the reality is that AMD has a track record of successfully designing custom GPUs for consoles. What about the RSX? That's just a cut-down 7900/7950GT with half the memory bandwidth of the desktop version. NV didn't even bother doing any optimization for AA on that chip (so PS3 got stuck with a slower GPU 1 year out of the gate compared to the 360!).

I hope if AMD wins all three designs they improve tesselation and physics performance. Or we can simply forget great leaps forward on the PC side for a decade.

We are probably at least 3-4 full GPU generations away from being able to use PhysX and Tessellation properly in videogames. PhysX might as well not exist since it's a proprietary standard. That just leaves us with Tessellation. On that front, it would make little difference for PC game development if they put a GTX580 or an HD6970 in the PS4. NV needed 3 GTX480s to run their lighthouse tessellation demo....
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Well, I believe next gen consoles will use Tessellation and in that department GF110 clearly has the advantage.

Isn't the tesselator a separate part of the graphics chip from the core shaders, though? So if tesselation is an issue it shouldn't be too hard for AMD to put an improved tesselator in the chip.