Could the next Xbox use an AMD APU?

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
No rumors or anything, just an idea for discussion.

The performance of Llano is clear and, despite slow performance on the CPU side due to outdated architecture, the graphics performance is extraordinary when compared to traditional integrated graphics or even Sandy Bridge's HD Graphics 3000. If equipped with a newer, better CPU architecture, do any of you think an AMD APU would be a good choice for the next Xbox?

The benefits of the AMD APU are:
Cost-effective due to the components being on one die, and being from the same designer. While not a top priority in a high-end gaming PC, it is important when trying to make a gaming console the average consumer can afford.
Power-efficient -- very important considering the 360's record of failing due to overheating, and being noisy in general.

The downsides are:
Weak CPU architecture (which again, could be resolved by a new architecture. C'mon, Bulldozer...)
Low memory bandwidth. However, this may not be as much of an issue as on a PC. The Xbox 360 already shares its 512 MB of GDDR3 memory between the Xenon processor and Xenos graphics chip. On a fixed system like a gaming console the system memory can be optimized for gaming. To further remedy the issue, a eDRAM chip can be added as a daughter die to the APU similar to how the Xenos has 10 MB of eDRAM.

I think it's very possible that the next Xbox will use an APU, if not one entirely designed by AMD. Microsoft certainly sees the benefit to such a design even now -- the current Xbox 360 S SKU has the processor and GPU integrated onto the same die.
 
Last edited:

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
I could see fusion being on the next Xbox...
But it would probably be a custom design, owned by microsoft, based on existing AMD designs.
 

(sic)Klown12

Senior member
Nov 27, 2010
572
0
76
My guess is that the only way this would happen is if Microsoft licenses a GPU from AMD and a CPU from IBM and works with AMD to get it the GPU and CPU as seamless as possible. Then they take the finished design to whoever they can find to fabricate for the cheapest. I really can't see MS going back to an x86 design after thier first expierence.
 
Dec 30, 2004
12,553
2
76
fusion has no place in the consoles IMHO. Just doesn't fit. It's for low power devices. Not 250 watt monstercities.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Current Fusion chips, with a GPU no bigger than a Radeon HD 5670. The GPU in a console APU would be much more powerful. 250 watts is small potatoes compared to what high-end or even mid-range PCs can churn, so power efficiency would be very much desired.
 
Last edited:

Via

Diamond Member
Jan 14, 2009
4,670
4
0
I don't know why console makers don't come up with some sort of upgradable GPU. Just make it a slot-type deal that makes the GPU easily exchangeable.

Programmers could then allow for different settings like they do on the PC.

And Microsoft, Sony, and Nintendo would make billions of dollars more off of the console crowd over the life of the system.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
I don't know why console makers don't come up with some sort of upgradable GPU. Just make it a slot-type deal that makes the GPU easily exchangeable.

Programmers could then allow for different settings like they do on the PC.

And Microsoft, Sony, and Nintendo would make billions of dollars more off of the console crowd over the life of the system.

exactly, like a PCIe external slot such as some laptops have.

And it wouldnt be the first time consols have been upgradeable the N64 had expandable memory through memory packs.

There is a huge multi million dollar market for this if implemeted correctly.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Backwards compatibility would be my guess. With game consoles any user-dependent variables could be disastrous.
 

Via

Diamond Member
Jan 14, 2009
4,670
4
0
You're probably right, but in a closed system it seems like that wouldn't be too had to control.

Besides - we deal with that on a daily basis as PC gamers, and we still love our platform.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Yeah, that's why we have PCs. (Most) console gamers just want to plug in the console to a wall socket and TV, turn it on, pop in the disc and play.

Anyways, it's moot point right now since none of the current consoles have provisions for hardware expansion past USB ports.
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
they would never allow user changeable graphics period. Unless it is a simple change (like 512MB to 1GB for higher resolutions) The fact of the matter is that consoles are meant to STANDARDIZE games, it is a LOT easier for companies to optimize a game for 2 consoles (PS3 and Xbox360) then an almost unlimited combination of PC parts. If you add in user upgradable GPU's to the equation you have just added more things the programmers will have to do and that will add money to the game, time to the development cycle, and piss off the gamers who have the "standard" console who cant play games that require the stage 3 turbo GPU in order to play their new game for their own console.
 

Via

Diamond Member
Jan 14, 2009
4,670
4
0
We'll see.

Who knows? The next gen of console might be more "PC-like".

Besides - the simple act of increasing GPU power shouldn't affect standardization.

I assume it's the short-sighted suits up top who can't move their brains outside of a simple box that would never give the financial go-ahead to something like that.
 

mnewsham

Lifer
Oct 2, 2010
14,539
428
136
We'll see.

Who knows? The next gen of console might be more "PC-like".

Besides - the simple act of increasing GPU power shouldn't affect standardization.

I assume it's the short-sighted suits up top who can't move their brains outside of a simple box that would never give the financial go-ahead to something like that.

It isn't just increasing GPU power, when you have the 6770 vs a 6970 it isnt just a simple increase in core clock and memory to get this it is a completely different design. In the end they are trying to build them cheap, and build them fast. It IS a business after all.
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
The performance of Llano is clear and, despite slow performance on the CPU side due to outdated architecture, the graphics performance is extraordinary when compared to traditional integrated graphics or even Sandy Bridge's HD Graphics 3000. If equipped with a newer, better CPU architecture, do any of you think an AMD APU would be a good choice for the next Xbox?
When is the next Xbox supposed to come out? 2012 or 2013? If later rather than sooner, and if AMD's plan to release Trinity sooner rather than later pans out, it won't be Llano that would be considered for console duty, and that would "fix" your Llano CPU performance concerns.

Don't really have an opinion whether it would be good for next-gen consoles or not because it is not yet out. It is supposedly an improvement in both CPU and GPU, so who knows.

Whatever goes into next-gen consoles, I hope it will be a big big jump, though, so I may not be rooting too hard for even a Trinity APU. Better console graphics will go a long way to lifting the minimum standard in graphics in PC games as well, so that would be a win for both console and PC gamers.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
**** IF ****

Microsoft wanted to give Xbox, windows 8 (or something)..... I could see it useing a APU.

It would be a gameing consol, but it could be market "all you need" device.

For when you arnt gameing, you could load up windows, and use it for "Word/excel" stuff, or browseing the internet ect.

1 box, able to play both pc games, and xbox games.
Able to steal "pc" sales, due to browseing capabilities, along with word ect.
(due to it running a actual windows OS)
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
No rumors or anything, just an idea for discussion.

The performance of Llano is clear and, despite slow performance on the CPU side due to outdated architecture, the graphics performance is extraordinary when compared to traditional integrated graphics or even Sandy Bridge's HD Graphics 3000. If equipped with a newer, better CPU architecture, do any of you think an AMD APU would be a good choice for the next Xbox?

The benefits of the AMD APU are:
Cost-effective due to the components being on one die, and being from the same designer. While not a top priority in a high-end gaming PC, it is important when trying to make a gaming console the average consumer can afford.
Power-efficient -- very important considering the 360's record of failing due to overheating, and being noisy in general.

The downsides are:
Weak CPU architecture (which again, could be resolved by a new architecture. C'mon, Bulldozer...)
Low memory bandwidth. However, this may not be as much of an issue as on a PC. The Xbox 360 already shares its 512 MB of GDDR3 memory between the Xenon processor and Xenos graphics chip. On a fixed system like a gaming console the system memory can be optimized for gaming. To further remedy the issue, a eDRAM chip can be added as a daughter die to the APU similar to how the Xenos has 10 MB of eDRAM.

I think it's very possible that the next Xbox will use an APU, if not one entirely designed by AMD. Microsoft certainly sees the benefit to such a design even now -- the current Xbox 360 S SKU has the processor and GPU integrated onto the same die.
It probably wouldn't be a good idea because it would limit how powerful the CPU and GPU could be. It would be better to just have the CPU and GPU on dies seperate from each other.

I don't know why console makers don't come up with some sort of upgradable GPU. Just make it a slot-type deal that makes the GPU easily exchangeable.

And Microsoft, Sony, and Nintendo would make billions of dollars more off of the console crowd over the life of the system.
Sega kind of tried something like that and it didn't work out. If it uses a whole new GPU, then it's not really the same console anymore.

I'm never going to be a console gamer again, largely because they lack the gaming feature set of PCs. But for most people who only play consoles, simplicity is the beauty of consoles. However, that's a downside for me.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If equipped with a newer, better CPU architecture, do any of you think an AMD APU would be a good choice for the next Xbox?

If it's x86, they will lose BC. Current x86 CPUs don't look all the great running native code against the now six year old PPC offering(not that they aren't better, just not all that much in relative terms), they have no hope of emulating it with x86 properly for several years at least(likely a decade or more, the SNES just now got a truly accurate emulator- although using hacks and what not you can get partial emulation much easier).

I don't know why console makers don't come up with some sort of upgradable GPU.

It would be shockingly dumb to do that. Listen to Carmack's latest keynote, he discusses how difficult it is to get decent performance out of the PC because you must deal with so many abstraction layers. On the console side, you know exactly how your code will run at any point. If you added differing hardware configurations, you bring the enormous downside of PC development in to the console space. PCs deal with this by throwing more and more power at it along with significantly higher development costs(for like results). You put that anchor around the neck of the consoles and they may perform close to as poorly as PCs with comparable specs(which would be a rather huge downgrade).
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
If it's x86, they will lose BC. Current x86 CPUs don't look all the great running native code against the now six year old PPC offering(not that they aren't better, just not all that much in relative terms), they have no hope of emulating it with x86 properly for several years at least(likely a decade or more, the SNES just now got a truly accurate emulator- although using hacks and what not you can get partial emulation much easier).


It would be shockingly dumb to do that. Listen to Carmack's latest keynote, he discusses how difficult it is to get decent performance out of the PC because you must deal with so many abstraction layers. On the console side, you know exactly how your code will run at any point. If you added differing hardware configurations, you bring the enormous downside of PC development in to the console space. PCs deal with this by throwing more and more power at it along with significantly higher development costs(for like results). You put that anchor around the neck of the consoles and they may perform close to as poorly as PCs with comparable specs(which would be a rather huge downgrade).
BC doesn't matter, MS and Sony have alkready shown us that with the current generation. MS has simply had poor BC all generation, and Sony REMOVED BC from their products. I don't see how you can factor it in as a major influence on decision making when BC hasn't been a focus of design recently anyway. Sure, it's a nice bonus, but losing it or not having it shouldn't really be an issue.

And as far as the earlier post on APUs not being for 250w systems. Guess what, 250w is a lot of power and more than most people were anticipating for new consoles, and it was a bad thing that they used so much power. For the manufacturers, low power is better, because it means lower costs in many areas. If there was an APU which gave 90% of the performance at 100w of something at 200w, they would probably want to use the 100w product over the 90w product, assuming it was "fast enough". Given that current "HD" consoles aren't even, again console makers have no problems with ifgnoring what are perceived product focus points and doing what ends up being best for them.
 

Via

Diamond Member
Jan 14, 2009
4,670
4
0
It would be shockingly dumb to do that. Listen to Carmack's latest keynote, he discusses how difficult it is to get decent performance out of the PC because you must deal with so many abstraction layers. On the console side, you know exactly how your code will run at any point. If you added differing hardware configurations, you bring the enormous downside of PC development in to the console space. PCs deal with this by throwing more and more power at it along with significantly higher development costs(for like results). You put that anchor around the neck of the consoles and they may perform close to as poorly as PCs with comparable specs(which would be a rather huge downgrade).

Why does everyone equate more GPU power with a completely different hardware configuration? It's not even remotely similar.

If anything, it's PSU draw that prevents this.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
the only way i could see this happening is if they do one of the following:

1. create an 800+ core graphics side for the "fusion" chip
2. crossfire one of their current 400 core apus with a 400 core seperate chip

you never know, though. amd might be able to create a serious gaming apu at this point. all it would need is some gddr5 and a ton of stream processors along with a nice hexacore. i think they could do it. it would be ambitious but it would be nice.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Last generation consoles had a GPU that was close to the current top end GPUs on the PC at the time.
AMD will need to significantly increase the processing and graphics capability of their APU's to make it happen.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I'd argue that the PS3 showed that BC _is_ important -- to developers, in terms of code libraries, engines and developer experience. By forcing developers to work with a new and unfamiliar design they gave the 360 even more of a head start.

360 devs now have 6 years of working with PPC cores, with millions of lines of code already written for it. Telling them to switch to x86 next week might go well if they have no assembly code and the x86 behavior is close enough not to break their C/C++ code.

They also have 6 years of working with the 360's AMD GPU. If the new GPU can transparently emulate that behavior while offering 4x - 8x the power for new code written for it, the developers can drop in their existing Unreal 3 engine code and finally crank up the settings to 1080p with more eye candy.
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
it's not as though developers don't have experience developing for x86, though. look at all the pc games. there are a ton of cross-platform games out there.

there's a learning curve involved with any new console.

is ibm even still developing the ppc platform?
 

RobertPters77

Senior member
Feb 11, 2011
480
0
0
No. Unless Amd makes a PPC Bulldozer 'type' Apu. Because MS does not want people buying Xboxs and installing windows on them, Because A. They will probably take a loss per console sold. And B. Which will directly affect software sales and as such license fees per copy sold.

Now it took Amd 5 years to design BD and Llano, How long do you think it will take for them to design a PPC type BD Apu? Maybe 2-3 years tops.

Or... Amd & MS are in a 'Cabal' of sorts because MS got tired of waiting for Intel to make a decent instruction set for X86, and went with Amd, which forced Intel's hand in selling cpu's to apple, Which pisses off MS, and that is when Amd got the go ahead from MS to buy out Ati, because MS could not renegotiate with Nvidia and Intel to cut the cost of the Chips for the Original Xbox and as such they kept losing money on that console even after EOL. And MS was worried about an unfriendly '3rd' party buying out Ati and either charging out the Ass for the Xenos chip in the 360, or terminating the Contract and forcing MS to go crawling back to nvidia.

And... Amd already has the Cpu+Gpu contract for the next Xbox and they have the PPC Bulldozer Apu already taped out and going into mass production for MS.

So MS and Amd are best friends as of now. And is MS could they'd buyout Amd and cut the cost of Xbox chips by a shitload. But MS does not want to piss off intel so they aren't doing that yet.

Or... MS and Amd are working on their own instruction set to displace x86 both in performance and power consumption.


But that's just a conspiracy theory.
 

rangda

Member
Nov 20, 2006
60
0
0
I don't know why console makers don't come up with some sort of upgradable GPU. Just make it a slot-type deal that makes the GPU easily exchangeable.

Because it takes away one of the biggest strengths of the platform, stability. Developers are drawn to consoles because they are all the same, write your game for console X and it's guaranteed to work on all of them. If you suddenly have console's X.1 and X.2, you are either writing games for both of them or picking one of the other. What you'd end up with is a small percentage of games supporting the .2 hardware, and doing so in ways that are very trivial to do, i.e. the same thing we have now with so many PC games being quick & dirty console ports. Of course you'd see a few showcase games on the .2 hardware but the majority would choose the easy path just like they do now.

It would also have a pretty big impact on hardware costs. Part of what makes consoles cheap to make is that they are (in theory anyway) built in a state of equilibrium. Just enough cooling to cool what you've got, a case just big enough to hold it, etc. If you have to worry about future hardware upgrades you have to worry about things like overspec'ing the PSU to power it, making the case larger to accommodate the slot, adding a new connector to the board, allowing for increased air flow etc. In short you drive the hardware costs up. In an environment where the hardware is sold at a loss for years this is a very bad thing.