Next Gen XBOX to use AMD 28nm CPU/GPU Fusion chip - Rumor

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
I think it's more like DX11 is important, the x86 part isn't as big a deal to port. Although I think MS will do both effectively making cost of porting to PC nil. This would be an excellent move, it's like a DX11 pc in a box. I just wish they can open up the peripheral upgrade options, so people can just add any HD or Blueray drives as they wish. But somehow I doubt MS will allow it.

This would be nice, but despite what they say, Microsoft seems to be doing everything they can to minimize PC gaming. I dont think they want it to be easy to port to the PC with full functionality. If you could do that, they would be afraid of hurting xbox sales. I think they would rather have the current situation in which most PC ports are really lame and give very little reason to play on the PC.
This is kind of off the subject, but I am a dedicated PC gamer and have never owned a console except the Wii. So I would love to see PC gaming make a strong showing. However, I must admit that with the current games that are coming out, if I were to start over for gaming only, I would buy an xbox and rent the games.
 

Vette73

Lifer
Jul 5, 2000
21,503
9
0
If I remember correct the first Xbox was supposed to have a AMD chip. Right up till that were about to start they then went Intel as intel under bid them a lot.

So until its on retail shelfs with that chip you never know.
Right now Intel and Nvidia are making up, so could be Intel has plans as well since they were locked out of the current units.
 

Schadenfroh

Elite Member
Mar 8, 2003
38,416
4
0
Anyone else tired of there always being 3 consoles available? Sure wish one of them would just drop out. Talking about you, Nintendo. With Move and Kinect out you're no longer relevant, please ditch the hardware and just focus on making quality games.

I dislike three consoles as well, but it seems that it would be much more logical for either Microsoft or Sony to drop out of the business considering the amount of money that their xbox1 / 360 division and ps3 division have hemorrhaged.

360 / PS3 are both perceived as catering to "traditional" gamers and the Wii (and by extension Nintendo) caters to the casual market.
 

sandorski

No Lifer
Oct 10, 1999
70,778
6,338
126
I dislike three consoles as well, but it seems that it would be much more logical for either Microsoft or Sony to drop out of the business considering the amount of money that their xbox1 / 360 division and ps3 division have hemorrhaged.

360 / PS3 are both perceived as catering to "traditional" gamers and the Wii (and by extension Nintendo) caters to the casual market.

All 3 Consoles should drop out. :awe:
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
he was joking, darkly.

the term is half joking... he is slightly exaggerating... but only slightly.

@OP: if the next gen xbox has a fusion APU then it is quite a step down in its relative performance. The worlds first true fusion CPU was actually developed by MS (but wasn't an original design)... it was the fusion of the CPU and GPU from the xbox360 (along with a special component designed to artificially and intentionally introduce communication delay to make sure its not faster then previous xbox360 designs). The AMD and intel fusion chips are their LOW end chips, their high end products are separate to allow much greater performance. If the next gen xbox is indeed a fusion chip it will be all that much weaker at launch, I have to say I find the notion disappointing.
 
Last edited:

Joseph F

Diamond Member
Jul 12, 2010
3,522
2
0
the term is half joking... he is slightly exaggerating... but only slightly.

@OP: if the next gen xbox has a fusion APU then it is quite a step down in its relative performance. The worlds first true fusion CPU was actually developed by MS (but wasn't an original design)... it was the fusion of the CPU and GPU from the xbox360 (along with a special component designed to artificially and intentionally introduce communication delay to make sure its not faster then previous xbox360 designs). The AMD and intel fusion chips are their LOW end chips, their high end products are separate to allow much greater performance. If the next gen xbox is indeed a fusion chip it will be all that much weaker at launch, I have to say I find the notion disappointing.

I haven't heard of this "lag generator" in between the cores before. Why the hell would they want to make the new xbox the same speed as the old one when it wouldn't cost any more money. (it probably costed more money in development to design this delay device.)
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I haven't heard of this "lag generator" in between the cores before. Why the hell would they want to make the new xbox the same speed as the old one when it wouldn't cost any more money. (it probably costed more money in development to design this delay device.)

I forgot where I read about, but it is true - they had to add a chip that would add the delay between the CPU and GPU to keep older games working properly.

Console are set functions, you can't increase/decrease performance on a whim unless you're willing to take the negatives of doing so (see PSP clock boost from 222mhz to 333mhz)

In newer games it was required (ala God of War: Chains of Olympus) but in older games (Lumines I believe was one of them) it cause synching issues. Sony had to go back and patch the firmware to force games before a certain date to revert back to 222mhz.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I doubt being slightly faster will break compatibility with older games, and besides, you could very easily test every single older game out there and program the new consoles to specifically run slower for them (since there is a finite number of older games). On occasion you even have games lag on their native console

I believe this is more to ensure newer games developed now work equally well on older consoles despite being developed and tested on newer consoles. Also there is the whole marketing issue with consoles, remember that console users believe that the console is "fixed hardware" despite there being constant (and fairly large) hardware revisions. It isn't fixed hardware at all, it is merely similar hardware with a "fixed performance", that is, they intentionally do not increase performance on newer consoles, going as far as including a special lag generating device (it isn't a chip, it is a portion of the CPU+GPU chip, remember they are now a single die). If you bought an xbox360 on day1, or if you had the second, third, fourth or fifth hardware revision you are guaranteed good (and thus equal) performance with the latest games.
 
Last edited:

omek

Member
Nov 18, 2007
137
0
0
I've read that the next gen consoles aren't going to premier for another 5 years.

This is why PC gaming will and did not 'die'. It now >again< brings something new, much better and evolved to the table - reminiscent of those frantic fanboy 'PC gaming is teh ded' articles a few years ago. They are about to have their asses poached by the dying travesty that is the 360.

Anyhoo, fuck consoles. They exist to monopolize on the market now (despite the originality of consoles in the 90's). People will be pushed back to the PC when they realize the 360 now renders like a Tonka toy.
/60&#37; drunken post


Cussing is not allowed in the technical forums.

Moderator Idontcare
 
Last edited by a moderator:

taltamir

Lifer
Mar 21, 2004
13,576
6
76
heh, PC gaming has been declared dead every year since the 1980s. its here to stay
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
I doubt being slightly faster will break compatibility with older games, and besides, you could very easily test every single older game out there and program the new consoles to specifically run slower for them (since there is a finite number of older games). On occasion you even have games lag on their native console

I believe this is more to ensure newer games developed now work equally well on older consoles despite being developed and tested on newer consoles. Also there is the whole marketing issue with consoles, remember that console users believe that the console is "fixed hardware" despite there being constant (and fairly large) hardware revisions. It isn't fixed hardware at all, it is merely similar hardware with a "fixed performance", that is, they intentionally do not increase performance on newer consoles, going as far as including a special lag generating device (it isn't a chip, it is a portion of the CPU+GPU chip, remember they are now a single die). If you bought an xbox360 on day1, or if you had the second, third, fourth or fifth hardware revision you are guaranteed good (and thus equal) performance with the latest games.

I completely agree with your post and I failed to think of that, however, going with the trends I see here I will refuse to admit I agree, call you a 360 romantic, and claim PS3 is better at grating cheese :eek:

My biggest question is which vendor Sony will go for with the PS4. Maybe it will go back to its old methods and not even use a standard dedicated GPU (PS1/PS2 didn't use anything from nVidia/ATI.)

But, I'd feel doing so will throw more monkey wrenches into their SDKs as the Cell did this time around (the RSX was a last minute change.)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
heh, with the PS3 btw it is even more pronounced in some ways. Early designed had poor yields resulting in 1 of the cores on the cell being disabled on all PS3s. Yields are good enough now so that all cores can be enabled, but they choose not to in order to ensure equality with older consoles. Every PS3 out there has one disabled core, and for most new ones it is a perfectly functional.
Although they have made some changes, the original PS3 included a separate chip that was the PS2 GPU, used to run PS2 games, it was replaced with software emulation in later models (that doesn't work as well)
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Except for Starcraft 2, apparently. After factoring in two expansions we're looking at ~= $140?
...and more like 20-40 hours single player, not counting replaying on different difficulties and branches. Assuming it's the same per episode, that ends up being a decent value. <=4 hrs? That wouldn't be worth paying for at all. DLC is fine, but needs to be truly added value, not something to make a crippled game whole.

Personally, I would be surprised if they didn't go with PPC, again. While it ends up costing more for game development (they don't pay for that part!), the custom chips let them get good real-world performance at a lower per-unit cost, and get good perf/watt. I would, however, not at all be surprised at it having a direct connection to the GPU, either both hitting the same memory management hardware, or even so far as each being able to directly manage a shared cache, maybe even with the CPU being able to manipulate GPU register files.

Anyone else tired of there always being 3 consoles available? Sure wish one of them would just drop out. Talking about you, Nintendo. With Move and Kinect out you're no longer relevant, please ditch the hardware and just focus on making quality games.
Yeah, drop the one set-top console that's been tempting me. Nintendo can make people like me, that like blowing up things with plenty of gore, want to jump around as a pink puffball in a world of crotchet.

MS did their research, and got their me-too act right. Sony copied Nintendo too directly, and thus uselessly, and became the butt of bad sex toy jokes at the same time as being the brunt of good old, "you just don't get it," commentaries. Sony also has a history of screwing up.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Considering the close ties between IBM and AMD in terms of tech swapping, maybe it could be a PowerPC CPU based system with DX11 type graphics hardware on the same die? I don't see it as out of the question. MS could license the chip designs, get the separate teams to create this "Fusion" chip and just pay royalties to AMD and IBM. I'm sure AMD is marketing Fusion very hard to Microsoft, but there is no denying that hardware backwards compatibility is a must and going with a PowerPC system would make the transition to the new hardware much easier. 6 or 8 Power7 Cores running 3.0 GHz + 1440 or more Radeon 6xxx series type SPs would be pretty sweet with a 128 bit bus connected to 4 GB of XDR2 main memory. 256 bit buses are too expensive and difficult to lower the component size and manufacturing cost, so 128 bit is (and often has) a proper choice, especially when XDR2 achieves twice the bandwidth for the same width of memory bus when compared to GDDR5.
 
Last edited:

Acanthus

Lifer
Aug 28, 2001
19,915
2
76
ostif.org
Anyone else tired of there always being 3 consoles available? Sure wish one of them would just drop out. Talking about you, Nintendo. With Move and Kinect out you're no longer relevant, please ditch the hardware and just focus on making quality games.

Yeah, because that totally worked for Atari and Sega...
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
The high end fusion makes sense. A console can only have the power budget of a high end, desktop replacement laptop, and the top end fusion chips (in 2012!) could easily be pushing that. A quad (or octo?) core bulldozer, combined with a graphics chip in the 1 to 2 billion transistor range would make a nice console chip, comparable to today's top end pc hardware.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
My biggest question is which vendor Sony will go for with the PS4. Maybe it will go back to its old methods and not even use a standard dedicated GPU (PS1/PS2 didn't use anything from nVidia/ATI.)

Sony has a tradition of going for something out there and completely impossible to code for. With this in mind it wouldn't surprise me if they do something silly like just have a gpu and minimal cpu. i.e. have a basic arm cpu to boot the thing up and then expect the game to run just using a highly programmable gpu (something along the lines of nvidia fermi with it's c++ support, but not necessarily nvidia).
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
People are overestiamting the power of APU's.

Just like people on consoels have a very scewed idea of the perfroamnce/capabilities of their current console GPU's.

A PS3 has a 7800GTX for a GPU.
We are talking several generations of performance behind the PC.

Dosn't matter if it's on 28nm...it don't magically make the GPU any faster...just cheaper to produce.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
The only question remaining in my mind is whether they go for x86 or a RISC (PowerPC) architecture. The advantage with RISC is that it makes emulation harder, and also I know that MS is very keen for consumers to not be too mindful of the fact that a console is in fact a specialized computer. And in fact, its only really specialized in terms of its software and input peripherals.

Although perhaps if an IBM PowerPC CPU could deliver better performance per watt, they would just go with that.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
The high end fusion makes sense. A console can only have the power budget of a high end, desktop replacement laptop, and the top end fusion chips (in 2012!) could easily be pushing that. A quad (or octo?) core bulldozer, combined with a graphics chip in the 1 to 2 billion transistor range would make a nice console chip, comparable to today's top end pc hardware.

Only thing that makes it unlikely is the rumor it'll be based on 28nm Fusion, which is a Bobcat successor. A full node shrink allows 2x faster than Brazos performance. Will it be enough?

On the other hand, the Bulldozer based Fusion chips are 32nm. I doubt they can stick 1 to 2 billion transistor GPU added to the CPU when Llano with ~200mm2 die has only 1 billion transistors in total, with the same 32nm.
 

tyl998

Senior member
Aug 30, 2010
236
0
0
...and more like 20-40 hours single player, not counting replaying on different difficulties and branches. Assuming it's the same per episode, that ends up being a decent value. <=4 hrs? That wouldn't be worth paying for at all. DLC is fine, but needs to be truly added value, not something to make a crippled game whole.
I never said the $$$ isn't worth it in the case of SC2. I'll happily pay for the expansions :wub:

Must...have...more...Kerrigan...
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
A PS3 has a 7800GTX for a GPU.
We are talking several generations of performance behind the PC.

Dosn't matter if it's on 28nm...it don't magically make the GPU any faster...just cheaper to produce.

I am pretty sure the PS3 had a 7950 GT GPU (GPU clock speeds of 550mhz and memory clock speed of 1400mhz GDDR3). The memory bandwidth is only 22.4GB/sec and not 44.8GB/sec because the memory bus was 128-bit and not 256-bit as on the desktop part.

Since PS3 launched November 11, 2006, and 7950GT launched around September 6, 2006, the GPU in the PS3 at the time (and Xbox360) was actually among the most powerful.

So new consoles repeat this pattern, by the time Xbox720 / PS4 launch in 2013-2014 or whatever, they would have GPUs a lot more powerful than the GTX580 / HD6970. PS3 actually cost Sony about ~900 USD to manufacture when it first launched. Since Blu-Ray component prices have come down over the last 5 years, adding Blu-Ray to PS4 will no longer cost them a lot of $. Therefore, this leaves more room for a very powerful GPU+CPU combo. The question is, will Microsoft have to pay royalties to Sony for using Blu-Ray?

I wouldn't be suprised if the GPU will be at least an HD7000 derivative on 28nm.
 
Last edited: