Digital Foundry: next-gen PlayStation and Xbox to use AMD's 8-core CPU and Radeon HD

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

inf64

Diamond Member
Mar 11, 2011
3,706
4,050
136
i have yet to see a game developer that compains about todays desktop cpus beeing weak...

ALL of them say, desktop cpus are very, very overpower...and they can make the most shitty code ever and it will still run ok

remember guys, the most cpu intensive games today are the very badly otimazed ones...and they all run well, in 3 years old cpus

*cough* Planetside2 *cough* :D
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Uhhh k...:confused:


lol yes
the only thing it shows is, apart from the Core 2 and bobcat, the others are all fast enough under the conditions, to hit the limit imposed by the VGA...

but, I wouldn't really read much in how the parts perform using PC software...
just look at the PS3/360 and some VGA from the same era...
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
*cough* Planetside2 *cough* :D

yep one game...name me other 34 games, and i will happy change my point

mmm... looks like cpus that can't use sse4.1 or 4.2 isa, chockes pretty hard at this game
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
lol yes
the only thing it shows is, apart from the Core 2 and bobcat, the others are all fast enough under the conditions, to hit the limit imposed by the VGA...

but, I wouldn't really read much in how the parts perform using PC software...
just look at the PS3/360 and some VGA from the same era...

You should always be hitting the limit of the graphics card, otherwise you should just have bought a less powerful card.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
really? if you go by a single multiplatform games, the most common thing to see is the xbox clearly outselling evereything else, take a look in COD or some other random game, also MS makes money with things like "xbox live",

http://arstechnica.com/gaming/2012/12/xbox-360-dominates-start-of-gamings-key-holiday-sales-season/

sure in Europe and specially Japan things may not look so great, but I think the graphs you posted look unreal, also on the last few years I've been only reading negative news about the Wii and sales going down,

Nintendo innovation but also only doing a small upgrade to the Gamecube was great (and now they're using the same CPU design from 15 years ago, 64bit DDR3 and a slow GPU), but MS/Sony using Jaguar seems like a bad choice? o_O

Their financials dont lie. If you think so you should go complain to the SEC for accounting fraud.

Nintendo had innovation, truely innovation. Thats why they can get away with lower specced hardware. As I also wrote before, if you dont have innovation, you need hardware to compensate.

Also what you could play on the Nintendo was severely limited. Yet now when its an AMD 1.6Ghz turtle CPU its all ok and all can happen? We talk about a CPU thats most likely slower than a Celeron 420 per core.

PC gaming is rebounding for the same reasons.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Yes, nintendo sold a ton of Wii units. That does not mean it's the best, not by a long shot; I would easily use a 360 over it any day for practical use. In the end it all boils down to software - and the 360 and PS3 have it soundly beat in this respect.

IMO, the 720 and PS4 will have better software support than the Wii U. I could be wrong though; It is interesting that nintendo has defied odds so many times - everyone counted them down and out with the Wii and they actually were very successful with it. Perhaps that will happen again with the Wii U.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
You should always be hitting the limit of the graphics card, otherwise you should just have bought a less powerful card.

the thing is, the test only shows some very limited scenarios, try BF3 MP or many others... or try different settings and resolution

Their financials dont lie. If you think so you should go complain to the SEC for accounting fraud.

Nintendo had innovation, truely innovation. Thats why they can get away with lower specced hardware. As I also wrote before, if you dont have innovation, you need hardware to compensate.

Also what you could play on the Nintendo was severely limited. Yet now when its an AMD 1.6Ghz turtle CPU its all ok and all can happen? We talk about a CPU thats most likely slower than a Celeron 420 per core.

PC gaming is rebounding for the same reasons.

maybe it should be analyzed with more caution, perhaps there are other factors involved than only how successful the 360 or the Wii are at the moment? the link I posted is pretty clear, the 360 is selling a huge amount of games, if the high number for the xbox hardware sales is due to the hardware failures, than I don't even know what to say, the smaller user base that MS have really buys a lot of games!

as for the CPU, if Jaguar is a turtle, I can't even think on what you should call the wii and wii U cpu, it's not even funny, there is a huge difference...

what innovation? the wii U controller? really? in that case you should also consider kinect and other things (even the XB live)

if you could buy an 8 core 420 (with the same price and power requirements), maybe it would be a good CPU for some uses!?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Since it was brought up, It's no secret that the hardware itself is sold for a loss initially, but the money is made in other areas. Not the hardware. Software and services are vastly more profitable.

Aside from that, i'm under the impression that both sony and MS broke even several years ago and started making a net profit on hardware, albeit a small one. I can't imagine that the 720 or PS4 will be profitable initially.
 
Last edited:

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Also what you could play on the Nintendo was severely limited. Yet now when its an AMD 1.6Ghz turtle CPU its all ok and all can happen? We talk about a CPU thats most likely slower than a Celeron 420 per core.

don't get me wrong, i would be glad to see a powerfull cpu there... but consoles are very power limited, so trading cpu speed for a beffy gpu, imo is a good thing

that's were jaguar cores comes handy, seems to be one of the best perf/watt/mm² cpu out there
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
don't get me wrong, i would be glad to see a powerfull cpu there... but consoles are very power limited, so trading cpu speed for a beffy gpu, imo is a good thing

that's were jaguar cores comes handy, seems to be one of the best perf/watt/mm² cpu out there

Not really. The first PS3 used over 200W. And the Xbox360 above 175W. Not to mention they both idled at around the same.

It would make alot more sense to go with a faster quadcore for example. Something like the Athlon II X4 620e if you wish to stay AMD. Shrink it to 28nm. Or simply a faster clocked quadcore Jaguar if TDP is the concern. But jaguar is still only a 2 issue wide uarch. This is gaming after all, not encoding or something similar.

This is what AMD themselves think of the cat family:
http://xtreview.com/images/jaguar%20%20x86-compatible%20AMD%2002.png
 
Last edited:

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
You know, despite the power of the PS3 and 360, they were really built grossly wrong for the needs of game coding.


I don't know if Sony had really intended to have Cell carry the load of compute as well as graphics (were they really designing it like previous generation consoles?) and was forced to put in the Nvidia GPU but Cell has a ton of power in the wrong places. Massive DSP crunching abilities but a dinky little PPE (which also had to manage the SPEs) for all the decision making type code like AI.

The 360 was a bit better but as already noted, the CPU was already rather inferior to what was available at the time for game code.

We're not even talking a comparison to the top end here, but just mainstream desktop CPUs (i.e., a not-super-hot Athlon64 X2 would have been significantly more powerful).


So in that sense, history is just repeating itself if the 720 and PS4 aren't able to touch the highest end systems. It might be a wise decision since the peak power of the consoles isn't as important plus consoles might not be the huge money maker it was relatively.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
But jaguar is still only a 2 issue wide uarch. This is gaming after all, not encoding or something similar.

yes, that's why the "moar cores" is better...1 core for each diferent thing..like AI, physics, audio, OS.. etc

mmm... a mix of small and big cores (a la big. Little) ones seems pretty good actually ^^
beffy ones for AI, physics and the game itself, and small ones for OS, audio and perimetrals :eek:

but still..... 200W is what a mid range gpu alone, uses

This is what AMD themselves think of the cat family:
http://xtreview.com/images/jaguar ...20AMD 02.png

yep, low power devices
 

jpiniero

Lifer
Oct 1, 2010
14,698
5,329
136
Not really. The first PS3 used over 200W. And the Xbox360 above 175W. Not to mention they both idled at around the same.
And that was a mistake. Jaguar will help keep the power down to realistic levels at launch, and only get better as they shrink it. Plus it allows them to stuff more GPU cores on the APU.

Edit: Say something like a ~250mm^2 power optimized APU that draws 80-90W by itself and is cheap enough so they can sell the consoles for $299.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Last edited:

boxleitnerb

Platinum Member
Nov 1, 2011
2,601
2
81
You should always be hitting the limit of the graphics card, otherwise you should just have bought a less powerful card.

That statement can easily be made for the CPU, as well. Aside from that, it is NOT important for the GPU to be at 100% all the time. What is important is that your fps are high enough.

If one component is very much underutilized most of the time, that's bad. But if you don't hit 100% all of the time, that is not really problem.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
Not really. The first PS3 used over 200W. And the Xbox360 above 175W. Not to mention they both idled at around the same.

It would make alot more sense to go with a faster quadcore for example. Something like the Athlon II X4 620e if you wish to stay AMD. Shrink it to 28nm. Or simply a faster clocked quadcore Jaguar if TDP is the concern. But jaguar is still only a 2 issue wide uarch. This is gaming after all, not encoding or something similar.

This is what AMD themselves think of the cat family:
http://xtreview.com/images/jaguar%20%20x86-compatible%20AMD%2002.png

I believe the consoles want to go down in power use. Ditch the red ring of death meme and such.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I believe the consoles want to go down in power use. Ditch the red ring of death meme and such.

The RROD was just bad quality and sloppy work. The PS3 for example "only" had a 10% RMA rate. However the labour cost associated with the PS3 was almost 40$, while labour for the xbox360 was 6$.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
I don't think sloppiness is the sole reason, definitely has to do with trying to minimize noise as well as cost while drawing ~200W. Less power draw means lower cost and less hassle meeting noise profiles.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,370
1,604
136
A very important point about next-gen is that unlike last time, the consoles will be limited more by their power envelopes than they are by the cost of silicon. The console vendors will decide how much power they want to cool in a small box in a livingroom, and then they split that power among all the devices in the box. This means that every watt spent on the CPU is effectively a watt not spent on the GPU.

Multiplat games tend to scale well to utilize a better GPU but not scale much to utilize a better CPU. This is why skimping on the CPU in favor of a fatter GPU makes sense.
 

Concillian

Diamond Member
May 26, 2004
3,751
8
81
I'm really excited about these future consoles. I like games, but I really don't want to spend tons of cash on hardware to play them. At this rate, the only thing I'll have to upgrade is my graphics card :D

Got news for you. Those days are already here. I've an overclocked i3-530 that I've had for years, and outside of PC specific games like BF3, it's pretty damned adequate for 1920x1200 PC gaming when coupled to my 7850.

Now this proposed console CPU is a 1.6 GHz Jaguar 8 core compared with my 4.0 GHz i3 2 core + HT. Consider this to be equivalent to a 2.4-2.8 Ghz quad or so in decently multithreaded apps... Now give the jaguar a little penalty for 8 core scaling inefficiency and AMD's slightly worse IPC (my i3 is way behind Sandy / Ivy in terms of IPC, so it's much closer to AMDs IPC,) and you have 2 CPUs that should be reasonably competitive in terms of overall CPU power. This is fairly competitive in CPU power to current generation i3 CPUs that can't be overclocked.

This should be adequate CPU performance for current games. Is it a powerhouse? No, absolutely not, but it's certainly adequate to create a decent gaming experience... especially when you consider that on a closed platform developers will be able to squeeze significantly more performance from a given piece of hardware (no matter what that hardware is). In practice, this CPU may end up somewhere between the i3-3120 and an i5-3570. When you consider the pricing pressure on console hardware, and the pressure on "being green" and power consumption in general that didn't exist 6 years ago this really doesn't seem that bad a CPU to end up with.

IMO, the goal of this gen of consoles is to push the native res up to 1080p while keeping hardware prices and power consumption within reason. If my analysis is correct, and these rumors true, then I think they have a CPU in the right range to provide decent performance.

However, the more exciting thing to see with consoles is the real reason there MUST be a new generation of consoles ... MEMORY. Developers are begging for memory increases. With the most flexible of the two consoles at 1GB memory shared between CPU and GPU in an era where the WEAKEST PCs are having 2GB shared and most opting for at least 4GB shared, you can certainly understand where they're coming from. I wonder how much of the development cycle was spent budgeting every byte. How much testing was necessary to ensure that in all conditions the game never overran the 512MB the PS3 gives the CPU...

The new consoles should see significant quality increases just porting current games over and being able to use something like 8 times the texture map memory without penalty. This should, at least for the time being, make the first year or two of console games take less hardcore optimization to get to work on consoles, which may potentially free up resources for porting to PC. Also, the similar microstructure of a Jaguar to a typical PC CPU should make PC porting an easier process, which should result in better PC games regardless of the performance differences between platforms.

Let's not forget games like Skyrim, that game was HORRIBLY CPU bound on PCs in outdoor areas because the developers did a half-assed porting job. This particular game saw enough developer support to be later patched to fix the PC CPU issues. CPU bottlenecks completely disappeared after that port. Other games were not so fortunate. GTA IV was an abysmally bad port in terms of CPU performance, and the poster child for a terrible PC port. If the structure of the CPUs of all three platforms (PC, Xbox and Playstation) are more similar, we may not need that kind of extra attention on the PC side, so overall experience on the PC side may end up much better, with most games porting easily and developers who pay extra attention to the PC offering experiences more like games like the Witcher 2, where the PC version is graphically all kinds of awesome.

We aren't in a world where PC gaming is king anymore. We have to assume games will be released on all platforms. With these consoles getting extremely "PC-like" hardware, it is absolutely a win for PC gaming regardless of the console CPU speeds.
 
Last edited:

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
The real reason there MUST be a new generation of consoles though is MEMORY. Developers are begging for memory increases. With the most flexible of the two consoles at 1GB memory shared between CPU and GPU in an era where the WEAKEST PCs are having 2GB shared and most opting for at least 4GB shared, you can certainly understand where they're coming from.
Minor nitpick: current consoles are at 512MB, not 1GB. The 360 has 512MB of GDDR3 (okay, and that bit of eDRAM) while the PS3 is 256MB GDDR3 + 256MB XDR. If these things had 1GB, developers would be ecstatic.;)
 

Fx1

Golden Member
Aug 22, 2012
1,215
5
81
The RROD was just bad quality and sloppy work. The PS3 for example "only" had a 10% RMA rate. However the labour cost associated with the PS3 was almost 40$, while labour for the xbox360 was 6$.

No it didnt have 10% failure rate at all.

PS3 was more like 1% return rate