Digital Foundry: next-gen PlayStation and Xbox to use AMD's 8-core CPU and Radeon HD

Page 41 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Starcraft, WoW, Diablo 3

Blizzard games quad-core threaded? hahha. Blizzard makes some of the most CPU limited games because they don't scale beyond 2 cores.

An i3 3.1ghz outperforms i7 930:

sc2%20proz.png

diablo%20III%20proz.png


Disparate architectures found in the current-gen consoles are partly responsible for lack of many multi-threaded PC games up to now. "Getting a common game architecture to run across both [Xbox 360 and PS3] is no easy feat and you have to take 'lowest common denominator' sometimes. This can mean that your engine, which is supposed to be 'wide' (ie. runs in parallel across many cores) ends up having bottlenecks where it can only run on a single core for part of the frame," (PS3 specs: six SPUs, one core, two hardware threads).

"This (Sony) approach of more cores, lower clock, but out-of-order execution will alter the game engine design to be more parallel. If games want to get the most from the chips then they have to go 'wide'... they cannot rely on a powerful single-threaded CPU to run the game as first-gen PS3 and Xbox 360 games did."
http://www.eurogamer.net/articles/digitalfoundry-future-proofing-your-pc-for-next-gen

It's not reasonable to look at existing PC games and extrapolate that next gen console games will still only use 2-4 cores. Killzone Shadow Fall already uses 5 cores.

Right now it's pointless to speculate on the performance of PS4/XB1 until we give developers 2-3 years to learn the hardware. We won't be able to tell the full potential of those consoles for a bit. Only one compute job was used in the Killzone demo, and that's used for memory defragmentation. That means developers can squeeze a lot more out of PS4 long-term. Also, once 1st wave of next gen titles comes out on the PC (Watch Dogs, etc.), we'll be able to tell what PC graphics cards are necessary to run those games at similar level of graphics to PS4.

I think many people quickly forgot that the original specs for PS4 included HD6670/7670 style GPU. What we got is a huge upgrade from what was rumored for a long time since the GPU is between 7850 and 7870 in performance. In light of the console only costing $400, they did a pretty good job. PS3 cost nearly $800 to manufacture and Sony sold it for $599. It was going to be impossible to release a PS4 with much more powerful hardware without raising the price or Sony taking on hundreds of millions of losses, or the console getting delayed (allowing time for prices on 28nm GPUs to fall to try and squeeze and HD7950/GTX660Ti/GTX760 in there).
 
Last edited:

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Right now it's pointless to speculate on the performance of PS4/XB1 until we give developers 2-3 years to learn the hardware. We won't be able to tell the full potential of those consoles for a bit. Also, once 1st wave of next gen titles comes out on the PC (Watch Dogs, etc.), we'll be able to tell what PC graphics cards are necessary to run those games at similar level of graphics to PS4.

Ubisoft develops their next gen games on a "high-end" PC. The E3 demo run on a GTX680. They do not use a PS4/XO dev kit to make the games.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Ubisoft develops their next gen games on a "high-end" PC. The E3 demo run on a GTX680. They do not use a PS4/XO dev kit to make the games.

That's only 1 firm. That doesn't address anything I said in my post. Looking at 1st wave of PS4 games is not an indication of its long-term full potential. Every single console is the same - graphics potential only starts to show itself in the 2nd and 3rd wave of next gen console games. Will GTX780 give us superior graphics to Watch Dogs on PS4? For sure it will but what about in 2nd and 3rd wave of PS4 titles? The argument of console vs. PC is pointless. One reason for a PC gamer to buy a console is to play exclusives that won't be out on PC, even if console graphics are inferior. Even if I bought 3 Titans, I still can't play the Last of Us. The increase in processing power on PS4 is good enough to have next gen exclusives that are by far graphically superior to PS3's. The point of a console is not to have graphics that rival GTX780 SLI for $400-500. PC gamers we will also upgrade over the next 7-8 years. For that reason it really doesn't matter how much more powerful GTX780/Titan/HD7990 are vs. PS4 because we'll have upgraded many times by 2021 to next gen cards. Anyone futureproofing with GTX780 SLI or similar for PS4/XB1 games is just wasting $. The best way to future-proof is to upgrade more frequently over the life of the console -- Maxwell--->Volta. By the time we see those 2nd and 3rd generational waves of games on PS4/XB1, we'll be on Volta already with probably 2-3x the power of GTX780.
 
Last edited:

tulx

Senior member
Jul 12, 2011
257
2
71

Yep, read that as well. This made me think - keeping in mind that correlation does not imply causation - the Xbox 360 was more GPU-focused than the PS3 and that did coincide with the shift to GPUs as the performance-defining element over the CPU in the PC space as well.

Now that the Xbox One seems to be more CPU-focused than the PS4 (with DDR3 as opposed to GDDR5 - which benefits the CPU, clock boost to the CPU and lower clocks and bandwidth on the GPU), could this encourage developers to utilize the CPU for most of the workload and focus on mutithreading as opposed to ofloading almost all the processing to the GPU, which seemed to be the case during the last years?
 
Last edited:
Aug 11, 2008
10,451
642
126
Yep, read that as well. This made me think - keeping in mind that correlation does not imply causation - the Xbox 360 was more GPU-focused than the PS3 and that did coincide with the shift to GPUs as the performance-defining element over the CPU in the PC space as well.

Now that the Xbox One seems to be more CPU-focused than the PS4 (with DDR3 as opposed to GDDR5 - which benefits the CPU, clock boost to the CPU and lower clocks and bandwidth on the GPU), could this encourage developers to utilize the CPU for most of the workload and focus on mutithreading as opposed to ofloading almost all the processing to the GPU, which seemed to be the case during the last years?

I have no doubt games will be more multithreaded. How that will translate to the PC remains to be seen. However, I wouldn't say either of the new consoles are CPU focused. The gpu is relatively much stronger than the cpu in either of the new consoles. I would more say the cpu is just "good enough" to get by while most of the thermal budget is directed towards the GPU. In fact the whole idea of HSA is to offload the processing from the cpu to the gpu. Again, how well this will be utilized, either in the consoles or wider PC landscape remains to be seen.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Since XB1 is in full production today they changed the clocks way back, simple they just announced them now.
 

Sheep221

Golden Member
Oct 28, 2012
1,843
27
81
I have no doubt games will be more multithreaded. How that will translate to the PC remains to be seen. However, I wouldn't say either of the new consoles are CPU focused. The gpu is relatively much stronger than the cpu in either of the new consoles. I would more say the cpu is just "good enough" to get by while most of the thermal budget is directed towards the GPU. In fact the whole idea of HSA is to offload the processing from the cpu to the gpu. Again, how well this will be utilized, either in the consoles or wider PC landscape remains to be seen.
Multithreaded games are something that is not readily available, especially because they are GPU bound, the sound, probably AI, network communication or source codes are just some small piece of program that are executed by the CPU, rest is GPU, technically, games are passive software, players are situated in static world created beforehand, which means it only requires more RAM to store source files of models and textures and more video RAM and GPU power to display duplicated effects and characters. If game world would be generated in real time, it would be useful(and possible) to use CPU for rendering, not for pre-made game worlds however.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Since XB1 is in full production today they changed the clocks way back, simple they just announced them now.

Microsoft said:
In fact, we just updated the CPU performance to 1.75 GHz

All dev systems to this point have been 1.6Ghz. If they had changed the clocks "way back" then at least some dev systems would have reflected that.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
My guess is that MS determined they had some thermal room left over so they bumped the clocks up some. Its not like a 150MHz boost is going to add much to overall performance. But if its free, why not?
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
1: The APUs must be produced first before Miscrosoft assemble/manufacture the XB1.

2: A large Volume Launch in 13 countries at 22 November 2013

3: For this to happen they already have final silicon, XB1 is in full production now.

4: It will take 2-3 months for TSMC to produce the APUs,

thus final clocks decision was not made recently.
 

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
Will be interesting to see how MS and Sony different approaches play out in real life. The GPU seems to be strong enough for the application, we keep hearing about mhz increases in CPU, which suggests it might be lacking some punch.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I'll bet chips are binning better than expected so they could up the clocks.

I doubt. They have increased both the GPU and CPU clocks now. And in the same time axed alot of release countries from the initial list. And having to give people free games in compensations for preorders.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Will be interesting to see how MS and Sony different approaches play out in real life. The GPU seems to be strong enough for the application, we keep hearing about mhz increases in CPU, which suggests it might be lacking some punch.

Indeed. Sony played smart on GDDR5 and a GPU thats almost 50% faster. MS had almost nothing going for them. And first they upped the GPU clock and now CPU clock. So atleast MS now got the CPU advantage going for them.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
Yep, read that as well. This made me think - keeping in mind that correlation does not imply causation - the Xbox 360 was more GPU-focused than the PS3 and that did coincide with the shift to GPUs as the performance-defining element over the CPU in the PC space as well.

Now that the Xbox One seems to be more CPU-focused than the PS4 (with DDR3 as opposed to GDDR5 - which benefits the CPU, clock boost to the CPU and lower clocks and bandwidth on the GPU), could this encourage developers to utilize the CPU for most of the workload and focus on mutithreading as opposed to ofloading almost all the processing to the GPU, which seemed to be the case during the last years?

Thats a very interesting thought, I never looked at it like that. Personally I hope they optimize for the CPU, but keep the GPU as the main powerhouse for the processing. Only because CPU's are always running more than just a game, they handle RAID, audio, OS, processing the RAM etc.. while a GPU is only processing the graphics of the game.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
And yet Microsoft says it was. Both you and Abwx seem to always know more than the actual product manufacturers themselves.

Where did they say that ??

I find it very difficult to change the clock specs so close to the end product launch. The APUs should be in production a few months now in order to keep the XB1 volume high at launch in 22 November.
 
Apr 20, 2008
10,067
990
126
Where did they say that ??

I find it very difficult to change the clock specs so close to the end product launch. The APUs should be in production a few months now in order to keep the XB1 volume high at launch in 22 November.

How is that difficult? These APU's likely come unlocked and their speeds are changed dynamically by the BIOS. I'd consider it just another clock profile that they chose to enable.

Changing memory size, ram type, cache sizes and CPU dies would be difficult. Changing CPU clock speeds is the easiest tweak they could have done.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
How is that difficult? These APU's likely come unlocked and their speeds are changed dynamically by the BIOS. I'd consider it just another clock profile that they chose to enable.

Changing memory size, ram type, cache sizes and CPU dies would be difficult. Changing CPU clock speeds is the easiest tweak they could have done.

I dont believe thats how it works. AMD will have to know how many chips can do 1.75GHz, that means that yields will be different than chips at 1.6GHz. Also AMD needs to validate those chips at 1.75GHz and then TSMC to produce them.
Also, changing the specs will have an impact in thermals and consumption. Remember they changed both the CPU and GPU clocks. Having the same Heat-Sink Fan will increase DB noise as well or you will have to change the HSF, and more.