Digital Foundry: next-gen PlayStation and Xbox to use AMD's 8-core CPU and Radeon HD

Page 40 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
They obviously cheaped out on the CPUs this time around (even more so than usual). Here's hoping we can at least get less watering down of our games on the graphics side for PC.

It looks like our new consoles are CPU limited from the get-go.

And lets be real about the threading hopes. Cell on the PS3 and the Tri-Core in the Xbox 360 and we have only a handful games out now that can utilize more than 2 cores, unless it is a physX tack on or some equally gimmicky feature.

What do we have that actually utilizes quad cores now? Civ5, Starcraft, WoW, Diablo 3, Planetside 2. None of those are console centric titles.

Of the console titles, we've got Crysis, GTA4, Dragon Age Origins?

Don't even get me started on Simcity not being threaded. What a joke.

You may have just contradicted what you said...
No one can pair "Planetside 2" with "cheaped on CPU".
In case you missed it: Planetside 2 is coming to PS4
Game is know as CPU hog. Giant battles with hundreds of players can put even high end gaming PC to its knees
Apparently that "tablet CPU" is more powerful than you may think
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
It can get worse:
4.5GB + 512Mb ''Flexible memory'' now (not 4.5GB+1GB flexible memory anymore).

Digital Foundry said:
For practical game applications, the correct figures for this story, as we understand it now, are a guaranteed 4.5GB for development and a further 512MB from the flexible pool. We have updated the headline accordingly.

Right now PS4/X1 game developers have access to: 6 Jaguar cores (rumours point to 1.6GHz) + 5GB unified RAM for CPU+GPU (4.5+0.5GB GDDR5 vs 5GB DDR3+32Mb ESRAM) and GPUs with 12-18 CUs GPUs @ 800MHz (there are rumours of 10% GPU reservation on X1).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Flexible memory is a terrible concept in a console. What is the requirement of using it. And if you can use it, why not have it all the time.

Flexible memory sounds like the user has to do something so x game will work. Consoles work best when everything so to say is static.
 

dagamer34

Platinum Member
Aug 15, 2005
2,591
0
71
Flexible memory is a terrible concept in a console. What is the requirement of using it. And if you can use it, why not have it all the time.

Flexible memory sounds like the user has to do something so x game will work. Consoles work best when everything so to say is static.

We are just going to have to wait for the system to be released, more Indies to get access to the docs, and someone more liberal with the NDA to find out the actual truth behind the matter.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Flexible memory is a terrible concept in a console. What is the requirement of using it. And if you can use it, why not have it all the time.

Flexible memory sounds like the user has to do something so x game will work. Consoles work best when everything so to say is static.

It all sound like devkit features. Devloping tools taking 4 gigs of ram? Sounds about right. And being able to distribute more resources to game engine (512MB of flexible RAM) can be usefull when doing test runs etc.

I just can't see this being operating system RAM reserves. Consoles have hardly any operating systems.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
I'm not sure why this is such a problem tbh, even if true. Check your Windows memory usage while playing any game you want and it's doubtful the game will be using much more than 1-2 GB max. How many people here have 4GB VRAM on their graphics cards? And of those who do, how many have ever run into a situation where 4GB wasn't enough?

Look at this 4K framebuffer information on hexus - http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=7

That's 4x1080p screens worth of pixels in FarCry 3 ultra settings, 4xMSAA and it's only using 2.6 GB of VRAM.

4GB VRAM is for eyefinity/Nv surround types of resolution, not 1080p and below. Even going forward in the next 2-3 years 4GB VRAM will be more than enough so IF true, it'll be 3-4 years before this is even remotely an issue at 1080p.

What it doesn't explain is why the OS would need so much memory, but if both have independently decided to go down the same route then it's probably for an important reason.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I'm not sure why this is such a problem tbh, even if true. Check your Windows memory usage while playing any game you want and it's doubtful the game will be using much more than 1-2 GB max. How many people here have 4GB VRAM on their graphics cards? And of those who do, how many have ever run into a situation where 4GB wasn't enough?

Look at this 4K framebuffer information on hexus - http://hexus.net/tech/reviews/displays/57849-asus-pq321q-4k-gaming-tried-tested/?page=7

That's 4x1080p screens worth of pixels in FarCry 3 ultra settings, 4xMSAA and it's only using 2.6 GB of VRAM.

4GB VRAM is for eyefinity/Nv surround types of resolution, not 1080p and below. Even going forward in the next 2-3 years 4GB VRAM will be more than enough so IF true, it'll be 3-4 years before this is even remotely an issue at 1080p.

What it doesn't explain is why the OS would need so much memory, but if both have independently decided to go down the same route then it's probably for an important reason.

Its not just vram its RAM + VRAM.

4.5 GB shared ram for game process + gpu.

There are game that are almost butting up against that wall now; it certainty is low for a console expected to last at least 6 years.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I would wager that a single core on the ps4 or xbox one is faster than all the threads on the xbox 360.

I could be wrong, but look at what dev's are able to do on ancient CPU's on the xbox 360. People need to seriously stop comparing the hardware to high end PC's and start comparing them to the current consoles. It paints a far more optimistic picture about the upcoming console generation.

I'd be willing to bet that the 360's CPU isn't actually that much slower at all. I have a hard time believing that a slimmed down, netbook/tablet core would be much faster clock for clock than a full-fledged CPU core from 7 years ago. The 360 is clocked almost twice as high on top of it.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Its not just vram its RAM + VRAM.

4.5 GB shared ram for game process + gpu.

Yep so take one of the recent AAA titles, put it on 4K at ultra settings, 4xMSAA and it's still not using 4.5GB of VRAM and RAM combined. It's likely only around 3.5GB. We're talking about a recent game at the max settings you can realistically expect to play at 4x the resolution (this is the important part) of the consoles maximum. This amount of VRAM usage is years away from mainstream.

There are game that are almost butting up against that wall now; it certainty is low for a console expected to last at least 6 years.
What games?
 
Last edited:

The Alias

Senior member
Aug 22, 2012
647
58
91
I'd be willing to bet that the 360's CPU isn't actually that much slower at all. I have a hard time believing that a slimmed down, netbook/tablet core would be much faster clock for clock than a full-fledged CPU core from 7 years ago. The 360 is clocked almost twice as high on top of it.
there is so much more to it than clocks ....
 

inf64

Diamond Member
Mar 11, 2011
3,698
4,018
136
Jaguar is miles ahead IPC-wise than the lowly PowerPC in 360. We have discussed this before many times, with links to developers talking about it and articles covering the CPU uarchitecture in xbox360.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
I'd be willing to bet that the 360's CPU isn't actually that much slower at all. I have a hard time believing that a slimmed down, netbook/tablet core would be much faster clock for clock than a full-fledged CPU core from 7 years ago. The 360 is clocked almost twice as high on top of it.

I'd be willing to bet that the Xenon CPU is about as good as what you'd find in your average A9 smartphone (not clock for clock). It's a highly clocked, yet simplistic in-order execution CPU that can execute 2 threads at once (3 cores, 6 threads). By comparison, Jaguar is a fully out-of-order CPU with significantly higher IPC and instruction capabilities (6 cores, 6 threads). You also have the add in the fact that the GPU in the new PS4/XBone is significantly more powerful and versatile than the consoles they're replacing, which can be useful for offloading some compute heavy tasks.

By comparison, Jaguar cores are also a lot beefier than Intel's Atom platform too.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
I'd be willing to bet that the 360's CPU isn't actually that much slower at all. I have a hard time believing that a slimmed down, netbook/tablet core would be much faster clock for clock than a full-fledged CPU core from 7 years ago. The 360 is clocked almost twice as high on top of it.

I take it you haven't been keeping up with the times. 7 years is a LONG time in semiconductors, in both process and design.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I'd be willing to bet that the 360's CPU isn't actually that much slower at all. I have a hard time believing that a slimmed down, netbook/tablet core would be much faster clock for clock than a full-fledged CPU core from 7 years ago. The 360 is clocked almost twice as high on top of it.
The bolded would be true. Neither the XB360, nor PS3, had such CPU cores. They had high-GHz low-IPC hard-real-time-capable in-order CPU cores, who's only bright spot was single-precision vector arithmetic performance. The 360 definitely had too little cache to share amongst its threads, too.

NDAs have kept us from getting a complete picture of the performance, but there's no way such a CPU can handle branchy goodness well, and devs never contradicted that, either.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Yep so take one of the recent AAA titles, put it on 4K at ultra settings, 4xMSAA and it's still not using 4.5GB of VRAM and RAM combined. It's likely only around 3.5GB. We're talking about a recent game at the max settings you can realistically expect to play at 4x the resolution (this is the important part) of the consoles maximum. This amount of VRAM usage is years away from mainstream.

What games?

Not really. BF3 MP uses close to that amount.

Not to mention that these are well, current games and for the PC are based on the console version. With next generation expect more geometry, better animations, better lighting, increased physics, and larger more open world games, significantly increasing RAM requirements.

Most games today are using around 0.5-1.5 GB of RAM for the game process and around 1-2 GB of Vram; 1.5-3.5 in total.

Modern Games?

Company of Heroes 2

coh%20ram.png



coh%20vram.png


2.2 GB RAM for the game process and 2.5 GB 1080p high; 4.7 GB total. Even on low this game is still chewing nearly 3.5 GB combined RAM.

And lets not think now, lets think in 3+ years.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Company of Heroes 2 is one of the worst coded games in existence, and the thing here is even a Titan can't hit 60fps on medium settings 1080p.

Med_1920.png


The 7870 is at 40 fps. The PS4 is going to run into a shader limit long before it gets near a VRAM limit of 3GB. It's a bit like putting 2GB on a gtx 650, giving it double the memory it'll ever be able to effectively use.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
Company of Heroes 2 is one of the worst coded games in existence, and the thing here is even a Titan can't hit 60fps on medium settings 1080p.

Med_1920.png


The 7870 is at 40 fps. The PS4 is going to run into a shader limit long before it gets near a VRAM limit of 3GB. It's a bit like putting 2GB on a gtx 650, giving it double the memory it'll ever be able to effectively use.


I think it's a CPU bottleneck, they used a 3960X for the VGAs test

CPU_02.png


PC COH2 on the PS4 CPU would probably run at 20FPS anyway
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Company of Heroes 2 is one of the worst coded games in existence, and the thing here is even a Titan can't hit 60fps on medium settings 1080p.

The 7870 is at 40 fps. The PS4 is going to run into a shader limit long before it gets near a VRAM limit of 3GB. It's a bit like putting 2GB on a gtx 650, giving it double the memory it'll ever be able to effectively use.

Its a console. 40 fps is sufficient. CPU should be able to hit 30 fps.

And there are times where > 1GB is useful on a 650 (I have a 660m 2GB and in Crysis 3, Bioshock Infinite, Metro 2033, Metro LL I have seen more than a GB used at playable settings).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Sony commented it:
http://vr-zone.com/articles/sony-we...nding-ps4s-ram-pool-for-developers/47932.html

We would like to clear up a misunderstanding regarding our “direct” and “flexible” memory systems. The article states that “flexible” memory is borrowed from the OS, and must be returned when requested – that’s not actually the case.


The actual true distinction is that:
  • “Direct Memory” is memory allocated under the traditional video game model, so the game controls all aspects of its allocation
  • “Flexible Memory” is memory managed by the PS4 OS on the game’s behalf, and allows games to use some very nice FreeBSD virtual memory functionality. However this memory is 100 per cent the game’s memory, and is never used by the OS, and as it is the game’s memory it should be easy for every developer to use it.
.............

“We understand that this is a 1GB virtual address space, split into two areas – 512MB of on-chip RAM is used (the physical area) and another 512MB is ‘paged,’ perhaps like a Windows swap file,” the site said. “But to be clear, of the 8GB of GDDR5 on PS4, our contention is that 5GB of it is available to developers.”

Seems both Sony and MS now slowly realizes that their consoles are fitted with too little memory. MS considers going to 12GB. And Sony now uses flexible memory, including paging to address it.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Steam Hardware Survey - http://store.steampowered.com/hwsurvey

Most popular VRAM = 1GB with 33%. Second most popular = 512MB with 15%. Less than 20% of people have more than 1GB (and I suspect many of those are using APU's with system RAM), of those less than 1.5% have more than 3GB. it will be 3 years or longer before 2GB becomes most popular, and it'll still be around the same level as 1GB today. Probably 5 years or longer before 3GB is at the 33% popularity stage.

Needing 5GB or more (GPU and system memory combined) just won't be an issue for PC gamers over the next 5 years, let alone console gamers.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,769
3,144
136
Sony commented it:
http://vr-zone.com/articles/sony-we...nding-ps4s-ram-pool-for-developers/47932.html



Seems both Sony and MS now slowly realizes that their consoles are fitted with too little memory. MS considers going to 12GB. And Sony now uses flexible memory, including paging to address it.


no they haven't, your just a troll who is like 1-2 months late on stuff. while your fixated on the amount of memory your forgetting the variable thats just as important if not more so with the size of memory pools we are talking here.

You also have to take into account the reduction in memory foot print from unified address space when comparing to PC and also consider what now is apart of the OS which in previous gens was "game memory".

Hey now, take it easy. There's no reason to be this hostile
-ViRGE
 
Last edited by a moderator:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Needing 5GB or more (GPU and system memory combined) just won't be an issue for PC gamers over the next 5 years, let alone console gamers.
In other news, 640K is enough for everybody.

We already have issues with 1GB VRAM (see those textures? See how they're not really sharp? Exactly!). The primary reason we don't have more on average has nothing to do with being able to use it not, and everything to do with cost. If games would use and require more, it would be more important. If we had more on average, games would use more, instead of down-scaled textures coming with the games. It's a chicken and egg problem, unless you play games that are amenable to mods, in which case >1GB has been useful for a few years, and more wouldn't hurt.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Yep also interesting how he "missed" this part of the article -

The users at NeoGAF have a different story, however: an insider by the name of Thuway has hinted that the PlayStation 4 might reserve as much as six gigs of memory for developers, with two gigs left over for the OS. Thuway posted an update wherein he outright disclosed that there are “[PS4]titles in development that are using 6GB’s of RAM”.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
In other news, 640K is enough for everybody.

We already have issues with 1GB VRAM (see those textures? See how they're not really sharp? Exactly!). The primary reason we don't have more on average has nothing to do with being able to use it not, and everything to do with cost. If games would use and require more, it would be more important. If we had more on average, games would use more, instead of down-scaled textures coming with the games. It's a chicken and egg problem, unless you play games that are amenable to mods, in which case >1GB has been useful for a few years, and more wouldn't hurt.

It's not a chicken and egg problem, it's a balance problem. All that bandwidth, all that memory needs powerful shaders to push it as well. The point here is the shader performance of the PS4 is not high enough for 4GB+ VRAM to be a requirement. If it was don't you think Pitcairn would be using 4GB?

Look at the conclusion of this review of the 7850 4GB -

http://www.guru3d.com/articles_pages/his_radeon_7850_ipower_iceq_4gb_review,25.html

2GB is sufficient enough for a mainstream graphics card. You can't see any performance benefit unless you pass 2560x1600 combined with silly high AA levels ... and at such resolutions we doubt very much you'd be opting a Radeon HD 7850 based graphics solution. It is a cool feature though and we are certain that in the following year there will be a title that eats more than 2 GB (GTA anyone?). But realistically, overall you are going to see very little benefit from going from 2GB towards 4GB unless you'd be in the extreme resolution ranges.
And yes, that's about cost. Just as soon as we all have Titan-level graphics performance, the devs will be making games that uses it. Until then, the majority with much lower specs will get what they want. It's always been that way and always will be.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
It's not a chicken and egg problem, it's a balance problem. All that bandwidth, all that memory needs powerful shaders to push it as well. The point here is the shader performance of the PS4 is not high enough for 4GB+ VRAM to be a requirement. If it was don't you think Pitcairn would be using 4GB?
No. I think you missed the point, which was about why games currently only typically need X VRAM (unmodded). With 1GB VRAM being typical, higher-res textures must be loaded and unloaded a lot, often streamed partly from the disk, or use GPU time to decompress (like Civ V).

It's not a bandwidth problem, unless you already assume that it would be impossible to make use of more VRAM at some given resolution. IE, if you assume that 1GB is enough for 1080P, then more VRAM is only needed for pushing more pixels to more buffers. I don't even play at 1080P (1680x1050), and I often get annoyed at only having 1GB, with a now-lowly GTX 460. Some games I play can use >2GB, now, and more already exist, if you're talking multiplayer; and surely more are on the way. What happens, if you run against the limit, is that it gets jerky, due to having to swap textures in and our of VRAM all the time. The answers are either pop-in/out due to streaming, or using lower-res textures. More VRAM would do the job better, without any more GPU power, or higher settings.

I'm talking about going up to something and it being fuzzy. The artist who made it likely had a much higher-res texture. But, you'll never see it. Not because you don't have enough VRAM bandwidth, but because system RAM is very far away, so the less swapping out the better, but you don't have so much VRAM to swap to and from. The bandwidth and latency problem is on PCI-e and your CPU, not the video card; the video card end mostly has a capacity/cost problem. Even with good asset re-use, there are limits, without simply having more RAM. But, a game developer simply cannot ignore what most people are running. Some, like Crytek, I applaud, for having options that can push the limits of what's available, today. But, I can also understand and respect the decision not to, by most devs/publishers.

IoW, the chicken and egg thing is about why 1GB is usually enough, and it's not because the people making assets for games couldn't make them more detailed, but because without more people having more VRAM, there isn't enough incentive to make them more detailed, and possible disincentive towards trying (notice how games that push HW boundaries get called "poorly coded," and such, until the hardware catches up?); yet by game makers choosing to limit their content, there then also isn't enough incentive for users [of unmodded games] to get more, unless they need to fill more buffers faster, because the game devs have already made compromises to make <=1GB work well, even if at the cost of noticeable IQ, for objects not at a great distance.
 
Last edited: