Digital Foundry: next-gen PlayStation and Xbox to use AMD's 8-core CPU and Radeon HD

Page 37 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gorobei

Diamond Member
Jan 7, 2007
4,117
1,624
136
the only chance physx had of becoming a defacto standard was when they got the UE3 to support it natively and the next gen console gpu was up for grabs. if nv had been able to get their gpus into all the consoles, all the game devs would have used physx since it was part of UE, more or less free, and available on all the systems of the installed base of console-users/game-buyers.

that chance died when amd got all 3 consoles. if havoc is openCL available at low enough cost, coding for cpu/apu game dynamics will likely be more prevalent since any ports back to pc means you can use intel execution units or amd apu/gcn as they are guaranteed to be present.

game devs are opportunistic. most common denominator hardware spec determines maximum number of possible sales. cutting the possible sales in half or a third by requiring physx for non trivial dynamics(walls destroyed, bridges burned, rubble blocking a door) is pure suicide unless nv or amd pays them enough to offset the loss in sales. thats why they code 2 paths: non physx (for 100% of user/buyers) and physx (for xx% subset of users/buyers) so they dont lose sales and because nv pays for putting in the physx code.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I would love for Intel to allow it's iGPU's to run Havok phsysics and allow discrete GPU's to do rendering grunt work. No need to add another discrete GPU to take over the physics calculations. That would be epic on the PC and the true death knell of PhsyX.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
False, the bolded part...but nice fallacy, taking your subjective, personal view and presenting it like a fact. :thumbsdown:

What has happend is that the CPY has held back physic for years.
And you are still advocating letting the lesser performing hardware do the job...:thumbsdown:
What PC game uses the GPU for integral physics, that cannot be configured to be off, or to a lesser or greater degree? How does it make of Geforces and Radeons? I have yet to hear about this game. It's not a subjective matter. Either the game has mechanics that utilizes the accelerated physics, or it doesn't. When you can turn it off to get some FPS, that's a clear sign it doesn't. While lesser in quantity, there have been Havok games of this nature, just like the common PhysX ones (SC II as the most notable example I can think of).

Until the above exists, an argument that the CPU is holding GPUs back is completely bogus. The CPU is used, because it makes more sense to. Intel's CPUs are getting ever-better at SIMD, too, and will soon rival mid-range GFX cards in peak performance. With fast access to the shared LLC, they will leave GPUs in the dust for work like Physics.

I advocate for something that works. I had hopes for GPGPU, like everybody else, but for physics, it just doesn't do the best job, and every time it looks like it might be closer, our CPUs get far better at it. So no, I advocate for the hardware that's closest to the main game engine code: the vector scheduling units and registers of the CPU.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
What PC game uses the GPU for integral physics, that cannot be configured to be off, or to a lesser or greater degree? How does it make of Geforces and Radeons? I have yet to hear about this game. It's not a subjective matter. Either the game has mechanics that utilizes the accelerated physics, or it doesn't. When you can turn it off to get some FPS, that's a clear sign it doesn't. While lesser in quantity, there have been Havok games of this nature, just like the common PhysX ones (SC II as the most notable example I can think of).

Until the above exists, an argument that the CPU is holding GPUs back is completely bogus. The CPU is used, because it makes more sense to. Intel's CPUs are getting ever-better at SIMD, too, and will soon rival mid-range GFX cards in peak performance. With fast access to the shared LLC, they will leave GPUs in the dust for work like Physics.

I advocate for something that works. I had hopes for GPGPU, like everybody else, but for physics, it just doesn't do the best job, and every time it looks like it might be closer, our CPUs get far better at it. So no, I advocate for the hardware that's closest to the main game engine code: the vector scheduling units and registers of the CPU.

The game engine physics are far LESS complicated than what NV PhysX are coded to run on the GPUs. CPUs may be able to calculate bullet trajectory and other simple physics calculations but the physics calculations required to run for NV PhysX(liquids, distraction etc) will make any current single socket desktop CPU crawl to its knees.

It is pity that we dont have more games utilizing more physics and NV PhysX effects. I hope that OpenCL will make more physics used in future games and be able to be used both by AMD and NV hardware.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The game engine physics are far LESS complicated than what NV PhysX are coded to run on the GPUs. CPUs may be able to calculate bullet trajectory and other simple physics calculations but the physics calculations required to run for NV PhysX(liquids, distraction etc) will make any current single socket desktop CPU crawl to its knees.

It is pity that we dont have more games utilizing more physics and NV PhysX effects. I hope that OpenCL will make more physics used in future games and be able to be used both by AMD and NV hardware.
Has PhysX fixed it's call-per-entity problem (too many calls, instead of a batch list to be sent to the HW)?

Has PhysX gotten to work on non-Geforces (answer: no)?

Why haven't the console versions had similar problems, doing it on the CPU?

And, again, what game has used physics, on the GPU (PhysX is one of only 2 available, but this is not a problem limited to PhysX), to implement game mechanics (there have been a few CPU-enabled ones)? TMK, no commercial game has done this (there have been some user mods that have, though, IIRC). To date, has any Bullet game used GPU OpenCL, FI?

It looks fine on paper, but when it counts, it ends up being done on the CPU (the PS3's PPU did it better than most of our desktops). And, the CPU is only getting faster. 2x last gen, then another 4-8x this year. My bet is on AVX2, much more than whatever the GPU will have. Of course, Sony won't have that on the PS4, so they will have to work with 128-bit AVX v. the GPU.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
I think you're the one confused? I think everyone knows that consoles tend to get more out of the same hardware. It's kinda obvious. But it's still going to be less than top of the line PC capabilities from the same period.

The 8800gts was top of the line 3 days BEFORE the PS3 came out. The PS3 has not matched that level. You simply don't seem to get that the PS3 has to cut tons of corners just to get playable framerates. 1------->It hilarious to use FC3 as another example since that one was cut back on the PS3 as well (it's running at 704p like BF3 and can't even maintain 30FPS). Looks like an 8800gts can at least get 30FPS on medium settings at 720p from a bit of googling (people complaining the PS3 version looks horrible in comparison too). So even this unoptimized for PC game is beaten.



2------->The top of the line right now is the GTX Titan. It's out at least a few months before the PS4 will be out. The "7860" in the PS4 at the end of its lifetime will at best only match the Titan.

1. Dude, i never even mentioned FC3, I SAID CRYSIS 3.

2. omg this is what i was arguing the whole time, and now you're repeating it after arguing against me? lol wtf??? i never said a Titan though, that might be pushing it, i said it'd equal a 7950/7970 in terms of performance.

I refuse to discuss anything with you though, cause at the end of all that you basically said what i was saying! in fact i'm staying the hell away from this thread, people are just arguing for the sake of arguing even though we all agree the PS4 is gonna be great for PC. and yet some of you guys still insist on arguing nonsensical details.
 
Last edited:

ChronoReverse

Platinum Member
Mar 4, 2004
2,562
31
91
No you're not saying what I'm saying. A 7970 isn't the top of the line. The GTX Titan is and it's almost certain that the PS4 will never reach the level the GTX Titan can just like the PS3 never reached what the 8800gts did.

It's nice that you're going to refuse to "discuss" because your point is incorrect no matter what engine we're talking about. The PS3 version of Crysis 3 is also a 720p game with much reduced graphical quality compared to the high resolutions, larger textures, greater draw distances, more complex shaders and 60FPS standard that PC gaming aims for.

The standard on the PC is much greater than that except when dealing with integrated graphics. Again, the PS4 is a very interesting console and unless Sony does something strange, it should be a good console. But the performance advantage will still remain with the (more expensive) PC platform.
 
Last edited:
Apr 20, 2008
10,067
990
126
I cant believe some of you are calling the Jaguar CPU slow for what it will be doing. As stated numerous times, the overhead of windows (and directx/OpenGL) is very, very high.

I played Madden 2008 on the Xbox console. When I inherited a Pentium 3 866mhz with 128MB ram i quickly loaded XP on and slapped in a geforce 5500 (256mb, 128-bit DDR, MUCH faster than the Geforce 3 500 in the Xbox). All in all, much better specs than the original Xbox.

Since Madden on the Xbox is running at high settings at 640x480, I decided to put those setting on my PC as well. What happened when I entered a game? It was a damn slideshow. Completely unplayable. Seemingly 3-6fps. The Xbox is running at 30FPS. Even on all low the fps couldn't have even been at 10fps. It was god awful.

P3 733 vs P3 866
64MB Shared Ram vs 128mb System Ram
Geforce 3 500 shared ram vs Geforce 5500 256mb 128-bit
Dedicated audio chip vs audigy 1
Xbox OS vs Windows XP SP1
30fps constant vs 6fps

See what I'm getting at?
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
I cant believe some of you are calling the Jaguar CPU slow for what it will be doing. As stated numerous times, the overhead of windows (and directx/OpenGL) is very, very high.

I played Madden 2008 on the Xbox console. When I inherited a Pentium 3 866mhz with 128MB ram i quickly loaded XP on and slapped in a geforce 5500 (256mb, 128-bit DDR, MUCH faster than the Geforce 3 500 in the Xbox). All in all, much better specs than the original Xbox.

Since Madden on the Xbox is running at high settings at 640x480, I decided to put those setting on my PC as well. What happened when I entered a game? It was a damn slideshow. Completely unplayable. Seemingly 3-6fps. The Xbox is running at 30FPS. Even on all low the fps couldn't have even been at 10fps. It was god awful.

P3 733 vs P3 866
64MB Shared Ram vs 128mb System Ram
Geforce 3 500 shared ram vs Geforce 5500 256mb 128-bit
Dedicated audio chip vs audigy 1
Xbox OS vs Windows XP SP1
30fps constant vs 6fps

See what I'm getting at?


Yup...bad porting...
 
Apr 20, 2008
10,067
990
126
Yup...bad porting...

You're joking, right? Madden, along with all 2002-2010 EA Sports PC games are not ports. All are compiled for x86, 32-bit processors. There is no "porting" of this game.

Porting is recompiling code for a different platform, and bad porting is when little to no effort for code optimization is observed. See GTA4 on XBOX vs PC for an example of bad porting.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
You're joking, right? Madden, along with all 2002-2010 EA Sports PC games are not ports. All are compiled for x86, 32-bit processors. There is no "porting" of this game.

Porting is recompiling code for a different platform, and bad porting is when little to no effort for code optimization is observed. See GTA4 on XBOX vs PC for an example of bad porting.

http://forum.ea.com/eaforum/posts/list/8534394.page

Why is the truth always absent in people defending consoles?
 
Apr 20, 2008
10,067
990
126
http://forum.ea.com/eaforum/posts/list/8534394.page

Why is the truth always absent in people defending consoles?

Defend consoles? Hardly. I have a (modded) Wii and my gaming PC. That's it. And I have no real interest in the next set of consoles either. With the architecture going back to x86 and heavily multithreaded, I'm quite happy to stay a PC gamer.

Also, I really don't know what you're getting at with that link. Once again you're being vague, irrelevant and likely inflammatory towards others, much like you are in VC&G. If you have a response, make it a real fucking response, and not just some rhetorical troll shit. The only thing i can contend your point is that some random poster (and I mean random, as in a dude with 14 posts) says that it is a PS2 port, when it wasn't. The 2001 PC variant was still the old PS1/N64 style madden on the PC. The 2002 PC version incorporated the xbox version, and it kept going from there.

If you were a sports gamer, and a follower of the franchise over multiple consoles and years, you would know this.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
No you're not saying what I'm saying. A 7970 isn't the top of the line. The GTX Titan is and it's almost certain that the PS4 will never reach the level the GTX Titan can just like the PS3 never reached what the 8800gts did.

It's nice that you're going to refuse to "discuss" because your point is incorrect no matter what engine we're talking about. The PS3 version of Crysis 3 is also a 720p game with much reduced graphical quality compared to the high resolutions, larger textures, greater draw distances, more complex shaders and 60FPS standard that PC gaming aims for.

The standard on the PC is much greater than that except when dealing with integrated graphics. Again, the PS4 is a very interesting console and unless Sony does something strange, it should be a good console. But the performance advantage will still remain with the (more expensive) PC platform.

lol are u reading what you're writing? You said:

"The "7860" in the PS4 at the end of its lifetime will at best only match the Titan."

YOU ARE SAYING THE PS4 MIGHT, "AT BEST", MATCH A TITAN. and then you argue with me that it won't match a 7970 because its not as good as a 7950 but might be as good as a Titan?? You DO realize a Titan outperforms a 7970 right? PLEASE put down the crack pipe, WE ARE ARGUING THE SAME THING.

Now look at my system specs and answer me this; DO I GIVE A RATS ASS? i'm gonna be gaming with my GTX 670 and 16GB RAM & 4.4ghz Sandy bridge till well into 2015, and it'd be nice to see something challenge it over the next 2 years cause EVERYTHING runs @ 60fps on my setup. That's the only reason i'm interested in the PS4.
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Defend consoles? Hardly. I have a (modded) Wii and my gaming PC. That's it. And I have no real interest in the next set of consoles either. With the architecture going back to x86 and heavily multithreaded, I'm quite happy to stay a PC gamer.

Also, I really don't know what you're getting at with that link. Once again you're being vague, irrelevant and likely inflammatory towards others, much like you are in VC&G. If you have a response, make it a real fucking response, and not just some rhetorical troll shit. The only thing i can contend your point is that some random poster (and I mean random, as in a dude with 14 posts) says that it is a PS2 port, when it wasn't. The 2001 PC variant was still the old PS1/N64 style madden on the PC. The 2002 PC version incorporated the xbox version, and it kept going from there.

If you were a sports gamer, and a follower of the franchise over multiple consoles and years, you would know this.

I know madden has always been ported badly from consoles to the PC...don't need to be a consol-gamer to know that...just an geek.
Try googling a bit...your FUD about "Madden" not being a console port is a joke.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
False, the bolded part...but nice fallacy, taking your subjective, personal view and presenting it like a fact. :thumbsdown:

What has happend is that the CPY has held back physic for years.
And you are still advocating letting the lesser performing hardware do the job...:thumbsdown:

It's because the CPU is the realistic place to do the job where all your code can be executed and latency isn't an issue. Realtime GPGPU with fast but accurate collision is difficult due to the latency between the CPU and GPU. An APU not only benefits from the combination of both a CPU and GPU but from homogenized RAM addressing. The APU was Sony's best option for the PS4 for expanding on Cell's GFLOPS potential while simultaneously providing the necessary hardware flexibilities and performance for developers without expensively customized core design.

I hope it trickles down to the PC space. It would be cool to see devs actually support high end physics on AMD APUs thanks to headways made on the PS4.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
It's because the CPU is the realistic place to do the job where all your code can be executed and latency isn't an issue. Realtime GPGPU with fast but accurate collision is difficult due to the latency between the CPU and GPU. An APU not only benefits from the combination of both a CPU and GPU but from homogenized RAM addressing. The APU was Sony's best option for the PS4 for expanding on Cell's GFLOPS potential while simultaneously providing the necessary hardware flexibilities and performance for developers without expensively customized core design.

I hope it trickles down to the PC space. It would be cool to see devs actually support high end physics on AMD APUs thanks to headways made on the PS4.

An APU aint much different than a CPU+GPU. Its just thrown on the same package. but the internal link is the same. Less latency yes, but not any revolution either. The closest thing yet is Intels shared L3. But the CPU wastes quite alot of cycles still to wait on the GPU.

I can only say I agree with Cerb. Physics besides glitter will always be CPU based.
 

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
Stupid Question: What's the purpose of the Jaguar GPU if the "7860" will be running the graphics? Will they use the gpu compute functions of AMD apus?
 

inf64

Diamond Member
Mar 11, 2011
3,884
4,692
136
Jaguar is just a name for x86 core. Name of SOC that has Jaguar and GCN GPU on the same die is Kabini/Temash. This is what AMD will launch for mobile segment.
PS4's custom based APU is Jaguar x86 core+special IMC+ Pitcairn-like GPU all on same die. Also there will be some dedicated HW blocks not in Kabini. This SOC won't be sold separately but only within PS4.
 

blckgrffn

Diamond Member
May 1, 2003
9,687
4,348
136
www.teamjuchems.com
Jaguar is just a name for x86 core. Name of SOC that has Jaguar and GCN GPU on the same die is Kabini/Temash. This is what AMD will launch for mobile segment.
PS4's custom based APU is Jaguar x86 core+special IMC+ Pitcairn-like GPU all on same die. Also there will be some dedicated HW blocks not in Kabini. This SOC won't be sold separately but only within PS4.

Slow down there. You aren't allowed to bring actual information in here anymore, just some speculation. That is all ;)

I, for one, don't see what the concern is. It's still going to be vastly better than what was there before, have a lot of FPU power, some special and exclusive go juice, etc.

Crap folks, you know that cross platform titles will have to work on the WiiU, right? It's likely this thing will only be able to stretch its legs on exclusives.

Or maybe the WiiU is about to get cut out like the Wii did if the xbox 3/PS4 are similar enough?
 
Apr 20, 2008
10,067
990
126
Slow down there. You aren't allowed to bring actual information in here anymore, just some speculation. That is all ;)

I, for one, don't see what the concern is. It's still going to be vastly better than what was there before, have a lot of FPU power, some special and exclusive go juice, etc.

Crap folks, you know that cross platform titles will have to work on the WiiU, right? It's likely this thing will only be able to stretch its legs on exclusives.

Or maybe the WiiU is about to get cut out like the Wii did if the xbox 3/PS4 are similar enough?

Sadly, yes.

I foresee most of the current developers keeping a team or two each to keep developing games on the PowerPC architecture or taking ports from the Xbox and PS4 and scaling them down to work on the Wii U.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Slow down there. You aren't allowed to bring actual information in here anymore, just some speculation. That is all ;)

I, for one, don't see what the concern is. It's still going to be vastly better than what was there before, have a lot of FPU power, some special and exclusive go juice, etc.

Crap folks, you know that cross platform titles will have to work on the WiiU, right? It's likely this thing will only be able to stretch its legs on exclusives.

Or maybe the WiiU is about to get cut out like the Wii did if the xbox 3/PS4 are similar enough?

DigitalFoundry did some investigating into the GPU that's in the Wii U, and found that the AMD GPU in the Wii U seems closest to the Radeon HD 4670. No, I did not mistype that. It's essentially an adapted RV730. It's got 320 shaders, probably VLIW5. There's no way it can compete against the 1152 GCN shaders in the PS4 or even the speculated 768 GCN shaders in the next Xbox. Hardware tessellation and compute functionality will be very limited, if present at all (the Radeon HD 4000 series did have an AMD proprietary pre-DX11 hardware tessellator, so the Wii U probably does too. But then, the Xbox 360's Xenos GPU also has a hardware tessellator. No idea if anyone does/will actually use either). The Wii U has only 8 ROPs -- 8! -- and 16 texture units. The PS4 probably has the Pitcairn standard 32 ROPs. With that few ROPs it's doubtful that the Wii U can actually handle demanding games smoothly at 1080p. And while the PS4 and next Xbox's 8 Jaguar cores may seem a little lean on the clock speed side, they probably leave the Wii U's three 1.2 GHz PowerPC cores in the dust.

The thing is barely better than an Xbox 360. In their quest for efficiency, Nintendo will again be a whole generation behind their competition. It's worse this time around, really. At least with the last generation, everything was PowerPC based, so there was some base code that made it easier to program/port things over to the Wii. With the PS4 and next Xbox being x86, that's another reason for developers to just pass on trying a Wii U port.
 
Last edited:

Stoneburner

Diamond Member
May 29, 2003
3,491
0
76
Jaguar is just a name for x86 core. Name of SOC that has Jaguar and GCN GPU on the same die is Kabini/Temash. This is what AMD will launch for mobile segment.
PS4's custom based APU is Jaguar x86 core+special IMC+ Pitcairn-like GPU all on same die. Also there will be some dedicated HW blocks not in Kabini. This SOC won't be sold separately but only within PS4.

Never mind, I read the ps4 information incorrectly. It will be jaguar cpu cores with an integrated "7860". I was under the impression the "7860" would be separate.

7850+ level of performance on an APU... God I hope they make a fm2 version of this.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,526
6,051
136
Never mind, I read the ps4 information incorrectly. It will be jaguar cpu cores with an integrated "7860". I was under the impression the "7860" would be separate.

7850+ level of performance on an APU... God I hope they make a fm2 version of this.

It would never work with FM2. 7850 level of performance requires 7850 levels of memory bandwidth, and dual channel DDR3 isn't going to cut it. They could mask some of it with on-package DRAM like Haswell GT3e (and supposedly the next XBox), but that would of course make it more expensive.

I can see it being used in BGA boards though, with soldered on GDDR5. Do it like graphics cards- make a reference MiniITX board, but let partners customise it if they like.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
An APU aint much different than a CPU+GPU. Its just thrown on the same package. but the internal link is the same. Less latency yes, but not any revolution either. The closest thing yet is Intels shared L3. But the CPU wastes quite alot of cycles still to wait on the GPU.

I can only say I agree with Cerb. Physics besides glitter will always be CPU based.

it's more than that...APUs have some data moving features, called Pin In Place and Zero Copy... both saves alot of bandwidth

don't forget that GCN have even more features for APUs, like the x86 pointer

yes, PS4 will have a more efficient bandwidth than a regular 7850 ;)