R Read - "Our semi-custom APUs" = Xbox 720 + PS4?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
I'm betting on not using an APU.

I'm curious, why are you betting on that? Both companies are looking for a lower cost solution this time around, and combining CPU and GPU into an APU from the start makes perfect sense. The latest version of the 360 already uses an APU (the chips were combined in the "stealth" revision to reduce costs and simplify cooling).
 

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
Well supply links to these magical chips ? Is something like that taped out for the PC ?, Llano is using essentially a 5xxx series gpu with ddr3.

Jaguar is being paired with GCN cores, and should be taped out by now (unless the schedule has gone drastically wrong, which is a possibility with the shambles currently managing AMD).

EDIT: And of course, is nowhere near the die size of a 7970, I imagine.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I'm curious, why are you betting on that? Both companies are looking for a lower cost solution this time around, and combining CPU and GPU into an APU from the start makes perfect sense. The latest version of the 360 already uses an APU (the chips were combined in the "stealth" revision to reduce costs and simplify cooling).

Thats still not an APU. APU is an AMD term for AMD CPUs with GPUs ondie.

My 3570K aint an APU for sure. Even tho its more integrated GPU wise than AMD will be the next few years. My smartphone doesnt use an APU either. Nor does my TV. Even tho both got GPU ondie with the CPU.
 

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
Thats still not an APU. APU is an AMD term for AMD CPUs with GPUs ondie.

My 3570K aint an APU for sure. Even tho its more integrated GPU wise than AMD will be the next few years. My smartphone doesnt use an APU either. Nor does my TV. Even tho both got GPU ondie with the CPU.

APU was a term used by other companies long before AMD used it, and is a generic term for CPUs plus an accelerator on the same die (in this case, a GPU).
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
Even Charlie Dumbrian doesnt seem to believe in AMD either.

http://semiaccurate.com/2012/01/18/xbox-nextxbox-720-chips-in-production/

So, what is the XBox Next? SemiAccurate has been saying for a while that all signs were pointing toward a PowerPC, specifically an IBM Power-EN variant. The dark horse was an x86 CPU, but it was a long shot. It looks like the long shot came through, moles are now openly talking about AMD x86 CPU cores and more surprisingly, a newer than expected GPU. How new? HD7000 series, or at least a variant of the GCN cores, heavily tweaked by Microsoft for their specific needs.
http://semiaccurate.com/2012/09/04/microsoft-xbox-next-delay-rumors-abound/ ;)

Thats still not an APU. APU is an AMD term for AMD CPUs with GPUs ondie.

My 3570K aint an APU for sure. Even tho its more integrated GPU wise than AMD will be the next few years. My smartphone doesnt use an APU either. Nor does my TV. Even tho both got GPU ondie with the CPU.

man, it's easyer to say apu, than "GPU ondie with the CPU"...now deal with it
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
Everything rational points against AMD based x86 cores. Not to mention, consoles are expected to last 10 years. Thats not something I would trust AMD to do currently.

Why does everything rational point against AMD cores? Honestly, I'm curious.

As for the 10 years issue, either AMD will have already sold the IP to their console customers already, or they could sell it if they were facing bankruptcy.

APU is a term that AMD marketing has specifically adopted to downplay their weaknesses.

now deal with it.

Given that over half of their Trinity die is given over to graphics, emphasising that it's more than just a CPU makes sense.

Now deal with it.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
If they are APUs (and not necessarily all AMD), it may lead to more game developers making use of APIs like OpenCL for non-graphics tasks.
I think OpenCL is not a good option for consoles, and this is true for all "universal APIs". There are specific APIs for these devices, and this won't change in the future. Also if AMD APUs win the new PS and Xbox design, it will be a big win for HSA. One of the biggest challenge for the programers is to create optimised multiplatform games. This is not a challenge on the technical side, but it is on the financial side. If the developers can write HSA applications to two major consoles, than porting these codes to PC and some mobile devices will be much more easier.

I presume that there will be a unified address space and a single shared memory buffer, similar to the 360's memory layout- experience has shown that game developers find it much easier to extract performance from that sort of layout, compared to the PS3 style of lots of separate, explicitly managed memory buffers. GCN working in x86 address space, with good computational power, and no copying overhead between address spaces would make it much, much easier for GPGPU calculations.
The Xbox 360 just share the memory "physically", but in a logical view the CPU and the GPU memory are separate. Also there are some functions to prevent data copy, but a fully shared coherent memory will be much more advanced. Regardless of this you see the problem well. A fully shared coherent memory and a unified address space will be worth more than all the TFLOPS.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Everything rational points against AMD based x86 cores. Not to mention, consoles are expected to last 10 years. Thats not something I would trust AMD to do currently.

This is not about the x86 cores. The real evolutionary leap for the next-gen is to share a fully coherent memory between the GPU and the CPU. The only company can do it now is AMD.
 

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
The Xbox 360 just share the memory "physically", but in a logical view the CPU and the GPU memory are separate. Also there are some functions to prevent data copy, but a fully shared coherent memory will be much more advanced.

Ah, thanks! I've not got experience developing for the 360, so forgive my layperson's perspective on that front. :)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
This is not about the x86 cores. The real evolutionary leap for the next-gen is to share a fully coherent memory between the GPU and the CPU. The only company can do it now is AMD.

The company with the least integrated GPU in the industry? Please enlighten me on how.
 

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
The company with the least integrated GPU in the industry? Please enlighten me on how.

Because Kaveri (and presumably also Jaguar, since they are both using GCN) will have a fully unified address space.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Why does everything rational point against AMD cores? Honestly, I'm curious.

As for the 10 years issue, either AMD will have already sold the IP to their console customers already, or they could sell it if they were facing bankruptcy.



Given that over half of their Trinity die is given over to graphics, emphasising that it's more than just a CPU makes sense.

Now deal with it.

MS/Sony cant manufactor the CPUs. And they cant buy the design, hence they are 100% dependent on AMD like MS was with Intel with Xbox. And Looking at the BOM for PS3/Xbox720. The CPU cost is very low. We talk below Celeron cost. Needing to be dependent for 10 years on a 3rd party company both for supply and any cost reduction is suicidal. Specially with a company that is heading directly into the ground.
 

SPBHM

Diamond Member
Sep 12, 2012
5,068
423
126
Everything rational points against AMD based x86 cores. Not to mention, consoles are expected to last 10 years. Thats not something I would trust AMD to do currently.

interesting, what about Nintendo choosing an AMD GPU? and even you accept that the most likely GPU choice for consoles is going to be from AMD....

it's pretty clear that there are strong rumors about x86 AMD CPUs on one or both consoles, much stronger than it ever was for the Wii U, so...
we will need to way and see...
 

NTMBK

Lifer
Nov 14, 2011
10,479
5,895
136
MS/Sony cant manufactor the CPUs. And they cant buy the design, hence they are 100% dependent on AMD like MS was with Intel with Xbox. And Looking at the BOM for PS3/Xbox720. The CPU cost is very low. We talk below Celeron cost. Needing to be dependent for 10 years on a 3rd party company both for supply and any cost reduction is suicidal. Specially with a company that is heading directly into the ground.

AMD has no foundries either. What is your point?
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
interesting, what about Nintendo choosing an AMD GPU? and even you accept that the most likely GPU choice for consoles is going to be from AMD....

it's pretty clear that there are strong rumors about x86 AMD CPUs on one or both consoles, much stronger than it ever was for the Wii U, so...
we will need to way and see...

AMD can contractually allow Nintendo to have provisions to keep manufacturing (via a foundry agreement that Nintendo can make) a GPU if/when AMD ceased to exist.

They cannot do that with x86 and would have to go begging to Intel.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
interesting, what about Nintendo choosing an AMD GPU? and even you accept that the most likely GPU choice for consoles is going to be from AMD....

it's pretty clear that there are strong rumors about x86 AMD CPUs on one or both consoles, much stronger than it ever was for the Wii U, so...
we will need to way and see...

AMD has no foundries either. What is your point?

GPUs you can sell the entire design. When MS/Sony/Nintendo buys a design from nVidia/AMD it doesnt belong to nVidia/AMD anymore. Same happens with PPC, since IBM sells the design the same way to Nintendo/MS/Sony.

You cant do that with x86, because AMD doesnt hold the IPs needed.
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
The company with the least integrated GPU in the industry? Please enlighten me on how.
I don't know what's your opinion, but AMD has the most advanced integration approach.
They have onion bus for Llano which can access pinned pages on CPU memory. This is an OpenCL function for AMD APUs (zero-copy).
For Trinity AMD even go further. With FCL the GPU can access the CPU memory, and the CPU can access the GPU memory.
The next leap is Kaveri with fully shared memory. The CPU and the GPU can work on the same coherent memory space. The GPU can write to the CPU memory.

Intel just use a shared LLC for the CPU cores and the iGPU, but it's pretty pointless, because data parallel applications mostly share 30-100 megabytes of data. You can use this feature, but you must aware that the iGPU will write to the LLC without any control. If you don't write a very special code for it the iGPU will thrash the whole LLC. This is the main reason why developers don't optimise for it.

MS/Sony cant manufactor the CPUs.
GlobalFoundries and TSMC can.

You cant do that with x86, because AMD doesnt hold the IPs needed.
AMD hold AMD64 IP. So x86 doesn't matter for the consoles. Also the unified address space feature will need 64-bit operating mode, so they can disable the AMD64 compatibility mode, if this is a problem for Sony or Microsoft.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Welcome to how Charlie posts. And whatever ends up as result he will pick the article that fits and say see, I told you so all along.

Everything rational points against AMD based x86 cores. Not to mention, consoles are expected to last 10 years. Thats not something I would trust AMD to do currently.

AMD's products are not completely terrible in a general sense. Just versus Intel they are inferior in certain workloads, but as for APUs, the GFLOPS on the x86 cores and graphics array will be what matters.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I'm curious, why are you betting on that? Both companies are looking for a lower cost solution this time around, and combining CPU and GPU into an APU from the start makes perfect sense. The latest version of the 360 already uses an APU (the chips were combined in the "stealth" revision to reduce costs and simplify cooling).

Cause you want to wow your customers. Without doing FULL TIME 1080p rendering with advanced (DX11) features it won't be much of a wow factor. Need to be able to run Battlefield 4 on it in what would be considered Ultra on the PC IMO. Otherwise there's no real draw for many. Show me an APU that can do BF3, Crysis etc at 1080p with playable framerates at something ABOVE low detail settings. Doesn't exist. Trinity can't even do that. All the reviews are using 0AA/0AF and resolutions below 1080p at medium or low detail and in the case of Crysis 2 at 1680x1050 it's getting 23fps in High Quality using DX11. That is not good enough. You're going to stagnate the industry even more and a lot of gamers who also own PCs will bow out and not upgrade their consoles. Especially when Sony is going to support the PS3 with games for a while yet.
 
Last edited:

Arzachel

Senior member
Apr 7, 2011
903
76
91
Cause you want to wow your customers. Without doing FULL TIME 1080p rendering with advanced (DX11) features it won't be much of a wow factor. Need to be able to run Battlefield 4 on it in what would be considered Ultra on the PC IMO. Otherwise there's no real draw for many. Show me an APU that can do BF3, Crysis etc at 1080p with playable framerates at something ABOVE low detail settings. Doesn't exist. Trinity can't even do that. All the reviews are using 0AA/0AF and resolutions below 1080p at medium or low detail and in the case of Crysis 2 at 1680x1050 it's getting 23fps in High Quality using DX11. That is not good enough. You're going to stagnate the industry even more and a lot of gamers who also own PCs will bow out and not upgrade their consoles. Especially when Sony is going to support the PS3 with games for a while yet.

...Do you honestly believe that they're going to use off-the-shelf parts?

This thread is painful to read.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I don't know what's your opinion, but AMD has the most advanced integration approach.
They have onion bus for Llano which can access pinned pages on CPU memory. This is an OpenCL function for AMD APUs (zero-copy).
For Trinity AMD even go further. With FCL the GPU can access the CPU memory, and the CPU can access the GPU memory.
The next leap is Kaveri with fully shared memory. The CPU and the GPU can work on the same coherent memory space. The GPU can write to the CPU memory.

Intel just use a shared LLC for the CPU cores and the iGPU, but it's pretty pointless, because data parallel applications mostly share 30-100 megabytes of data. You can use this feature, but you must aware that the iGPU will write to the LLC without any control. If you don't write a very special code for it the iGPU will thrash the whole LLC. This is the main reason why developers don't optimise for it.

GlobalFoundries and TSMC can.


AMD hold AMD64 IP. So x86 doesn't matter for the consoles. Also the unified address space feature will need 64-bit operating mode, so they can disable the AMD64 compatibility mode, if this is a problem for Sony or Microsoft.

So Intels ringbus approach is so far behind AMDs method of just bolting the GPU on via a PCIe or HT style bus(Fusion Control Link)? Ye right....

TSMC and GloFo cant produce any x86 CPUs without AMD(Or VIA) being the company behind it.

Also you cant run x64 without Intel IPs. And Intel have free use of x64.
 
Last edited: