R Read - "Our semi-custom APUs" = Xbox 720 + PS4?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
It doesn't matter, there is no magic APU that can do full time 1080p with AA at high levels of detail (high-ultra equivelant on PC) with playable framerates. If it existed then surely you would find it available in a PC somewhere cause AMD sure as hell needs a magic shot in the arm.

The main issue with pushing PC APUs towards higher graphics performance is, of course, memory bandwidth. The benchmarks showing almost linear scaling of Trinity graphics performance with memory speeds are a clear indication of this. There is no real reason why they couldn't make a 7850 sized die with 4 Jaguar cores and 7770 level graphics, but with DDR3 they would be choked without a (prohibitively expensive) quad-channel memory controller. But in a console it could be paired with GDDR5 for main memory (in the same way that the 360 uses GDDR3), giving it enough memory bandwidth to have high performing graphics.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
You dont need to do it that way. Consoles already fused the 2 together. MS and Sony just needs to buy the GPU(AMD) and the CPU(IBM). Same way with SoCs for that matter.

Yup, they could indeed do an APU with mixed AMD and IBM parts. The issue there though is that they throw away the ability of GCN to work on x86 address space, making working on it a bit more difficult.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Will they? Or is that a pure guess.

According to this post, they will.

Kaveri will take that idea even further with shared memory and a unified address space. The company is not yet talking about how it will specifically achieve this with hardware, but a shared on-die cache is not out of the question — a layer that has been noticeably absent from AMD’s APUs. Phil Rogers, AMD Corporate Fellow, did state that the CPU and GPU would be able to share data between them from a single unified address space. This will prevent the relatively time-intensive need to copy data from CPU-addressable memory to GPU-addressable memory space — and will vastly improve performance as a result.

Would you kindly stop accusing me of making things up?
 
Last edited:

psoomah

Senior member
May 13, 2010
416
0
0
From an Extreme Tech article on Kaveri - "AMD gave two examples of programs and situations where the heterogeneous architecture can improve performance — and how shared memory can push performance even further. The first example involved face detection algorithms. The algorithm involves multiple stages where the image is scaled down and the search square remains the same. In each stage, the algorithm looks for facial features (eyes, chins, ears, nose, etc.) If it does not find facial features, it discards the image and continues searching further scaled down images. The first stage of the workload is very parallel so the GPU is well-suited to the task. In the first few stages, the GPU performs well, but as the stages advance (and there are more and more dead ends), the performance of the GPU falls until it is eventually much slower than the CPU at the task. It was at this point that Phil Rogers talked up the company’s heterogeneous architecture and the benefits of a “unified, shared, coherent memory.” By allowing the individual parts to play to their strengths, the company estimates 2.5 times the performance and up to a 40% reduction in power usage versus running the algorithm on either the CPU or GPU only."

That has Kinect II written all over it. Could be a two chip solution, Kaveri dedicated to running game code and a Bobcat (or ARM) chip running everything else. An unhindered Kaveri running dedicated code would be capable of astounding graphics at 1080p and with HSA just getting off the ground would provide for substantial future proofing.
progress.gif
 

cytg111

Lifer
Mar 17, 2008
25,661
15,160
136
I guess the memory is weak when trying to remember the failed hype rumours.

http://www.pcgameshardware.de/Ninte...e-Vielleicht-doch-mit-Intels-Larrabee-702080/
http://gonintendo.com/?p=108637
http://forums.gametrailers.com/viewtopic.php?f=38&t=987238

And so on and on.

There will be no APU in PS4 and Xbox720.

Oh wait, you people talk about the Audio Processing Unit? Yes it will most likely be AMD based ;)

every single positive rumor in regards to amd and you'll be there to put it down .. no proof, no link, no evidence .. just fanbois trying to make dreams come true.

Riddle me this.

"There will be no APU in PS4 and Xbox720."

The only possible deduction is, unless it is not 'fanboishm wishful thinking' on your part is if you *know* it not to be true! As in you *know* something else is going into those consoles.
So. Wheres the proof, the links, the evidence ?

Riddle me that.

If you *dont* know then there is a plausible future where both those consoles end up with an AMD apu. Deny that and you must missed statistics 101.

So what is it; fanboi, knowledge or missed stat-101 ?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
every single positive rumor in regards to amd and you'll be there to put it down .. no proof, no link, no evidence .. just fanbois trying to make dreams come true.

Riddle me this.

"There will be no APU in PS4 and Xbox720."

The only possible deduction is, unless it is not 'fanboishm wishful thinking' on your part is if you *know* it not to be true! As in you *know* something else is going into those consoles.
So. Wheres the proof, the links, the evidence ?

Riddle me that.

If you *dont* know then there is a plausible future where both those consoles end up with an AMD apu. Deny that and you must missed statistics 101.

So what is it; fanboi, knowledge or missed stat-101 ?

When was the last time the hype hold true? Was it Phenom 50% faster than Core 2? Was it BD? Was it PD? Was it reverse hyperthreading? Was it AMD gonna make massive cash on GPUs now (Since HD3xxx)? Or something else? Techsites happily post that kind of crap because people want to believe and keep on clicking.

If you said consoles would have an x86 Intel CPU I would tell you just the same, that you completely lost your mind.

You might just see more of this type withAMD since its a company in such deep troubles. That some miracle needs to happen. And people are searching for that miracle.
 

cytg111

Lifer
Mar 17, 2008
25,661
15,160
136
thats just spin. You're doing excatly the same as the fanbois you're accusing.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
The main issue with pushing PC APUs towards higher graphics performance is, of course, memory bandwidth. The benchmarks showing almost linear scaling of Trinity graphics performance with memory speeds are a clear indication of this. There is no real reason why they couldn't make a 7850 sized die with 4 Jaguar cores and 7770 level graphics, but with DDR3 they would be choked without a (prohibitively expensive) quad-channel memory controller. But in a console it could be paired with GDDR5 for main memory (in the same way that the 360 uses GDDR3), giving it enough memory bandwidth to have high performing graphics.

Arguably, 4 Jaguar Cores + Cape Verde level of performance really wouldn't be enough of a push for nextgen, considering this generation has been drawn out for so long. If the Xbox 3 was to come out right now, such level of performance would be acceptable at least on the graphics front. On the CPU level, not so much, unless GPGPU is a planned part of the equation (which would only take performance from the graphics array).
 

Arzachel

Senior member
Apr 7, 2011
903
76
91
No, it won't, not even in the realm of it. It may have a *slight* performance advantage, but it would be so small that the mass of console gamers would laugh, call it a piece of sh!t and not buy it. Recent AMD APUs still lose to a 8800GT, they aren't even a generation ahead of what Sony and MS have in the six seven year old consoles. Console gamers expect order of magnitude performance jumps between generations, they are going to laugh at ~15%.

If Sony and MS really wanted to go stupid cheap, they could just use high end ARM SoCs saving them a lot of money over the expensive, hot a huge APUs, eliminating the worries of AMD imploding to the point of failure and giving them more options at moving their code base around(Win8 is banking on ARM success already).

And you manage to absolutely ignore the root cause of why the current APU performance is as bad as it is. Some things that are prohibitively expensive on an off-the-shelf desktop/laptop platform you can do comparatively easily in a console and feeding the iGPU is one of them. An iGPU equivalent to Cape Verde would be what, six or seven times faster than Xenos? As for the CPU, while raw throughput for Xenon was pretty impressive at the time, it came with absolutely horrible latency so Jaguar would most definitely be a huge improvement.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
And you manage to absolutely ignore the root cause of why the current APU performance is as bad as it is.

We can just look at non FB access limited situations, best case scenario we see a 100% performance improvement. That is insanely bad for a console generation. To put this in perspective, going from point filtered pre Voodoo1 to Direct X 9 was a total of three generations(there was only one generation between those two points). Tiny little marginal generational increases may be OK in the PC space, they are not acceptable in the console area for anyone who is looking to buy a 720 or PS4.

An iGPU equivalent to Cape Verde would be what, six or seven times faster than Xenos?

Not even remotely close. I don't think people are properly grasping how stupidly bad APU parts are, and it isn't just a matter of bandwidth. The GF 640 sodomizes APUs, a part that we all laugh at as pathetic and clearly non gaming at all. It isn't all a matter of bandwidth either, that is only one of the massive flaws APUs have.

If you want to think about this in the proper context, when the PS3 released it was one generation behind the top desktop parts, the 360 was *ahead* of the top desktop parts when it released. That is what we are looking at in terms of where console gamers expect their consoles to be, not getting violently beat down by HTPC parts.

As for the CPU, while raw throughput for Xenon was pretty impressive at the time, it came with absolutely horrible latency so Jaguar would most definitely be a huge improvement.

1000% is your baseline for being *average*. Jaguar would be an epicly massive failure for MS- it would be closer to a PS2.5 for Sony.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Has there ever been an AMD CPU in a console? Even the Wii U has an IBM chip in it.

The only console ever to use x86 was the original xbox (and that wasn't chosen due to technical or cost merits either, but more due to time to market convenience).
 

cytg111

Lifer
Mar 17, 2008
25,661
15,160
136
The only console ever to use x86 was the original xbox (and that wasn't chosen due to technical or cost merits either, but more due to time to market convenience).

If there was going a new "cell" into either consoles at this point, wouldnt we have known about it? If the next gen consoles is coming 13/14, it has got to be cooked on tech allready in the oven.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
The only console ever to use x86 was the original xbox (and that wasn't chosen due to technical or cost merits either, but more due to time to market convenience).

yes but, I think the Xbox 1 was a different beast, MS was buying CPUs from Intel (it was basically a mobile P3 with less l2 cache), not just licensing IP, which was a big mistake and caused the early death of the XB1, I think they did the same with the Nvidia GPU/Chipset....

but the XB1 was clearly the most capable console of its gen.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If there was going a new "cell" into either consoles at this point, wouldnt we have known about it?

IBM was saying they were working on it years ago. Doesn't mean it *will* end up being in either console, but it was in development based on the Power8 architecture.
 

NTMBK

Lifer
Nov 14, 2011
10,411
5,677
136
Not even remotely close. I don't think people are properly grasping how stupidly bad APU parts are, and it isn't just a matter of bandwidth. The GF 640 sodomizes APUs, a part that we all laugh at as pathetic and clearly non gaming at all. It isn't all a matter of bandwidth either, that is only one of the massive flaws APUs have.

You're looking at the current off the shelf parts. The top level Trinity chip has 384 shaders, which is between a HD6450 and HD6670 in terms of shader counts- or a mere fifth of the HD6970. (This generation is the closest to Trinity's architecture, which is why I am comparing it.) They can't realistically put more shaders on a current PC APU, as dual channel DDR3 would stop them getting more performance out of it.

What is the reason that will stop them making an APU die the size of a 7850 die, with the number of GCN cores that a 7770 has, paired with GDDR5 main memory? And what will stop that chip from having the performance of today's discrete 7770? Because I can't see any theoretical reason why APU performance would have to be inherently worse that CPU + GPU performance, and I can even see a few reasons why it could be (and indeed should be) better - especially in a tightly controlled console.

If you want to think about this in the proper context, when the PS3 released it was one generation behind the top desktop parts, the 360 was *ahead* of the top desktop parts when it released. That is what we are looking at in terms of where console gamers expect their consoles to be, not getting violently beat down by HTPC parts.

It's been pretty commonly mooted that both MS and Sony aren't looking for the same bleeding-edge tech that was in the 360 and PS3 at launch. The Wii was far, far behind them technologically, but it sold gangbusters- and Nintendo were able to make a profit on every console sold, from launch. Nintendo handily won the last round of the console war- don't you think that the other two teams are going to be stealing a few pages from their playbook?
 

HurleyBird

Platinum Member
Apr 22, 2003
2,800
1,528
136
What is the reason that will stop them making an APU die the size of a 7850 die, with the number of GCN cores that a 7770 has, paired with GDDR5 main memory?

There's no reason they couldn't, besides the fact that it would be a pretty hefty undertaking of course.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
What is the reason that will stop them making an APU die the size of a 7850 die, with the number of GCN cores that a 7770 has, paired with GDDR5 main memory?

First, you'd need ~64MB-128MB of eDRAM, on top of 7770 level GPU and the CPU, you are talking ~GTX 480 die size. How are you going to cool that in a console? You could say pairing GDDR5 with a 512bit bus would work giving up the need for eDRAM and getting it down to simply an enormous size die, but the production costs for the mobo and the cost of the RAM would end up being more then the cost of the APU. If the goal here is supposed to be saving money, a custom MIPS or POWER part with a dedicated GPU would be considerably cheaper then either of those setups, and it would rather handily best it in performance too.

Because I can't see any theoretical reason why APU performance would have to be inherently worse that CPU + GPU performance

There are two reasons that aren't theoretical, die size and memory throughput. Dedicated parts for each allows *significantly* more flexibility in terms of die considerations and having separate memory busses significantly reduces that bottleneck not to mention allowing you to have much larger cache on your CPU *and* eDRAM on your GPU. There are massive reasons that an APU can't best dedicated CPU+ dedicated GPU, the only reason APUs are ever viable is if they are "good enough". The problem with that mentality, if say MS decides to go with "good enough" and Sony decides to go with good, MS is going to be obliterated(the inverse is also true, just MS has a bit more history working with x86 vendors then Sony so I used them).

It's been pretty commonly mooted that both MS and Sony aren't looking for the same bleeding-edge tech that was in the 360 and PS3 at launch.

Sony's big expense at launch was their BluRay player. I'm not saying that it is reasonable for MS to release a console with a GPU a generation ahead of the PCs like they did last time, but thinking they are going to use a castrated two year old low-mid range product? Not seeing a lot of good logic to that.

Nintendo handily won the last round of the console war- don't you think that the other two teams are going to be stealing a few pages from their playbook?

Do you not follow the console market?

http://miami.cbslocal.com/2012/07/28/game-over-for-nintendo/

Nintendo's strategy was short sighted and doomed to fail from the start over the long haul. Everyone who follows the market knew that before they launched and it happened. Their tie rate is the lowest of this generation(which is where most of the profits from consoles normally derive), they have been forced to launch at least a year ahead of everyone else despite coming out last(barely), they also have gone from have ~55% of sales for the category to under 21%. Nintendo had a short term strategy last generation, and it worked.... in the short term.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,112
136
It's been pretty commonly mooted that both MS and Sony aren't looking for the same bleeding-edge tech that was in the 360 and PS3 at launch. The Wii was far, far behind them technologically, but it sold gangbusters- and Nintendo were able to make a profit on every console sold, from launch. Nintendo handily won the last round of the console war- don't you think that the other two teams are going to be stealing a few pages from their playbook?

That's what I've heard, and it makes sense, especially in light of a globally weak economy.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
There are two reasons that aren't theoretical, die size and memory throughput. Dedicated parts for each allows *significantly* more flexibility in terms of die considerations and having separate memory busses significantly reduces that bottleneck not to mention allowing you to have much larger cache on your CPU *and* eDRAM on your GPU. There are massive reasons that an APU can't best dedicated CPU+ dedicated GPU, the only reason APUs are ever viable is if they are "good enough".

the reazon that an APU can't beat dedicated CPU+ dedicated GPU is power consuption and bandwidt...
APU = 100-125W
cpu+gpu = 250-370W (troll mode: more if using fermi)

take note: Brazos beat the crap of atom+ION using the same TDP
 

psoomah

Senior member
May 13, 2010
416
0
0
You're looking at the current off the shelf parts. The top level Trinity chip has 384 shaders, which is between a HD6450 and HD6670 in terms of shader counts- or a mere fifth of the HD6970. (This generation is the closest to Trinity's architecture, which is why I am comparing it.) They can't realistically put more shaders on a current PC APU, as dual channel DDR3 would stop them getting more performance out of it.

Which is why the next gen Ms and Sony consoles will be based on Kaveri, which combines Steamroller, 2nd gen GCN, shared memory and a unified address space. A game developers dream with several times the power of the Xbox 360 at a crazy low research, development and chip cost.

In Sony's case they have little choice. They are not only in a vastly different financial position than they were when the PS3 was being developed, but it's pretty obvious they were caught with their pants around their ankles when word broke Ms was well along the path to coming out with a new console in 2013 and whatever grandiose expanded Cell plans they had for a 2014/2015 timeframe crumbled. That arrogance and hubris cost them dearly. Now it's catch-up time and it's all about the $$$ and the obvious, if not only, solution is to cut a deal with AMD to have their 'semi-custom' version of whatever AMD APU Microsoft is going with, which minimises development costs in a number of ways and MAYBE allows them to get a console out in 2013. Kinect II is going to be mind blowingly awesome and giving Ms a years headstart to establish that as the standard would be bad news, ESPECIALLY as Ms has a complete pc/tablet/phone/console ecosystem to drive Xbox sales and Sony has ... *chirp* ... to drive theirs.

It also allows for shorter console refresh cycles while maintaining 100% backward compatibility.
 
Last edited:

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
Which is why the next gen Ms and Sony consoles will be based on Kaveri, which combines Steamroller, 2nd gen GCN, shared memory and a unified address space. A game developers dream with several times the power of the Xbox 360 at a crazy low research, development and chip cost.

In Sony's case they have little choice. They are not only in a vastly different financial position than they were when the PS3 was being developed, but it's pretty obvious they were caught with their pants around their ankles when word broke Ms was well along the path to coming out with a new console in 2013 and whatever grandiose expanded Cell plans they had for a 2014/2015 timeframe crumbled. That arrogance and hubris cost them dearly. Now it's catch-up time and it's all about the $$$ and the obvious, if not only, solution is to cut a deal with AMD to have their 'semi-custom' version of whatever AMD APU Microsoft is going with, which minimises development costs in a number of ways and MAYBE allows them to get a console out in 2013. Kinect II is going to be mind blowingly awesome and giving Ms a years headstart to establish that as the standard would be bad news, ESPECIALLY as Ms has a complete pc/tablet/phone/console ecosystem to drive Xbox sales and Sony has ... *chirp* ... to drive theirs.

I hope you really don't believe what you've posted here.
 

psoomah

Senior member
May 13, 2010
416
0
0
It's been pretty commonly mooted that both MS and Sony aren't looking for the same bleeding-edge tech that was in the 360 and PS3 at launch. The Wii was far, far behind them technologically, but it sold gangbusters- and Nintendo were able to make a profit on every console sold, from launch. Nintendo handily won the last round of the console war- don't you think that the other two teams are going to be stealing a few pages from their playbook?

Nintendo has no more Wand controller rabbits to pull out of it's hat, which is what caught the publics imagination and drove those sales. The Wii U controller ain't no rabbit, it's an albatross.

The reason Kinect set 'fastest ever' sales records was mothers. Mothers ecstatic at the idea of no more contollers underfoot, lost, fought over, needing replacing or spider webbing the tv. And that Kinect pretty much required their children to be on their feet and exercising. Sony caught on fast and it didn't take long for rumours to surface it was looking for companies and IP to buy to get it into the game.

Nintendo instead went with a SINGLE, expensive to replace, controller, WITH A SCREEN, that was guaranteed to be fought over. A mothers nightmare. It doesn't even double as a stand alone game machine. So it's essentially ceding it's dominance of the 'casual' market and it's drive to capture some of the 'serious' gamer market is going to flounder on it's underpowered processor relative to what Ms and Sony will be coming out with.

Game developers are going to love programming for the powerful, streamlined Ms and Sony machines and will program for those machines and are going to hate trying to squish their code back down to fit Nintendo's far less powerful and differently architectured machine. How will the Wii U fare better then the Wii in the 'serious' gaming department? Nintendo hit the ball clear out of the batting box with the Wii U.
 
Last edited:

psoomah

Senior member
May 13, 2010
416
0
0
I hope you really don't believe what you've posted here.

Why not? AMD recently said they have working Kaveri silicon in hand and Kaveri is slated for 2Q 2013 release. Not a stretch that 'semi-custom' Kaveri variants would be doable for 2013 console releases.

It's known AMD is sourcing mulitple vendors for it's 28nm APUs.

The latest rumor Kaveri is being delayed and AMD will be pushing out a refreshed Trinity instead makes sense if AMD is prioritising Ms and Sony Kaveri variants to make sure they meet console release deadlines in sufficient quantities.

It's surely in AMD's interests to prioritise resouces into what will give them a substantial edge over Nvidia and Intel into the future and having their APUs in the Xbox 720 and PS4 is just such an advantage.
 

BlockheadBrown

Senior member
Dec 17, 2004
307
0
0
Why not? AMD recently said they have working Kaveri silicon in hand and Kaveri is slated for 2Q 2013 release. Not a stretch that 'semi-custom' Kaveri variants would be doable for 2013 console releases.

It's known AMD is sourcing mulitple vendors for it's 28nm APUs.

The latest rumor Kaveri is being delayed and AMD will be pushing out a refreshed Trinity instead makes sense if AMD is prioritising Ms and Sony Kaveri variants to make sure they meet console release deadlines in sufficient quantities.

It's surely in AMD's interests to prioritise resouces into what will give them a substantial edge over Nvidia and Intel into the future and having their APUs in the Xbox 720 and PS4 is just such an advantage.

That's not exactly what I was referring to. Your previous statements read almost as if it was from the P.T. Barnum of Microsoft. There's more to it all than what you're relaying. There's plenty of solid places you can read up on this stuff.

Just some friendly advice.