AMD Bulldozer in PS4 - rumor sufaces

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AnandThenMan

Diamond Member
Nov 11, 2004
3,991
627
126
AMD's stock price is off 5% today. If there was the slightest chance of anything about this rumor being true that wouldn't be the case.
The stock market rarely if ever follows "fanboy" logic, or what you and I would consider logical reasons for a stock price to fluctuate. I've seen many times a company announce healthy growth and revenue for a quarter, and the stock price dropped 10-15% the next day.
 

grimpr

Golden Member
Aug 21, 2007
1,095
7
81
Somehow this has to do with AMDs Fusion System Architecture plans with future APUs with onchip ram through silicon vias and graphics core next than the Bulldozer x86 core itself.
 

Mopetar

Diamond Member
Jan 31, 2011
8,526
7,786
136
So much baseless speculation here...

All the presently sold consoles use an "APU", or a single chip with both the CPU and GPU. This significantly saves on fixed costs everywhere in the system, and it would be very unlikely that they would go back to two separate chips, if they had any way to avoid this.

The primary reason Llano's GPU is so puny is the memory interface. Of the present-gen consoles, XBOX360 uses a single unified 128 bit interface with GDDR3 ram. This is probably how they will do it for the next gen -- a single 128-bit interface with GDDR5. This is enough to get to the performance of 5770 or so, which is probably good enough.

As for the CPU, I don't think you get just how awful the CPU's in present-gen consoles really are. They are all in-order monstrosities plugged into high-latency memory systems with little cache -- Brazos would probably win them in most of the tasks they really perform, at half the clock speed. They were just as awful when they shipped -- even considering for how much CPU's have advanced in the past 5 years, BD really would be a major step up.

But I find it's use doubtful. The primary determinant on what HW they will use is not how good it is, but supply security, and how cheap they can make it. This is why it will most likely be built on TSMC, and the CPU is unlikely to be anything good enough that they'd actually have to pay much for licensing it.

Pretty much spot-on in my opinion.

It's also worth considering that the consoles would likely use a very customized design based on AMD technology. What this means is that it's likely that the APU seen in a console could be a lot different from one designed for traditional PCs. The chip would have a much heavier focus on the GPU and the overall specs would be towards providing a good gaming experience at 1080p.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Stupid? Not quite. Could you imagine the massive spread of multi-threaded games that would come about? Don't get me wrong I'm thinking it's far fetched also but AMD has a lot of pull in the console industry by now.
2005 has come and gone. What's so far fetched about what we have had since around '07?

*IF* this is true it would be a big step in console hardware. People always b tch about how consoles hold back the PC, this would definitely lend a hand in that department.
A console with an x86 CPU is still a console. It still only gets replaced after 5+ years. The problem isn't x86 v. PPC or ARM, it's that it is a piece of clearly defined hardware, that does not change in any performance or feature metric throughout its lifetime.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
If they could sandwich two dies together, its feasible. The problem is that, if they build a two module BD with a suitable GPU, you'd be looking at 3-4 billion transistors. Or more. There just isnt any way that yields would be high enough for consoles, since it would push prices way. Given the level of performance that consoles need, they can only do it if the CPU and GPU are separated fabbed and then combined later. They would also need some on die video memory or the GPU will suck hard, like Llano's GPU does (compared to similarly specced GPUs with dedicated memory).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Umm... not quite.

http://en.wikipedia.org/wiki/Wii_U#Technical_specifications

"based on" doesn't mean it's a full featured one, and that's only a rumor.

The 4730 is "based on RV770", and that's not exactly radeon 4870 class graphics.

http://www.engadget.com/2011/06/14/wii-u-has-last-gen-radeon-inside-still-more-powerful-than-ps3-a/

"At its heart is a chip similar to the R770 found in AMD's last-gen cards like the 4890"

Not sure what the 4730 is. Never heard of that GPU. RV770 on the desktop was a 4850 or 4870, both with 800 SPs, 40 TMUs and 16 ROPs. The other version was the 4830 with 640 SPs, 32 TMUs and 16 ROPs. Any of the 4830/4850/4870 chips are at least 2x faster than Llano's GPU.


So much baseless speculation here...

All the presently sold consoles use an "APU", or a single chip with both the CPU and GPU. ..............This is enough to get to the performance of 5770 or so, which is probably good enough.

I was under impression that Xbox360 included a GPU + CPU on 1 package but 2 separate silicon die. I heard that MS was supposed to launch a new revision that would finally combine the GPU+CPU on 1 die. But even if you are correct, it took 5 revisions preceeding Vejle revision before the GPU+CPU were combined into an APU style setup. So it actually makes more sense that they have 2 separate die to start and then once technology reaches 11nm, then they may try to combine them once more.

Consider that HD5770 on 40nm occupies 170mm^2, how do you propose they'll combine such a large GPU on top of a Bulldozer (4-6-8 core), which itself hasn't even launched due to delays? But not only that, by the time PS4 and Xbox3 launch in 2012-2013, an HD5770/HD6770 GPU will probably be $50 on the market (considering they are $100 today). It would be laughable to include such a slow GPU in a PS4 / Xbox3, which are probably going to launch at $400-500. It's going to be even harder to integrate a GPU faster than the HD5770 inside a CPU die on a 28-32nm process. Consider that Llano's GPU is just half the complexity....so obviously it's extremely expensive and difficult to do.

I think you guys are too optimistic about the APU setup - you have memory bandwidth limitations and TDP constraints and a large die size actually increases costs and reduces yields over 2 smaller die until the manufacturing process has matured where it is actually cheaper to produce a single die. However, combining a modern quad-core 32nm CPU like Bulldozer or IBM's Processors with a modern 32nm GPU design is going to result in a huge die - think 400-500mm^2.

If they choose APU design at this time, the GPU performance will take a huge performance hit. The fact that Wii U didn't even go with an APU style design sends a signal that it's even less likely that PS4 and Xbox3, which are likely to include a more powerful GPU, will follow that route either.
 
Last edited:

-Slacker-

Golden Member
Feb 24, 2010
1,563
0
76
It could happen

-Modified trinity chip with an over-emphasized gpu
-Shared fast gddr5 ram instead of crappy ddr3 ram at 1333mhz.
-other whacky videogame-oriented stuff that consoles have that pcs don't.

I could see this at cypress level of performance.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
It could happen

-Modified trinity chip with an over-emphasized gpu

That chip wouldn't stand a chance against an embedded discrete GPU + CPU package. But also, from a business risk perspective, you'd bet all your marbles on a product that hasn't even been tested, manufactured in large quantities or even sold on the market for your new console launch? Bulldozer has been delayed for 2-3 years and still hasn't launched on 32nm (that's without a GPU onboard). Who would trust AMD to launch Trinity on time and with good yields by 2012?

Think about it, if YOU were designing PS4 right now, would you choose a a known variable with a proven sales/reliability/yield history such as the GTX560M / 6800M + IBM CPU or an "unknown Trinity product with unknown CPU and GPU performance"?

Even if you do choose that, your competitor may choose to do a discrete GPU setup, in the process destroying your GPU in performance due dedicated GDDR5, dedicated cooling, dedicated TDP and much beefier specs. Your main competitive advantage of "World's most advanced graphics" will be ancient history. For Wii U this isn't a problem since they have unique 1st party support, but what exactly is going to separate PS4 from Xbox3? If one of those consoles is way more powerful, the other one is toast.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
No way they gonna use an APU design from the start, perhaps in 2-4 years after the original launch.

If they (MS - Sony) will launch the new consoles in 2012-13 then they could use IBMs 32nm, TSMCs 28nm or GloFos 28nm.

Since the first XboX 360 was using close to 180W and PS3 close to 200W when playing a game, i believe both MS and Sony will keep almost the same TDPs for there next consoles.

Welcome to Valhalla: Inside the New 250GB Xbox 360 Slim

23400.png


http://en.wikipedia.org/wiki/PlayStation_3_hardware

We do know that TSMC 28nm is a full node (From 40nm) and that means that if MS will use AMD then they could use a shrink down 28nm Cayman (HD6970) derivative chip or even a middle end GCN (HD7870 ??) design.

One thing, just because they could use such GPU technologies doesn't mean they will choose them for their next consoles, they could choose a different path.
 

zebrax2

Senior member
Nov 18, 2007
977
70
91
Consider that HD5770 on 40nm occupies 170mm^2, how do you propose they'll combine such a large GPU on top of a Bulldozer (4-6-8 core), which itself hasn't even launched due to delays? But not only that, by the time PS4 and Xbox3 launch in 2012-2013, an HD5770/HD6770 GPU will probably be $50 on the market (considering they are $100 today). It would be laughable to include such a slow GPU in a PS4 / Xbox3, which are probably going to launch at $400-500. It's going to be even harder to integrate a GPU faster than the HD5770 inside a CPU die on a 28-32nm process. Consider that Llano's GPU is just half the complexity....so obviously it's extremely expensive and difficult to do.

Sure the 5770 may not be the top end but you must remember that power consumption of current cards are vastly different from cards 5 or so years ago. Looking at xbitlabs power consumptions high end cards from the 7xxx and x19xx era consume around 100w in comparison to recent cards that consume 200-300w. You can't just shove a 200w GPU in a console
 

nenforcer

Golden Member
Aug 26, 2008
1,782
24
81
I think it's highly unlikely Sony switches from an IBM PowerPC RISC architecture back to an AMD legacy x86 architecture.

Console manufactures purposely want to use a proprietary architecture or custom chipset to not only own it from top to bottom and move away from a commodity PC but also to curb any potential piracy.

They would essentially be undoing what Microsoft did in moving a away from the Intel Pentium 3 in the original XBOX.

We've seen Sony has gone from a hardware to a software layer to handle playing PS2 games on the PS3 before finally removing it altogether. Microsoft went to quite a bit of effort to write an x86 to PowerPC emulator to run as many original XBOX games on the 360.

I don't know if Sony is willing to sacrifice backwards compatibility or write a software layer to run PS3 games on some new CPU PS4.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
An APU could happen for another reason:
Consoles are TDP limited. They basically fall into the same class a desktop replacement laptops, the entire system can use, at most, ~200W. 200W is higher than all but the beastliest laptops, but it's low enough that a discrete setup (GPU+CPU) may not have an advantage of a fusion setup.
 

sawtx

Member
Dec 9, 2008
93
0
61
http://www.engadget.com/2011/06/14/wii-u-has-last-gen-radeon-inside-still-more-powerful-than-ps3-a/

"At its heart is a chip similar to the R770 found in AMD's last-gen cards like the 4890"

Re-read that quote. The 4890 does not have a R770 but an RV790, there isn't even an R770. You really going to believe Engadget when they are going off of a translation of a site that is trying to guess at what is in the WiiU and can't even get their GPU facts straight? Again Engadget is misunderstanding what Game Watch was trying to say, basically it will use something from the HD4000 generation.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,732
432
126
Re-read that quote. The 4890 does not have a R770 but an RV790, there isn't even an R770. You really going to believe Engadget when they are going off of a translation of a site that is trying to guess at what is in the WiiU and can't even get their GPU facts straight? Again Engadget is misunderstanding what Game Watch was trying to say, basically it will use something from the HD4000 generation.

R770 is clearly RV770, which you have RV770 XT (4870), RV770 Pro (4850) and RV770 Le (4830). Radeon R700 was the engineering codename for the 4800 series and later became the code name for the 4870x2.
 

psoomah

Senior member
May 13, 2010
416
0
0
The 'Windows 8 to run Xbox 360 games' rumor, if true, would be a pretty compelling reason for Microsoft to go with an full AMD solution. It would considerably simply and cost reduce console/PC/tablet game development costs into the future.

Microsoft appears to be taking a page from Apples book and, with Windows 8, intends to provide a unified, inter-operable enviroment across PCs, consoles, tablets and phones. AMD has or will have APU solutions for PCs and tablets and possibly consoles.

Considering the hyper competitive future landscape both are facing, they would seem natural synergistic allies in implementing Microsofts 'unifying' vision.
 

sawtx

Member
Dec 9, 2008
93
0
61
R770 is clearly RV770, which you have RV770 XT (4870), RV770 Pro (4850) and RV770 Le (4830). Radeon R700 was the engineering codename for the 4800 series and later became the code name for the 4870x2.

He is going off of an Engadget article that did not understand what Game Watch was saying and put so little effort into the article that they mistyped/put in the wrong name of a GPU and attributed it to a video card that doesn't use an RV770. We don't know what the GPU is right now for the Wii U but it is not likely to be an RV770 based on the tech demo, console size, and Nintendo's history.
 

yottabit

Golden Member
Jun 5, 2008
1,672
874
146
This would be ludicrously stupid, same as if they were rumored to be using a next-gen Intel 8-core cpu. Too much heat/power/wasted general purpose processing power/cache.

I guess that explains why the original Xbox, which used a Pentium III CPU, had so many more overheating and power consumption problems than the Xbox 360, which used a more dedicated PowerPC CPU.

Oh, hrm... wait....
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The stock market rarely if ever follows "fanboy" logic, or what you and I would consider logical reasons for a stock price to fluctuate. I've seen many times a company announce healthy growth and revenue for a quarter, and the stock price dropped 10-15% the next day.

That's because stock prices are forward looking, not backwards looking.

Do you really believe that some random tech site has managed to scoop the big institutional investors? Joe Shmo blogger has more inside info than the investment bankers? Notice nobody in finance or business has picked up this story.

Because there isn't a story.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The 'Windows 8 to run Xbox 360 games' rumor, if true, would be a pretty compelling reason for Microsoft to go with an full AMD solution. It would considerably simply and cost reduce console/PC/tablet game development costs into the future.
Windows 8 already officially will run on x86 and ARM, and MS has been supporting either XP or 2k3, minus most of the user-land goodies, on Xbox360. C++ on current and future Windows platforms will not be hard to port, and MS is rather dedicated to .NET. Most of those costs have already been sunk. Having removed the ties to x86 for ARM, PPC would be a breeze, easily costing more time in QA than the rest of development. Visual Studio might need some tweaks, but nothing more than the typical 2-3 year version improvements.

MS could easily use fat binaries, a la Apple's x86 transition period, and make the Xbox360/Win8 rumor true. Or, they could be stretching the truth by .NET (which already allows games to run on many Windows hardware platforms, though you're stuck without hardware access). Or, they could have the software side of things so close that it would only need a HW target change at compilation time, allowing for entirely separate binaries. And all that is only if that rumor has any truth to it at all. Whether it does or does not is far more of a business issue for MS than a technical issue to work out, and will have little to nothing to do with the chosen hardware.

But, you know, here's a better explanation, which makes way more sense than any of the sensationalism: Windows all share a similar kernel, and much driver and service code, which now has a very linear history, and tables of errors could exist on all versions for all platforms.
 
Last edited:

Gikaseixas

Platinum Member
Jul 1, 2004
2,836
218
106
An APU could happen for another reason:
Consoles are TDP limited. They basically fall into the same class a desktop replacement laptops, the entire system can use, at most, ~200W. 200W is higher than all but the beastliest laptops, but it's low enough that a discrete setup (GPU+CPU) may not have an advantage of a fusion setup.

i had a ASUS Desktop Replacement (18") with a Intel Q9000 that could be oced to 2.8, two radeons 4870's mobile in CF and a PS rated at 240W. If thats the way they want to go we could very well see consoles consume that much.
 

notty22

Diamond Member
Jan 1, 2010
3,375
0
0
I find the windows 8/xbox 360 rumor more possible than BD in a console. IMO, that was leaked to counter BD might be a bad gaming option whispers. People on the fringe will have been exposed to conflicting tidbits.
Back to the x360 games, I wonder if game developers sign over all sorts of rights to Microsoft, that would allow this to happen.
 

Mopetar

Diamond Member
Jan 31, 2011
8,526
7,786
136
Consider that HD5770 on 40nm occupies 170mm^2, how do you propose they'll combine such a large GPU on top of a Bulldozer (4-6-8 core), which itself hasn't even launched due to delays?

Die shrink. Also, if they were going to use BD, it wouldn't be the same BD that goes into normal computers. It would be optimized for console use.

But not only that, by the time PS4 and Xbox3 launch in 2012-2013, an HD5770/HD6770 GPU will probably be $50 on the market (considering they are $100 today). It would be laughable to include such a slow GPU in a PS4 / Xbox3, which are probably going to launch at $400-500.

Because a console needs to play games at resolutions greater than 1080p? Something with the performance of a 5770 would be fairly ideal.

It's going to be even harder to integrate a GPU faster than the HD5770 inside a CPU die on a 28-32nm process. Consider that Llano's GPU is just half the complexity....so obviously it's extremely expensive and difficult to do.

It's mostly just a question of cost and the yields of a new process. AMD could easily pair a 5770-class GPU with Llano, but they would need to alleviate the existing memory bottleneck before it became useful to do so. Also consider that it's their first part on the 32 nm process, which means that smaller chips will have better yields.

It's merely just an engineering problem. It might be costly but it's worth it. First, they gain further experience building APUs which improves their future products, and second, a lot of the cost will be offset by the large number of chips that will end up in the future consoles.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Refer to this chart for my comment: http://stockcharts.com/h-sc/ui?s=AMD&p=W&st=2000-06-24&en=200-06-24&id=p39823290236

AMD's stock hit a high of around $42 in February 2006. The first week of march 2006, it dropped to $36. This was the week that anandtech did the "Conroe Preview: Intel Regains the Performance Crown" AMD's stock spent all of February 2006 hovering around $40, despite many market insiders clearly knowing full well that Core 2 was going to outperform it by a large margin. AMD's stock was trading at $15 less than a 9 months earlier, in 2005. In other words, the market knows next to nothing about tech. If it did, AMD's stock would have been falling during the 2nd half of 2005 into 2006. Now if you want a real head scratcher look at PCLN. Most worthless scam ripoff company ever. All you need to do is cancel once and you lose whatever savings you may have ever gotten for years of use, and those savings are questionable anyway. That company is a zero.