GDDR5 RAM vs On-Package Cache RAM to improve IGP performance?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
The entire reason why games are moving up to $70 dollars is not to subsidize the hardware in the ps4, but to pay for the cost of developing the game you are buying. Ps3 and Xbox 360 game ridiculously expensive to make. Ps4 and Xbox Next games are going to be even more expensive to create. They now have hardware that is capable of far more than what they were building for. They need to hire more artists, programmers, directors, sound designers, etc. to build these new next gen worlds.

Sony has stated that they will not lose money on this console generation for hardware. It's also already been said that the increase in game price to $70 is to cover the additional cost to building games to utilize all the added performance/capabilities of the new consoles.
 

NTMBK

Lifer
Nov 14, 2011
10,447
5,819
136
They're still using custom parts, and that will kill the early BoM, just like always, though probably less this gen than last. AMD probably gave them great deals, compared to getting something else made, like a SMP PPC470 SoC. IBM (and LSI?) are fat and full, while AMD is hungry for cash flow.

The real problem last time was that they used old methods and ideas, which were already patently bad, from the millenium-era resurgence of, "stupid simple hardware can go fast, you guys, so let's make all stupid simple hardware!" that plagued crappy RISCs. They were bad ideas from the start, their failures in implementation were well-predicted, and their supposed appearance in high-performance hardware was as a means to an end (neither MIPS nor Alpha, way back when, suffered from treating RISC/simplicity as a religion; it just happened to be one of the many ways they were able to improve performance, back then, with much more limited xtor technology at their disposal).

MS did a better job of it, overall, but neither had a CPU that was well-prepared for the memory wall, like our desktop and laptop CPUs were, and MS ended up using too little eDRAM to make the most of it (probably too costly to have 20+MB, at the time). The Cell needed a more powerful main CPU, a larger cache, and larger local stores, and MS' needed a larger cache, and eDRAM that was of greater capacity, and more flexible. With the long time between generations, not looking at it anew, and having cloistered server guys, and real-time embedded guys, helping the design, IBM/Sony/Toshiba made just the wrong decisions for the halo implementation. That left MS stuck with poor CPU core options, but they made what good they could out of it.

They don't just get a lower BoM by not going custom from the CPU on up, they also get accumulated wisdom from engineers who've worked on practical general-purpose CPUs, which the Cell, and thus Xenon's core, very much lacked. Either of them could afford the money for a custom CPU, but neither would have the capability to determine who would be good at managing such a project, without trying to poach a known quantity from their current employer :).

This, times 100. Anyone in this thread who hasn't, go read The Race For The New Game Machine. It's a first hand account written by senior IBM people who were in charge of the PowerPC core that powered the PS3 and 360. It's fascinating seeing just how easily a massive project like that can go way off target. They were trying to chase after clock speeds like 4-5GHz, but in the process throwing out so much useful stuff and making choices that were just daft for a games console. Heck, the way that book tells it, they threw out out-of-order execution and put in multithreading because one engineer wanted to make his mark on the project.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,222
589
126
So what is the general consensus on the BOM for the PS4? Reading the latest posts some say it can be sold with a profit even at $350-$400, and some say it will have to be subsidized at that price. But is the general opinion that the cost of the hardware will at least not be too far away from the expected $350-$450 sales price?

The reason I wonder is that if they can put 8 GB GDDR5 RAM in the PS4 it at that price point, is there any reason not to assume it could also make sense to do so in a $600-800 Kaveri laptop or AIO PC?
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,222
589
126
One thing I also find interesting is that Microsoft and Sony have decided to go for different solutions to solve the common memory bandwidth bottleneck problem. And that is despite that they both are expected to use very similar hardware apart from that (with the main exception that the PS4 is said to have 18 Radeon GCN units, the XBOX720 only 12).

To me that seems like an indication that neither of the two options 8 GB GDDR5 RAM (PS4) vs 8 GB DDR3 & 32 MB ESRAM (XBOX720) is obviously better in every aspect (cost, performance, etc). Otherwise we could assume that both would have chosen the same solution.

With the Haswell HD5200 On-Package Cache RAM vs Kaveri GDDR5 RAM there are other hardware differences between those CPUs (and also other aspects such as what process technology Intel vs AMD have available), which could explain why they have taken different routes. But with the PS4 and XBOX720 there are basically no such differences, and yet they have decided to go for different solutions.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
The reason I wonder is that if they can put 8 GB GDDR5 RAM in the PS4 it at that price point, is there any reason not to assume it could also make sense to do so in a $600-800 Kaveri laptop or AIO PC?
Yeah. There's something called "economies of scale." A PS4 is going to be a much higher volume part than any Kaveri laptop.
 

cplusplus

Member
Apr 28, 2005
91
0
0
So what is the general consensus on the BOM for the PS4? Reading the latest posts some say it can be sold with a profit even at $350-$400, and some say it will have to be subsidized at that price. But is the general opinion that the cost of the hardware will at least not be too far away from the expected $350-$450 sales price?

The reason I wonder is that if they can put 8 GB GDDR5 RAM in the PS4 it at that price point, is there any reason not to assume it could also make sense to do so in a $600-800 Kaveri laptop or AIO PC?

From the things I've read, the BOM on the PS4 should probably be around $400-500, and the GDDR5 is probably about $125-$175 of that. If they price it at $399, it'll still be a loss, but nowhere near the $200-$300 they were losing when they first released the PS3. So I would think it'd be possible to put 4GB into a laptop without it being ridiculously expensive.
 

Olikan

Platinum Member
Sep 23, 2011
2,023
275
126
The entire reason why games are moving up to $70 dollars is not to subsidize the hardware in the ps4, but to pay for the cost of developing the game you are buying. Ps3 and Xbox 360 game ridiculously expensive to make. Ps4 and Xbox Next games are going to be even more expensive to create. They now have hardware that is capable of far more than what they were building for. They need to hire more artists, programmers, directors, sound designers, etc. to build these new next gen worlds.

so true, as games become more realistic, it need more realistic art style...
today you can't have 4 models of trees and repeat the same model in a florest, ppl will whine and boycott the brand...

FarCry 3 probably had more designers working on trees, rocks, leafs and grass than working on the game engine :mad:
 

cytg111

Lifer
Mar 17, 2008
26,159
15,580
136
Stupid Question Of The Day(tm).

GGDR5, allright, but why wouldnt an approach like we see in 2011 work in this setup? Upping the number of channels from two to four(2011), six? and keep the cheaper DDR3 ?
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Stupid Question Of The Day(tm).

GGDR5, allright, but why wouldnt an approach like we see in 2011 work in this setup? Upping the number of channels from two to four(2011), six? and keep the cheaper DDR3 ?
It would work, definitely. It may even be a cheaper approach with current prices. The big question is if it would still be cheaper in one or two years.

There is probably reason behind the fact that both AMD and Nvidia have chosen to go with the smallest possible GDDR5 memory buses and instead up the speed to 6~6.4 Ghz effective. Higher speed GDDR5 doesn't seem as expensive as higher pin counts, slightly larger die sizes and maybe PCBs with more layers.

This isn't directly translatable to DDR3 <-> GDDR5, but it does show a general trend.
 
Last edited:

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Stupid Question Of The Day(tm).

GGDR5, allright, but why wouldnt an approach like we see in 2011 work in this setup? Upping the number of channels from two to four(2011), six? and keep the cheaper DDR3 ?
4 module laptops? Ack. No thank you. I would like my laptop to be able to fit on my lap.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Stupid Question Of The Day(tm).

GGDR5, allright, but why wouldnt an approach like we see in 2011 work in this setup? Upping the number of channels from two to four(2011), six? and keep the cheaper DDR3 ?

Cost and size. Look how much more expensive 2011 boards are over 1155.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
And, AFAIK, DDR4 has 48 more pins than DDR3, but the same data width :(
Guess we'll see advantage P2P has over a bus.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
And, AFAIK, DDR4 has 48 more pins than DDR3, but the same data width :(
Guess we'll see advantage P2P has over a bus.
Well, same data width for now. DDR3 kind of exceeded expectations, so it's stealing DDR4's thunder.
 

lamedude

Golden Member
Jan 14, 2011
1,230
68
91
One thing I also find interesting is that Microsoft and Sony have decided to go for different solutions to solve the common memory bandwidth bottleneck problem. And that is despite that they both are expected to use very similar hardware apart from that (with the main exception that the PS4 is said to have 18 Radeon GCN units, the XBOX720 only 12).

To me that seems like an indication that neither of the two options 8 GB GDDR5 RAM (PS4) vs 8 GB DDR3 & 32 MB ESRAM (XBOX720) is obviously better in every aspect (cost, performance, etc). Otherwise we could assume that both would have chosen the same solution.
My guess is MS wants to shrink there's. The smallest 256bit GPU is RV670 at 192mm2 so how small you can go before you're pad limited is likely around there so no 14nm 100mm2 PS4's without switching to 11GHZ 128bit.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,667
2,537
136
Stupid Question Of The Day(tm).

GGDR5, allright, but why wouldnt an approach like we see in 2011 work in this setup? Upping the number of channels from two to four(2011), six? and keep the cheaper DDR3 ?

The primary problem with this is that there won't be enough room for pads on the chip. Each bit line in the memory controller needs at least one pad on the chip. Using the standard way chips are integrated these days, pads take up room on the sides of the chip, and can only be so small before soldering things to them becomes infeasible. This means that the chip circumference puts a hard limit on how many memory channels you can fit on it. (And the need to keep distances within the chip small limits how "off-square" you can make it, you don't want to have a very narrow chip.)

This is especially bad for consoles, because they are expected to be shrunk several times during their lifetimes. 7970 or Titan can use a very wide memory bus, because they will never be shrunk as is. The next chip will be about as large, using the process transition to add more transistors instead of reducing cost. If a PS4 chip uses all it's room for pads on first release, it means that the chip will never be shrunk and will never become cheaper.

Also, adding more pads and lines on the board adds quite a bit of manufacturing cost, and that is a cost that does not go down over time like silicon costs do. So wishing for wider dimms is misguided. It will not bring performance advantages independent of cost.


What will help is 3d and 2.5d integration. Lines on a silicon interposer are much cheaper than ones on a pcb, and using new technologies (including TSVs), chips can be bonded to interposers and each other with bit lines on the whole surface of the chip, not just the edges. This will allow much wider interfaces, which will help bandwidth and power. (No DRAM runs at GHz speeds. DRAM typically runs at much slower speeds, but they are internally very wide. To fit data from this wide interface on a narrow bus, they make the bus run very fast. Faster signals take more power, so passing the data through wider interfaces instead saves quite a bit of it.)
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
Well, same data width for now. DDR3 kind of exceeded expectations, so it's stealing DDR4's thunder.

Well, I guess the choice was higher clocks or wider data paths - pick one. At least DDR4 is baking stacked ram into it's spec.
 

cytg111

Lifer
Mar 17, 2008
26,159
15,580
136
The primary problem with this is that there won't be enough room for pads on the chip. Each bit line in the memory controller needs at least one pad on the chip. Using the standard way chips are integrated these days, pads take up room on the sides of the chip, and can only be so small before soldering things to them becomes infeasible. This means that the chip circumference puts a hard limit on how many memory channels you can fit on it. (And the need to keep distances within the chip small limits how "off-square" you can make it, you don't want to have a very narrow chip.)

This is especially bad for consoles, because they are expected to be shrunk several times during their lifetimes. 7970 or Titan can use a very wide memory bus, because they will never be shrunk as is. The next chip will be about as large, using the process transition to add more transistors instead of reducing cost. If a PS4 chip uses all it's room for pads on first release, it means that the chip will never be shrunk and will never become cheaper.

Also, adding more pads and lines on the board adds quite a bit of manufacturing cost, and that is a cost that does not go down over time like silicon costs do. So wishing for wider dimms is misguided. It will not bring performance advantages independent of cost.


What will help is 3d and 2.5d integration. Lines on a silicon interposer are much cheaper than ones on a pcb, and using new technologies (including TSVs), chips can be bonded to interposers and each other with bit lines on the whole surface of the chip, not just the edges. This will allow much wider interfaces, which will help bandwidth and power. (No DRAM runs at GHz speeds. DRAM typically runs at much slower speeds, but they are internally very wide. To fit data from this wide interface on a narrow bus, they make the bus run very fast. Faster signals take more power, so passing the data through wider interfaces instead saves quite a bit of it.)

Wow, thanks :).
 

Piroko

Senior member
Jan 10, 2013
905
79
91
The primary problem with this is that there won't be enough room for pads on the chip. Each bit line in the memory controller needs at least one pad on the chip. Using the standard way chips are integrated these days, pads take up room on the sides of the chip, and can only be so small before soldering things to them becomes infeasible. This means that the chip circumference puts a hard limit on how many memory channels you can fit on it. (And the need to keep distances within the chip small limits how "off-square" you can make it, you don't want to have a very narrow chip.)

This is especially bad for consoles, because they are expected to be shrunk several times during their lifetimes. 7970 or Titan can use a very wide memory bus, because they will never be shrunk as is. The next chip will be about as large, using the process transition to add more transistors instead of reducing cost. If a PS4 chip uses all it's room for pads on first release, it means that the chip will never be shrunk and will never become cheaper.
While I agree that the room for pads is a technological hardlimit, I don't think it applies here. There have been Chips with 512 Bit buses like the R600 with only 408 mm² in 2007 and the Tegra 4 is said to be about 80mm² with dual channel 32 Bit DDR3L interface.
Imho it really boils down to an economical decision.

And yes, Homeles, two smartphones worth of PCB will still fit into your laptop :biggrin:
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,667
2,537
136
While I agree that the room for pads is a technological hardlimit, I don't think it applies here. There have been Chips with 512 Bit buses like the R600 with only 408 mm² in 2007 and the Tegra 4 is said to be about 80mm² with dual channel 32 Bit DDR3L interface.
Imho it really boils down to an economical decision.

R600 has the PHY on every edge of the chip, or in other words, it was exactly as small as it could possibly be and still fit a 512-bit interface (and debug and PCI-e interfaces). If a console were to ship with a 512-bit interface, the main chip alone would cost in the low hundreds of $ and never go down in price.

Not a winning proposition for a market where consoles are expected to only cost $500 for the first year or so.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Well, I can't argue against that. The PS4 might live through 2 or 3 shrinks after all, its APU could be tiny by then.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
(No DRAM runs at GHz speeds. DRAM typically runs at much slower speeds, but they are internally very wide. To fit data from this wide interface on a narrow bus, they make the bus run very fast. Faster signals take more power, so passing the data through wider interfaces instead saves quite a bit of it.)
GDDR5 runs at GHz speeds, does it not (I'm pretty sure Geforces are coming with it at 1.5GHz)?