Just wondering, why do you think no GPU maker has integrated eDRAM onto the card?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

lamedude

Golden Member
Jan 14, 2011
1,206
10
81
Bitboys tried it.:)
1870272axewafer_pic3.jpg

Its probably cheaper to go wider than to add 100+MB of eDRAM needed for 1080P+/AA/triple buffering. A 256bit memory bus limits how far you can shrink your die so on consoles it makes sense.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
When you've invested in the memory bus with large and fast GDDR, adding in the amount of eDram a performance card would require is just not realistic unless it's going to be some sort of $1000+ trophy SKU.

It makes sense for consoles and perhaps console like integrated graphics where the main memory interface is either considerably slower or limited in size or both.
 

tipoo

Senior member
Oct 4, 2012
245
7
81
SRAM replacement, that sounds weird o_O

How come? IBMs Power7s do the same thing, eDRAM uses something like 6x or 8x (I forget which) less transistors and die area than the equivalent SRAM would need.

Even still, the 32MB rumored eDRAM size doesn't seem like it could fit on the CPUs 30mm2 on 45nm die, the eDRAM rumored for the CPU was 3MB which makes more sense.

Also if the GPU doesn't have its own eDRAM, that 12.8GB/s memory bandwidth is cutting it extremely low, and I'm not sure that makes sense given that the speed of the GPU was never complained about by developers and it seems to do fine, or slightly better than the PS360. With half the memory bandwidth and part of that taken by the CPU, how would it do that?

My best guess still remains that both the CPU and GPU have eDRAM for different purposes, I just can't find an official source that says so yet. And there's still that matter of the third mystery die on the package, is that the ARM security core, or could that be a standalone eDRAM package?

I wish someone would take apart the chips and look at them under a microscope already, like Chipworks does for Apple chips.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
I believe the Nintendo 3DS also has a few megs of eDRAM, about 6 MB if rumors are correct.

The consoles probably uses eDRAM as a fast framebuffer, relying less on memory bandwidth to external VRAM, and even allowing the use of a UMA (unified memory architecture) in a similar way to igps as opposed to dedicated memory, and reducing both cost and complexity.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
This would be the correct answer. When your memory pools are split, it's something that needs to be managed by the programmer when developing the game. Since most existing games are not written for split pools, the new gpus would rather suck at them.
I don't think you necessarily need that level of granularity. Just like games don’t directly program GPU caches, the eDRAM could be used behind the scenes by the driver as a rasterizer cache of sorts.

Let the driver manage it automatically as another level in the memory hierarchy, while programmers keep treating the GPU as a combined memory pool.
 

Flyingcircus

Junior Member
Dec 9, 2012
1
0
0
Even still, the 32MB rumored eDRAM size doesn't seem like it could fit on the CPUs 30mm2 on 45nm die, the eDRAM rumored for the CPU was 3MB which makes more sense.
that's correct
eDRAM manufactured in 40nm is about 1MB/2mm²
as far as i'm aware the actual size of the eDRAM hasn't been confimed yet though.. or am i mistaken about that?


Also if the GPU doesn't have its own eDRAM, that 12.8GB/s memory bandwidth is cutting it extremely low, and I'm not sure that makes sense given that the speed of the GPU was never complained about by developers and it seems to do fine, or slightly better than the PS360. With half the memory bandwidth and part of that taken by the CPU, how would it do that?
more importantly, no developer has ever complained about memory speed
and you can be quite sure they would have if 12.8GB/s Bandwidth was everything the GPU can rely on


My best guess still remains that both the CPU and GPU have eDRAM for different purposes, I just can't find an official source that says so yet. And there's still that matter of the third mystery die on the package, is that the ARM security core, or could that be a standalone eDRAM package?
eDRAM is by definition always on die
also it's much too small to house any significant amount of eDRAM
it seems a bit small even for an ARM core.. especially since it's apparently a tri-core according to marcan, but maybe yeah.. who knows what it's for.. if it really is just handling DRM as marcan claims then that could be it
could also be a north/southbridge equivalent

I wish someone would take apart the chips and look at them under a microscope already, like Chipworks does for Apple chips.
let me know if you find someone who does :)
 
Last edited: