Impact of upcoming 3D DRAM on CPU and overall computing market

seitur

Senior member
Jul 12, 2013
383
1
81
http://www.computerworld.com/s/article/9242664/Micron_ships_Hybrid_Memory_Cube_that_boosts_DRAM_15X

Micron Wednesday said that it's started shipping engineering samples of its 3D Hybrid Memory Cube (HMC) to high-performance computing and network equipment makers.
[...]
The first HMC boards will deliver 2GB and 4GB of capacity, providing aggregate bi-directional bandwidth of up to 160GBps compared to DDR3's 11GBps of aggregate bandwidth and DDR4, with 20GBps to 24GBps of aggregate bandwidth.


"For quite some time memory and logic was tracking well, following Moore's Law. But, as the processor market took a lead in performance, we started falling behind in trying to keep up with improvements generation over generation," said Mike Black, chief technology strategist for Micron's Hybrid Memory Cube team. "Processors with multiple core were being starved for memory."


Micron, Hynix and Samsung. Knee deep in this and tech seem already quite mature (enginering samples already).

Thoughts?
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
is this a different manufacturing process than nand flash? how much renovation/upgrading would be needed to build these new chips?
and what of capacity and densities?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Looks promising. And would also finally be the death of the parallel DIMMs if density and price can follow. Else it will just be another "GPU memory".

And an IGP revolution.

But I dont see it happening until after DDR4 (lifespan) at the earlist. So..2018, 2019? Around the same time we might see graphene CPUs? :p
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
I bet these chips end up in iphones and ipads long before they end up in PCs, given how slow and dumb microsoft and intel are... We still dont have integrated SSD controllers, NAND on DIMMs, etc. We still have no preparation for all the new types of hybrid memory coming down the pipe; there is no shared high bandwidth bus for volatile and nonvolatile memory, so when something like PCM hits, which could replace NAND and DRAM with just one type of memory, there will be no hardware or OS support for it. But apple will immediately begin to use it and then there will be just one type of memory and their OS will fly... imagine not having to load anything. To make use of such a feature requires a complete OS redesign. Imagine microsuck doing this... hahhaha
 

seitur

Senior member
Jul 12, 2013
383
1
81
Looks promising. And would also finally be the death of the parallel DIMMs if density and price can follow. Else it will just be another "GPU memory".

And an IGP revolution.

But I dont see it happening until after DDR4 (lifespan) at the earlist. So..2018, 2019? Around the same time we might see graphene CPUs? :p
IGP revolution definately.

Not really it has to be at the end of "2D" DDR4 lifespan. This is "just" 3D stacking technology+transporting tech.
It is NOT sort of DDR5 - it uses DDR3 or DDR4 standards. In similar way as 3D flash by Samsung use normal flash "just" stacked.

Here info from April when it specification was made:
http://www.computerworld.com/s/arti...mory_Cube_spec_to_boost_DRAM_bandwidth_by_15X

Adoption will be dependant on price. If it can be made low enough relatively fast, then adoption rates will be fast as well. Competetive advantage for any IGP or SoC maker that will offer products that use it are obvious. It just all depend if price will be low enough that users will be willing to pay it.
Of course it can have it uses in dGPUs too.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
So the ever speculated full integration of PC parts into one tight bundle may actually happen in the 5-10 year time frame. It really is starting to come together as ARM pushes up and x86 pushes down. If these layered DIMMs can push 160GB/sec with DDR3, this stacked configuration will be spot on for integrated CPU+GPU memory. Wonder what sort of bandwidth they'll have using DDR4 or a to be defined DDR5.

There will be demand for some new methods of keeping everything cool. Unless the industry just settles on <10W computing.
 
Last edited: