AMD 2015 Roadmap: Toronto, Cambridge , there DDR4

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
why would ddr4 be so much more expensive? supply/demand related? higher complexity? new tech price gouging?

ROI is the short answer. Its always a troublesome area. If too high supply, the memory makers loses money. So they wont do that. If too high demand, volume might go through the floor. And then the change of production lines cost too. Plus they want the R&D money back as well.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
ROI is the short answer. Its always a troublesome area. If too high supply, the memory makers loses money. So they wont do that. If too high demand, volume might go through the floor. And then the change of production lines cost too. Plus they want the R&D money back as well.

Yes, but those issues apply to DDR3 as well. Except that the R&D money to a large degree already has been earned back in that case.

I think the question is whether DDR4 really will be that much more expensive once it becomes mainstream with Carrizo / Skylake / [...] . And will it be expensive enough that it really will affect the total price of a PC so much, if e.g. adding 8 GB DDR3 vs DDR4 RAM?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yes, but those issues apply to DDR3 as well. Except that the R&D money to a large degree already has been earned back in that case.

I think the question is whether DDR4 really will be that much more expensive once it becomes mainstream with Carrizo / Skylake / [...] . And will it be expensive enough that it really will affect the total price of a PC so much, if e.g. adding 8 GB DDR3 vs DDR4 RAM?

You need to think as OEMs, millions of PCs that will cost 10-20$ more. That you cant sell for a higher price.

DDR4 starts in 2133Mhz, DDR3 2133 already carries a 20-25$ premium over DDR3 1600.

For DDR3, higher densities had a huge premium as well for years, because servers took it all.

And remember, only 1 DIMM per memory channel with DDR4. All non LGA2011-3 DDR4 boards on the desktop will feature 2 DIMM slots only.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
You need to think as OEMs, millions of PCs that will cost 10-20$ more. That you cant sell for a higher price.

DDR4 starts in 2133Mhz, DDR3 2133 already carries a 20-25$ premium over DDR3 1600.

For DDR3, higher densities had a huge premium as well for years, because servers took it all.

And remember, only 1 DIMM per memory channel with DDR4. All non LGA2011-3 DDR4 boards on the desktop will feature 2 DIMM slots only.

Well the point with DDR4 is that it should eventually provide higher speeds than DDR3. Then you can also sell a PC with DDR4 at a higher price. So $10-20 (or even more) extra should be easy to justify then I think.

Also, as you mentioned DDR4 is point-to-point, so if you have multiple memory sticks the performance can improve dramatically compared to DDR3. E.g. with 2 sticks you get 2x speedup, 4 sticks 4x, and so on. That is of course assuming sufficient number of memory channels are supported.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Well the point with DDR4 is that it should eventually provide higher speeds than DDR3. Then you can also sell a PC with DDR4 at a higher price. So $10-20 (or even more) extra should be easy to justify then I think.

Also, as you mentioned DDR4 is point-to-point, so if you have multiple memory sticks the performance can improve dramatically compared to DDR3. E.g. with 2 sticks you get 2x speedup, 4 sticks 4x, and so on. That is of course assuming sufficient number of memory channels are supported.

The performance is not worth the cost difference. So you cant increase the price.

You will not get more memory channels. So no change there besides the new limitation on desktop boards with 2 DIMM slots only. DDR4 2133 will perform as DDR3 2133. And the DDR3 will most likely be cheaper in the start. But again, price needs to come down. Else it will simply not be used.

Thats why its servers only. Since DDR4 offer better densities plus lower power consumption and you can use switch chips.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
The performance is not worth the cost difference. So you cant increase the price.
I think the price difference can be justified if paired with an iGPU that otherwise would be severely memory bottlenecked. Something we'll likely see happening with e.g. Broadwell / Skylake and Carizzo.
You will not get more memory channels. So no change there. DDR4 2133 will perform as DDR3 2133. And the DDR3 will most likely be cheaper in the start. But again, price needs to come down.

Thats why its servers only. Since DDR4 offer better densities plus lower power consumption and you can use switch chips.

Hmmm... I'm not sure why you mean there will not be more memory channels? Are you talking about some specific product/SKU then? Because surely there will be multi channel DDR4 SKUs available going forward.

A Carizzo / Skylake / or later APU paired with 4 memory channels would be really nice. A BGA solution with the DDR4 memory soldered onto the motherboard could be suitable so you don't need 4 individual memory sticks. Then you'd get a 4x speedup compared to a single DDR3 stick at the same frequency. Would there even be a need for pairing an APU with eDRAM then?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think the price difference can be justified if paired with an iGPU that otherwise would be severely memory bottlenecked. Something we'll likely see happening with e.g. Broadwell / Skylake and Carizzo.

Let me know when the OEMs agree. You know, the same OEMs selling PCs with 1 channel used only on IGP products to save cost ;)

Hmmm... I'm not sure why you mean there will not be more memory channels? Are you talking about some specific product/SKU then? Because surely there will be multi channel DDR4 SKUs available going forward.

A Carizzo / Skylake / or later APU paired with 4 memory channels would be really nice. A BGA solution with the DDR4 memory soldered onto the motherboard could be suitable so you don't need 4 individual memory sticks. Then you'd get a 4x speedup compared to a single DDR3 stick at the same frequency. Would there even be a need for pairing an APU with eDRAM then?

Many things would be really nice, but they still dont happen. 2 channels will be the max for non LGA2011-3 chips on the desktop. And there are no products even planned to change that.

Stacked memory or eDRAM is the solution for IGPs, not quadchanel DDR4. DDR4 is also the last standard we see there as standalone memory. GDDR5 is certainly the last on graphics. MCDRAM/HBM etc takes over there.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
Many things would be really nice, but they still dont happen. 2 channels will be the max for non LGA2011-3 chips on the desktop. And there are no products even planned to change that.

Stacked memory or eDRAM is the solution for IGPs, not quadchanel DDR4. DDR4 is also the last standard we see there as standalone memory. GDDR5 is certainly the last on graphics. MCDRAM/HBM etc takes over there.

Are you sure AMD is heading in that direction too? I don't think hUMA works that well with separate dedicated GPU graphics memory / eDRAM. From what I've heard there are latency issues with hUMA when transferring data between such dedicated IGP/GPU RAM and the normal system RAM.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Are you sure AMD is heading in that direction too? I don't think hUMA works that well with separate dedicated GPU graphics memory / eDRAM. From what I've heard there are latency issues with hUMA when transferring data between such dedicated IGP/GPU RAM and the normal system RAM.

AMD is working towards HBM. But if need be, I am sure HSA/hUMA would die in favour of eDRAM/eSRAM.
 

sefsefsefsef

Senior member
Jun 21, 2007
218
1
71
Stacked memory or eDRAM is the solution for IGPs, not quadchanel DDR4. DDR4 is also the last standard we see there as standalone memory. GDDR5 is certainly the last on graphics. MCDRAM/HBM etc takes over there.

You might be right about GDDR5, but I don't think DDR4 is the last we'll see from socketable DIMM-style memories. The problem with HMC and HBM is that the GB/$ ratio is super bad. Servers will always want more and more memory, and we will need memory in DIMM form factors to get us there. I can actually see a return to *lower* bandwidth, higher capacity DDR memory channels in the next 10 years, that use some fancy HBM or HMC as a caching and/or prefetching "memory controller."
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
Everybody rags on AMD! Remember Intel's awful Pentium 4 and AMD's wonderful Athlon FX-63? AMD had Intel beat back then. Why couldn't they do it once more?

DDR4 is a gimmick right now. With people using DDR3-2133MHz in dual-channel they don't need DDR4. CPU and, especially GPU power is where it's at for gamers at least.
 

meloz

Senior member
Jul 8, 2008
320
0
76
Everybody rags on AMD! Remember Intel's awful Pentium 4 and AMD's wonderful Athlon FX-63? AMD had Intel beat back then. Why couldn't they do it once more?

The fab gap between Intel and AMD was not as large back then as it is now. Another reason is that AMD does not have the kind of cash to develop Haswell or Broadwell beating CPUs.

DDR4 is a gimmick right now. With people using DDR3-2133MHz in dual-channel they don't need DDR4. CPU and, especially GPU power is where it's at for gamers at least.
Intel do not make CPUs for just gamers only. The DC/server segment will appreciate DDR4. As will the mobile segment once low power DDR4 becomes available. Speeds might not be better than DDR3 at launch but this is how progress happens, step by step.
 

Hans de Vries

Senior member
May 2, 2008
347
1,177
136
www.chip-architect.com
Looks like mobile phones might be getting DDR4 before Desktop PCs:

"Samsung's 8Gb LPDDR4 Chip Brings 4GB DDR4 to Mobile"

http://www.tomshardware.com/news/samsung-8gb-20nm-processing-4gb-memory,25554.html

I guess it's a sign of the times... :)

Samsung said its 64 bit processor (expected in Feb 2014 for the Galaxy S5) will be
compatible with these (3200 Mbps) LPDDR4 memories. With 4266 Mbps coming for
mobile in 2015. Samsung already showed 5000 Mbps I/O at VLSI'2013 with 750mW
for a 64 bit bus.

Hans
 
Last edited:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
The fab gap between Intel and AMD was not as large back then as it is now.

As hard as it is to believe today, there was once a time when AMD actually had a manufacturing process advantage over Intel. With the 180nm (or, as it was then called, 0.18-micron) process used circa 2000, AMD was able to get the Athlon Thunderbird up to 1.4 GHz, while Intel struggled to get the Pentium III Coppermine on the same process node past 1.0 GHz, and yields were low even at that. (They attempted to release a low-volume Pentium III at 1.13 GHz, but it proved to be too unstable.)

AMD had the better CPUs almost across the board between 2000 (when the first Socket-A Athlons were released) and 2006 (when Intel brought out the Core 2 Duo). Since then things have been reversed. To a large extent this is because a combination of name recognition and anti-competitive agreements enabled Intel to rack up sales even when their products (Willamette and Prescott especially) were complete garbage. Had AMD been able to capitalize on their engineering success and obtain the profits they deserved, the x86 market today would look very different.
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
Everybody rags on AMD! Remember Intel's awful Pentium 4 and AMD's wonderful Athlon FX-63? AMD had Intel beat back then. Why couldn't they do it once more?

DDR4 is a gimmick right now. With people using DDR3-2133MHz in dual-channel they don't need DDR4. CPU and, especially GPU power is where it's at for gamers at least.
This is so off the mark, that I don't even know where to begin.
Had AMD been able to capitalize on their engineering success and obtain the profits they deserved, the x86 market today would look very different.
Different, yes. Very different, no. Let's not forget how much Ruiz screwed the company over.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
As hard as it is to believe today, there was once a time when AMD actually had a manufacturing process advantage over Intel. With the 180nm (or, as it was then called, 0.18-micron) process used circa 2000, AMD was able to get the Athlon Thunderbird up to 1.4 GHz, while Intel struggled to get the Pentium III Coppermine on the same process node past 1.0 GHz, and yields were low even at that. (They attempted to release a low-volume Pentium III at 1.13 GHz, but it proved to be too unstable.)

AMD had the better CPUs almost across the board between 2000 (when the first Socket-A Athlons were released) and 2006 (when Intel brought out the Core 2 Duo). Since then things have been reversed. To a large extent this is because a combination of name recognition and anti-competitive agreements enabled Intel to rack up sales even when their products (Willamette and Prescott especially) were complete garbage. Had AMD been able to capitalize on their engineering success and obtain the profits they deserved, the x86 market today would look very different.

Thats not the definition of a process advantage. Willamette did 2Ghz on 180nm for example.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,736
156
106
I remember articles around the time of 130nm that suggested AMD's process tech was only about 6-8 months behind intel. I tend to see those socket A days as AMD's best.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
I remember articles around the time of 130nm that suggested AMD's process tech was only about 6-8 months behind intel. I tend to see those socket A days as AMD's best.

As a consumer, yes. The K8 and x2s were awesome, but much more expensive (which was better for AMD). If Intel were charging $500+ for a 4770K, AMD would probably be profitable. Intel now saves that pricing level for HEDT, where AMD just can't compete even with the foggiest notion.
 

zir_blazer

Golden Member
Jun 6, 2013
1,259
573
136
As a consumer, yes. The K8 and x2s were awesome, but much more expensive (which was better for AMD).
Actually, Athlons 64 X2 and Pentium D were similar in price. The only exception being the Pentium D 805, which was under the A64X2 3800+ price when Dual Cores were still brand new.
Besides, there are some places where Intel gets a hefty price premium. I recall that while on Newegg you had things like an Athlon 64 3000+ and a Pentium 4 Prescott 3 GHz at the same price, yet here Intel would command a 20-30% price premium just because brand recognition (AMD as a brand had extremely bad reputation here due to the crappy PC Chips + K6-II combos and later the soldered Duron "Pro" on even more PC Chips Motherboards. But everyone blamed the Processor).
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
Actually, Athlons 64 X2 and Pentium D were similar in price. The only exception being the Pentium D 805, which was under the A64X2 3800+ price when Dual Cores were still brand new.

Much more expensive than the "Socket A" Athlons. I wasn't referencing Intel with that remark - though you are correct.
 

Soulkeeper

Diamond Member
Nov 23, 2001
6,736
156
106
As a consumer, yes. The K8 and x2s were awesome, but much more expensive (which was better for AMD). If Intel were charging $500+ for a 4770K, AMD would probably be profitable. Intel now saves that pricing level for HEDT, where AMD just can't compete even with the foggiest notion.

I really don't wanna respond to that. Thanks
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Actually, Athlons 64 X2 and Pentium D were similar in price. The only exception being the Pentium D 805, which was under the A64X2 3800+ price when Dual Cores were still brand new.
Besides, there are some places where Intel gets a hefty price premium. I recall that while on Newegg you had things like an Athlon 64 3000+ and a Pentium 4 Prescott 3 GHz at the same price, yet here Intel would command a 20-30% price premium just because brand recognition (AMD as a brand had extremely bad reputation here due to the crappy PC Chips + K6-II combos and later the soldered Duron "Pro" on even more PC Chips Motherboards. But everyone blamed the Processor).

Thats not how it was.

http://techreport.com/review/8616/amd-athlon-64-x2-3800-processor
http://techreport.com/review/8295/amd-athlon-64-x2-processors

X2 performed very good indeed, but the price premium was really high as well. When AMD was in front they took all the money they could get. And it was one of, if not the most expensive times we ever had. AMD could enjoy being capacity limited while having the best desktop CPUs. And you had to pay a healthy premium for that.
 
Last edited:

DaZeeMan

Member
Jan 2, 2014
103
0
0
Actually, Athlons 64 X2 and Pentium D were similar in price. The only exception being the Pentium D 805, which was under the A64X2 3800+ price when Dual Cores were still brand new.
Besides, there are some places where Intel gets a hefty price premium. I recall that while on Newegg you had things like an Athlon 64 3000+ and a Pentium 4 Prescott 3 GHz at the same price, yet here Intel would command a 20-30% price premium just because brand recognition (AMD as a brand had extremely bad reputation here due to the crappy PC Chips + K6-II combos and later the soldered Duron "Pro" on even more PC Chips Motherboards. But everyone blamed the Processor).

What always amazed me about the K6+ series of chips (K6-2 and 3), was how they took an INTEL designed motherboard and socket, and managed to get roughly twice the MHz performance out of the same motherboard. (Super Socket 7 could still accomodate Pentium chips). Sure, fp performance was noted as lacking, but nonetheless the K6-III chips in particular managed to perform quite well. The K6III chips in particular were noted for having Level 3 cache, which helped considerably with benchmarks, versus the Pentium Pros and such. L3 cache is now pretty much an industry standard, but was a big deal back in the SS7 days...

This also helped keep the prices of the Intel chips around the late '90s down a bit. If AMD hadn't been around, Intel would have charged an even higher premium than they did.

As to AMD 'gouging' the market, as has been noted before, going to a new process node is expensive, and AMD NEVER had a large share of the market. Intel has always had 60% or much higher of the market overall versus AMD. Lower market share equals less money for R&D, hence the need to eek out as much as they could in those 'heady days' of the Athlon architecture.

What you have AMD to thank for is 'awakening the sleeping Intel giant'. Once Intel realized that they were in danger of falling behind, they kicked their CPU development up a notch, and have been dramatically increasing performance steadily ever since.

And, since Intel has a LOT more money to throw at new node processes/die shrinks than AMD, it is inevitable that they are now about 2 nodes ahead of AMD.

Just remember, you have AMD to thank for shaking Intel's confidence in their superiority, just enough to give you the wonderful Haswell, etc. CPU's you see today.

AMD also helped spearhead consumer oriented multicores, 64 bit consumer processors, on die memory controllers, and on die APUs. All good things for the consumer price/performance front, I'd say.

Even Intel's on die graphics are getting MUCH better, thanks to AMD taking a gamble with ATI a few years back.

So, kick AMD all you want. While I agree that Intel is the pretty much undisputed market leader in CPU performance these days, it took AMD nipping at their heels to get them there... in a much shorter amount of time than if they had just 'rested on their laurels', and took their sweet time.

Via was there too of course, but I don't think anyone ever took Via that seriously outside of the Asian rim...

Motorola made a more serious effort, but I remember us 'PC guys' regularly deriding the much inferior performance of Macs back in the late '90s. Apple cultists anyways... even Motorola had to step things up, which was probably too little too late. Apple is now in the Intel camp, something that came as a bit of a shock to Apple enthusiasts when that happened...

We now have ARM and other up and coming CPU manufacturers as well. Which helps keep Intel serious about their own efforts, and serious about maintaining their market share. Which is a good thing I think.

That being said, I'm an unabashed AMD supporter. Because I like choice, and don't need the 'absolute fastest performing consumer CPU on the planet', and AMD delivers sufficient performance for my needs. The only 'generation' of AMD I haven't owned since the K6-III series was the Slot A generation (I do have a Socket A). So you can thank us 'underdog supporters' for your wonderful Intel CPUs being what they are today.

I could bring up the 'multiple Intel socket changes' in the early-mid 2000s that so annoyed many of you, and maybe that Rambus thing, but I think both AMD and Intel are doing much better/about equal on socket longetivity of late.


Anyways, my main reason in replying here was to point out this (now dated) article, as it is relevant to the roadmap mentioned in the first post:
http://www.xbitlabs.com/news/other/...nm_FinFET_Chips_Within_Next_Two_Quarters.html

I don't think we will see any 20nm AMD APUs/CPUs before 2015, but at least they are making progress. Intel should have their 8nm products coming online in 2015. Intel's deep pockets should keep them ahead in the node wars for some time to come, barring some utterly stupid management decisions (I don't see that happening).
 
Last edited: