[Deustche Bank Conference] AMD's New x86 Core is Zen, WIll Launch WIth K12

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

blake0812

Senior member
Feb 6, 2014
788
4
81
Hopefully they'll step it up to Intel's level. 2016 seems far away but 2014 is close to an end.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
AMD needs to do something completely game changing. They need to completely redesign their GPU ISA. The entire concept of a video driver needs to vanish. They need to enable the writing of game code that is processed entirely by the CPU. All the SIMDs of the GPU need to be directly linked into the CPU cores, just as the current vector units are. If they can manage to give users that type of low latency access to 4000+ shaders, then they will truly have a game changer. But its been 8 years and still no sign that they are anywhere near this point. It is entirely possible Intel will get there first.

And then there is storage. They need to force OEMs to use high speed flash for OS storage. They need to upgrade their memory controller to have ULLtra-DIMM style capabilities, ie the ability to cache DDR with NAND. They need to do somethign. The era of dog slow 5400 rpm hard drives that take 3 minutes to load windows needs to end. And it is not going to end until someone forces these OEMs to end it. Namely, by providing a lower cost alternative.
 

lefty2

Senior member
May 15, 2013
240
9
81
I was thinking that a Steamroller, or Excavator FX part would be a good stopgap until this Zen thing is ready. AMD depends a lot on FX revenue (said to be hunderds of thousands units per quarter). Even though they don't perform well versus Haswell, they have an advantage in that there is no iGPU, which means smaller die and AMD can afford to sell them cheaper (especially if they do it on 20nm).
Taking into account that there will be no budget Broadwell desktop CPUs, AMD would be in a good position to compete in the budget enthusiast market.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Hopefully they'll step it up to Intel's level. 2016 seems far away but 2014 is close to an end.
They can't. Not gonna happen. But, they could make something good, and I hope they do. Kaveri, FI, is what BD should have been from the start. Intel is at the top of their game, these days, so matching them is just wishful thinking. But, even Piledriver, and now Kaveri, aren't bad at all in some niches, and historically good server CPUs have tended to be decent to good gaming CPUs, so it's not like Intel has every corner fully covered (especially given what they charge for 4C8T); and as good as Haswell's GU driver quality is, the Radeons still have options and optimizations, and institutional experience, that Intel lacks.

What they need in 2016 is to come out with something brand new that would be like if Kaveri were the release, or at least first respin, of BD, back ~3 years ago.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
I was thinking that a Steamroller, or Excavator FX part would be a good stopgap until this Zen thing is ready. AMD depends a lot on FX revenue (said to be hunderds of thousands units per quarter). Even though they don't perform well versus Haswell, they have an advantage in that there is no iGPU, which means smaller die and AMD can afford to sell them cheaper (especially if they do it on 20nm).
Taking into account that there will be no budget Broadwell desktop CPUs, AMD would be in a good position to compete in the budget enthusiast market.

Currently AMD is selling a 315 mm^2 CPU for less than 180, not counting all the binned 6xxx series and 4xxx series. FX is very likely profit neutral due to the absurdly low prices that they must sell them at.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
AMD needs to do something completely game changing. They need to completely redesign their GPU ISA. The entire concept of a video driver needs to vanish. They need to enable the writing of game code that is processed entirely by the CPU. All the SIMDs of the GPU need to be directly linked into the CPU cores, just as the current vector units are. If they can manage to give users that type of low latency access to 4000+ shaders, then they will truly have a game changer. But its been 8 years and still no sign that they are anywhere near this point. It is entirely possible Intel will get there first.
Directly tying the SIMDs together would be hard. But, they could get 99% the way there by sharing address space and caches past L1. If they get even close to that, then it's all software.

A video driver being a thin layer on top of firmware would be ideal, IMO. Having the abstraction layers they do provides them them power and flexibility, and is not something to get rid of. JIT is not ideal for all cases, but since GPUs mostly do the same thing over and over and over, it's great for them, and allows code to be forward-compatible and backwards-compatible, without necessarily restricting the hardware design. Moderate-performance low-powerdedicated cores running a firmware to handle 99% of it could relieve the CPU of a lot of work, without having to use any "real" assembly. It could end up being a pie in the sky, though.

And then there is storage. They need to force OEMs to use high speed flash for OS storage. They need to upgrade their memory controller to have ULLtra-DIMM style capabilities, ie the ability to cache DDR with NAND. They need to do somethign. The era of dog slow 5400 rpm hard drives that take 3 minutes to load windows needs to end. And it is not going to end until someone forces these OEMs to end it. Namely, by providing a lower cost alternative.
That's OEMs and the market. Apple has the right idea, just not offering parts that are too low end. 3D NAND might be able to get dense enough quick enough to be that alternative. Let us hope it is.
 
Mar 10, 2006
11,715
2,012
126
Currently AMD is selling a 315 mm^2 CPU for less than 180, not counting all the binned 6xxx series and 4xxx series. FX is very likely profit neutral due to the absurdly low prices that they must sell them at.

I'm sure FX chips are gross margin positive even after manufacturing, boxing up, shipping, etc. The problem is that they probably don't sell all that well given that they are attractive only to a pretty small niche of desktop PC buyers.

When a friend of mine needed to upgrade his PC, I looked at both Intel and AMD offerings, but for the workloads that he and the majority of people do (gaming, surfing the web, etc.), the Intel product stack just had better products at just about every price point we were looking at.

For users with big multithreaded needs and a tight budget, I could see where the FX 8350, for instance, could make sense.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I was thinking that a Steamroller, or Excavator FX part would be a good stopgap until this Zen thing is ready. AMD depends a lot on FX revenue (said to be hunderds of thousands units per quarter). Even though they don't perform well versus Haswell, they have an advantage in that there is no iGPU, which means smaller die and AMD can afford to sell them cheaper (especially if they do it on 20nm).
Taking into account that there will be no budget Broadwell desktop CPUs, AMD would be in a good position to compete in the budget enthusiast market.
Not having an IGP means fewer sales, and so is not an advantage, once you move beyond enthusiasts for a market. FX CPUs are server leftovers, and the platform is cheaper due to having already been implemented years ago.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
Are we going to see Excavator-based APUs, CPUs, anything before this new core comes out?

I am very interested to see how this new design turns out, and how much influence AMD's talent acquisitions had.
 

lefty2

Senior member
May 15, 2013
240
9
81
Not having an IGP means fewer sales, and so is not an advantage, once you move beyond enthusiasts for a market. FX CPUs are server leftovers, and the platform is cheaper due to having already been implemented years ago.
Enthusiasts don't use the iGPU. Why should you pay for something that you will never use?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Enthusiasts don't use the iGPU. Why should you pay for something that you will never use?

I assume you define enthusiast from a purely price perspective :p

Unused IGP is also dark silicon and can be very handly.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Enthusiasts don't use the iGPU. Why should you pay for something that you will never use?
Because you are a small part of the market (probably a minority even among enthusiasts, today), and the greater masses do use it. Therefore, it's going to be cheaper to primarily develop for the mass market. If you are offered CPUs with IGP turned off at a discounted price, that is an arbitrary decision for marketing reasons, and has basically nothing to do AMD's costs, and thus their ability to sell the parts at a lower ASP. You basically pay for everything AMD develops and produces, as part of paying for any AMD product.

IoW, your statement is the reason the Athlon series is cheaper than an APU of similar CPU performance. But, each of those Athlons costs AMD around as much as an A10 does. The lower prices to sell those (most of which likely have fine GPUs, but it does offer an outlet for chips with defective ones) have to be made up for by higher prices elsewhere, because they will not reduce costs, at least not substantially (there may be a minor cost reduction for CPUs in which the IGP is never tested, as they are filling a quota for Athlons, to maintain their pricing structure, but that can't be much, if it's the case at all). Since most PCs demand IGP, for cost reasons, it also makes it less than ideal for AMD's FX line, which will be, and likely has been many times, passed over for Pentiums and Core i3s, due not to perf/W (Acer doesn't care once you buy it, right?), but machine cost with acceptable video features and performance.

There's good reason to have a set of CPUs with no IGP, that may include those with broken IGPs. But, I don't see any point in a mainstream CPU not having IGP as part of it. it's all negatives. Now that GPGPU is finally starting to be accepted for server uses, thanks more to team green than team red, even that's not a good excuse (and that's without it being ubiquitous, which AMD could have helped make happen, but chose not to).
 
Last edited:
Aug 11, 2008
10,451
642
126
Not having an IGP means fewer sales, and so is not an advantage, once you move beyond enthusiasts for a market. FX CPUs are server leftovers, and the platform is cheaper due to having already been implemented years ago.

Agreed. If you got really exceptional CPU performance that would be one thing. But except for certain heavily threaded applications, you get better CPU performance and a free igp with Intel. The lack of an igp really limits the attractiveness of FX for OEMs since you have to add a dgpu along with the CPU.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
I was thinking that a Steamroller, or Excavator FX part would be a good stopgap until this Zen thing is ready. AMD depends a lot on FX revenue (said to be hunderds of thousands units per quarter). Even though they don't perform well versus Haswell, they have an advantage in that there is no iGPU, which means smaller die and AMD can afford to sell them cheaper (especially if they do it on 20nm).
Taking into account that there will be no budget Broadwell desktop CPUs, AMD would be in a good position to compete in the budget enthusiast market.

AMD doesn't have a die size advantage. FX die is 315mm^2, while 4C Haswell is around 180mm^2. Same with their APU, that are 240mm^2 while 2C Haswell goes from roughly 120mm^2 to 180mm^2 and because of that they can't sell cheaper than Intel without showing extremely low gross margins. They are not in a good position at all, despite Intel skipping Broadwell for desktops.
 

lefty2

Senior member
May 15, 2013
240
9
81
AMD doesn't have a die size advantage. FX die is 315mm^2, while 4C Haswell is around 180mm^2. Same with their APU, that are 240mm^2 while 2C Haswell goes from roughly 120mm^2 to 180mm^2 and because of that they can't sell cheaper than Intel without showing extremely low gross margins. They are not in a good position at all, despite Intel skipping Broadwell for desktops.
Yeah, I was thinking that they could make it smaller if they put it on 20nm, but it actually says in that conferance call that they are going to stay on 28nm for a long time. Ah Well.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Yeah, I was thinking that they could make it smaller if they put it on 20nm, but it actually says in that conferance call that they are going to stay on 28nm for a long time. Ah Well.
Lacking funds, they probably want to milk 28nm, that they're now used to, and is mature, until they can go with fins, which will not be cheap (I don't know enough to know why, but that's been a universal consensus for developing with FinFET).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Yeah, I was thinking that they could make it smaller if they put it on 20nm, but it actually says in that conferance call that they are going to stay on 28nm for a long time. Ah Well.

Transistor cost is higher on 20nm. So the manufactoring cost would increase.
 

lefty2

Senior member
May 15, 2013
240
9
81
Transistor cost is higher on 20nm. So the manufactoring cost would increase.
it's actually cheaper to manufacture provided the yield is high. Of coarse, TSMC are going to charge a high price for a leading edge node regardless of manufacture cost.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Yeah, I was thinking that they could make it smaller if they put it on 20nm, but it actually says in that conferance call that they are going to stay on 28nm for a long time. Ah Well.

Yes, Rory read said that, but he didn't comment on the reasons for sticking with 28nm that long. 28nm will indeed stick for a long time with the industry, but that's Rory smokescreen to not address the elephant in the room: Globalfoundries. Every other bleeding edge players (like AMD used to be) is already moving to 20nm and below, but no other bleeding edge player is tied to Globalfoundries the way AMD is, so that's why the bleeding edge guys are heading out of 28 and AMD isn't.

And 20nm could really help AMD. While I don't think it could make Bulldozer or its derivatives shine, I think it could allow for a better business case, but in order to do that that they needed their foundry partner to commercially deploy the node, and Globalfoundries didn't deceive anyone on this: GLF 20nm is following SOP to the letter, meaning delayed and underperforming.

20nm will be another dud for GLF, maybe even worse than 28nm was. By the time GLF 20nm would be ready to deploy the other foundries would have their finfet processes around the corner, and AMD cannot outsource CPU production to TSMC anymore, because of the Wafer Supply Agreement they have with GLF. AMD is doing what it can, dumping prices of their old line and praying that GLF will have their copycat "14nm" in place by the time Samsung does.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
it's actually cheaper to manufacture provided the yield is high. Of coarse, TSMC are going to charge a high price for a leading edge node regardless of manufacture cost.

Gate cost is higher on 20nm. So the same chip shrinked to 20nm is higher cost.

The only sole reason to do 20nm would be the electrical properties to perhaps increase price to offset the extra cost.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Gate cost is higher on 20nm. So the same chip shrinked to 20nm is higher cost.

The only sole reason to do 20nm would be the electrical properties to perhaps increase price to offset the extra cost.

That should be more a symptom of TSMC deploying 20nm in a less mature state than previous nodes than something intrinsic to the node itself. Intel is getting *a lot* of costs improvements with the jump to 22nm and 14nm, the other foundries should get too when they reach the same maturity levels.
 

lefty2

Senior member
May 15, 2013
240
9
81
Yes, Rory read said that, but he didn't comment on the reasons for sticking with 28nm that long. 28nm will indeed stick for a long time with the industry, but that's Rory smokescreen to not address the elephant in the room: Globalfoundries. Every other bleeding edge players (like AMD used to be) is already moving to 20nm and below, but no other bleeding edge player is tied to Globalfoundries the way AMD is, so that's why the bleeding edge guys are heading out of 28 and AMD isn't.

And 20nm could really help AMD. While I don't think it could make Bulldozer or its derivatives shine, I think it could allow for a better business case, but in order to do that that they needed their foundry partner to commercially deploy the node, and Globalfoundries didn't deceive anyone on this: GLF 20nm is following SOP to the letter, meaning delayed and underperforming.

20nm will be another dud for GLF, maybe even worse than 28nm was. By the time GLF 20nm would be ready to deploy the other foundries would have their finfet processes around the corner, and AMD cannot outsource CPU production to TSMC anymore, because of the Wafer Supply Agreement they have with GLF. AMD is doing what it can, dumping prices of their old line and praying that GLF will have their copycat "14nm" in place by the time Samsung does.
I understood Globalfoundries abandonded 20nm. They no longer mention it in any of their marketing materials. Instead they are focusing on 14nm FinFET (which they claim will be ready 1H 2015)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
That should be more a symptom of TSMC deploying 20nm in a less mature state than previous nodes than something intrinsic to the node itself. Intel is getting *a lot* of costs improvements with the jump to 22nm and 14nm, the other foundries should get too when they reach the same maturity levels.

It requires billion$ design tho and the tools. Thats something TSMC isnt delivering. Hence all the reports that 20nm will cost more. Even Samsung says so as well.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I understood Globalfoundries abandonded 20nm. They no longer mention it in any of their marketing materials. Instead they are focusing on 14nm FinFET (which they claim will be ready 1H 2015)

They license Samsungs 14FF. And they are not going to come out faster than Samsung.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Every other bleeding edge players (like AMD used to be) is already moving to 20nm and below, but no other bleeding edge player is tied to Globalfoundries the way AMD is, so that's why the bleeding edge guys are heading out of 28 and AMD isn't.

Not a single player have 20nm products in retail now, most of them will have them shipped in early 2015. Thats when AMD will have 20nm products as well so according to you AMD is another bleeding edge player in the industry ;)

Not only that, but most of the bleeding edge players will still have the majority of their volume manufactured at 28nm in 2015 and 2016. But i believe you already knew that :whiste:

http://seekingalpha.com/article/232...arnings-call-transcript?page=5&p=qanda&l=last
Lisa Su - SVP and COOHello John, so let me take a step at that. I think when you look at what's important to us, I mean clearly process technology is an important element but we have invested quite a bit in architecture, design techniques, new IP software. So, I wouldn’t say that process technology is the first and primary determinants for us. It is important that we are on competitive technology, so we have said before and I will say again that 20 nanometer is an important note for us. We will be shipping products in 20 nanometer next year and as we move forward obviously a FinFET is also important. So, if you look at our business, it is quite a bit more balanced between the semi-custom, embedded sort of commercial PRO Graphics growth portions as well as the more traditional sort of client and graphics pieces of our business. So technology plays in all of those businesses.