[Techpowerup] AMD "Zen" CPU Prototypes Tested, "Meet all Expectations"

Page 35 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Where do you think this will land performance wise

  • Intel i7 Haswell-E 8 CORE

  • Intel i7 Skylake

  • Intel i5 Skylake

  • Just another Bulldozer attempt


Results are only viewable after voting.

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
No surprise really since this screams of clickbait and nothing else. We all remember engineers dancing in the aisles as well.

And remember LinkedIn info can be faked easily. And some random forum posts only. Oh lord. We all remember the last AMD Zen slides that was leaked. All fake.

the RWT convo with lurker also shows he is pretty much just spree out random caca.

Desperation and impatience is the fuel.

Clickbait? More like stock price manipulation. Someone was trying to pull a good 'ol "pump and dump" with their AMD shares or options.
 

DrMrLordX

Lifer
Apr 27, 2000
23,178
13,265
136
6-8 core FX CPUs was never sold in big numbers. Despite failing, APUs is still the bread and butter.

They were just cut-down Opterons. That was always the main focus with Bulldozer and Piledriver. Desktop was secondary.

AMD's server market share continued its collapse from 2012 through 2015, going from 4.5% to ~1.5%. So I have to wonder, what was worth more revenue to AMD in the long run? That missing 3% of server market share (or, *gasp*, an improvement in market share courtesy of SR and XV not having some of the ugly bugs in BD/PD), or Kaveri? The question is basically academic, since AMD simply could not continue with development of Opterons without a new node to match.

Also, that implies that AMD could not have released such Opterons AND Kaveri within a reasonable time frame. They probably could have, but only if they put all their eggs in the Construction Core basket, instead of worrying about future products like Zen. And that might have proven to be a critical error.

It's 14LPP.

Okay, that's actually sort-of good news for Zen then. Maybe! That assumes that GF can do as well with the same general node.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,815
1,294
136
Things are weird at GlobalFoundries..

Small blurbs from a memo:
28SLP/28HPP -> 22FDX [Base]
14LPe -> 22FDX [ULP]
14LPP -> 22FDX [UHP]

There is a hint of a new node with it, but no EDA tools or early PDK yet.
Premium node [IBM] -> 14LPX

14LPX => ETSOI FinFET w/ FBB/RBB via back biasing

So..
[Mainstream] Leading Edge = 22FDX
[Premium] Leading Edge = 14LPX

Samsung and GlobalFoundries partnership is ultimately over before it even began. Samsung and GlobalFoundries partnered in 2014, only to get superseded by the IBM acquisition in 2015.
 
Last edited:

Azuma Hazuki

Golden Member
Jun 18, 2012
1,532
866
131
NostaSeronx, where are you getting this information? None of this appears on any roadmap I've been able to find.

It looks interesting of course. I've wondered what a 22nm FDSOI or 14nm FinFET version of a CMT uarch would perform like, given that Vishera still holds up to a modern i5 in performance if not efficiency.
 
Mar 10, 2006
11,715
2,012
126
Things are weird at GlobalFoundries..

Small blurbs from a memo:
28SLP/28HPP -> 22FDX [Base]
14LPe -> 22FDX [ULP]
14LPP -> 22FDX [UHP]

There is a hint of a new node with it, but no EDA tools or early PDK yet.
Premium node [IBM] -> 14LPX

14LPX => ETSOI FinFET w/ FBB/RBB via back biasing

So..
[Mainstream] Leading Edge = 22FDX
[Premium] Leading Edge = 14LPX

Samsung and GlobalFoundries partnership is ultimately over before it even began. Samsung and GlobalFoundries partnered in 2014, only to get superseded by the IBM acquisition in 2015.

It's pretty neat how you manage to make all of this stuff up.
 

Dresdenboy

Golden Member
Jul 28, 2003
1,730
554
136
citavia.blog.de
Clickbait? More like stock price manipulation. Someone was trying to pull a good 'ol "pump and dump" with their AMD shares or options.
The internet has too many news sites, sharing the shrinking ad revenue. So to pay the bills writers have to create lots of news articles, often repeated from other sites, on an ad cluttered site. At least I don't need the money.
 

beginner99

Diamond Member
Jun 2, 2009
5,320
1,768
136
Yeah, it's almost as if making something out of billions of low-double-digit nanometer transistors was REALLY freaking hard. People like to the point the finger at the failings of tech companies (GloFo, AMD, Nvidia, TSMC IBM, Intel....ect) but it's downright amazing what the industry at large has achieved.

The amazing thing is you can buy such a very complicated, very fast chip/SOC for $100. The tech behind it is somewhat mind boggling. At the same time you pay $200 for a nice wooden table. For me this is ridiculous.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The amazing thing is you can buy such a very complicated, very fast chip/SOC for $100. The tech behind it is somewhat mind boggling. At the same time you pay $200 for a nice wooden table. For me this is ridiculous.

One is mass produced, largely by robots, and transports in mass quantities with ease. The other has much more labor per unit and transportation is much more costly.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,815
1,294
136
I've wondered what a 22nm FDSOI or 14nm FinFET version of a CMT uarch would perform like, given that Vishera still holds up to a modern i5 in performance if not efficiency.
A micro-architecture that utilizes clustered multithreading would perform based on the resources it is given.

No shrink is required to "fix" the 15h architectures.

- The ALUs in AGLUs would have to do all EX operations minus branches, popcnt/lzcnt, muls, and divs.
- The FPU would have to put p2 into p1, then replicate p0|p1 into 256-bit units. Going efficiency route p1 does loads, p3 does load/stores.

That is all 15h really would need to be competitive. Doesn't even need to be a whole from scratch architecture. In fact, if you run that in a FPGA simulation, it would actually run faster and be more efficient than Zen simulated. The guys and girls at AMD have really messed this upcoming generation up.
 
Last edited:

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
202
106
A micro-architecture that utilizes clustered multithreading would perform based on the resources it is given.

No shrink is required to "fix" the 15h architectures.

- The ALUs in AGLUs would have to do all EX operations minus branches, popcnt/lzcnt, muls, and divs.
- The FPU would have to put p2 into p1, then replicate p0|p1 into 256-bit units. Going efficiency route p1 does loads, p3 does load/stores.

That is all 15h really would need to be competitive. Doesn't even need to be a whole from scratch architecture. In fact, if you run that in a FPGA simulation, it would actually run faster and be more efficient than Zen simulated. The guys and girls at AMD have really messed this upcoming generation up.

No offence, but AMD has several really, really talented engineers. Jim Keller himself was there while Zen was being designed.

If they thought that 15H could be made to be better than Zen fairly cheaply, they would have done so. They would have run simulations of their own, they probably built engineering samples and tested them.

So, I think they would have made the right decision given the information they had. Let's not forget - to design a new architecture from scratch would be insanely expensive. So imagine if they went to the board and said, we want to build Zen but it will cost XXX million dollars. The board would say, "How can you be sure it is worth it?"

And they would have to prove that Zen would be worth the effort. If it weren't, they would have just done the 15H improvement project.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
They were just cut-down Opterons. That was always the main focus with Bulldozer and Piledriver. Desktop was secondary.

AMD's server market share continued its collapse from 2012 through 2015, going from 4.5% to ~1.5%. So I have to wonder, what was worth more revenue to AMD in the long run? That missing 3% of server market share (or, *gasp*, an improvement in market share courtesy of SR and XV not having some of the ugly bugs in BD/PD), or Kaveri? The question is basically academic, since AMD simply could not continue with development of Opterons without a new node to match.

You try and justify something that was never an option. FX8xxx sales for example pretty much peaked at around 125K units a quarter. While the FX6xxx ended around 200K units a quarter.
 

ultimatebob

Lifer
Jul 1, 2001
25,134
2,450
126
The internet has too many news sites, sharing the shrinking ad revenue. So to pay the bills writers have to create lots of news articles, often repeated from other sites, on an ad cluttered site. At least I don't need the money.

Of course... tech blogs "recycle" each other's stories and post reworded press releases all the time.

Note the stock price bump in AMD stock around 11/3, though. I still think that someone pulled off a pump and dump here.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Also, that implies that AMD could not have released such Opterons AND Kaveri within a reasonable time frame. They probably could have, but only if they put all their eggs in the Construction Core basket, instead of worrying about future products like Zen. And that might have proven to be a critical error.

Which AMD? Today's AMD wouldn't be able to carry on these projects, but the AMD of 2010 would. The company spent a lot more on R&D and had a much larger work force than today. It wasn't lack of resources that hindered AMD in the first place, but the limitations of a failed concept (CMT) coupled with a failed architecture (Bulldozer) and failed product conception (APU).

Had AMD more resources to spare and if they decided to spare on the CMT line up, the result would be just a deeper hole.
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
In what way? Intel copied the concept BTW.

APU is a little different from an iGPU. I don't think Intel's ever really pushed hard on GPU acceleration using their on-die GPU. AMD's public vision was always that there would be great integration between the CPU and GPU. It's a great idea, but thus far pretty much everything treats it as a CPU with a GPU stuck on it. Back almost 5 years ago at the Llano launch big things were planned for HSA

Anandtech Llano Review said:
AMD is looking to change that with the arrival of its first Fusion APUs. These APUs marry one or more AMD x86 cores with dozens if not hundreds of Radeon "cores" on a single die. While today the APU is little more than a cohabitation of these two computing architectures, the end goal is something far more integrated:
evolving2.jpg


We're still at integration, really. Same place we were when Llano launched.
 

Essence_of_War

Platinum Member
Feb 21, 2013
2,650
4
81
The internet has too many news sites, sharing the shrinking ad revenue. So to pay the bills writers have to create lots of news articles, often repeated from other sites, on an ad cluttered site. At least I don't need the money.

http://idlewords.com/2015/11/the_advertising_bubble.htm

The prognosis for publishers is grim. Repent! Find a way out of the adtech racket before it collapses around you. Ditch your tracking, show dumb ads that you sell directly (not through a thicket of intermediaries), and beg your readers for mercy. Respect their privacy, bandwidth, and intelligence, flatter their vanity, and maybe they'll subscribe to something.
Or else just sit back, crack open a cool Smirnoff Ice™, and think about more creative ways to fund online publishing.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
In what way? Intel copied the concept BTW.

If you were right then AMD copied the concept also. Texas Instruments created the first microprocessor with on-chip graphics a couple of decades before AMD.

Anyway, Intel released an "APU" a year before AMD.

They don't teach that in comp sci anymore?
 
Last edited:
Aug 11, 2008
10,451
642
126
Actually, if I recall correctly, Intel was the first to market (vs AMD) with an igpu. Granted it sucked, but I believe they were first to market.
 

BigDaveX

Senior member
Jun 12, 2014
440
216
116
Actually, if I recall correctly, Intel was the first to market (vs AMD) with an igpu. Granted it sucked, but I believe they were first to market.

Well, if we're talking x86 products, the first wasn't AMD or Intel, it was actually Cyrix back with the MediaGX in 1997. :p

But yeah, Intel had the first iGPU with Clarkdale, albeit in a very clumsy, hack-job form; Sandy Bridge was the first one to have it properly integrated.