[Deustche Bank Conference] AMD's New x86 Core is Zen, WIll Launch WIth K12

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Ressources is as you say a combination of budget and scope. And we know nothing about the scope. Yes it looks starved looking from the outside, especially because of the software burden as you say, but the devil is in deep complexity, and there is nothing we can know about that. Look how fast the K12 is comming. It shows the scope compared to BD is quite smaller - perhaps a magnitude.

Given that resources are a lot less than before and the time frame is given (2016, 4 years in R&D), we can only deduce that scope will be much smaller than Bulldozer was. I'd say that it will be closer to the cat core than from Bulldozer in terms of raw performance and other parameters. It shouldn't be remotely compared to Haswell.

Nobody really knew how bad BD arch was

Sorry, but AMD management knew, otherwise they wouldn't have taken the then-dramatic steps of axing the FX/Opteron line, bringing in a new chief architect and design a new uarch from the scratch. If they didn't know how bad Bulldozer was, I don't think the old management team would have been fired at all. I think that at least in december 2010, *before* Dirk Meyer was fired, AMD had a clear idea of the train wreck in their hands. They just couldn't be open about it because otherwise their marketing efforts would unnecessarily suffer from the bad press it would generate and they didn't know what to do about it. Only in 2013 with sales foundering and the ARM strategy in place is that they could be candid about how bad it was (the unmitigated failure comment from Andrew Feldman).
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Techreport power efficency graph is based on their updated X264 encoder that takes advantage of AVX2 for 20% better perfs than IB, that s a very favourable case for HW, their other X264 bench that is not using AVX2 would show quite different numbers for SB in respect of HW.

There s softs that are better than Fritz, particularly Houdini wich see 19% improvement (and Stockfish to a lesser extent) over IB but FX also get a boost to the point that its perf/watt is even closer than in Fritz, in Winrar or 7zip the FX8370E has better perf/watt than said 4670K.

Edit :

They use this graph and bench for their perfs/watt :

x264.png


Not this one :

handbrake.png


http://techreport.com/review/26996/amd-fx-8370e-processor-reviewed/5

I agree, but its the most favorable situation for the BD arch. In the larger scheme it is of minor importance as the die is still far larger.
Add idle power disadvantage and eg. single thread perf disadvantage to get the more correct picture.
 

NTMBK

Lifer
Nov 14, 2011
10,525
6,050
136
Sorry, but AMD management knew, otherwise they wouldn't have taken the then-dramatic steps of axing the FX/Opteron line, bringing in a new chief architect and design a new uarch from the scratch. If they didn't know how bad Bulldozer was, I don't think the old management team would have been fired at all. I think that at least in december 2010, *before* Dirk Meyer was fired, AMD had a clear idea of the train wreck in their hands. They just couldn't be open about it because otherwise their marketing efforts would unnecessarily suffer from the bad press it would generate and they didn't know what to do about it. Only in 2013 with sales foundering and the ARM strategy in place is that they could be candid about how bad it was (the unmitigated failure comment from Andrew Feldman).

Sounds about right, given the lead time required on a new arch. AMD have pretty clearly been in a holding pattern on the big core line, putting in minimal effort to slow the bleeding a little.
 

krumme

Diamond Member
Oct 9, 2009
5,956
1,596
136
Given that resources are a lot less than before and the time frame is given (2016, 4 years in R&D), we can only deduce that scope will be much smaller than Bulldozer was. I'd say that it will be closer to the cat core than from Bulldozer in terms of raw performance and other parameters. It shouldn't be remotely compared to Haswell.

I dont know about performance. It is only one meassure of quality and performance. Production cost, ttm, projected revenue/profit, side effect on eg consoles whatever matters as performance od. It can be a very high performance part using existing building blocks. And it can be a high performance part with very bad market appeal non the less. What we know is the ressources is extremely scarce so a slim high perf part, with low power, like Intel can is not doable.

Its a bad situation for AMD. If say they go for the lower end servermarket, Intel can easily cut prices in half or even 25% and still make a solid profit. They need to adress new markets or segments.

Sorry, but AMD management knew, otherwise they wouldn't have taken the then-dramatic steps of axing the FX/Opteron line, bringing in a new chief architect and design a new uarch from the scratch. If they didn't know how bad Bulldozer was, I don't think the old management team would have been fired at all. I think that at least in december 2010, *before* Dirk Meyer was fired, AMD had a clear idea of the train wreck in their hands. They just couldn't be open about it because otherwise their marketing efforts would unnecessarily suffer from the bad press it would generate and they didn't know what to do about it. Only in 2013 with sales foundering and the ARM strategy in place is that they could be candid about how bad it was (the unmitigated failure comment from Andrew Feldman).

Yes the board must have known BD was bad at tapeout or there about. And it was even postponed for 32nm. But the improvements we saw afterwards was bad imo. Selling mobile GPU tech for wasnt it 45M usd didnt help Meyers either :) - the major problem with BD was not performance imo, it was how big it was - look at the huge non-core -, and how extremely power hungry it was. At launch it was expected they could raise freq radically in next revisions, but that never happened. The supposed high freq never appeared.

But my point is, AMD was in a very bad situation before BD hit. Ofcource BD launch was covered in lipstick but it only lasted 4 hours. The Lipstick on the financial results the years before, took some time to hit.
 

DrMrLordX

Lifer
Apr 27, 2000
23,231
13,314
136
Killer hardware without killer hardware support is meaningless, and this is exactly HSA problem here. None of the big software companies are backing AMD. It's the complete opposite situation of the x64 situation, when Microsoft backed AMD solution.

Exactly my point. Their engineers did the job, they have the hardware they need to win an impressive number of benchmarks, but they can't do it because they won't support developers the way they need to in order to win those benchmarks. It's crazy that we're speculating about what it is that AMD would need to do to compete with Intel when, to a great extent, they've already done it on the hardware side. They either can't, or won't, provide the software support necessary to seal the deal.

Even one big open source bench win would open some eyes and make developers take another look. And with open source, they don't have to win over developers. They can just fork the project or submit a code update for consideration in the next update. Why don't they do a rewrite on Blender to utilize HSA? Blender can use Nvidia cards via CUDA, but HSA is MIA. And that is 100% AMD's fault.

I think Kaveri was held back by its pricing for most of its existence -- I realize there is a lot of Richland inventory to sell through which is why they were pricing Kaveri stuff at such a premium at launch..... With their recently launched A6/A8 Kaveri chips, they finally hit some more reasonable price points -- so I think sales will improve.

Sales may or may not improve, but the important thing here is for AMD to start selling APUs based on their compute capabilities.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Exactly my point. Their engineers did the job, they have the hardware they need to win an impressive number of benchmarks, but they can't do it because they won't support developers the way they need to in order to win those benchmarks. It's crazy that we're speculating about what it is that AMD would need to do to compete with Intel when, to a great extent, they've already done it on the hardware side. They either can't, or won't, provide the software support necessary to seal the deal.

AMD has a very specific conundrum to solve. If they adheres to Intel playbook, they have to beat Intel in their own turf, which is not feasible if Intel is executing its pipeline correctly and they will always be trailing Intel in features. But if they stray too much from Intel's playbook, that means AMD has to develop the entire support infrastructure, which is kinda hard given their resources limitations. This becomes readily apparent in some of their initiatives. Their professional GPU drivers are subpar, they have nothing compared to CUDA in terms of ease of use, and they do not have their own Android implementation to show.

I guess AMD isn't really serious about developing serious software support for their initiatives. Some of the rationale behind going ARM is exactly to benefit from other's software R&D without being feature handicaped as they are with Intel.

Rory did indeed change the focus of AMD regarding marketing focus. They are shifting from competing and reacting against Intel to developing chips to other markets outside Intel area of interest, but the real transformation I'd like to see for AMD is the company building and delivering complete solutions, hardware and software, like Nvidia does in some market segments, but this doesn't seem to be the path chosen by Rory and his team.
 
Last edited:

pTmdfx

Member
Feb 23, 2014
85
0
0
Rory did indeed change the focus of AMD regarding marketing focus. They are shifting from competing and reacting against Intel to developing chips to other markets outside Intel area of interest
Though the markets they are diversifying into all have or will have Intel's presence. Well, (sort of) except professional graphics.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Though the markets they are diversifying into all have or will have Intel's presence. Well, (sort of) except professional graphics.

Nope. I don't think Intel will chase 15-20% gross margin embedded projects like AMD did with game consoles.
 

pTmdfx

Member
Feb 23, 2014
85
0
0
Nope. I don't think Intel will chase 15-20% gross margin embedded projects like AMD did with game consoles.
Which is just one of the new businesses. In regard to this particular business, even if you are right with integrated custom chips for game consoles, what about things other than these boxes? Atom is strong in the lower end embedded space, while Xeon has a strong presence in the embedded networking MPU market.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Not all monopolies are broken up nor illegal, btw. The courts have long upheld this. The 1920 antitrust case vs. U.S. Steel is one instance in which the govt. allowed that monopoly to stand while others were broken up at the same time.

Yeah, I'm sure Microsoft would agree with you -- not.

Just because there is no competitor on the market -- doesn't mean it will save you from legal trouble. US Steel was nearly 100 years ago and the political landscape is vastly different....

The Breakup of Bell was far more recent (1982) and by far more relevant:
http://en.wikipedia.org/wiki/Breakup_of_the_Bell_System

Intel is probably thanking their lucky stars that Via and AMD are at least trying to compete -- so they don't have to deal with government intervention.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Yeah, I'm sure Microsoft would agree with you -- not.

Just because there is no competitor on the market -- doesn't mean it will save you from legal trouble. US Steel was nearly 100 years ago and the political landscape is vastly different....

The Breakup of Bell was far more recent (1982) and by far more relevant:
http://en.wikipedia.org/wiki/Breakup_of_the_Bell_System

Intel is probably thanking their lucky stars that Via and AMD are at least trying to compete -- so they don't have to deal with government intervention.

Microsoft wasn't broken up.

The are a lot more competitors in the CPU market than just AMD.

Intel holds a 95% market share in servers. Why hasn't the justice department taken action?
 

Abwx

Lifer
Apr 2, 2011
12,034
4,996
136
I agree, but its the most favorable situation for the BD arch. In the larger scheme it is of minor importance as the die is still far larger.
Add idle power disadvantage and eg. single thread perf disadvantage to get the more correct picture.

I specified at the CPU level and also pointed MT and the power hungry chipset, not that i m understimating the weaknesses, one could also point that the 8370E has been released for 95W MBs wich are often not as power hungry at the chipset level although it s still not up to recent FM2 FCHs TDPs.


AMD didnt expect HW to go backward in perf/watt and they were not insightfull enough to improve their AM3+ plateform in comsumption even it was obvious that it was easier to improve the system perf/watt by updating this part, after all the 990FX + SB 950 total TDP is at 25.6W, at the main level with the PSU losses this put the chipset maximal footprints at 32W wich is about 20% of the system power comsumption and even more when the CPU is a 8370E, a 8-10W TDP chipset would had globaly improved the perf/watt of the AM3+ plateform by roughly 15% given that all additional controllers that paliate for USB3 and SATA3 would also see their comsumption reduced.


Edit : The die size is not that important anymore, the process is 3 years old and the big size is due to increased distance between components, this doesnt increase the defect rate.
 
Last edited:
Aug 11, 2008
10,451
642
126
All this haswell regression in performance per watt is a red herring. Even if one accepts your theory, it is still miles ahead of AMD on the desktop.

However haswell was focused on mobile and the integration of the pch and other low power optimizations led to longer battery life under the vast majority of use cases. So it was not a step back like you are trying to spin it to be in the market it was aimed for.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Microsoft wasn't broken up.

The are a lot more competitors in the CPU market than just AMD.

Intel holds a 95% market share in servers. Why hasn't the justice department taken action?

Even though Microsoft didn't get broken up -- they dealt with one major headache after another and have racked up quite the legal bill at this point.

With ARM now trying to break into servers, I don't see the DOJ getting involved. With Sun, AMD, PowerPC.... It's not like Intel is the only thing available to buy.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
So then why would Intel be the only thing available to buy and face anti-trust action if AMD were gone.

After all, there's ARM, SUN, POWER, Mips, Etc..

Your words, not mine.
 

Abwx

Lifer
Apr 2, 2011
12,034
4,996
136
All this haswell regression in performance per watt is a red herring. Even if one accepts your theory, it is still miles ahead of AMD on the desktop.

That s not a theory, these are real numbers, Intel has clearly the lead on ST perf/watt but on MT the picture is quite different, as a hint about the chipset tax compare the numbers to Intel s Z97 wich is 4.1W and X99, a big plateform, wich is about 6W.

However haswell was focused on mobile and the integration of the pch and other low power optimizations led to longer battery life under the vast majority of use cases. So it was not a step back like you are trying to spin it to be in the market it was aimed for.

You cant design a chip that will be good for its purpose at both 5W and 100W TDPs, there are inherent compromises, moreover when using large cores, for mobile chips like BT or Mullins have much better perf/watt, the large cores are beyond the point where more IPC is less rewarding than keeping IPC at average level and increasing the core count instead, you can see this with Haswell where this tendency extended up to the higher TDPs levels, Core M numbers point also to a perf/watt that is barely better than the small cores i mentionned and this despite a node shrink.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
All this haswell regression in performance per watt is a red herring. Even if one accepts your theory, it is still miles ahead of AMD on the desktop.

It's not a red herring, it's an outright lie. Don't forget that AMD was wiped out in servers with the 32nm Sandy-Bridge E/EP, because on the majority of workloads the Opterons wouldn't be able to match Intel in performance/watt.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
So then why would Intel be the only thing available to buy and face anti-trust action if AMD were gone.

After all, there's ARM, SUN, POWER, Mips, Etc..

Your words, not mine.

None of those companies are selling desktops -- only servers.

It would be a genuine problem for Intel if AMD went away for Desktop PC's.
 
Last edited:

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Numerous.... They are launching SSD's, pushing hard into embedded and expending professional graphics. They clearly are tired of having all their eggs in the one (CPU) basket.... This process IMO started long ago, probably when they first bought ATi.

AMD is not manufacturing SSD, it is rebranding them, and a very cheapskate one. Intel won't compete here. Same with embedded, Intel doesn't compete on the same segments with AMD. High complexity silicon with 15-20% margins isn't Intel cup of tea, it is AMD. Last, but not least, AMD always had a professional business, they just couldn't get it right because they can't fix their software support and devrel.

There aren't many new business, the only effectively new business is embedded, and this because AMD is now willing to swallow extremely low margins to sell them.
 

MiddleOfTheRoad

Golden Member
Aug 6, 2014
1,123
5
0
AMD is not manufacturing SSD, it is rebranding them, and a very cheapskate one. Intel won't compete here. Same with embedded, Intel doesn't compete on the same segments with AMD. High complexity silicon with 15-20% margins isn't Intel cup of tea, it is AMD. Last, but not least, AMD always had a professional business, they just couldn't get it right because they can't fix their software support and devrel.

You're pretty much wrong on all fronts -- AMD didn't have have a professional graphics business before they bought ATi.

AMD is diversifying their revenue streams.... They are trying to reduce the reliance on the fickle CPU market. Even Intel felt a lot of pain last year -- their profits dropped 29% in Q2. The fastest way for AMD to enter new markets is rebranding existing products. So AMD decided to rebrand some SSD's. Intel already sells SSD's. Intel/AMD have been in embedded for years (AMD Geode), too. Of course they are competing.... They are competing on price. It really is the semi-custom chips that is new for AMD (Xbox / PS4).
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
There aren't many new business, the only effectively new business is embedded, and this because AMD is now willing to swallow extremely low margins to sell them.

Embedded is not only the Game consoles.
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
If it's faster than intel x86, I still won't buy it because it won't be in a MacBook.

I do, however, appreciate the effort. Godspeed AMD, Godspeed
 

Abwx

Lifer
Apr 2, 2011
12,034
4,996
136
It's not a red herring, it's an outright lie. Don't forget that AMD was wiped out in servers with the 32nm Sandy-Bridge E/EP, because on the majority of workloads the Opterons wouldn't be able to match Intel in performance/watt.

I would be very cautious about branding liar someone who provide numbers while myself using an hollow argument that is usefull nor to man neither to beast when it comes to objective comparisons.

Consumption at the 12V rail of both FX8370E and 4670K are 72W and 63.6W respectively measured with Fritzchess wich is one of the most integer power hungry soft, 4670K scores 10% better wich translate to 23% better perf/watt but in a task like 7zip the FX has 26% better perf and hence about 13% better perf/watt than the 4670K assuming both CPUs are fully loaded, wich is 100% sure for the 4670k.

getgraphimg.php




getgraphimg.php


http://www.hardware.fr/focus/99/amd-fx-8370e-fx-8-coeurs-95-watts-test.html

So who is posting outright lies..??.
 
Last edited: