With the current rate of Intel CPU performance increases, could AMD be catching up?

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
AMD has obviously been able to bring out both Llano, Trinity and soon Kaveri despite the economic constraints you mentioned. Why shouldn't they be able to continue like that going forward?

Llano and Trinity were financed by Phenom II revenues, and phenom II were financed by Phenom, and Phenom was financed by Athlon 64... I think you got the idea. For excavator, AMD would need cash generated in Llano time frame.

So when you factor AMD current situation, where free cash flows are almost 0 to negative and they are pretty much shut out of the debt market, where do you think AMD can get the funds to develop a new architecture and do that in a cutting edge node? more important, why would they sink whatever cash reserves they still have if they can't make any money?
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
There's a host of reasons why 1155 is far more comparable to FX than 2011. But this debate is par for the course really, what I mean is Atenra agreeing with Atenra
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Llano and Trinity were financed by Phenom II revenues, and phenom II were financed by Phenom, and Phenom was financed by Athlon 64... I think you got the idea. For excavator, AMD would need cash generated in Llano time frame.

By your logic Excavator would need cash from SR, Kaveri(SR) would need Cash from Trinity(PD). Now how they will release the Kaveri (SR) when as you believe didn’t make money out of Llano/Trinity ??

What you don’t understand is that they already have spent a lot of money for the Bulldozer microarchitecture, they don’t have to spent the same high amount for upgrading the same architecture to the next core designs (PileDriver, SteamRoller, Excavator).

And we already know that they have started to design the next microarchitecture that will most probable replace the BD microarchitecture.

http://www.amd.com/us/press-releases/Pages/JimKellerJoinsAMD-2012aug01.aspx
SUNNYVALE, Calif. —8/1/2012 AMD (NYSE: AMD) announced today that Jim Keller,53, has joined the company as corporate vice president and chief architect of AMD’s microprocessor cores, reporting to chief technology officer and senior vice president of technology and engineering Mark Papermaster. In this role, Keller will lead AMD’s microprocessor core design efforts aligned with AMD’s ambidextrous strategy with a focus on developing both high-performance and low-power processor cores that will be the foundation of AMD’s future products.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Just to add that AMD have already spend 1.3B for R&D in 2012 and they are going to spend close to 300M every quarter in 2013. They cut off almost 60M from R&D going to bulk from SOI for the next generation of CPUs starting with Kaveri(Q4 2013).
The doom posts about not having money for R&D for future Microarchitecture designs are without any base.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Rrrriiiiight... Lets see here. A nearly worthless stock, posting losses quarter after quarter, shrinking sales, engineers and executives leaving the company left and right.

Here is the reality, in the dimension everyone [else] lives in.

1) AMD has less money to play with now then in years past
2) R&D isn't getting any cheaper (AMD spending less is because they have less to spend, not because they figured out how to make awesome products by not researching and developing awesome products)
3) They have fewer engineers to do the R&D
4) If they happen to over come 1-3, they're still selling to a shrinking market, which means less ROI which means problems 1-3 will continue to exist and be even bigger issues then they are now.

Hardly baseless, and the fact that they're spending significantly less in 2013 than they did in 2012 is even more evidence that they're R&D budget is tight and shrinking.

I'd really love to know how you conjure up some of your conclusions. I'm clearly not the only one who thinks they make no sense. Not this one, and not the LGA 2011 to FX comparison.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
With all the R&D money that Intel has, I have to say it's really crappy to only be able to produce CPUs that provide ~8% CPU increase per year. I know they currently focus on lowering power consumption and improving iGPU, but still.

In the light of this it's even more impressive that AMD which is on a limited budget currently trumps Intel when it comes to delivering CPU performance increase per year (looking at what Trinity brought, and what Kaveri is expected to bring).
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Rrrriiiiight... Lets see here. A nearly worthless stock, posting losses quarter after quarter, shrinking sales, engineers and executives leaving the company left and right.

Here is the reality, in the dimension everyone [else] lives in.

1) AMD has less money to play with now then in years past
2) R&D isn't getting any cheaper (AMD spending less is because they have less to spend, not because they figured out how to make awesome products by not researching and developing awesome products)
3) They have fewer engineers to do the R&D
4) If they happen to over come 1-3, they're still selling to a shrinking market, which means less ROI which means problems 1-3 will continue to exist and be even bigger issues then they are now.

Hardly baseless, and the fact that they're spending significantly less in 2013 than they did in 2012 is even more evidence that they're R&D budget is tight and shrinking.

I'd really love to know how you conjure up some of your conclusions. I'm clearly not the only one who thinks they make no sense. Not this one, and not the LGA 2011 to FX comparison.

You are thinking in two dimensions, in black and white.

They may spend less in R&D as a whole, but they are spending enough for future microarchitectures.

Your R&D greatly dictated by your future planing. If you will not use SOI in your future products you may cut off R&D by lay off the engineers and the funds spend to develop the 20nm SOI process. That R&D money will not be taken off from the Microarchitecture design.
So to conclude, AMDs entire R&D spending may be shrinking but they are still speeding for future microarchitectures.
 

cytg111

Lifer
Mar 17, 2008
26,201
15,605
136
You are thinking in two dimensions, in black and white.

They may spend less in R&D as a whole, but they are spending enough for future microarchitectures.

Your R&D greatly dictated by your future planing. If you will not use SOI in your future products you may cut off R&D by lay off the engineers and the funds spend to develop the 20nm SOI process. That R&D money will not be taken off from the Microarchitecture design.
So to conclude, AMDs entire R&D spending may be shrinking but they are still speeding for future microarchitectures.

- That makes zero sense in the context in which you are replying .. unless you are agreeing with the man.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
1) AMD has less money to play with now then in years past
2) R&D isn't getting any cheaper (AMD spending less is because they have less to spend, not because they figured out how to make awesome products by not researching and developing awesome products)
3) They have fewer engineers to do the R&D

To give you an idea of the magnitude of the problem. AMD budget in 2010 was around 1.4 billion, then 1.5 billion in 2011, 1.35 in 2012 and for 2013 they are expecting 1.2 billion, so what we see here is a decreasing budget when we should be seeing a growing one.

In 2010, AMD had to worry only about their GPU architecture, Bulldozer and bobcat, which was treated as a low cost project. Now in 2013 they have less money, less engineers, but they have to treat Kabini as a high end project, they have to evolve their GPU architecture, they have to improve software support for HSA, they have a new ARM core to develop, and improve their Seamicro interconnect because if they don't their 300 million investment is done for. This would be quite an undertaking if they were already fully funded and fully staffed, and they aren't. It is obvious that they can't afford to, along with all this, develop a big core processor to compete with Intel.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
With all the R&D money that Intel has, I have to say it's really crappy to only be able to produce CPUs that provide ~8% CPU increase per year. I know they currently focus on lowering power consumption and improving iGPU, but still.

I think you are looking at the wrong market. I really expect IVB-EP/EX to be a revolutionary processor for those markets. 15 cores on a reasonable TDP is game changing.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
To give you an idea of the magnitude of the problem. AMD budget in 2010 was around 1.4 billion, then 1.5 billion in 2011, 1.35 in 2012 and for 2013 they are expecting 1.2 billion, so what we see here is a decreasing budget when we should be seeing a growing one.

In 2010, AMD had to worry only about their GPU architecture, Bulldozer and bobcat, which was treated as a low cost project. Now in 2013 they have less money, less engineers, but they have to treat Kabini as a high end project, they have to evolve their GPU architecture, they have to improve software support for HSA, they have a new ARM core to develop, and improve their Seamicro interconnect because if they don't their 300 million investment is done for. This would be quite an undertaking if they were already fully funded and fully staffed, and they aren't. It is obvious that they can't afford to, along with all this, develop a big core processor to compete with Intel.

Thank you mrmt, your post deftly and succinctly highlights the specific issue.

Less money combined with rising costs.

It will cost nearly 2x to develop a 20nm big-core IC as it cost to develop a 32nm big-core IC. AMD doesn't have 2x the budget.

How people cannot see the clear and obvious parallels between the rock and the hard place that Via came up against verus the same that AMD has come up against is beyond me. It is staring at us right there, in the numbers.

Plus it is obvious in what Rory speaks to in CC. If the guy had any intentions of selling big-cores at 20nm then he'd be prioritizing his time to talk about that roadmap trajectory. Instead he spends all his time discussing the trajectory of the low-power roadmap.

The writing is plain-as-day on the wall, you've got to be a true believer to not see it.
 
Aug 11, 2008
10,451
642
126
With all the R&D money that Intel has, I have to say it's really crappy to only be able to produce CPUs that provide ~8% CPU increase per year. I know they currently focus on lowering power consumption and improving iGPU, but still.

In the light of this it's even more impressive that AMD which is on a limited budget currently trumps Intel when it comes to delivering CPU performance increase per year (looking at what Trinity brought, and what Kaveri is expected to bring).


It should be east to get some relatively large ipc increases when your new flagship micro architecture is a step back from your previous one. Much easier to make improvements from an inefficient level than from a highly efficient one.
 

Vesku

Diamond Member
Aug 25, 2005
3,743
28
86
There's a host of reasons why 1155 is far more comparable to FX than 2011. But this debate is par for the course really, what I mean is Atenra agreeing with Atenra

When actually comparing the chip and node size I think Atenra is on target regarding FX 8350 and i7 3820. iGPless, similar die size, 32nm GloFo vs 32nm Intel. When comparing these two chips we can see where AMD has made budgetary compromises. Under Dirk they really were pushing to make gains in workstation and server with Bulldozer with a smaller R&D budget than Intel and made decisions accordingly. FX 8350 vs i7 3820 while not a completely ideal comparison does serve adequately, imo. How they are priced and how they sell is only relevant to this thread in the generic sense that AMD needs to make money to stay in business and keep making chips.
 
Last edited:

cytg111

Lifer
Mar 17, 2008
26,201
15,605
136
"funny" thing is that, if anyone should have been on the front of x86/amd64 with stuff like TSX, it *should* have be the company that touts the 8 core "beast".
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
When actually comparing the chip and node size I think Atenra is on target regarding FX 8350 and i7 3820. iGPless, similar die size, 32nm GloFo vs 32nm Intel.

GloFo should have better density than Intel.
 

Pilum

Member
Aug 27, 2012
182
3
81
What you don’t understand is that they already have spent a lot of money for the Bulldozer microarchitecture, they don’t have to spent the same high amount for upgrading the same architecture to the next core designs (PileDriver, SteamRoller, Excavator).
That's wrong. The design teams and server farm requirements don't get smaller for development of successive CPU generations. Just because the CPU block diagrams are similar doesn't mean that the low-level implementation stays nearly the same and can simply be recycled from generation to generation. The whole design needs to get reworked, if you change the frontend you have to change the backend, if you change the L1 cache you have to change the interfaces to the core and L2, for changes in L2 you have to change L1 and L3, changes in the decoder imply changes in fetch and scheduler interface, if you change the execution units you often end up having to change the micro-ops which requires changes from decode to retire, etc.

So you have to work on all parts of the CPU at once, and thus the design team will be basically the same size for a new chip as it is for an improved version of an old one.

You also need to consider the simulation requirements for development and validation. These requirements rise with the compexity of the architecture. A thread on RWT stated that Intel has something like 10.000 4c servers for these purposes now, and that the number is rising annually. Buying & running such server farms requires a lot of capital, and AMD simply doesn't have the budget to compete with Intel in this regard. But if your designers have to get by with less processing power, their design can't be tested and optimized as well. The results of this can be observed with each new CPU generation.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,224
589
126
I think you are looking at the wrong market. I really expect IVB-EP/EX to be a revolutionary processor for those markets. 15 cores on a reasonable TDP is game changing.

Not a mainstream desktop CPU, which is the primary topic of this thread. Please read the original post.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
"funny" thing is that, if anyone should have been on the front of x86/amd64 with stuff like TSX, it *should* have be the company that touts the 8 core "beast".

They should, but they cannot afford it. It's cheaper to copy Intel implementation 3 years down the road.
 

pyjujiop

Senior member
Mar 17, 2001
243
0
76
Remember the headroom. 77W IB vs a 140W+ PD. AMD is already way beyond spec and beyond P4 in consumption.

Well, that was one of my points. AMD can't just catch Intel by ramping up to enormous clock speeds, even if the architecture is theoretically capable of doing so. They don't have access to Intel's production facilities, and they're often a process node behind, or at least well behind in development on the same node.

I know how much power an FX draws. I'm using an 8350 at 4.3 GHz right now. If I overclock it much farther, it makes the old P4 Prescott look efficient. I hate to think how bad Bulldozer must have been. I never had one of those.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
They should, but they cannot afford it. It's cheaper to copy Intel implementation 3 years down the road.

Isn't it interesting that AMD was most successful when they were challenging Intel with leading edge innovation in x86. AMD's leadership in iGPU/APU came at a cost to x86 performance and hasn't provided the same kind of benefit in the general market.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It will cost nearly 2x to develop a 20nm big-core IC as it cost to develop a 32nm big-core IC.

That is correct but since 2009, AMD doesnt have to spend a penny/cent in R&D for new process development, GlobalFoundries has to. Also, AMD doesnt have to spend a penny/cent for FAB machinery, GloFo has to.
So, although the cost for new IC development (Design+Masks+EDA tools) for 20nm is double than that of the 32/28nm the cost still remains fairly low (close to 1B maybe??). AMD still has the R&D budget to spend it over the 2-3 years time the new CPU design needs.

Also, dont forget that as of now they only have the Bulldozer Microarchitecture for both their Server, HighEnd desktop AND HighEnd APU. Not being able to develop and bring to market SteamRoller and Excavator would mean that AMD would be dead until the end of the year, since ONLY Kabini and Temash would not be enough to sustain them untill they would bring a new microarchitecture (after 2016).