AMD’s Research Budget Reaches Record Lows in 10 years

desprado

Golden Member
Jul 16, 2013
1,645
0
0
Research budget is a very relevant indicator when it comes to measuring a semi conductor company’s progress, potential and current standing. Taping out wafers is expensive, extremely so, and the R&D budget is usually indicative of this fact. New research (Via Sweclockers.com) has surfaced focusing on three companies: AMD, Intel and Nvidia. The study shows that AMD’s research budget is actually at record lowest while Nvidia’s and Intel’s are at an all time high.

Lets start with Intel first. Blue has a quarterly budget of approximately 3 billion dollars – which is approximately 12 times as much as AMD’s budget and approximately 9 times as much as Nvidia’s budget. While the difference is to be expected due to the fact Intel maintains its own fabrication facility, the budget shows just how big of a difference Intel enjoys between itself and its only rival in the main x86 ecosystem.

Research budget is a very relevant indicator when it comes to measuring a semi conductor company’s progress, potential and current standing. Taping out wafers is expensive, extremely so, and the R&D budget is usually indicative of this fact. New research (Via Sweclockers.com) has surfaced focusing on three companies: AMD, Intel and Nvidia. The study shows that AMD’s research budget is actually at record lowest while Nvidia’s and Intel’s are at an all time high.



Lets start with Intel first. Blue has a quarterly budget of approximately 3 billion dollars – which is approximately 12 times as much as AMD’s budget and approximately 9 times as much as Nvidia’s budget. While the difference is to be expected due to the fact Intel maintains its own fabrication facility, the budget shows just how big of a difference Intel enjoys between itself and its only rival in the main x86 ecosystem.




Nvidia’s R&D budget recently crossed that of red’s and is now at a record high of 348 Million Dollars, while AMD’s has fallen from roughly this amount to 238 Million Dollars. To put that into perspective, this level of R&D budget was first seen back in 2004, more than ten years back. The reason for that is ofcourse, brutal competition with Nvidia and constant price cuts to current products. Nvidia and AMD currently have a budget difference of almost 100 Million Dollars – a significant amount and one that will definitely have practical implications.
Since Nvidia and AMD do not need to worry about inhouse fabrication, most of the R&D spending is done on silicon and wafer design directly. The more the spending, the more chances of a better, more powerful architecture and efficient die once TSMC comes into play. To be fair, R&D budget itself doesn’t lend the complete picture, R&D spending does. Theoretically speaking, it is possible for AMD to fullfill R&D costs from somewhere else – but that is admittedly a long shot. Combined with the rumor of Sasmung allegedly acquiring AMD and you have got yourself a very interesting turn of affairs.

 

xpea

Senior member
Feb 14, 2014
458
156
116
they forget to say that AMD portfolio is way wider than Nvidia. When the green team only concentrate on GPU and a bit on SoC, red boys have CPU, Chipset, GPU and SoC/APU. The real interesting data will be to know how much intel/AMD/NV spend solely on GPU R&D. Intel and Nvidia shouldn't be far away...
 

csbin

Senior member
Feb 4, 2013
908
614
136
intel :sneaky:

1amd_nvidia_intel.png
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
Anyone have a graph showing intel vs amd research budgets before 2010?
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Isn't it rather shocking that a company like Intel that seems to mainly focus on CPUs (no matter the market) with that insane budget of 3 billion...isn't already ahead like 500%+++ in performance?

I'm just sayin'. Intel probably has more than 20 times the R&D funds allocated for CPUs...yet their CPUs aren't anywhere near that much ahead it seems....like what are they actually doing with all that research money O_O"?
 

tviceman

Diamond Member
Mar 25, 2008
6,734
514
126
www.facebook.com
Isn't it rather shocking that a company like Intel that seems to mainly focus on CPUs (no matter the market) with that insane budget of 3 billion...isn't already ahead like 500%+++ in performance?

I'm just sayin'. Intel probably has more than 20 times the R&D funds allocated for CPUs...yet their CPUs aren't anywhere near that much ahead it seems....like what are they actually doing with all that research money O_O"?

They also have to research and pay for their own node advancements, they have cellular modem technology, they compete in the ultra low-power space, they compete in HPC, they compete in servers. Intel has more products and competes in as many (if not more) areas than Nvidia and AMD combined before factoring in their manufacturing facilities. The factories probably account for a very big chunk of R&D. Nodes and 3D transistors don't create themselves.
 
Dec 30, 2004
12,553
2
76
eff you, hector!!! you and your MCM Phenom killed AMD! Imagine if they had just slapped 2 duals together like the Q6x00's.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Isn't it rather shocking that a company like Intel that seems to mainly focus on CPUs (no matter the market) with that insane budget of 3 billion...isn't already ahead like 500%+++ in performance?

I'm just sayin'. Intel probably has more than 20 times the R&D funds allocated for CPUs...yet their CPUs aren't anywhere near that much ahead it seems....like what are they actually doing with all that research money O_O"?

I could be wrong, but I suspect that the lion's share of Intel's R&D goes for research on fabs, not the CPU division. Modern fabs are insanely expensive and complicated. AMD has the disadvantages of being fabless, and the idiotic WSA keeps them from realizing some of the advantages of it, but at least they don't have to pay for fab R&D out of their limited budget.

I also suspect Intel's corporate culture is more friendly to "blue sky" R&D initiatives that might (or might not) pay off in the distant future, while AMD and Nvidia are more concerned with things that are immediately relevant to the next product cycle.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I could be wrong, but I suspect that the lion's share of Intel's R&D goes for research on fabs, not the CPU division. Modern fabs are insanely expensive and complicated. AMD has the disadvantages of being fabless, and the idiotic WSA keeps them from realizing some of the advantages of it, but at least they don't have to pay for fab R&D out of their limited budget.

I also suspect Intel's corporate culture is more friendly to "blue sky" R&D initiatives that might (or might not) pay off in the distant future, while AMD and Nvidia are more concerned with things that are immediately relevant to the next product cycle.

Very unlikely. Here is the reason why:
bulletin20150224Fig01.png
 
Jun 18, 2000
11,219
783
126
In other news, building your own fabs with your own process technology is ridiculously expensive. Intel's R&D budget only shows how much costs have increased over the years. The article just glosses past this like it doesn't matter.
Since Nvidia and AMD do not need to worry about inhouse fabrication, most of the R&D spending is done on silicon and wafer design directly.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
In other news, building your own fabs with your own process technology is ridiculously expensive. Intel's R&D budget only shows how much costs have increased over the years. The article just glosses past this like it doesn't matter.

Did you remember to look Samsung, Micron and TSMCs R&D budgets? Or missing companies like Qualcomm and Broadcom.

Chip design is the absolute biggest cost.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
In other news, building your own fabs with your own process technology is ridiculously expensive. Intel's R&D budget only shows how much costs have increased over the years. The article just glosses past this like it doesn't matter.

It is ridiculously expensive, but the main disbursement is CAPEX, not R&D expense.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
AMD gets the most out of their research budget, Intel not so much.

I say both companies get what they pay for. AMD with its shoestring budget got the unmitigated failure, a gift that keeps on giving. Intel on the other hand got the Core line, which brought billions in-house. That's why I don't have many hopes for Zen.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
If you inflation adjust, you may have to go back to 2001-2002 to find a similar R&D budget.

One only need to look at their product list to see it hurts.

Carrizo for mobile only. Rebrand of desktop, cat cores, 310-370 series GPUs. 380 and 390 is still nowhere to be seen or even what they will be. With continual declining marketshare in both GPU and CPU.

Sure is getting the most out of it. No wonder they are in full retreat into the semicustom business. You cant live in the PC segment by selling old technology over and over again.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
That's an understatement. It amazes me that they support their CPU tech and GPU tech with a budget lower than Nvidia.

R&D pays years down the line. What you are seeing is the R&D budget of 3+ years ago.

This is why cutting R&D is bad. It doesn't affect you right now but will certainly affect you 5 years down the line. At which point your products are noncompetitive and you end up bleeding money...which you can't stop because it takes several years to bring anything decent to market.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
R&D pays years down the line. What you are seeing is the R&D budget of 3+ years ago.

This is why cutting R&D is bad. It doesn't affect you right now but will certainly affect you 5 years down the line. At which point your products are noncompetitive and you end up bleeding money...which you can't stop because it takes several years to bring anything decent to market.

This. What we are seeing now is the results of the decisions Seifert and Rory took circa 2011-2012, and the result is that AMD line up today is much smaller than it was at the time: Opteron is a goner, their HETD platform is MIA, their desktop APU is also a goner and the cat cores are falling behind. The results of the 2013 and 2014 cuts will be felt circa 2017-2018. If Rory did the right bets AMD will make into the 2020's in an ascending curve, if he didn't, well, VIA will have a buddy, but again, you don't fire a CEO you trust he made the right bets.