The future of AMD in graphics

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

turtile

Senior member
Aug 19, 2014
633
315
136
AMD's cost to tape out chips will still cost less than Nvidia because they can share development cost between CPU and GPU. They've also made a deal with Synopsys to help lower cost of future cutting edge design (license and share IP). The software cost is the largest chunk of the design. I assume this is the main reason AMD released the 7nm Vega. They needed a test product to make sure 7nm CPU designs have no issues.
 
  • Like
Reactions: DarthKyrie

jpiniero

Lifer
Oct 1, 2010
16,823
7,264
136
I assume this is the main reason AMD released the 7nm Vega. They needed a test product to make sure 7nm CPU designs have no issues.

It was more about the margins - the Instinct margins are way better than even Epyc. Not to mention they badly needed a competitive DP part if they were serious about competing with nVidia in the datacenter.

Now the VII is not making AMD much if anything, but is better than the trash can.
 
  • Like
Reactions: turtile

Zstream

Diamond Member
Oct 24, 2005
3,395
277
136
CPU has enough revenue in the total addressable market for AMD to survive.

GPU on the other hand does not. The consumer graphic market is worth around 5 to 6 billion a year.

nano3.png


https://semiengineering.com/big-trouble-at-3nm/

At 3nm, they estimate it will be 1.5 billion for a complex GPU which is something that Nvidia can barely afford at the moment and will need their data center revenue to continue to grow.

At 7nm, cost are tripling compared to 16nm.

These cost are significant and have a bearing on how many chips AMD releases. At 28nm, AMD launched 3 different 28nm chips and over the lifetime of 28nm released 8 different designs on 28nm.

During 16/14nm, AMD only initially launched two chips and by the time this generation is over, AMD will only have released 4 different chips design.

If we count Vega 20 as a 2018 design since that is when it was released, Navi at the end of the year a 2019 design, AMD will be trickling out only 1 7nm card a year and if we look at costs of design, engineering hours, we see why.

What it means for the future for AMD will have to get by with less designs and have their chips cover a larger market. This puts them at a disadvantage because it means compromises on the chip. That is a chip that might be too small to address the mainstream market polaris 10 covers, but a chip too large to go into laptops. What AMD needs to happen between now and 5nm is make a chip design like Ryzen 2 where multiple chips are on the same design so one chip can cover the entire market. The difficultly in this is latency in GPU workloads is more sensitive than CPU ones. If AMD can't do this, they won't be able to afford any nodes beyond 7nm because their revenue is too low and is not growing enough.

Without significant revenue growth to offset the cost of development cost of higher nodes, it simply makes less and less sense for AMD to make consumer graphics. The CPU market is worth 10x as much as the GPU market meaning the return on investment is much better for AMD. The GPU market on the other hand is comparatively little in comparison.

Lisa Su sees this and which is why all the cost cutting measures have been focused on AMD graphic division which has largely been the most successful a profitable part of the company for years. Most people would think, why slowly kill the most successful part of your business. This is short sighted. The reason is a 60 billion dollar market has room for two players while a 6 billion dollar market is going to be a struggle. The former has the value and potential to keep up with the ongoing and increasing cost of nodes, the latter does not unless they are Nvidia and are in a dominant position.

If Su did not focus the companies resources on CPU's, the graphic divisions profits would have eventually been unable to cover the R and D cost of the CPU division and eventually AMD CPU division would have died and taking down AMD graphic with it because of the ongoing development costs of smaller nodes.

Going onto 5nm, development cost from 16/14nm, 100 million about to 540 million. A 5.4x increase in cost. Will AMD graphic revenue increased this much to keep up with the cost? Certainly not in the consumer market. One of the big reasons AMD could afford even the transition from 28nm to 16nm is the transition from an largely North American workforce for GPU development to a Chinese one. With nowhere to pinch any more pennies and the graphic division struggling without mining, the RTG group needs rapid revenue growth to make sense for AMD. This is why AMD has targeted the data center above the gamer segment. I expect Nvidia to even start struggling at 5nm and likely transition part of its work force to China. Revenue growth will be critical for them to continue using new nodes because a 1.5 billion dollar 3nm design right now is too expensive for them.

During 7nm and including the development costs, AMD will make money off of Ryzen 2 and the various server chip. The consumer graphic division on the other hand is likely to struggle. This is because revenue is probably going to be similar to the Polaris generation without mining but with the added cost of R and D, AMD graphic division will struggle to make a profit.

Remember not only does the profit of a chip have to pay for cost of the wafers themselves, but the design cost. If Intel presents a threat, AMD will Find it very difficult to make a profit.

Right now, the discrete market is worth around 1.5 billion dollars quarterly for consumer graphics. 6 billion annually without mining. AMD gets around 300-400 million considering their 30% marketshare vs Nvidia's 70 and Nvidia higher average selling price. The before and after mining showed, growth has been overly optimistic in the consumer graphic market.

If Intel manages to be successful and take 25 percent of the market and takes 375 million from both players, that is about 187 million from Nvidia and AMD each, both are going to feel the pinch but AMD case, it much more significant. If this were to happen this would translate into Nvidia is making around 913 to 1013 million and AMD is making 113 million to 213 million in revenue(not profit). While Nvidia can still make a profit from this, AMD won't be able to(intel won't either but are willing to fight a war of attrition with Nvidia). For the RTG group to survive, Intel has to fail in its consumer chip launch. There is not enough room for 3 players when chip designs cost 300 million at 7nm and the cost of wafers getting more expensive each generation.

The China group has always been a degradation of design and focus. The team has never brought forth a competing product.

AMD screwed themselves by moving all R&D into China. It’s a cheap immediate win, that has lost them the long war. It does nothing to bring parity between themselves and Nvidia. No one to blame but AMD.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136

Polaris is OK, but many of the gains come from the die shrink. In terms of architecture it isn't that much better than Hawaii and Tonga.

IMO, the best Polaris product isn't a consumer product at all, but the Radeon WX 5100 workstation card. In terms of compute performance, nothing else <75W (i.e. without a PCIe connector) can beat it. I suppose Nvidia might manage to finally surpass it with a Quadro-based TU116 or TU117 card if they release one, but the best they can do currently is the Quadro P2000 and it isn't nearly as strong (though it comes close in gaming due to Nvidia's DX11 perf/TFlop advantage).

You can see AMD's problems begin as soon as they laid off ATi's GCN development team in Canada and replaced them with cheap Chinese laborers who don't understand the architecture. GCN 1.1 (Hawaii), the last Canadian architecture, was a big improvement over 1.0. GCN 1.2 (Tonga/Fiji), which was the Chinese team's first attempt at a GCN redesign, was just a small incremental change, and Tonga was not even that much better than 1.0 Tahiti in terms of die utilization or perf/watt. GCN 1.3 (Polaris) added little more than a better memory controller (which was probably licensed anyway) and primitive discard acceleration. GCN 1.4 (Vega) added a lot of stuff, but it added up to very little in terms of real added performance; Vega basically offers about the performance you'd expect from Fiji at the same clock rates. The current Chinese team doesn't have enough knowledge to optimize GCN for gaming or remove the 4-shader-unit limitation.
 

tajoh111

Senior member
Mar 28, 2005
346
388
136
Polaris is OK, but many of the gains come from the die shrink. In terms of architecture it isn't that much better than Hawaii and Tonga.

IMO, the best Polaris product isn't a consumer product at all, but the Radeon WX 5100 workstation card. In terms of compute performance, nothing else <75W (i.e. without a PCIe connector) can beat it. I suppose Nvidia might manage to finally surpass it with a Quadro-based TU116 or TU117 card if they release one, but the best they can do currently is the Quadro P2000 and it isn't nearly as strong (though it comes close in gaming due to Nvidia's DX11 perf/TFlop advantage).

You can see AMD's problems begin as soon as they laid off ATi's GCN development team in Canada and replaced them with cheap Chinese laborers who don't understand the architecture. GCN 1.1 (Hawaii), the last Canadian architecture, was a big improvement over 1.0. GCN 1.2 (Tonga/Fiji), which was the Chinese team's first attempt at a GCN redesign, was just a small incremental change, and Tonga was not even that much better than 1.0 Tahiti in terms of die utilization or perf/watt. GCN 1.3 (Polaris) added little more than a better memory controller (which was probably licensed anyway) and primitive discard acceleration. GCN 1.4 (Vega) added a lot of stuff, but it added up to very little in terms of real added performance; Vega basically offers about the performance you'd expect from Fiji at the same clock rates. The current Chinese team doesn't have enough knowledge to optimize GCN for gaming or remove the 4-shader-unit limitation.

Indeed, the team from China has produce significantly worse products than the North american team.

The last pure North American design was Hawaii or the 290x and that card was fire(in a good way). Not only was the gaming performance competitive, it had FP64 and really caught up to Nvidia and later surpassed Kepler.

Pretty much every launch after this has been a disappointment generally performing at the bottom end of peoples expectations to vastly disappointing them as was the case with Vega. However the savings was quite significant for AMD and was really the only way graphics could continue.

Lets look at the healthy North American salaries

https://www.indeed.com/cmp/Amd/salaries

Average salary is around 130k. This is nothing like the Chinese salaries.

Salaries range from $2250 USD and they top out at about $5500 USD per month or about 66k a year.

https://www.glassdoor.ca/Salary/AMD...15.0,3_IL.4,12_IM999.htm?countryRedirect=true

The Highest engineer salary in China is lower than the lowest salary in North America. These are RMB which means you have to divide by 6.666 to get USD.

As you can see the savings are enormous and as a result, AMD can afford 3000 people to work on graphics as a result of disparity in salaries. This isn't without repercussions and there has been a decrease in product quality since AMD Shanghai started developing graphics.

However without this change, staff who worked on CPU stuff would have to go or have much smaller team of 600-800 people working on graphics.

As we can see from how the Ryzen 2 came out, it was the correct decision.

One thing I will say is at equal price to performance, I will buy Nvidia over AMD because of this. While their graphics were developed in North America, I would gladly give them my money over Nvidia because it they needed it more than Nvidia. With the change to China, would rather my money support a well paying North American Job than a one that support an underpaying overseas job, especially with AMD generally pricing their products like Nvidia as of late.

North America needs to keep well paying, high skills jobs here.

AMD needs to embrace the Value angle for me to consider them, because when you use cheap labor, I expect cheaper prices when the performance is equal when you combine this with them losing out on heat, noise and efficiency.

One more thing to note is AMD attacked Nvidia's weakest chip with polaris and this is Gx106.

GX106 has been one of the Nvidia's weakest parts when showing off their architectures because of the Ackward compromise between shaders and Bandwidth. You will often see parts with half the shaders above this chip or a underperforming 128bit bus. You will rarely see a healthy amount of shaders and bandwidth on this part compared to the next step up in GX104.
 
Last edited:
  • Like
Reactions: Muhammed and Elfear

beginner99

Diamond Member
Jun 2, 2009
5,318
1,763
136
As you can see the savings are enormous and as a result, AMD can afford 3000 people to work on graphics as a result of disparity in salaries. This isn't without repercussions and there has been a decrease in product quality since AMD Shanghai started developing graphics.

However without this change, staff who worked on CPU stuff would have to go or have much smaller team of 600-800 people working on graphics.

This is the classic mythical man-month issue. Adding more people will not always make a project go faster and at some point it will actually make it slower because the communication overhead gets too big. that's why small teams often are so much more efficient. Having more workers doesn't result in more or better products.

It's also ignores the fact especially for R&D that if you simply don't have at least the one guy with exceptional skill and ideas, no amount of workers can resolve lack of good design and ideas.

EDIT:

And yes AMD needs to go chiplets on GPUs as well. I mean they had something hinting at that in roadmaps years ago. AFAIK it was the one after Navi.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,931
13,014
136
Now the VII is not making AMD much if anything, but is better than the trash can.

Hey, at least it's selling quite well relative to launch availability. It's only recently that they seem to have met market demand for the product, and they're still only selling one per customer from amd.com .

I'm happy to have a Rad VII.

Ditto. I totally did not expect to be able to buy such a card for "only" $700. Vega FE was $999 for crying out loud.
 
  • Like
Reactions: DarthKyrie

SpaceBeer

Senior member
Apr 2, 2016
307
100
116
It was more about the margins - the Instinct margins are way better than even Epyc. Not to mention they badly needed a competitive DP part if they were serious about competing with nVidia in the datacenter.

High margins don't mean anything, if you have low sales. And I'm not sure how popular Radeon Instict is in that market, where there are more competitors than in dGPU market (only 1)
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Most people thought it would not be successful because of the form factor limiting CPU/GPU power, not necessarily because it was nVidia (Although previous Tegra chips were junk). And they were right to an extent that graphics were significantly hampered by the form factor.

But, most nintendo gamers don't expect top tier graphics. Game play and game quality is far more important.

Should revisit that thread. No one was expecting much from the console. But it wasn't until the NV announcement that the thread essentially died.

It still trails the PS4 in overall sales globally month per month. It has had the highest sales some months in the USA, but its also the newest console out of the bunch. And as far as I know its not a big profit machine for Nvidia.

Switch is matching in some regions and outpacing in other regions PS4's own numbers in the same time span. The Switch already outpaced Xbone. Just look at the US market, PS4 17.7M to Switch's 17.4M last year, and PS4 is routinely cheaper than the Switch, especially during the holidays. Will it outsell the PS4 through life time /shrug, dunno, but that wasn't what I was saying. Hardware doesn't really factor into consolers decisions. When you consider that a good chunk of the population still thinks of consoles as toys, there are more people likely to buy a Switch for their kids than an Xbox/PS4, and sales numbers are reflecting this. The Wii was essentially a riced out Gamecube, but it ate PS3 and Xbox 360 for lunch.


I don't see the consoles contract being threateend, this coming generation. But I wouldn't put all my eggs in one basket that things won't change. I brought up the Switch only because in this forum, factoring in Nintendo's own statements of focusing on ARM didn't stop the usual suspects from basically PROMISING that Switch would use AMD CPU+GPU design. With how heavy MSFT is hemorrhaging on both initial Xbone and Xbone X, them aiming for "top tier" outside of PC's current "mainstream" is something I'd be cautious of. Sony is now essentially ruining their own positive momentum with their stupid localization rules, and Nintendo is oddly enough now has the most diverse line up of games. Things are just so crazy in the console side right now. I think some of Switch's growing success is actually Sony's own doing.

TL;DR: I think AMD's success in the console segment isn't going to be directly affected by NV/Intel in the near future, I see MSFT/Sony themselves creating more issues for AMD. From my own experience, AMD's public perception/mindshare is mostly out of their control. They've just had a bad string of partnerships crapping over the community/products, ie Wolfenstein 2. A game that up until it's marketing and release was going to be received well only to basically flounder and all that AMD marketing spoiled.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Use the physical BIOS switch, change it from OC to Silent.

I wanted to respond to this while I remembered. Not sure the switch did anything. The fans ramp up to 85-90%. I'm even taking means to limit things, I set my FPS limit to 75 (my monitor is 100hz), I raised the temp limit to 85C, with the switch set to silent mode it no longer boosts beyond 1380mhz, and this thing is still taking off into the atmosphere. With only one card experience, I'm guessing its the cooler not the chip, though I'm sure the chip factors in. This little *card* [Note: not sure why my post got edited, EDIT #2, HAHAHA seriously? Haha, I guess the more you learn. Googling the word revealed a definition I've never once heard that word used for in my 30+ years of conscious existence haha.] is just insane loud (again, I'm aware I'm coming from water, but I was not expecting this card to get this loud). With no word from Zotac on my replacement, I might return this card and get something with a beefier cooler or just less heat. I need a back up card anyways, I'd want to keep one that doesn't surprise me every time it starts to ramp up haha.

I might record a video, perhaps my card is defective (possibly not good contact from heatsink and chip). Looking at some reviews, they don't list the card as loud. I've had some loud cards by my own definition. I remember my own disappointment with the newly reworked cooler on the GTX 1080 Ti Founder, that thing was also loud. But I can understand that kind of noise from that kind of card, I went into the RX 580 expecting something at least quieter than that. You can hear my Red Devil from another room! My wife constantly points out "playing a game?" Since we're both on water now, I guess we just aren't use to fan noise anymore haha.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
Wow, I had no idea how bad off RTG is: https://wccftech.com/nvidia-amd-discrete-gpu-market-share-q4-2018-report/

AMD - 18.8% market share in dGPU last quarter! Now I see why some are so skeptical hn about AMD's ability to invest in new big die Radeon GPUs going forward. AMD really can’t afford to take money out of CPU development as Intel, despite their problems, isn’t standing still. It is hard for me to see how AMD releases any competitive dGPUs after this year's releases.

In light of this, I wonder why AMD didn’t spend a bit more time refining Vega20 - unless they really couldn’t refine the implementation much more and are counting on process improvements going forward.

[please pardon my ignorance, I haven’t been following GPU technology much the past ~5 years; have been spending hours reading the B3D forums the past week].


PS. What happened to the semiaccurate forums?
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
AMD - 18.8% market share in dGPU last quarter!
They're just selling off the remains of their inventory.
It is hard for me to see how AMD releases any competitive dGPUs after this year's releases.
You're in for a revelation.
In light of this, I wonder why AMD didn’t spend a bit more time refining Vega20 - unless they really couldn’t refine the implementation much more and are counting on process improvements going forward.
It's a N7 pipecleaner, why bother?
Very limited volume part.
PS. What happened to the semiaccurate forums?
Charlie said the DB went kaput.
 

jpiniero

Lifer
Oct 1, 2010
16,823
7,264
136
Well, if the full Navi turns out to be Vega 56ish performance for $249 and lets say 150 W that would be good.
 

Yotsugi

Golden Member
Oct 16, 2017
1,029
487
106
Chooo chooo! I'm ready for some competition. NV prices are ridiculous! WTB AMD/Intel/Anyone!
You're going to get even higher prices going forward.
Someone's gotta pay for that $1.5B 3nm tapeout, and that someone is you.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
You're going to get even higher prices going forward.
Someone's gotta pay for that $1.5B 3nm tapeout, and that someone is you.

That's fine, prices aren't an issue for me, options are. I'd rather have options, or at least the illusion of options. You got my train ticket, let's see how far this train goes!
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Well, if the full Navi turns out to be Vega 56ish performance for $249 and lets say 150 W that would be good.

Nvidia's GTX 1660 Ti has slightly lower performance than Vega 56 (roughly 5-10% less) and adheres to a 120W TDP. And that's on 12nm (which is actually a refined 16nm). If AMD can't get better-than-Vega-56 performance at 150W with a full node advantage over Nvidia, then I don't know what to say...
 

jpiniero

Lifer
Oct 1, 2010
16,823
7,264
136
Nvidia's GTX 1660 Ti has slightly lower performance than Vega 56 (roughly 5-10% less) and adheres to a 120W TDP. And that's on 12nm (which is actually a refined 16nm). If AMD can't get better-than-Vega-56 performance at 150W with a full node advantage over Nvidia, then I don't know what to say...

Thing is, nVidia's obviously not going to release the 1660 Ti's replacement any time soon.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
You're going to get even higher prices going forward.
Someone's gotta pay for that $1.5B 3nm tapeout, and that someone is you.
From the article, it seems like that was for a complex large die ASIC (like Volta on 12FFN). The range given was $500M - $1500M. Wouldn’t AMD likely to be shooting for smaller die sizes?
 

prtskg

Senior member
Oct 26, 2015
261
94
101
Nvidia's GTX 1660 Ti has slightly lower performance than Vega 56 (roughly 5-10% less) and adheres to a 120W TDP. And that's on 12nm (which is actually a refined 16nm). If AMD can't get better-than-Vega-56 performance at 150W with a full node advantage over Nvidia, then I don't know what to say...
Easy. Nvidia's arch is much better comapared to GCN for most of the things.
I do hope Intel and AMD bring some good gpus. We really need more competition in gpus.
 
  • Like
Reactions: kawi6rr