Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 79 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146

biostud

Lifer
Feb 27, 2003
18,701
5,432
136
After the big drop on client PC side, I think AMD has too much capacity on all of the nodes.
You are probably right. But it makes you wonder about the total capacity of TSMC and over what time horizon those wafers has to be delivered. With both Intel and nvidia moving orders to TSMC, I doubt they have empty production queues in the foreseeable future.
 
  • Like
Reactions: Leeea

moinmoin

Diamond Member
Jun 1, 2017
5,064
8,032
136
The biggest hurdle is going to be CUDA for which AMD has no answer.
I disagree. Windows PC gamers seem to love Nvidia no matter what. Same can't be said about supercomputer users so there is a broad effort to move away from the current Nvidia controlled ecosystem, with AMD just one company in that effort. So the latter is both a "smaller" hurdle and a higher margin business.
 
  • Like
Reactions: Tlh97 and Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
If I recall correctly ATI did get 50% market share with 3870/4870/5870 cards, but it took them 3 generations of aggressively priced cards that performed the same as nvidia to achieve it. It is a tough battle, but it is not impossible, it will take time and consistently delivering equivalent product at better prices or better product at same prices. The biggest hurdle is going to be CUDA for which AMD has no answer.

-AMD has toyed with 50% market share but the market has been a 60/40 split for a long time after the 5000 series and has collapsed all the way back to an 80/20 split as of 2022.

AMD ain't hunting market share, it's too brutal to gain and way too easy to lose.

Here is a nice 18 year market share recap video: short and easy to watch:

 

biostud

Lifer
Feb 27, 2003
18,701
5,432
136
I disagree. Windows PC gamers seem to love Nvidia no matter what. Same can't be said about supercomputer users so there is a broad effort to move away from the current Nvidia controlled ecosystem, with AMD just one company in that effort. So the latter is both a "smaller" hurdle and a higher margin business.
I've had both AMD/ATI and nvidia cards, but for last couple of generations the price/performance has been equal or close to and with nvidia you got RTX and DLSS. Sure you got some extra memory on AMD, but currently it hasn't been that important. So for most you would get a better product from nvidia.
Now where a RX6900 is cheaper than a 3080 12GB, and FSR2.0 exists I think it is a better deal, unless you have a specific wish for raytracing. But it is too little too late. Hopefully they will fare better with RX7XXX.
 
  • Like
Reactions: Leeea

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
AMD has put a lot of shoulder into improving their DX11/OGL performance and reaching feature parity with NV, and that cannot be understated. You can see the effect additional margins and income has had on their efforts: they can put more money toward the "nice to haves", which in turn make them more competitive with NV, which in turn gets them more market share.

Selling cheap to manufacture cards at firesale prices that can only game and do nothing else doesn't get you real tractable market share, making money to make good products does, and that's why margins matter so much.
 

Aapje

Golden Member
Mar 21, 2022
1,515
2,065
106
This best seller list seems to be incorrect. Why would 4090 at $2999 be #2 on the top seller list?
This just reflects that these are short time sales, which makes then rather pointless. Of course a new GPU released just now is going to dominate the recent sales. But the 4090 is not going to be the 2nd most sold GPU.
 
  • Like
Reactions: Leeea

Vattila

Senior member
Oct 22, 2004
809
1,412
136
The biggest hurdle is going to be CUDA for which AMD has no answer.

AMD has a clear alternative to CUDA: HIP — a "CUDA dialect" serving as a programming model for their open-source ROCm framework, while supporting CUDA as an alternative backend. Intel and the wider industry have an alternative to CUDA as well: Khronos SYCL — a programming model based on ISO C++, aiming to be subsumed by the official ISO C++ standard at some point.

SYCL is conceptually pretty close to CUDA/HIP, except for accelerator code being written in pure C++ (no proprietary language extensions). A SYCL implementation already exists using HIP as a backend: hipSYCL. Also, Intel's C++ dialect — Data Parallel C++ (DPC++) — is based on SYCL and serves as a programming model for their open-source oneAPI framework. USA national laboratories have contracted Codeplay (now a subsidiary of Intel) to add support for AMD and Nvidia hardware in DPC++, thereby allowing SYCL/DPC++ to be used to write portable code across all these platforms.

SYCL is seeing rapid adoption in the supercomputing space, with AMD having delivered the first exascale computer (Frontier) this year and being on-track to deliver an even bigger one next year (El Capitan). These machines have a heavy emphasis on GPGPU and AI compute and will be programmed with no CUDA in sight (SYCL, HIP, OpenMP, etc.). AMD also won the biggest supercomputer in Europe (LUMI in Finland), which is based on the same hardware technologies and programming models. Intel will bring another exascale supercomputer online next year as well (Aurora), which will also use SYCL (DPC++) as one of the primary programming models.

More about SYCL, HIP and CUDA here:


PS. I see a lot of discussion about the CUDA software moat on forums, without any mention of HIP, SYCL and DPC++ (oneAPI) whatsoever, which tells me there is a lot of ignorance when it comes to the industry's preferred direction (open standards) and the CUDA alternatives already gaining ground (SYCL and HIP). That said, AMD's lack of support for ROCm on Windows is a sore point, especially due to missing (or poor) acceleration support in some application areas (AI and content creation, in particular).
 
Last edited:

insertcarehere

Senior member
Jan 17, 2013
639
607
136
It's not that simple. If AMD prices RDNA3 very aggressively against Lovelace and they have volumes and in 6 months they gain considerable mindshare and marketshare for the Radeon brand/products, I think their investors would be very happy. Seeing something you invested in start being very competitive in a segment they previously were meh (consumer/client GPU) and technically increasing revenue potential, is the next best thing long term investors would like to see, behind immediate ROI ofc.

AMD GPUs, by-and-large, need to compete with AMD CPUs for TSMC 5nm wafers.

Navi 31 + 32 all take up 200++mm^2 of N5 die space + are 5-7 chiplet package designs. From a packaging perspective this puts them closer to Zen 4 EPYC than anything else AMD sells. From that perspective it is very difficult for AMD and its investors to justify selling RDNA 3 for cheap when the opp cost here is less high-margin Zen 4 workstations.
 

Vattila

Senior member
Oct 22, 2004
809
1,412
136
AMD GPUs, by-and-large, need to compete with AMD CPUs for TSMC 5nm wafers. [...] From that perspective it is very difficult for AMD and its investors to justify selling RDNA 3 for cheap when the [opportunity] cost here is less high-margin Zen 4 workstations.

Good point, although AMD SVP Forrest Norrod has recently stated that package substrate capacity has been (and still is) the limiting factor for accelerating EPYC sales. Wafer supply is not a limiting factor, he claimed.

"The principal gate for us is not wafers. Particularly for these Epyc chips, it’s advanced substrates. And there’s just a long lead time to build up the factories and increase capacity for those substrates. We have made major investments and I think we are ramping that capacity at a very steep but prudent rate."

The Steady Hand Guiding AMD’s “Prudently Expanding” Datacenter Business (nextplatform.com)
 
Last edited:

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136
AMD GPUs, by-and-large, need to compete with AMD CPUs for TSMC 5nm wafers.

Navi 31 + 32 all take up 200++mm^2 of N5 die space + are 5-7 chiplet package designs. From a packaging perspective this puts them closer to Zen 4 EPYC than anything else AMD sells. From that perspective it is very difficult for AMD and its investors to justify selling RDNA 3 for cheap when the opp cost here is less high-margin Zen 4 workstations.

As Vatilla above points out the EPYC bottleneck is not wafers but substrates.

In addition that kind of optimisation works fine when you can sell everything to the high margin crowd but when you have Zen 4 sitting on shelves and are maxed out on substrate capacity for EPYC and still have wafers to spare you have pricing options.

It does not need to be an either / or scenario like it was with RDNA2 and Zen 3 since AMD have the capacity to now supply both.
 

Aapje

Golden Member
Mar 21, 2022
1,515
2,065
106
From that perspective it is very difficult for AMD and its investors to justify selling RDNA 3 for cheap when the opp cost here is less high-margin Zen 4 workstations.

This assumes that the demand for Zen 4 workstations doesn't crash...

I wouldn't be surprised if we see the same scenario as with the 6000 series, where prices started high, but went down quite a lot.
 
  • Like
Reactions: Leeea

TESKATLIPOKA

Platinum Member
May 1, 2020
2,522
3,037
136
I would also like AMD to price this gen considerably cheaper than Nvidia. However IMO most of the reasoning here seems a bit wishful.

I wouldn't mind to be wrong though.
Hard to tell. Nvidia set the prices very high, so there is certainly space for considerably lower prices, but the question is If AMD is willing to do that. If the raster performance and RT is competitive, then I don't expect they will be much cheaper. If RT is considerably behind, then they will lower the price.
 
  • Like
Reactions: Tlh97 and Kaluan

insertcarehere

Senior member
Jan 17, 2013
639
607
136
This assumes that the demand for Zen 4 workstations doesn't crash...

I wouldn't be surprised if we see the same scenario as with the 6000 series, where prices started high, but went down quite a lot.

That's out of scope for discussion here but it should be reasonable to expect that primarily consumer-driven demand to gaming GPUs to be more volatile and affected by macroeconomics than primarily corporate-driven demand for workstations and servers.

As Vatilla above points out the EPYC bottleneck is not wafers but substrates.

In addition that kind of optimisation works fine when you can sell everything to the high margin crowd but when you have Zen 4 sitting on shelves and are maxed out on substrate capacity for EPYC and still have wafers to spare you have pricing options.

It does not need to be an either / or scenario like it was with RDNA2 and Zen 3 since AMD have the capacity to now supply both.

Is there any evidence that RDNA 3 wouldn't take substrate capacity? If recent earnings calls are an indication AMD has been trying to address issues for both the CPU and GPU sides here.

AMD themselves have been on the record about pivoting more to the workstation/server crowds vs desktop/client simply due to the margins in the former. This likely applies doubly so for GPUs given how much more packaging and middlemen are involved for GPUs vs CPUs.

Just as an illustrative example, compare what's needed for a 7950X to say, a Navi 32 graphics card.

7950X needs:
-2x 5nm 70mm^2 chiplets + 6nm i/o die
-substrate
-CPU socket pins

Navi 32 graphics card needs:
-1x 5nm 200mm^2 chiplet + 4+x 6nm MCDs
-substrates
-16+gb of fast GDDR6 memory
-VRMs and capacitors
-A beefy cooler
-PCIE socket, power connectors, displayport/HDMI ports... Etc

Its plain to see that BOM for the former is significantly less than the latter. So an AMD that can move the former at ~$600 will be very unwilling to sell the latter at remotely similar pricing unless it can be complemented by higher-margin workstation GPU products ala Nvidia, and AMD just doesn't have the penetration in that front.
 
  • Like
Reactions: Tlh97 and Gideon

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136
That's out of scope for discussion here but it should be reasonable to expect that primarily consumer-driven demand to gaming GPUs to be more volatile and affected by macroeconomics than primarily corporate-driven demand for workstations and servers.



Is there any evidence that RDNA 3 wouldn't take substrate capacity? If recent earnings calls are an indication AMD has been trying to address issues for both the CPU and GPU sides here.

AMD themselves have been on the record about pivoting more to the workstation/server crowds vs desktop/client simply due to the margins in the former. This likely applies doubly so for GPUs given how much more packaging and middlemen are involved for GPUs vs CPUs.

Just as an illustrative example, compare what's needed for a 7950X to say, a Navi 32 graphics card.

7950X needs:
-2x 5nm 70mm^2 chiplets + 6nm i/o die
-substrate
-CPU socket pins

Navi 32 graphics card needs:
-1x 5nm 200mm^2 chiplet + 4+x 6nm MCDs
-substrates
-16+gb of fast GDDR6 memory
-VRMs and capacitors
-A beefy cooler
-PCIE socket, power connectors, displayport/HDMI ports... Etc

Its plain to see that BOM for the former is significantly less than the latter. So an AMD that can move the former at ~$600 will be very unwilling to sell the latter at remotely similar pricing unless it can be complemented by higher-margin workstation GPU products ala Nvidia, and AMD just doesn't have the penetration in that front.

That is the point though. In a situation where they sell every 7950X they make and don't have any more capacity you are correct. However when they can't sell every one they make and they have additional capacity then selling other products, even at lower margins, is more profitable than not selling anything.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,522
3,037
136
That is the point though. In a situation where they sell every 7950X they make and don't have any more capacity you are correct. However when they can't sell every one they make and they have additional capacity then selling other products, even at lower margins, is more profitable than not selling anything.
If the sales of Zen4 will be or are under expectations, then they can lower the cost, shift some wafer allocation to server models or to GPU.
The question is If RDNA3 will be so good or Nvidia so overpriced, that the already allocated wafers for RDNA3 won't be enough due to high demand.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
If the sales of Zen4 will be or are under expectations, then they can lower the cost, shift some wafer allocation to server models or to GPU.
The question is If RDNA3 will be so good or Nvidia so overpriced, that the already allocated wafers for RDNA3 won't be enough due to high demand.

Don't forget about Sienna... and they could also make Threadripper (Pros) that work with that socket too
 

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
I am pretty skeptical about N21 gaining 40-50% more performance by simply moving to N5. With the unrealistic linear scaling, It would already mean 3234-3465MHz, and you would also need 40-50% higher clocked memory to feed It(24gbps from Samsung).
Let's say It could be clocked so high, but TBP also wouldn't be the same but a lot higher because of those clocks and needed voltage.
AMD is using a custom spin of N5. They also have the option of using faster memory. Look at the 7950X. 33% higher base clock, 14% higher SC boost, and > 24% higher (typical) multicore boost.

They are doing more regardless.

Ancient history. That AMD has very little in common with Lisa Su's AMD. She maximizes margin at every opportunity.

She maximizes margin through market leadership.

AMD isn’t using chiplets for a pure margin play. If the card isn’t competitive it won’t sell. If it doesn’t sell, there will be zero margin. Also remember all these designs have R&D and marketing overhead. Selling only a few thousand cards will mean AMD has lost money.

I have absolute faith in AMD. I believe the 6900xt was evolutionarily in the same way Zen 2 was. Good enough, but not the best. I suspect next-gen will be a knockout. Hopefully, it will be AMD’s “Zen 3” moment, but for GPUs.
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
AMD is in an unique position to cripple nVidia's 30XX and 40XX sales: all they have to do is to have their next generation cards priced "much more consumer friendly", and nVidia will lose millions.

While such a move would mean lower margins for AMD too, it would also have the effect of increasing their sales across the board.

In the current global economic situation, the company that sells cheaper is likely to have the bigger sales, but it's a double edged sword since AMD would lose A LOT OF MONEY too, though nowhere near nVidia's because of their market share.

It would be a gamble, for sure ... but if it works ... AMD would see their market share improve tremendously while forcing nVidia to lose millions at the same time, and not only from the market share loss.

Heads AND jackets might roll ...
It's a double-edged sword, as in the next generation they might want to increase prices and improve their margins and then they receive backlash for not being the ultra value brand. This happened with Ryzen 5k prices I believe.

Just look at the current situation, the 6 series has way better price/perf, as well as better or competitive perf/watt and they still probably sold a fraction of Nvidia's sales.
 

Aapje

Golden Member
Mar 21, 2022
1,515
2,065
106
While such a move would mean lower margins for AMD too, it would also have the effect of increasing their sales across the board.

In the current global economic situation, the company that sells cheaper is likely to have the bigger sales, but it's a double edged sword since AMD would lose A LOT OF MONEY too, though nowhere near nVidia's because of their market share.

AMD seems to have significantly lower BOM costs, so AMD is in a perfect situation for a price war: being able to lower prices to a point where the profits are still decent, while the competition cannot sell at that price point without making a loss or having no profit.

And AMD does have a significantly worse reputation, so they will need to provide very good value if they are to win market share.
 
  • Like
Reactions: Tlh97 and Leeea

Leeea

Diamond Member
Apr 3, 2020
3,795
5,549
136
They WOULD make money: just NOWHERE NEAR AS MUCH as they would in normal circumstances, hence the "lose A LOT of money" i had stated. Add to that "much more consumer friendly" prices for their new generation ... I'm thinking something along the lines of 25% less price for all cards ON TOP of the "planned price drops" due to new generation of cards launching, and about ... say ... 15% less for the new generation of cards.

Because of the current economic situation Worldwide, MANY will start looking for the "more budget friendly for X performance option" when purchasing a new card. This would FORCE nVidia to also drop their prices in order to compete but, due to the monolithic nature of their chips, odds are that nVidia wouldn't be able to lower the price of their cards AS MUCH as AMD, meaning they would be A LOT MORE expensive, so they would likely not sell as well.

ASSUMING AMD can meet demands, this would result in nVidia losing A LOT of market share, thus making them lose millions, and make them lose EVEN MORE MONEY due to price reductions. @ the same time, and despite selling for much lower prices, AMD would see a HUGE INCREASE in market share, improve their GPU manufacturer image CONSIDERABLY, but earn "little money" doing so, WHEN COMPARED TO "normal circumstances".
That is not going to happen.

What we saw with Ryzen was as soon as AMD was able to, they charged Intel prices. The GPUs will be no different.

AMD will figure out the point on the supply/demand chart where they are likely to make the most money now, and shoot for that.
 

Kaluan

Senior member
Jan 4, 2022
504
1,074
106
It's a double-edged sword, as in the next generation they might want to increase prices and improve their margins and then they receive backlash for not being the ultra value brand. This happened with Ryzen 5k prices I believe.

Just look at the current situation, the 6 series has way better price/perf, as well as better or competitive perf/watt and they still probably sold a fraction of Nvidia's sales.
They didn't have an absolute lead over nVidia technologically, did they? They were competitive bottom to top, for the first time in many years, but they didn't outright beat nVidia in anything.

It's all about brand build up IMHO. RDNA4 may very well be AMD's greed moment, if RDNA3 suceeds in denting nVidia/gaining mindshare for the Radeon brand that is. Right now I don't think they can afford to "maximize margins". I think some still think that we're in the same world as 2-3 years ago.

We all know Intel chose to shrink their margins with 13th gen, but in the current economic climate, that may pay off. AMD may be smart to do so as well, they already derped a bit with the Ryzen 7000 launch prices. Maybe they won't make the same mistake twice.
AMD is in business to make money, not to make some other company lose money or fulfill the fanciful whims of forum posters.

We already saw what they would do with their prices when they have the best product when Zen 3 had Intel pretty squarely beat.
It is if nVidia losses mean less investments in the next gen or less trust in the brand. But this is obviously a gambit.

RDNA2 isn't Radeon's Zen3 moment, now is it? But RDNA3 could be. IDK
 

maddie

Diamond Member
Jul 18, 2010
4,881
4,951
136
It's WCCFTech. If true, expect worsening conditions for all the majors. Should know more this week.

"Based on an internal AMD report, we have managed to learn that the company is planning to lower its Ryzen 7000 "Zen 4" CPU production plan."
 
  • Like
Reactions: Leeea and Glo.