Question Speculation: RDNA2 + CDNA Architectures thread

Page 72 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,746
6,653
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
So I would say there was/is no enough capacity to make Navi 2x. I think we'll see paper-launch on 28th

Nothing more then demand exceeding the sales predictions on certain sku's. When this happens you can't just whip up more, I believe it takes a few months from start to finish.

AMD estimates sales and produces what they think they'll need. Best to wait and see what happens. I believe AMD said Big Navi comes before the consoles. I'd guess availability should be close to the announcement as there really isn't much time in between it and PS5's release date. Availability ??? Most likely in stock longer then the 3080's. The good old supply and demand arguments will be fought at that time.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
This supply/demand issue is what has me sketchy on “dumping” my 5700xt. I don’t want to feel like I am under pressure to find something at MSRP+ just because my current rig feels like it is running with the anchors down with a really old limp along GPU.

My PS5 acquisition dreams are rapidly pivoting to when they release some sort of bundle in 2021 (Ratchet & Clank?) rather than worrying about it now. My next gen GPU shopping might be the same. What is the opportunity costs of keeping my card last launch? 🤔

The good news is I didn’t pay $1200 for a 2080ti so I can maybe have the same percentage loss but the dollar amount will be so much less 😂
 

eek2121

Diamond Member
Aug 2, 2005
3,384
5,011
136
From Slimbook



So I would say there was/is no enough capacity to make Navi 2x. I think we'll see paper-launch on 28th

Renoir was unexpectedly well received. In addition, AMD was in the middle of a nearly year long manufacturing run of console SoCs. In addition, there was strong demand for both Ryzen and EPYC. The first (and likely the biggest) run of consoles is wrapping up right about now. That is why AMD timed the announcements the way that they did. You will note that despite incredible discipline by AMD, some of the lower margin PC parts still sold out.

The Ryzen 5000 series and the Radeon 6000 series will likely have a bit of an issue upon initial launch (pent up demand, as always) but I strongly suspect that the availability will be good as we move forward. AMD hasn’t been in the habit of paper launches.
 

eek2121

Diamond Member
Aug 2, 2005
3,384
5,011
136
This supply/demand issue is what has me sketchy on “dumping” my 5700xt. I don’t want to feel like I am under pressure to find something at MSRP+ just because my current rig feels like it is running with the anchors down with a really old limp along GPU.

My PS5 acquisition dreams are rapidly pivoting to when they release some sort of bundle in 2021 (Ratchet & Clank?) rather than worrying about it now. My next gen GPU shopping might be the same. What is the opportunity costs of keeping my card last launch? 🤔

The good news is I didn’t pay $1200 for a 2080ti so I can maybe have the same percentage loss but the dollar amount will be so much less 😂

Do you really consider it a loss? I paid $800 for my 1080ti and I hope to retire it this year, but it won’t be a loss at all for me. I got my money’s worth out of it, and I can’t even sell it (it is going in another PC).
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
It is really easy actually. To compete with 2080Ti AMD would have needed to produce a 500mm^2 RDNA GPU. This would not have had Ray Tracing or DLSS but would have competed at native resolution. The max they could sell such a card would be about $1000 and they would have to sell chips for far less than that to board partners so they could make a profit after adding in the PCB and memory.
You don't need 500mm2 RDNA GPU to compete with 2080Ti.
60CU, 3840SP, 96Rops, 384bus RDNA GPU would be only ~361mm2 with the same density and shouldn't be too far behind 2080Ti.
If I add 80CU instead of 60CU then It's ~418mm2 and this config should be a bit faster than 2080ti even with 15% lower clocks(1.6Ghz) than RX 5700XT, unless the scaling of CU is horrible with higher number.
 
Last edited:
  • Like
Reactions: Tlh97

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
Quite simply, RDNA1 was limited by the thermals. 1.8GHz at 220W means that for having a card that could beat 2080 or 2080 Super they had to go over 300W, and they anyway had traded blows with the 2080Ti which offered better features with a lower power consumption. So they had sold none, and lost money in developing a big chip that they could have not even used for the professional market as they already decided to use CDNA for that purpose. SO having a "Big Navi" on RDNA1 would have been a big mistake. Now they are claiming +50%perf/W, compatition went for power hungry cards, they added ray tracing, so they can build a chip that in theory can compete.
They wouldn't need 300W TBP to combat 2080Ti, 250W TBP is more than possible. They just need to set the clockspeed lower and power consumption would go down quite a bit.
Example:
RX 5700XT -> 1887Mhz and 219W on average
RX 5700 -> 1672Mhz and 166W on average
11% lower clockspeed and 4CU less and you gain 24% lower power consumption, which is not bad considering memory consumes the same.
 
Last edited:
  • Like
Reactions: Tlh97

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
The answer to ALL of that above is very simple.

AMD is on a yearly cadence. Designing big GPU with architecture that is going to be retired just 12 months later is uneconomical, from design costs perspective. If you are going to retire your architecture just 12 months later, its better to milk as much as possible cheapest possible designs.

Which means: smallest possible designs.

I hope this ends this ridiculous discussion.
RDNA 1 release -> July 7, 2019
RDNA 2 release -> October 28, 2020
This is actually 15 months and we don't even know If RDNA2 will replace everything.
I rather think It's because they wanted more wafers for Zen.
 
  • Like
Reactions: Konan and Tlh97

uzzi38

Platinum Member
Oct 16, 2019
2,746
6,653
146
He did clearly state that was a speculation piece on his part and not tied to any of his sources. It’s a little bit like us debaiting the patent at the top of this page. In this particular case above this post is clearly saying where he got his pricing sources
He still thinks it's true, claiming that certain "partners" are testing it for a future product.
 
  • Haha
Reactions: Tlh97 and kurosaki

leoneazzurro

Golden Member
Jul 26, 2016
1,113
1,863
136
They wouldn't need 300W TBP to combat 2080Ti, 250W TBP is more than possible. They just need to set the clockspeed lower and power consumption would go down quite a bit.
Example:
RX 5700XT -> 1887Mhz and 219W on average
RX 5700 -> 1672Mhz and 166W on average
11% lower clockspeed and 4CU less and you gain 24% lower power consumption, which is not bad considering memory consumes the same.

True to an extent, but you still probably more than 250W (which is 2080Ti's TBP) and you are still missing the added fatures (Ray Tracing, DLSS). An enthusiast would have chosen the better featured card anyway,( not speaking about the fact that Nvidia could have pushed clocks and thermals a bit further).
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,696
3,260
136
TDP can be within 250W, but the clockspeed and voltage is the key.
If no game supports these features then It's pointless and 2080Ti was very expensive, so I don't see why AMD wouldn't be able to compete even without these features. Now more games have support for DLSS and RT, but Ampere is way better in both of these features than Turing.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,501
1,342
136
They wouldn't need 300W TBP to combat 2080Ti, 250W TBP is more than possible. They just need to set the clockspeed lower and power consumption would go down quite a bit.
Example:
RX 5700XT -> 1887Mhz and 219W on average
RX 5700 -> 1672Mhz and 166W on average
11% lower clockspeed and 4CU less and you gain 24% lower power consumption, which is not bad considering memory consumes the same.
When AMD released the 5700XT it was a test card with the 7nm die shrink. The end result was wow, look what we did without even trying. Big Navi is supposed to be the real deal and a major attempt to dethrone Nvidia. We still have yet to see the 3090 and I am convinced they could push beyond 400w. So those performance numbers from Nvidia come with a massive power bill that has not been seen even by Vega 64. Even the pro reviews say wow, nice performance but cringe at the power consumption levels of the 3080. Others have questioned why the performance does not scale better with the 8nm process and all the additional cores and better memory.

Meanwhile, AMD looks stupid by not having Zen3 out yet considering Intel processors crush Zen2 processors in gaming by close to 20% in some cases. I know I sound negative but all of this frustrates me. I can't believe 1080ti owners think their cards are outdated. What is AMD to say about big Navi? Why is it late? Why is Zen 3 not out yet? They say AMD wants to tie in the Zen3 and big navi release with the new consoles. PC gamers are not console gamers and do not care about new consoles.

It looks as if AMD will easily win the power efficiency award even if big navi doesn't outperform Nvidia.
 

leoneazzurro

Golden Member
Jul 26, 2016
1,113
1,863
136
Ampere being better does not mean Turing is crap at that, especially the highest end card. In any way, if I had to buy an enthusiast card, i would like it to last the longest time possible, with good aging. And in that case a RDNA1 high end card would have been less competittive than in the mainstream segment, where updating happens more often.
 

Kuiva maa

Member
May 1, 2014
182
235
116
The reason why AMD didn't create a bigger RDNA1 chip is quite obvious if we see their post bulldozer practices. The last time radeon came up with a full GPU range for a new architecture was back in 2012 and GCN 1st gen. They had an entry level (cape verde,7750/7770), mainstream (pitcairn,7850/7870), high end (tahit,7950/7970) that corresponded to the full array of nvidia kepler chips. They also gave us 7790 and even GK110 with some delay got competition in the form of hawaii. This has never happened ever since. From that point onwards, AMD has only been bringing one or two GPUs per generation tops. Against maxwell they just rebranded their existing range (eg grenada etc), dished out a card out of Apple's Tonga rejects (R9 285), and eventually released Fiji (theri last enthusiast level chip). Against Pascal they only gave two Polaris gpus for the low and mainstream part of the market and near the end of that gen they also brought vega 10 at the high end,which served as a bridge vs Turing in a shrunk to 7nm version. Then after 10+ months after Turing arrived they brought two RDNA1 chips.
So the reason we saw no big RDNA1 chip was simply the fact that until a couple of years ago, AMD was cash starved and simply couldn't afford the resources to design it. They would gladly make one if they could afford it but they have been alternating between segments throughout the 2010s while nvidia was happily bringing several GPUs to cover the whole market every year almost.
With their resurgence in the CPU front and their economic situation being rather good I would assume that from now one we will be seeing a full range of products with every new gen once again.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
RDNA 1 release -> July 7, 2019
RDNA 2 release -> October 28, 2020
This is actually 15 months and we don't even know If RDNA2 will replace everything.
I rather think It's because they wanted more wafers for Zen.
It a top to bottom release with 4 dies in RDNA2 lineup.
 
  • Like
Reactions: Tlh97 and raghu78

moinmoin

Diamond Member
Jun 1, 2017
5,234
8,442
136
Nothing more then demand exceeding the sales predictions on certain sku's. When this happens you can't just whip up more, I believe it takes a few months from start to finish.

AMD estimates sales and produces what they think they'll need. Best to wait and see what happens. I believe AMD said Big Navi comes before the consoles. I'd guess availability should be close to the announcement as there really isn't much time in between it and PS5's release date. Availability ??? Most likely in stock longer then the 3080's. The good old supply and demand arguments will be fought at that time.
If AMD botched the orders the lead time to correct that at TSMC is reportedly up to half a year.

This may have gotten better by now, but AMD's estimates have to be spot on (they tend to err on the low side historically out of financial necessity) and capacity at TSMC has to be there. In foundry business there is no fast enough turnaround possible to directly react to supply shortages right as they happen.

My expectation is that with the console chips orders AMD reserved plenty capacity at TSMC which it'll opt to retain for its own chips once the console chips orders are fulfilled.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
AMD themselves stated they didn't expect the take rate to be so high given the $500 MSRP before tax.

Haha, which is sort of hilarious.

Because that means that type of SKU (in terms of competitive peformance and pricing) didn't exist for them before so their forecasting models were terrible.

"Wow, there is this much potential at this price point? Intel you've had this spot for the last decade?!? We can actually make money selling CPUs!"
 
  • Love
Reactions: lightmanek

moinmoin

Diamond Member
Jun 1, 2017
5,234
8,442
136
Haha, which is sort of hilarious.

Because that means that type of SKU (in terms of competitive peformance and pricing) didn't exist for them before so their forecasting models were terrible.

"Wow, there is this much potential at this price point? Intel you've had this spot for the last decade?!? We can actually make money selling CPUs!"
More like a sad reflection of AMD's business the decade before. It really had no worthwhile mass market consumer CPU product at that price point for that long (or even longer?).
 

Timorous

Golden Member
Oct 27, 2008
1,966
3,850
136
If AMD botched the orders the lead time to correct that at TSMC is reportedly up to half a year.

This may have gotten better by now, but AMD's estimates have to be spot on (they tend to err on the low side historically out of financial necessity) and capacity at TSMC has to be there. In foundry business there is no fast enough turnaround possible to directly react to supply shortages right as they happen.

My expectation is that with the console chips orders AMD reserved plenty capacity at TSMC which it'll opt to retain for its own chips once the console chips orders are fulfilled.

Isn't that the lead time to get more wafers rather than the lead time to change the product mix on wafers you have ordered but that have not been manufactured yet.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
More like a sad reflection of AMD's business the decade before. It really had no worthwhile mass market consumer CPU product at that price point for that long (or even longer?).

Ha, right?

I can laugh, you can cry, but that's exactly it.

Even the 8xxx bulldozer chips were relegated to "super value sub i5" pricing basically from the jump.

Man, I bet there were a lot of very unfun tense meetings at AMD over the last decade I am very glad that I was not a part of.
 
Last edited:

moinmoin

Diamond Member
Jun 1, 2017
5,234
8,442
136
Isn't that the lead time to get more wafers rather than the lead time to change the product mix on wafers you have ordered but that have not been manufactured yet.
That's what I think too, which is why I wrote that I expect AMD to retain the capacity at TSMC for its own chips after the console chips orders for which it was originally reserved are fulfilled. External obligations, like those console chips orders, always come first though.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Haha, which is sort of hilarious.

Because that means that type of SKU (in terms of competitive peformance and pricing) didn't exist for them before so their forecasting models were terrible.

"Wow, there is this much potential at this price point? Intel you've had this spot for the last decade?!? We can actually make money selling CPUs!"
That's what I don't get. I forget if it was Forest who said that or Papermaster. It amounted to "Because of how the competition prices similar hardware and what we estimated they move each month, we did not expect such an enthusiastic take rate."

What was the ASP for a 12 core Intel at the time? $1,200?
 
  • Like
Reactions: Tlh97 and blckgrffn