• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question Speculation: RDNA2 + CDNA Architectures thread

Page 140 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PhoBoChai

Member
Oct 10, 2017
119
389
106
I think 505mm^2 N21 is nonsense.

That would give you 83 good dies per wafer. Even if they were all 6900XT and the 6900XT is 3080+5% it means AMD can look to charge around $700 per card. That is revenue of $58,100.

OTOH a wafer of Zen3 dies is 782 good CCDs. Even if all are used in 5600X's that would be a revenue of $230k.

A 500+ mm^2 GPU is far far far too costly for AMD when they can generate 4x the revenue by making Zen3 parts instead.

Edit To Add: The other option is that availability is like the 3080/3090, non existent.
That's the downside of any monolithic design vs chiplets and in general GPU revenue is weaker than CPU, per silicon die area.

EPYC is ramping like crazy lately for anyone who has noticed, VMs, datacenter and super computers all aboard. The beauty of Zen 2 or Zen 3 dies here is per wafer, their revenue is MASSIVELY more than anything AMD can get with gaming GPUs.

When your wafer is limited, you want to devote most of them to the higher revenue & profit products.
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
And yet, the same team just without that special "clog" is now performing and behaving so differently, it's like, just like, his departure made a difference.
More like they actually have funding since Zen 1 success in 2017, right after his departure. AMD been on a hiring spree. And Lisa Su also moved Zen engineers to help the graphics division. IDK if anyone here has ever been in charge of R&D, but manpower + money makes a hell of a difference.
 

Glo.

Diamond Member
Apr 25, 2015
4,806
3,424
136

Another info from Paul. According to his source, the full board power, for reference RX 6900 XT is 280W power drawn, and that this GPU is competing very well with RTX 3090, albeit, some games its faster, in some games its slower.

His source firmly suggests that Reference design is under 300W of power, despite of what Igor suggested yesterday.
 

Edgy

Senior member
Sep 21, 2000
366
20
81
Well... Keller probably had the worst draw of the straw if one considers funding/resources or lack thereof (especially compared to Intel) but nevertheless Ryzen was/is considered a roaring success.

What I see is that Keller left AMD much better than when he was hired - no drama, no excuses.

Raja certainly didn't do that. If what I read is true that Raja complained about half his team being pulled into Navi development before him leaving for Intel - this leads to 2 significant extrapolations:

1. He made enough effort at excuses for his failures that news like this got out to the public.
2. He might have been aware of Navi tech as a person in high enough of a position within AMD, but he was not the driving force behind it.

Let's hope RDNA2 succeeds and even Intel finds success at their graphics cards for more competitive market but Raja would not be getting credit for either successes in my book even if they happen.
 

Tup3x

Senior member
Dec 31, 2016
526
392
136

Another info from Paul. According to his source, the full board power, for reference RX 6900 XT is 280W power drawn, and that this GPU is competing very well with RTX 3090, albeit, some games its faster, in some games its slower.

His source firmly suggests that Reference design is under 300W of power, despite of what Igor suggested yesterday.
The BIOS he had wasn't from reference board.
 

Kuiva maa

Member
May 1, 2014
169
203
116
I think 505mm^2 N21 is nonsense.

That would give you 83 good dies per wafer. Even if they were all 6900XT and the 6900XT is 3080+5% it means AMD can look to charge around $700 per card. That is revenue of $58,100.

OTOH a wafer of Zen3 dies is 782 good CCDs. Even if all are used in 5600X's that would be a revenue of $230k.

A 500+ mm^2 GPU is far far far too costly for AMD when they can generate 4x the revenue by making Zen3 parts instead.

Edit To Add: The other option is that availability is like the 3080/3090, non existent.
If die allocation according to margins was the only factor driving AMD business, they would only be making Epyc. However it doesn't work like that, they require volume and that means they need to be present in all x86 markets. Also if they want to have variance in their portfolio (they do) and to have access to semicustom deals, they absolutely need graphics. And in order to be competitive in the GPU market you need to sell in all market segments.
 

Timorous

Senior member
Oct 27, 2008
625
673
136
If die allocation according to margins was the only factor driving AMD business, they would only be making Epyc. However it doesn't work like that, they require volume and that means they need to be present in all x86 markets. Also if they want to have variance in their portfolio (they do) and to have access to semicustom deals, they absolutely need graphics. And in order to be competitive in the GPU market you need to sell in all market segments.
I never said it was the only factor. Point is if you want a better supply of N21 GPUs you better hope it is smaller as the larger it is the less they will make due to that disparity. That disparity also gives AMD a lot more pricing flexibility to try and get OEMs to offer better products with AMD parts in.

Another good note is that if the TBP is 280W like Paul suggests then that is less than R7 and considering R7 used HBM it means a lower % of the TBP is for the GPU die itself so a 330mm² or so N21 will not have an excessive W/mm².
 

beginner99

Diamond Member
Jun 2, 2009
4,818
1,205
136
Fiat for crypto is so much easier. I'll let you do the hassle, I had my turn too briefly too late to be awesome but early enough for actual returns.
Yeah mining is just annyoing especially nowadays. Hardly and profit and you still have to deal with the heat. Tried it a long time ago where in hindsight is now was extremely profitable. Still I stopped and just buyed some ETH with cash.

on-topic:

Yeah it 6900XT is competes with a 3090, $999 is probably minimum price to expect. As I have said and others just repeated, AMD has no incentive to lower prices and gain market share. Every GPU wafer is a waste of money compared to a Zen2/3 wafer and supply is limited. NV was the only hope for sane prices and a lot of supply since no ones else uses Samsung. But well...
 

Kuiva maa

Member
May 1, 2014
169
203
116
I don't think AMD can charge 1.2k just like that for the top model. It needs something to compete against the 3080 too, lest we forget the 3090 is not that faster. If the second best card loses to the 3080, a very expensive 6000 series flagship would only cement the latter's position as the actual realistic flagship. Now if the second best competes favorably against the 3080, things change.
 

kurosaki

Senior member
Feb 7, 2019
257
247
86
I don't think AMD can charge 1.2k just like that for the top model. It needs something to compete against the 3080 too, lest we forget the 3090 is not that faster. If the second best card loses to the 3080, a very expensive 6000 series flagship would only cement the latter's position as the actual realistic flagship. Now if the second best competes favorably against the 3080, things change.
KIND OF SAD YOU CANT GET A REASONABLE TOP GPU AT THE 300 USD PRICE POINT ANY MORE. WHAT HAPPENED?
HOW COME THE 500 IS THE NEW 300. VOTING WITH THE WALLET GIVES NO EFFECT IF EVERYONE ELSE DOESN'T CARE. EVEN THE 3090'S SEEMS TO HAVE SOLD OUT...
 

Kuiva maa

Member
May 1, 2014
169
203
116
I never said it was the only factor. Point is if you want a better supply of N21 GPUs you better hope it is smaller as the larger it is the less they will make due to that disparity. That disparity also gives AMD a lot more pricing flexibility to try and get OEMs to offer better products with AMD parts in.

Another good note is that if the TBP is 280W like Paul suggests then that is less than R7 and considering R7 used HBM it means a lower % of the TBP is for the GPU die itself so a 330mm² or so N21 will not have an excessive W/mm².
That's the problem with that reasoning right there. A 330mm2 N21 can't possibly compete with the 3080, can it now. You may flood the shelves with them from Day 1 but I , and others like me are simply not in the market for a product in that segment. I mean if AMD could magically pull this off, more power to them and I would gladly buy the product, but realistically speaking, they need a bigger die if they want to address that market segment the 3080 targets.
 

sandorski

No Lifer
Oct 10, 1999
68,366
3,486
126
Not gonna lie, I'm really excited to see RDNA2. The potential of being the first single Corp who has CPU/GPU Gaming mastery simultaneously is a significant milestone. They might fall short of that, but they are so close.
 

leoneazzurro

Senior member
Jul 26, 2016
367
487
136
According to Patrick Schur (I cannot the Twitter link here) Radeon 6900XT has "more than 10000 points" in 3DMark Firestrike extreme. This means it is on approximately on the same level of 3080. It would be interesting to know if we really have a XTX above that...
 

Glo.

Diamond Member
Apr 25, 2015
4,806
3,424
136
According to Patrick Schur (I cannot the Twitter link here) Radeon 6900XT has "more than 10000 points" in 3DMark Firestrike extreme. This means it is on approximately on the same level of 3080. It would be interesting to know if we really have a XTX above that...
He said its XT that scores over 10k in 3DMark.

6900XT is XTX.

6800XT is XT.

:)
 

moinmoin

Platinum Member
Jun 1, 2017
2,556
3,275
136
Im genuinely surprised, that we are not discussing one of the features, that appear to be confirmed, at this point, to be featured, and working for RDNA2 architecture, and for RDNA1.

I wonder if anyone knows what Im talking about, which landed in recent times in upcoming driver release.
Yeah, I wonder why we are not discussing features that nobody is talking about. 🤔
 

ASK THE COMMUNITY