Question Speculation: RDNA2 + CDNA Architectures thread

Page 130 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
If what I've seen is correct, AMD have been using the XT moniker on GPU code-names for years to signify the fully enabled chip (found one example in the last ten years where it wasn't). Why would they change it now? XTX has been used a few times recently (Navi 10, 14, and less recently in 2013...) to signify a more highly-clocked fully-enabled chip. They might change it, but IMO it's unlikely.

What we've seen from AMD has been a card slower by around 10% than the 3080, and people here are talking about beating the 3090 by 10-15%. I'll be happy if AMD can produce a 5% winner vs the 3080. People here are leading themselves to disappointment. People should go re-read the Vega thread. The same user, using the same tactics, overhyping yet another AMD launch.





I will leave you with this spoiler to the Vega thread:
To end this stupid discussion, I will ask you one question.

Where do I in this thread hype up Navi GPUs?

Point it out. Not use something from the past. Use something from this very thread.

Show that I have hyped out Navi GPUs, that in reality turned out to be false, or incorrect.

If you are accusing me about hyping up Navi GPUs, you should be able to easily point it.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
To end this stupid discussion, I will ask you one question.

Where do I in this thread hype up Navi GPUs?

Point it out. Not use something from the past. Use something from this very thread.

Show that I have hyped out Navi GPUs, that in reality turned out to be false, or incorrect.

If you are accusing me about hyping up Navi GPUs, you should be able to easily point it.

Post history is a powerful, powerful thing..

Yeah.

Navi 23 will be just behind RTX 3070, and Navi 22 will be just behind RTX 3080. That is correct.

Guys, use your logic. Look at RX 5700 XT and its performance per ALU/CU compared to Turing, add redesigned caches, which increases internal bandwidth massively, add 25% core clocks(2.3 GHz) and 10% IPC.

Then use those equations for 50% bigger GPU, with 60 CUs, and then for 2x bigger GPU in the mix.

Im sometimes baffled that even if people have those things so plain, in their faces, they fail to see them.

And Im not going to post where does the info about bus width, and memory sizes comes from.

If 40 CU RDNA1 GPU was able to trade blows with 40 SM Turing GPU, why would 40 CU GPU(RDNA2) with increased IPC lose to a 44 SM GPU that LOST IPC(Ampere)? Especially with masively increased clock speeds, on RDNA2 GPU?

P.S. I think we should stop calling ALUs in GPUs cores, and call SM's, CU's, EU's cores. That way will be factually correct, and easier for laymans to understand the performance of GPUs, instead of marketing gibberish that all of companies are trying to sell us.

Incorrect.

I forgot to add to that post, that N21 has 24 GB VRAM, N22 has 16 GB VRAM, N23 will have 12 GB VRAM and N24 will have 8 GB VRAM.

Navi 23 - RTX 2080 Super performance levels +10%, @150W TBP.
Navi 22 - 10-20% above RTX 2080 Ti performance levels @225 TBP.
Navi 21 - 40-50% above RTX 2080 Ti performance @ 275W TBP.

Those were performance targets for RDNA2, that I got hint few weeks ago.
 

Konan

Senior member
Jul 28, 2017
360
291
106
I’m excited to see what comes but history at this precise point in time with AMD shows it’s better to remain balanced and objective. This whole hype train being spun up by Z-list digital buskers is getting embarrassing for attention and clicks (and donation begging)

I heard AiBs may just have two N21 SKU variants for launch this year.

TBP could still be close to 300W

Everything else looks pretty fantastic so far, but I for one, am keeping things in check.
 

PhoBoChai

Member
Oct 10, 2017
119
389
106
I would assume basic logic. AMD needs a very small die for OEMs and laptops. It would still be far more powerful than any iGPU.

Indeed. They need something that maxes out at ~75W or so, for OEM on desktop, and can power down for notebook markets in the 35-80W range.

N24 could be entry, and N23 mainstream class. N22 midrange and N21 high-end.

It's been a long time since AMD even has an entire stack for one uarch generation. Their profits from CPU side is really making a difference for the GPU division to be finally able to design and bring up 4 dies and to many markets.
 
May 17, 2020
122
233
86
I’m excited to see what comes but history at this precise point in time with AMD shows it’s better to remain balanced and objective. This whole hype train being spun up by Z-list digital buskers is getting embarrassing for attention and clicks (and donation begging)

I heard AiBs may just have two N21 SKU variants for launch this year.

TBP could still be close to 300W

Everything else looks pretty fantastic so far, but I for one, am keeping things in check.
TBP or TGP ? :D
 

Vope45

Member
Oct 4, 2020
114
168
86
Can't say that I'm surprised at this. It's not really goalposts moving; it's more like 2.4 GHz Game Clock at only 255W was just too good to be true.

What I think can be concluded though is that with a high enough power limit and with proper cooling, the max boost clock of 2.4 GHz is attainable.


This reinforces my thinking as to why AMD won't let AIB know anything about XTX. XTX might be liquid cooled like Fury X ?
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
Can't say that I'm surprised at this. It's not really goalposts moving; it's more like 2.4 GHz Game Clock at only 255W was just too good to be true.

What I think can be concluded though is that with a high enough power limit and with proper cooling, the max boost clock of 2.4 GHz is attainable.


This just tells me nobody outside of specific people at AMD and AiBs know a damn thing.

Edit to add: this reminds me of Eyefinity where AMD gave partners different code names for the project to see who leaked info early.

It seems entirely possible AMD have sent different AIBs different bioses with different clocks so they can see which AIBs are leaky and which run a tight ship.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
So... What are the odds that after Nvidia released the power guzzling 3000 series, AMD decided to clock the nuts off Navi 2 and blow up the power consumption? If 2.4GHz is true, then they surely use more than 250W.
 

zinfamous

No Lifer
Jul 12, 2006
110,512
29,099
146
Shoot the message, not the messenger, again.

...but it is the message. the same message, every single time, every single release based on speculation and random tweets, numbers that speculators never want to accept do not directly translate across architectures, only ever assuming the very best scaling in their speculations, with never any room for doubt.

So yeah, it happens to be the same messenger most of the time, but it's still the same message. I've been biting my thumb over the last week or so of this thread, but it really is the exact same thing going on with the Vega thread. :D
 

zinfamous

No Lifer
Jul 12, 2006
110,512
29,099
146
To end this stupid discussion, I will ask you one question.

Where do I in this thread hype up Navi GPUs?

Point it out. Not use something from the past. Use something from this very thread.

Show that I have hyped out Navi GPUs, that in reality turned out to be false, or incorrect.

If you are accusing me about hyping up Navi GPUs, you should be able to easily point it.

Not an honest claim at all, because it can not be measured against actual released product. So, it can only be assumed to be hype, anyway.

....I still don't know why constant tweets from randos about "leaks" are instantly assumed to be real
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Post history is a powerful, powerful thing..
First of all: As far as I understand the meaning of "Hype Train" its: Blowing things out of proportion.

Now Tell me specifically where did I blew things out of proportion, especially considering that I toned down expeectations of some people in this very thread.

If you are eager to use stuff that fits into your narrative, why don't you be OBJECTIVE on the topic, and also show stuff where I specifcally toned down the expectations of some people, huh?

Its disrespectful from you.

P.S. As I have specified later, the correct performance targets were: 40 CU GPU - 10% above RTX 2080 Super, 60 CU GPU - 10-20% above RTX 2080 Ti, 80 CU GPU - 40% above RTX 2080 Ti.
...but it is the message. the same message, every single time, every single release based on speculation and random tweets, numbers that speculators never want to accept do not directly translate across architectures, only ever assuming the very best scaling in their speculations, with never any room for doubt.

So yeah, it happens to be the same messenger most of the time, but it's still the same message. I've been biting my thumb over the last week or so of this thread, but it really is the exact same thing going on with the Vega thread. :D
No. For the specific reason I posted higher.

Maybe you should go and re-read the thread? You are basing your opinion on the past, not on this thread, and what I was posting. READ IT WITH UNDERSTANDING.
Not an honest claim at all, because it can not be measured against actual released product. So, it can only be assumed to be hype, anyway.

....I still don't know why constant tweets from randos about "leaks" are instantly assumed to be real
Right. So now you guys have made a crap out of logic.

First: this is speculation thread. Secondly, if you would know anything about Rogame, Kopite, Patrick Schur, and few other people that twitts land here, you would not post stuff like this.

Thirdly. We have seen the performance and efficiency of Xbox Series X. And the conclusion, about RDNA2 architecture, based on its performance is simple. Its RDNA1, but with Ray Tracing. IN WORST CASE SCENARIO.

Now, based on your own, personal fear of being disappointed, you guys accuse me of something that I do not do.

Hype Train means, blatantly, that things are being blown out of proportion. And so far, everything that leaks out about RDNA2 EXCEEDS those expectations. So why I am blamed for this, when my personal expectation was 2.1 GHz maximum clock speeds on large, 80 CU die, and it turns out its actually 2.4 GHz?

No guys, I do not deserve treating for this thread that you guys give to me.
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Indeed. They need something that maxes out at ~75W or so, for OEM on desktop, and can power down for notebook markets in the 35-80W range.

N24 could be entry, and N23 mainstream class. N22 midrange and N21 high-end.

It's been a long time since AMD even has an entire stack for one uarch generation. Their profits from CPU side is really making a difference for the GPU division to be finally able to design and bring up 4 dies and to many markets.
If N23 is slated for X500 and X600 SKUs they do not need anything else.

APUs are for entry level market, aren't they?
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
If N23 is slated for X500 and X600 SKUs they do not need anything else.

APUs are for entry level market, aren't they?

If 21 is for x900 and x800, 22 is for x700 and x600 that leaves 23 for x500.

That leaves the question of what is 24 for but it could be a HBM version for apple like N12 was.
 

linkgoron

Platinum Member
Mar 9, 2005
2,286
810
136
Shoot the message, not the messenger, again.
Regarding shooting the message. Here's an example on doing it correctly:
1602982425547.png
Mockingbird was essentially 100% correct, and yet you just totally dismissed him - really showed him why he's wrong.

Anyway, you keep using the same message every time, in every AMD launch. For the record, I really hope that AMD can produce a 3090 competitor. It's just unlikely, and there's no reason to overhype. IMO a 3080 + a few percent would be a win. RDNA was (IMO) a great success (I have a 5700XT), and I'm hopeful that RDNA2 will be a success as well. The consoles are looking good, and I'd be delighted if the 6900XT or whatever ends up as a 3090 competitor, it's just unlikely. Obviously, without a RDNA2 card we can't see if you're correct regarding RDNA2, but here's something from just a year ago, from the RDNA 5500XT launch (I only put two quotes, but there were tons of them in that thread):
1603023279095.png
1603023312459.png
Results:

1603023241826.png

I will give you this though, once it was totally totally clear the the 5500XT card was a dud, you admitted it:

1603024056937.png

The only issue? People were constantly telling you that your numbers don't add up, that they don't align with that we know. AMD themselves were comparing the card with the 1650. You wouldn't listen, stated again and again that AMD are just underselling their product, or using old drivers or any other thing.

Nvidia fanboys still harping about Vega smfh.
You're more than welcome to look at my posting history and then call me an Nvidia fanboy.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Regarding shooting the message. Here's an example on doing it correctly:
View attachment 31923
Mockingbird was essentially 100% correct, and yet you just totally dismissed him - really showed him why he's wrong.

Anyway, you keep using the same message every time, in every AMD launch. For the record, I really hope that AMD can produce a 3090 competitor. It's just unlikely, and there's no reason to overhype. IMO a 3080 + a few percent would be a win. RDNA was (IMO) a great success (I have a 5700XT), and I'm hopeful that RDNA2 will be a success as well. The consoles are looking good, and I'd be delighted if the 6900XT or whatever ends up as a 3090 competitor, it's just unlikely. Obviously, without a RDNA2 card we can't see if you're correct regarding RDNA2, but here's something from just a year ago, from the RDNA 5500XT launch (I only put two quotes, but there were tons of them in that thread):

I will give you this though, once it was totally totally clear the the 5500XT card was a dud, you admitted it:

View attachment 31951

The only issue? People were constantly telling you that your numbers don't add up, that they don't align with that we know. AMD themselves were comparing the card with the 1650. You wouldn't listen, stated again and again that AMD are just underselling their product, or using old drivers or any other thing.
You do realize that you are making logical fallacy?

I am asking you about showing stuff from THIS THREAD about Navi 2, that I am overhyping, and you go into previous threads, from the past.

If you have something to add about what I am doing right now, based on current facts - sure I will give you credit.

If you would also be objective, You would find in this thread posts in which I tone down people's expectations.

P.S. If you don't have RDNA2 GPUs on hand. How the hell can you accuse me of overhyping the stuff?

That is the pinnacle of logical fallacy,.
 
Last edited:

linkgoron

Platinum Member
Mar 9, 2005
2,286
810
136
You do realize that you are making logical fallacy?

I am asking you about showing stuff from THIS THREAD about Navi 2, that I am overhyping, and you go into previous threads, from the past.
This will be my last post on this subject, I've de-railed this thread enough as it is, and I assume that I'm bordering on getting a personal attack warning from a mod at this stage. As I've stated in my previous post "without a RDNA2 card we can't see if you're correct regarding RDNA2". However, your comments in this thread are exactly the same as those that you put in the 5500XT thread, and the Vega thread - both turned out completely wrong. It could be that RDNA2 will end up at 2.5 GHZ and will obliterate the 3090 while consuming 290 watts. However, it's unlikely.

If you would also be objective, You would find in this thread posts in which I tone down people's expectations.
This is from yesterday!

Oh crap, even here I misread that.

I guess, that 2.4 GHz still got me on concealed my ability to read with understanding.

Dear lord, it makes it even more mind blowing.
If its 80 CU die, and has 2.4 GHz GAME CLOCK, at 255 TGP then this is freaking the most efficient GPU architecture, that has ever been made by any company.
If it was just for demo, they could've used ANY configuration in the die, with ANY clock speeds.

They could've used Navi 21 Die, with special, "demo" BIOS, that was specifically designed to blatantly sandbag its performance.

Im not saying this is what happened. But its possible, after all.
Which SKU in particular you are talking about, being on par with RTX 3080?

Because maybe you missed, but there is no way in hell top SKU is only on par with 3080. No way, in hell.

Yes, guys. Kepler_L2 is correct.

Historically XT and XTX are full dies. They just differ by clock speeds, and power targets.

So its not 72 CUs. Its full, 80 CUs, with 2.4 GHz game clock, at 255W of power drawn(hence the TGP).

Im lost. Its GG. Its beyond EVERYTHING we thought it could possibly be.

Definitely toning down people's expectations. IIRC there was also earlier talk about 3GHZ clocks...
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
This will be my last post on this subject, I've de-railed this thread enough as it is, and I assume that I'm bordering on getting a personal attack warning from a mod at this stage. As I've stated in my previous post "without a RDNA2 card we can't see if you're correct regarding RDNA2". However, your comments in this thread are exactly the same as those that you put in the 5500XT thread, and the Vega thread. both turned out completely wrong. It could be that RDNA2 will end up at 2.5 GHZ and will obliterate the 3090 while consuming 290 watts. However, it's unlikely.


This is from yesterday!






Definitely toning down people's expectations.
The fact, that you connect speculation, based on what is leaking recently with Hype is based on your flawed logic. Its your problem, not mine.

Secondly.

So you are only looking for what fits your expectations?

From yesterday:



Ok. Can somebody stop this train? We're approaching Earth's Orbit right now!
Why I posteed this? Because the info is unbelievable.
Next, about the specs from Power Tables leaked from MacOS.
Which could be stand-in data due to incomplete/missing specs.
Here user says that the data is incomplete in MacOS power tables. to which I responded:
Yeah. Missing 48 CUs, and missing 800 MHz on GPU clocks in Big Navi die...

No. It doesn't work that way, as has been multiple times specified in this very thread.
I assume that you are intelligent enough to spot the irony but I will tell you anyway, in case you missed it: it was sarcasm.

Next:
The 40CU part has a 400-600Mhz higher clock speed over the 5700XT. It will be closer to 30% faster, which interestingly enough places it close to where this card performs. Each die will likely have 3 different skus, with the biggest and best card being nearly 3X as fast as a 5700XT.
To which I replied:
And people call me a hype train conductor...
Which meant: "Yeah, no. It won't be."

If it isn't clear by now, for those that are unable to read with understanding, for RDNA2 Im pretty neutral. I won't say that it will beat Nvidia, which is not my expectation at all, but I also will not let myself, and anyone believe that AMD will not be able to compete with Nvidia.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Yeah I don't know what whycry's problem is. It's obvious that IF these specs are true then it's more than just his "beats the 3070".
Well if it beats a 3080 then it's still true that it beats a 3070. Maybe they're just seeing if they can sandbag harder than AMD is alleged to be doing.

When the banks of Hype River really start to overflow, a little sandbagging is necessary. Gotta keep the fanboys and girls from getting soaked even if they're already a little moist.