Question Speculation: RDNA2 + CDNA Architectures thread

Page 91 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
Well yes, but Nvidia has had the better architecture for quite some time now. All it takes is a better engineering process and it’s probably back to status quo. AMD has generally had a good chip, but seem to always add too much voltage. They also preferred HD libraries vs high performance ones. This is why AMD’s chips are much smaller in size compared to Nvidias.
All it takes is for some genius at AMD to say let’s build a big chip this time, screw density, and let’s have day at it.
So either stupidity is now gone, or Nvidia foooked up badly hoping Samsung cheapness will keep their stock holders happy
For which chip did they prefer HD libraries instead of high performance ones? It's not RDNA1, Renoir has almost 50% better transistor density. Polaris and Vega on 14nm have comparable transistor density and Nvidia Pascal had just ~10% worse because It was built using 16nm process.
If you compare RDNA1 vs Turing then It's because they are built on 7nm vs 12nm process and Ampere on 8nm process has higher density than RDNA1.
 

beginner99

Diamond Member
Jun 2, 2009
5,233
1,610
136
Some expectations here are totally unrealistic IMO.

Yeah. Always the same. 80CU 2.2ghz seems to be somewhat realistic at according power consumption around 300W. But now we suddenly are at 2.7 ghz with 250W. People here are setting the launch up to be a failure just due to the over hyped expectations if the top sku lands at 3080 level with just lower power use.
 

Tup3x

Golden Member
Dec 31, 2016
1,086
1,085
136
I guess the final clock will depend on how nuts they go with the power consumption. I'm guessing that the top 80CU version will use 300W +-25W depending on how much above 2 GHz it will clock.
 

misuspita

Senior member
Jul 15, 2006
498
592
136
All AMD recent GPU launches were plagued by unrealistic fanboy expectations which were shot down by reality.

I'd say it's stuuupid to jump in another hype train. Especially in a forum full of smart people and with enough information to not fall for these gossips.

The reality will hurt and go against AMD. If they do realise the impossible and release a 3 GHz 96CU 300W RDNA2, yepedeedoo! But my expectations are at 2.2 GHz, 80cu, maybe 250w. Ant they will not move regardless, till launch day. If they do, it will have to be a rock hard proof, not forum ramblings.

/rant :)
 

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
All AMD recent GPU launches were plagued by unrealistic fanboy expectations which were shot down by reality.

I'd say it's stuuupid to jump in another hype train. Especially in a forum full of smart people and with enough information to not fall for these gossips.

The reality will hurt and go against AMD. If they do realise the impossible and release a 3 GHz 96CU 300W RDNA2, yepedeedoo! But my expectations are at 2.2 GHz, 80cu, maybe 250w. Ant they will not move regardless, till launch day. If they do, it will have to be a rock hard proof, not forum ramblings.

/rant :)
After all that you ended up with an expectation way beyond my own lol.

Not to mention an expectation people would have called you completely insane for making just two days ago.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,741
2,717
146
Yeah. Always the same. 80CU 2.2ghz seems to be somewhat realistic at according power consumption around 300W. But now we suddenly are at 2.7 ghz with 250W. People here are setting the launch up to be a failure just due to the over hyped expectations if the top sku lands at 3080 level with just lower power use.
I would be perfectly happy with a high end card at around 3080 levels with less power usage. I mean, the 3080 is already using more than 300W. If I get more than expected, well, I will be happy and take it, but I am not expecting miracles. I hope there is at least sufficient stock though.
 
  • Like
Reactions: Tlh97 and misuspita

beginner99

Diamond Member
Jun 2, 2009
5,233
1,610
136
I would be perfectly happy with a high end card at around 3080 levels with less power usage. I mean, the 3080 is already using more than 300W. If I get more than expected, well, I will be happy and take it, but I am not expecting miracles. I hope there is at least sufficient stock though.

Agree but people will be disapointed and call it a fail because they expected 250w RTX3090 performance for $699.

Stock for sure will be a issue. Zen3, consoles and RDNA2. A little much all at once I fear.
 

misuspita

Senior member
Jul 15, 2006
498
592
136
After all that you ended up with an expectation way beyond my own lol.

Not to mention an expectation people would have called you completely insane for making just two days ago.
Yes, it's a new upper limit for me :)))))

A few weeks ago I wouldn't dare go that high, and you're right, it's stupid high even at that. But that I set as my absolute highest I could see them go this generation. Having a lower real product will not bother me because I have no intentions on buying(Eastern Europe, money are meh in this Covid period), although will be disappointing, Nvidia will reign free again. But I hope they do, because competition drives progress and a good halo product may make a good lower stack product as well.
 

linkgoron

Platinum Member
Mar 9, 2005
2,409
979
136
Lol. Hype train has reached moon orbit.
It's clear that we are finally reaching the mythical "Raja's GPU" we have been waiting for.
Yeah. Always the same. [...] People here are setting the launch up to be a failure just due to the over hyped expectations if the top sku lands at 3080 level with just lower power use.
What's dissapointing is that these are the same people that overhyped Vega. Posts implying that Vega will have better perf/watt and performance, will beat the 1080ti, will reach 50% market share, will have "tremendous perf/$" and kick Nvidia's ass, or that Nvidia released the 1080ti because of Vega's awesome performance.

Navi was a really good step forward, and AMD definitely have their chance to actually compete after Nvidia's somewhat weaker launch (maybe launches if we include Turing) , but there's no need to overhype so much.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
My prediction for the two top cards with todays info.

RX6900XT with 16GB HBM2E ram / ~RTX3090 raster performance at $1299
RX6800XT with 16GB HBM2E ram / ~RTX3080 raster performance at $699
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
My prediction for the two top cards with todays info.

RX6900XT with 16GB HBM2E ram / ~RTX3090 raster performance at $1299
RX6800XT with 16GB HBM2E ram / ~RTX3080 raster performance at $699
Can you give me some details on the launch, availability and embargo dates for reviews on the AMD RX 5300?
 

CakeMonster

Golden Member
Nov 22, 2012
1,502
659
136
NV has left such a gap in memory and price between the 3080 and 3090 that if AMD can compete they will dictate the price. If not, expect NV to price the 3080 20GB models in a way that will make you look the 3090 again or angrily buy the 3080 10GB that you could have gotten earlier if you just had known. I really hope AMD messes up the pricing while still staying profitable.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Few things to add about MacOS power tables.

IF those power tables are for Apple GPUs, those are the specs within which the Apple GPUs have to operate. The TDP is for WHOLE GPU BOARDS, that are connected to Macs, or rather, Radeon Pro versions.

Otherwise, its just what DisEnchantment explained. Another power table that came from AMD.

If we will find BIOSes for those GPUs in MacOS, then we are at home. From those BIOSes we can get everything related to clock gating, voltages, power gates, etc.

For example, Radeon Pro D700 have had 129W TDP, for whole board, with 1.025v, clocked at 850 MHz. It was possible to exctract the exact details from MacOS back in the day. I don't know if its still possible.
 

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Yeah. Always the same. 80CU 2.2ghz seems to be somewhat realistic at according power consumption around 300W. But now we suddenly are at 2.7 ghz with 250W. People here are setting the launch up to be a failure just due to the over hyped expectations if the top sku lands at 3080 level with just lower power use.
Who, exactly is talking about 2.7 GHz at 250W for 80 CU GPU?

Or is it just your own projection?
 

Asterox

Golden Member
May 15, 2012
1,039
1,823
136

Yes very legit, all models have same 1500mhz GPU clock very legit. :grinning: I am not drunk, so a will not play same guessing clickbate game.

We have PS5 as example, and this is Gaming Console or not a wood stove.

RDNA2, 36CU up to 2.2ghz

RDNA2 Desktop GPU, as rumored 40CU, but now only 1500mhz GPU
 

BlitzWulf

Member
Mar 3, 2016
165
73
101
Yes very legit, all models have same 1500mhz GPU clock very legit. :grinning: I am not drunk, so a will not play same guessing clickbate game.

We have PS5 as example, and this is Gaming Console or not a wood stove.

RDNA2, 36CU up to 2.2ghz

RDNA2 Desktop GPU, as rumored 40CU, but now only 1500mhz GPU



Base clock not boost, clocks might not be fully finalized yet and 1500 may be a placeholder ,or maybe it is fake we'll see!
 
Last edited:

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,037
136
These specs at Newegg are not the actual specs, from where would they even get it?

I was thinking how there is no mention of anything higher than 256bt GDDR6, only HBM2.
My speculation is that 60-64CU N21 will have only 256bit GDDR6.
40CU N22 at 2.5Ghz with 192bit 16Ghz GDDR6 would have 12.8TFLOPs and 384GB/s.
60-64CU N21 at 2.1Ghz with 256bit 16Ghz GDDR6 would have 16.13-17.2TFLOPs and 512GB/s.
The increase in TFlops and bandwidth is comparable so this is highly possible in my opinion.

Now what about the full 80CU chip? I think N21 could have both GDDR6 and HBM2 inside and combined It probably won't use up more die space than 384bit GDDR6, so there is no problem. They will save some W which can be used for higher GPU clockspeed and although HBM2 will be costlier than GDDR6, they can increase the price If It will perform higher than RTX3080. Even If they had minimal margins on It It's still a great advertisement for AMD that they are back.

With this in mind the amount of Vram could be
N23 - 8GB 128bit GDDR6
N22 - 12GB 192bit GDDR6
Cut down N21 - 16GB 256bit GDDR6
Full N21 - 16GB HBM2 -> 2 stack of 8GB 2.4Ghz 2048bit HBM2E would mean 614GB/s or 20% more than 256bit 16Ghz GDDR6. If It's still not enough then there is still HBM Flashbolt with 33% higher bandwidth(3.2Ghz) than HBM Aquabolt, but that will be even costlier. At worst I am wrong and It will have 384bit GDDR6.
 
Last edited:

Asterox

Golden Member
May 15, 2012
1,039
1,823
136
Base clock not boost, clocks might not be fully finalized yet and 1500 may be a placeholder ,or maybe it is fake we'll see!

Placeholder for what, if you go by AMD slide it is very easy to put solid clickbait blah.So if you go by AMD slide, where is increase clock speed if stock GPU clock is only 1500mhz? :grinning: So GPU base is lower, ok then probably IPC is also lower.In short RDNA2 is garbage, buy Nvidia and no problem.

RX 5700XT, base GPU clock 1600mhz


RX 67000XT, base clock 1500mhz

2020-09-27_151333.jpg