Question Speculation: RDNA2 + CDNA Architectures thread

Page 202 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
No, I apologize as I wasn't clear. You can not use CSM for UEFI and expect SAM to work. So there goes 95% of the machines out there.

Even then, most machines that people would be adding these GPUs to aren't running CSM. My current X570 motherboard doesn't even have the option for CSM, its UEFI only. I know there are boards that still support CSM, but its typically off by default.

So yes, some people may have to reinstall if they for some reason chose CSM/Legacy or UEFI. But most people putting a high end GPU into their machine wont be running on an old system.

But I can certainly see how people that did choose CSM/Legacy would be... annoyed.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
About CSM, it comes enabled on almost every motherboard i know, i know this very well because my network boot system runs on legacy PXE and havent had the time to move it to UEFI as i need to do deep changes.

I know of SOME boards were it comes disabled by default, but those are a minority, maybe now this will change. I also know of some high ends boards whiout CSM, again a huge minority.

BUT CSM being enabled means nothing, all boards give priority to UEFI, so even if you installed the OS with CSM enabled, the OS is UEFI UNLESS the user changed the boot order or manually selected the non UEFI boot drive.

So most users should be fine.
 
  • Like
Reactions: Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I just checked every PC I have, using System Information, and all of them have BIOS set to UEFI, none of them are Legacy. And this is both home built and pre-built Dell systems.
 
  • Like
Reactions: lightmanek

gdansk

Golden Member
Feb 8, 2011
1,978
2,354
136
I use CSM/Legacy for an old network card too. It didn't occur to me that reBAR wouldn't work but it isn't altogether unsurprising. Perhaps an impetus to figure out why it won't boot in UEFI mode.

Edit: That was an easy change, everything seems to work. But still need to test network boot.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
I just checked every PC I have, using System Information, and all of them have BIOS set to UEFI, none of them are Legacy. And this is both home built and pre-built Dell systems.

Yes this is what meant when i said, that having CSM enabled tends to do nothing unless the user does something to install the OS under legacy mode.
 
  • Like
Reactions: Stuka87

Panino Manino

Senior member
Jan 28, 2017
813
1,010
136
I'm "concerned" about RDNA2 cards.
AMD really performed a miracle (and they say that the performance per watt with undervolt is unbelievable), but a lot of consumers will keep demanding more and more RT and Image Reconstruction. These cards can deliver both, but contrary to Nvidia's cards AMD's cards will have to give up rasterization resources to do this. It's just too much, and the same consumers will point fingers, "see, AMD cards are bad!".
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,075
136
I'm "concerned" about RDNA2 cards.
AMD really performed a miracle (and they say that the performance per watt with undervolt is unbelievable), but a lot of consumers will keep demanding more and more RT and Image Reconstruction. These cards can deliver both, but contrary to Nvidia's cards AMD's cards will have to give up rasterization resources to do this. It's just too much, and the same consumers will point fingers, "see, AMD cards are bad!".
what are you talking about.....

your understanding of both NV and AMD architectures seems to be wrong.
 

Glo.

Diamond Member
Apr 25, 2015
5,658
4,418
136
Paul from RedGaming tech claims that Navi 22 Based 6700 XT is 20% faster than RX 5700 XT, costs 350$, while cut down version will cost 299$.

Basically the same info few days ago Moore's Law is Dead has posted.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Paul from RedGaming tech claims that Navi 22 Based 6700 XT is 20% faster than RX 5700 XT, costs 350$, while cut down version will cost 299$.

Basically the same info few days ago Moore's Law is Dead has posted.

At what clock? $350 leaves a HUGE price gap there.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Paul from RedGaming tech claims that Navi 22 Based 6700 XT is 20% faster than RX 5700 XT, costs 350$, while cut down version will cost 299$.

Basically the same info few days ago Moore's Law is Dead has posted.
Doesn't that clash with some earlier rumors that were expecting it to be clocked a lot higher? It should have the same 40 CU at a much higher clock speed along with the benefits of infinity cache. Unless they've decided to use more conservative clock speeds and power consumption, 20% comes across as underwhelming. If it's just using the Navi 21 clock speeds that both the XT cards use, that alone is almost 20% above the boost clock that the 5700XT got. The memory bus is smaller, but faster memory and the infinity cache should compensate for that.

A 20% performance boost still puts AMD squarely inline with Nvidia and 3060 Ti performance at 1080p, but it would cause a bit of a gap as the resolution increases. A $50 price difference would help them sell a lot of cards in normal circumstances, but I don't think it makes a lot of sense to do when there are as many supply issues.
 

Glo.

Diamond Member
Apr 25, 2015
5,658
4,418
136
If the performance rumored is true, and the clock is true: 2.5 GHz @ 220W TBP, then its straight up bad. Something went wrong with this GPU.

Because it suggests somehow AMD lost performance/CU versus previous generation, and versus even Navi 21.
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
If the performance rumored is true, and the clock is true: 2.5 GHz @ 220W TBP, then its straight up bad. Something went wrong with this GPU.

Because it suggests somehow AMD lost performance/CU versus previous generation, and versus even Navi 21.
This feels like a deja vu.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
The 6800 has 60 CUs, the full 128 MB of Infinity Cache, and only performs roughly 50% faster than the 5700XT at 1440p according to Computerbase. KitGuru got a similar conclusion. TechSpot has similar numbers too for the 6800 vs 5700XT. According to KitGuru, they got an average clock of 2200 MHz for the 6800 and 1781 MHz for the 5700XT under a 30 minute run of 3DMark Time Spy.

Therefore, the 6800 has 50% more CUs than the 5700XT with >20% more clocks and a full 128 MB of Infinity Cache, yet only performed 50% better, which seems like poor scaling. This suggests that if the 6700XT (Navi 22) clocks to roughly 2200 MHz, it will be on par with a 5700 XT. If it can reliably clock to >2500 MHz, then we could potentially look at a 5700XT +20%.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The 6800 has 60 CUs, the full 128 MB of Infinity Cache, and only performs roughly 50% faster than the 5700XT at 1440p according to Computerbase. KitGuru got a similar conclusion. TechSpot has similar numbers too for the 6800 vs 5700XT. According to KitGuru, they got an average clock of 2200 MHz for the 6800 and 1781 MHz for the 5700XT under a 30 minute run of 3DMark Time Spy.

Therefore, the 6800 has 50% more CUs than the 5700XT with >20% more clocks and a full 128 MB of Infinity Cache, yet only performed 50% better, which seems like poor scaling. This suggests that if the 6700XT (Navi 22) clocks to roughly 2200 MHz, it will be on par with a 5700 XT. If it can reliably clock to >2500 MHz, then we could potentially look at a 5700XT +20%.

Are you saying the 6700XT, which has the same CU count as the 5700XT, but with a much faster clock, is only going to match it? That makes no sense at all.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
Are you saying the 6700XT, which has the same CU count as the 5700XT, but with a much faster clock, is only going to match it? That makes no sense at all.
I personally don't think it makes any sense either, but the reviews I linked seem to suggest that there is poor scaling or how else would 60 CU w/ decently higher clocks and the full 128 MB IC only perform only 50% better than a 5700XT? It's got better specs in every department.

I must be interpreting the data from the reviews incorrectly or something.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Well, it only has 192 bits right? The IC helps but i dont think it does miracles, that could explain why it would only match it with higher clock.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
Well, it only has 192 bits right? The IC helps but i dont think it does miracles, that could explain why it would only match it with higher clock.
Allegedly it does, but the IC should make up for it, no? Otherwise, if 192-bit bus + 64+ MB of IC is still worse than a traditional 256-bit bus, what was the whole point of going with 256-bit bus and 128 MB of IC instead of a traditional 384-bit bus for Navi 21, which has twice the CU count of Navi 22?
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Allegedly it does, but the IC should make up for it, no? Otherwise, if 192-bit bus + 64+ MB of IC is still worse than a traditional 256-bit bus, what was the whole point of going with 256-bit bus and 128 MB of IC instead of a traditional 384-bit bus for Navi 21, which has twice the CU count of Navi 22?

Its probably cheaper to design, produce and implement. It does makes sence on navi 21 were going over 256bits would have been really expensive, but on Navi 22 and 23 it looks like just cost saving to me.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,658
4,418
136
Allegedly it does, but the IC should make up for it, no? Otherwise, if 192-bit bus + 64+ MB of IC is still worse than a traditional 256-bit bus, what was the whole point of going with 256-bit bus and 128 MB of IC instead of a traditional 384-bit bus for Navi 21, which has twice the CU count of Navi 22?
Paul from RGT says that N22 has 96 MB Infinity Cache.

By that logic we should see 64 MB's on Navi 23(16 MB's per memory channel).
 

menhera

Junior Member
Dec 10, 2020
21
66
61
The 6800 has 60 CUs, the full 128 MB of Infinity Cache, and only performs roughly 50% faster than the 5700XT at 1440p according to Computerbase. KitGuru got a similar conclusion. TechSpot has similar numbers too for the 6800 vs 5700XT. According to KitGuru, they got an average clock of 2200 MHz for the 6800 and 1781 MHz for the 5700XT under a 30 minute run of 3DMark Time Spy.
The 6800 has one entire Shader Engine disabled, hence 3 rasterizers/3 prim units/192 Z/Stencil ROPs, while the 5700 XT enjoys 4/4/256. Also 16Gbps vs 14Gbps on 5700 XT despite 50% higher CU count. IC just mitigates lower bandwidth per CU.

3DMark doesn't matter. My 6800 hardly ever boosts to 2200MHz in actual games, and according to the Anandtech 5700 XT review, its average boost clock is 1823MHz in 9 games.

All things considered, scaling doesn't seem to be poor to me.
 

Kuiva maa

Member
May 1, 2014
181
232
116
I'm "concerned" about RDNA2 cards.
AMD really performed a miracle (and they say that the performance per watt with undervolt is unbelievable), but a lot of consumers will keep demanding more and more RT and Image Reconstruction. These cards can deliver both, but contrary to Nvidia's cards AMD's cards will have to give up rasterization resources to do this. It's just too much, and the same consumers will point fingers, "see, AMD cards are bad!".

Not sure what you mean. Nvidia has to allocate a very large portion of its RTX gpu die for dedicated ray tracing, an area that would otherwise go for shaders that can execute traditional rasterization workloads. Of course they give up rasterization performance, there is no free lunch for anyone.
 

exquisitechar

Senior member
Apr 18, 2017
655
862
136
The 6800 has one entire Shader Engine disabled, hence 3 rasterizers/3 prim units/192 Z/Stencil ROPs, while the 5700 XT enjoys 4/4/256. Also 16Gbps vs 14Gbps on 5700 XT despite 50% higher CU count. IC just mitigates lower bandwidth per CU.

3DMark doesn't matter. My 6800 hardly ever boosts to 2200MHz in actual games, and according to the Anandtech 5700 XT review, its average boost clock is 1823MHz in 9 games.

All things considered, scaling doesn't seem to be poor to me.
All true, the 6700 XT will be a much more "balanced" design in every way. It will be much closer to the 6800 than simply looking at the number of CUs would make one think.

The only way I can see it being only 20% faster than the 5700 XT is if they are conservative with clock speeds, which isn't going to happen. I don't think it will be $350 either, for sure.
 
  • Like
Reactions: Tlh97 and Mopetar

Panino Manino

Senior member
Jan 28, 2017
813
1,010
136
Not sure what you mean. Nvidia has to allocate a very large portion of its RTX gpu die for dedicated ray tracing, an area that would otherwise go for shaders that can execute traditional rasterization workloads. Of course they give up rasterization performance, there is no free lunch for anyone.

That's the point.
Enabling RT the rasterization drops.
Enabling IS the rasterization drops.
It'll just "show how inferior AMD is", they will not like seeing the bars dropping more and more (proportionally) in the benchmarks.
 
  • Like
Reactions: Tlh97