Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 108 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,632
5,959
146

Unreal123

Senior member
Jul 27, 2016
223
71
101
Just pulled the trigger on an Zotac Amp Extreme RTX 4090 Airo. I waited to see what AMD would offer with RDNA3 but unfortunately, it doesn't align with my gaming requirements for my new build.

AMD needs to step up and take ray tracing seriously because that's the future of gaming graphics.
Do not buy it if you planning to use it with 6900K. RTX 4090 at times even bottlenecking 12600K at 4K.


Better to buy RTX 3090 Ti at a lower price.
 
  • Like
Reactions: Tlh97 and Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You didn't even wait for third party reviews, I guess you were just looking for an excuse.

Why bother waiting for reviews when AMD themselves have positioned RDNA3 beneath the RTX 3090 Ti in terms of pricing?

That tells me all I need to know. RDNA3 will likely be much faster than Ampere in terms of pure rasterization, but when you factor in DLSS and ray tracing it will probably be slightly ahead or behind depending on the game.

The RTX 4090 is in a whole other league. I'm not happy about that either, because Nvidia definitely need a stronger competitor as they've been dominant for too long.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,356
2,848
106
....
So yea N33 is looking to be about 6800 at 1080p and maybe 6750XT at 4k, something like that. Still a good uplift vs 6650 but not as great as hoped but not terrible, especially if it manages that at 120W or so.
You can forget about 120W TBP for full N33.
7nm N23 has 160-180W TBP, cutdown only 132W.
N33 is using 6nm, and 5nm 7900XT with only 84CU and only 2GHz game clock has 300W TBP as the higher clocked 6900XT.
For a 120W TBP, they would need to lower clocks significantly and sacrifice performance.
Even with the same TBP It looks like the clock will regress for the shaders and only the frontend will be clocked similarly and even that is not certain.
 
Last edited:
  • Like
Reactions: Tlh97 and RnR_au

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
I found my new graphics card. 4090 is too expensive and useless for my needs. the 4080 16 gb is useless compared to amd's offering unless you give a damn about ray tracing and dlss and all those other useless eye candy that eat frames. I'm stunnned at the price amd have chosen. my earlier prediction on pricing was way off base. Back to the good land I go.
 
  • Like
Reactions: Zepp and Kaluan

RnR_au

Golden Member
Jun 6, 2021
1,709
4,158
106
You can forget about 120W TBP for full N33.
7nm N23 has 160-180W TBP, cutdown only 132W.
N33 is using 6nm, and 5nm 7900XT with only 84CU and only 2GHz game clock has 300W TBP as the higher clocked 6900XT.
For a 120W TBP, they would need to lower clocks significantly and sacrifice performance.
Even with the same TBP It looks like the clock will regress for the shaders and only the frontend will be clocked similarly and even that is not certain.
Some talk on the GCD <-> MCD power costs...

So maybe some decent savings for going monolithic. And I hadn't heard of N33 being as low as 120W. Thought it was rumoured to be 150-200W.
 

Unreal123

Senior member
Jul 27, 2016
223
71
101
It does seem a bit off, but even AD103 is 46B transistors without wasting any on interconnect serdes.
It is having bottleneck.

This is no way that it is planned for 3 years.

For AMD, As i said that the issue is funds and work force. Fanboys of any company are delusional if they think Capital/R&D and workforce does not matter. It matters that is why even now since turing Nvidia is ahead of AMD in term of hardware.

Nvidia software is the same since 2007 but Nvidia really and heavily invested in hardware and it is paying now.

Many rumors in the forum said

1) RDNA clock speed will touch 4GHZ another disappointment.

2) Raytracing 2X over RDNA 2, which is only 0.5X max.

3) Raster performance is faster than RTX 4090 and it is confirm that it is at least 10% to 15% slower even in raster performance.


AMD tries to much to be shot fire at compeition ,however, AMD forgets Nvidia Eco User who buys RTX 4090 does not care about value and they only care they that they have best card in the market.

Example on Steam Survey is that RTX 3080 and EU sale figures shows sold more than RDNA 2 entire line up.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,356
2,848
106
I found my new graphics card. 4090 is too expensive and useless for my needs. the 4080 16 gb is useless compared to amd's offering unless you give a damn about ray tracing and dlss and all those other useless eye candy that eat frames. I'm stunnned at the price amd have chosen. my earlier prediction on pricing was way off base. Back to the good land I go.
Ray tracing is a useless eye candy? Because It eats up performance?
Will you set everything to low quality, because higher settings also eat up performance?
It looks like even that useless 4080 16GB will have higher RT performance than 7900XTX.
cyberpunk-2077-rt-3840-2160.png


You are stunned about the price? It's not like they could have set a much higher price with such a weak RT performance. In heavy RT games, It will be on par with Ampere at best.
Weak RT performance was forgivable during RDNA2 launch, but not now.
I wonder how long some of you will ignore RT just because AMD is weak in It.
I am not so sure RDNA3 is such a great deal.

2) Raytracing 2X over RDNA 2, which is only 0.5X max.
It's 1.5x per CU. So RX 7900XTX should be ~80% better at best.
 
Last edited:

RnR_au

Golden Member
Jun 6, 2021
1,709
4,158
106
3) Raster performance is faster than RTX 4090 and it is confirm that it is at least 10% to 15% slower even in raster performance.
Just on this...


60% price premium on a ~6% raster premium?

But bench for waitmarks.

Some people are saying Ray tracing is uselss dam. Just wait for next gen graphic engines, which will have built in Ray Tracing.
So the consoles will be useless on the next gen graphics engines?
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,356
2,848
106
Just on this...


60% price premium on a ~6% raster premium?

But bench for waitmarks.
So you will buy a $999 card and then disable RT on It, so you can be happily say what a great deal It was compared to the competition?
If I wanted to pay so much for a card just to play some games, then I certainly want to have the best eye candy possible and not deactivate RT because my card performs as the previous generation from the competition.

So the consoles will be useless on the next gen graphics engines?
They will heavily cutdown on RT effects for console versions. No one will care, because consoles are meant for budget gaming.
 

Gideon

Golden Member
Nov 27, 2007
1,641
3,678
136
For some reasons, AMD omitted to mention 2 AI Cores per CU.

Because there are no AI cores, there are just new instructions (longer discussion about that in Beyond3d forums). It's quite deceptive marketing unfortunately.

And this slide is actually somewhat worse than I thought
AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2043_575px.png


Anandtech has the full slide-deck, and It's actually running in FSR Performance mode. FSR 1.0 in case of CP2077:

AMD%20RDNA%203%20Tech%20Day_Press%20Deck%2075_575px.png

And guess what? This is from an early toms review of CP2077 from 2020. Let's not forget AMD is running FSR 1.0 so with the upcoming FSR 2.0 AMD will probably end up about where the RTX 3090 (and RTX 3080 Ti) is.

cdfRncbJt2xdVxf2iVHjxi-970-80.png


Here one can see the perfomance difference between FSR 1.0 and FSR 2.0 from the community patch. It's small using quality mode, but very substantial at performance mode:

All in all it's quite clear that in demanding RT titles it won't even reach top Ampere performance.
 

Timorous

Golden Member
Oct 27, 2008
1,615
2,772
136
So you will buy a $999 card and then disable RT on It, so you can be happily say what a great deal It was compared to the competition?
If I wanted to pay so much for a card just to play some games, then I certainly want to have the best eye candy possible and not deactivate RT because my card performs as the previous generation from the competition.


They will heavily cutdown on RT effects for console versions. No one will care, because consoles are meant for budget gaming.

If you actually slow down and crunch the numbers the perf/$ for RT is looking similar between NV and AMD.

Yes NV has the absolute best RT performance with the 4090 but the 4080 16GB comes with a 20% price premium over the XTX and will maybe offer around 20% more RT performance and a lot less raster performance.

Who knows what NV will do with AD 104 now. A 12GB 4070 Ti at 800 might be slightly better RT than the 7900XT but the raster will be far worse although full AD104 is less than half the RT cores of the 4090 so there is no guarantee the AD103 and below will have the same relative RT performance as the 4090 due to how cut they are.
 
  • Like
Reactions: Elfear and Joe NYC

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Just on this...


60% price premium on a ~6% raster premium?

But bench for waitmarks.


So the consoles will be useless on the next gen graphics engines?

Its clearly enough of a gap that AMD didn't want to invite the comparison with RTX 4090, unlike the RDNA 2 launch where they directly compared to RTX 3080/3090.
RX6800_16.jpg


RX6800_33.jpg
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,356
2,848
106
If you actually slow down and crunch the numbers the perf/$ for RT is looking similar between NV and AMD.

Yes NV has the absolute best RT performance with the 4090 but the 4080 16GB comes with a 20% price premium over the XTX and will maybe offer around 20% more RT performance and a lot less raster performance.
That is actually even worse for AMD.
Both RTX 4080 16GB and RTX 4090 are criticized because of very high cost and Nvidia being greedy, and now It turns out Nvidia wasn't so greedy with these cards, because they have similar perf/$ in RT heavy games as the TOP AMD product.
 
  • Like
Reactions: Carfax83

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Ray tracing is a useless eye candy? Because It eats up performance?
Will you set everything to low quality, because higher settings also eat up performance?
It looks like even that useless 4080 16GB will have higher RT performance than 7900XTX.
cyberpunk-2077-rt-3840-2160.png


You are stunned about the price? It's not like they could have set a much higher price with such a weak RT performance. In heavy RT games, It will be on par with Ampere at best.
Weak RT performance was forgivable during RDNA2 launch, but not now.
I wonder how long some of you will ignore RT just because AMD is weak in It.
I am not so sure RDNA3 is such a great deal.


It's 1.5x per CU. So RX 7900XTX should be ~80% better at best.
for older people with bad eye sight with corrective lenses it doesn't make a huge difference. My eyes aren't the same as a 20 year olds or even a 30 year olds or even a 40 year old's. I can make out major details in rt scenes, anything other than that my eyes won't pick up on. I want good graphics with good frames. I'll leave stellar graphics and less frames to the young.
 

RnR_au

Golden Member
Jun 6, 2021
1,709
4,158
106
That is actually even worse for AMD.
Both RTX 4080 16GB and RTX 4090 are criticized because of very high cost and Nvidia being greedy, and now It turns out Nvidia wasn't so greedy with these cards, because they have similar perf/$ in RT heavy games as the TOP AMD product.
I have zero interest in RT. It holds as much interest to me as fake frames.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
It looks to me we will have the following,

RX7900XTX vs RTX4080 16GB

Price MSRP : $999 vs $1199 = 17% lower
vRAM : 24GB vs 16GB = 50% more
Raster Performance 4K = 25-30% higher
Ray Tracing Performance 4K = 25-30% slower
TDP : 355W vs 320W

Which one would you choose ??
 

leoneazzurro

Senior member
Jul 26, 2016
930
1,461
136
Idk man, it looks like a fine card, but compared to what people in this thread and elsewhere have been saying as recently as last week, this is absolutely a letdown...

Eh, I was reporting the leaker's tweets about the frequencies, not some personal guesses or information.

My own guesses came from what declared about the perf/W ratio increase, with the idea they could sandbag like in the RDNA2/Zen3/Zen 4 case. RDNA2 was also declared to be 50% and was 64% in AMD's own numbers. As we had also a process node improvement (while we had not with the RDNA->RDNA2 change), I went with a 70%, which would have brought the raster performance at 2x6950XT @375W. It came out that this time they weren't sandbagging much.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,356
2,848
106
for older people with bad eye sight with corrective lenses it doesn't make a huge difference. My eyes aren't the same as a 20 year olds or even a 30 year olds or even a 40 year old's. I can make out major details in rt scenes, anything other than that my eyes won't pick up on. I want good graphics with good frames. I'll leave stellar graphics and less frames to the young.
Fair enough. Maybe in your case, I would wait for N32, but who know when they will release It and for what price.

I have zero interest in RT. It holds as much interest to me as fake frames.
Just because you have no interest in RT doesn't mean RDNA3 is actually a fantastic product. Maybe for some It is, but as a complete product It is not.
BTW, those fake frames are coming to RDNA3, too. :p