Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 129 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,624
5,889
146

Timorous

Golden Member
Oct 27, 2008
1,608
2,753
136
So that slide has the 7900XTX 54% faster than the 6950XT in raster which is what I was using when comparing for guessing perf/$ and where it stacks up.

That chart also has the 7900XT as 30% ahead of the 6950XT.

For $900 the 7900XT is overpriced relative to the performance loss. However, vs the 4080 it should be pretty much a match in raster give or take a few %. So for $300 less you get the same raster but worse RT.

I also suspect the 7800XT will come in about 15% ahead of the 6950XT maybe even a little more and AMD may phase out the 7900XT entirely.
 

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
So that slide has the 7900XTX 54% faster than the 6950XT in raster which is what I was using when comparing for guessing perf/$ and where it stacks up.

That chart also has the 7900XT as 30% ahead of the 6950XT.

For $900 the 7900XT is overpriced relative to the performance loss. However, vs the 4080 it should be pretty much a match in raster give or take a few %. So for $300 less you get the same raster but worse RT.

I also suspect the 7800XT will come in about 15% ahead of the 6950XT maybe even a little more and AMD may phase out the 7900XT entirely.
Did the math (based on the limited set of first party data tho), and 7900XTX is faster than 7900XT by:

19.3% in 4K raster
15% in 4K RT/hybrid
17.2% in 4K mixed

Yup, looks like AMD is definitely trying to upsell the 7900XTX via the XT. I also expect a much lower volume of 7900 XT cards on the market, a-la RX 6800. They probably have really good yields and BOM costs on N31 designs. May just EOL the XT by the time RDNA3 refreshes come in.

On the flipside, TBP-wise, XTX look really good efficiency and TFLOP wise. Near linear performance scaling to TBP and TFLOP increase. 300 to 355, 52 to 61.

Guess this could mean we can pretty much predict N32 performance if we know the TFLOPs (tho it might not be as good at 4K as N31, the claims of only 3 SE complicate things IMHO).
I kinda expect 7800XT to be closer to 7900XT in performance than 7900XT to 7900XTX.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
I didn’t see this one posted here. Found on Reddit. It has the 7900XT benchmarks.

- Really doesn't look like there is a whole lot of room for the 7800XT/7800 at this point. Those aught to come in right at the 6950XT level +/- a few %.

Which then makes me wonder if we're going to see an N33 closer to 6800(non-XT) or 6750XT performance levels rather than the rumored 6900XT levels.
 
  • Like
Reactions: Tlh97 and maddie

Timorous

Golden Member
Oct 27, 2008
1,608
2,753
136
- Really doesn't look like there is a whole lot of room for the 7800XT/7800 at this point. Those aught to come in right at the 6950XT level +/- a few %.

Which then makes me wonder if we're going to see an N33 closer to 6800(non-XT) or 6750XT performance levels rather than the rumored 6900XT levels.

Not sure that is true.

7800XT that is +20% vs 6950XT might seem daft but the BOM cost difference is pretty large so AMD could offer that at $650ish and then basically make the 7900XT super rare. Literally only make them from failed defective dies or from XTX's where an MCD bond failed. They can use it as a gap filler product for the broken XTX's get sent.

AMD did this with the 5600XT which was rather close to 5700 tier performance in a lot of games but that was using even more cut down N10.

Then cut N32 in the 7700XT could come in around 6950XT performance +/- a bit. I do think N33 will be around 6800 levels now thoughj
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Well, reference with its ~366W of usable sustained power certainly won't go too far, but customs with up to ~510W certainly may surprise. I hope we have both reference and custom on review/launch day. All custom AIB listings have [redacted] in the clock specifications. Not sure if this is common behavior in pre-launch listings.


The "up to 1.8-1.84x" claim is weird, since even going by their own claims/slides, we can clearly see a "2x" case in Dying Light 2:
View attachment 71150
View attachment 71151
View attachment 71149

At least in this title ("heavy RT"?), RX 7900 may be quite a bit faster than your 3090.
The lower than expected clocks and the weak RT improvment really killed the performance that I was expecting to be a good upgrade over my 6800XT. 7800XT with N32 will be like 6950XT +10% and that is not a worth upgrade by far.

Looks like the only cards that makes sense are 7970XTX and 4090 but both are above my $700-$800 limit.

The $1200 4080 will be a joke but will outsell all AMD 7000 cards combined so I don't think that Nvidia will care much of what the reviews will say.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,848
106
Performance of N32 or N33 will depend on actual clockspeed.

RX 6650 XT: 10,793 GFLOPs
RX 6950 XT: 23,654 GFLOPs

Comparable GFLOPs to RDNA2 could(should) look like this:

N31(2.5GHz): 96CU*2FLOP*128SP/2*1.174Arch*2.5GHz = 36,065 GFLOPs or 52.5% more than RX 6950XT
N31(+30% over N21): 96CU*2FLOP*128SP/2*1.174Arch*3GHz = 43,278 GFLOPs or 83% over 6950XT

N32(3GHz): 60CU*2FLOP*128SP/2*1.174Arch*3GHz = 27,050 GFLOPs or 14% more than RX 6950XT
N32(+30% over N22): 60CU*2FLOP*128SP/2*1.174Arch*3.380GHz = 30,475 GFLOPs or 28.8% over 6950XT

N33(3GHz): 32CU*2FLOP*128SP/2*1.174Arch*3GHz = 14,426 GFLOPs or 33.7% more than RX 6650XT
N33(+30% over N23): 32CU*2FLOP*128SP/2*1.174Arch*3.425GHz = 16,470 GFLOPs or 52.6% over 6650XT
Keep in mind that N33 has a weaker WGP than N31(N32) so the performance gain will be smaller than what I calculated.
 
Last edited:

Saylick

Diamond Member
Sep 10, 2012
3,127
6,302
136
The lower than expected clocks and the weak RT improvment really killed the performance that I was expecting to be a good upgrade over my 6800XT. 7800XT with N32 will be like 6950XT +10% and that is not a worth upgrade by far.

Looks like the only cards that makes sense are 7970XTX and 4090 but both are above my $700-$800 limit.

The $1200 4080 will be a joke but will outsell all AMD 7000 cards combined so I don't think that Nvidia will care much of what the reviews will say.
FWIW, even if the performance is a little lackluster relative to the rumors, if AMD were able to clock the 7900XTX and 7900XT another 10% higher at the same TDP, they would have likely charged 10% more as well, so it's not like we'd be getting more bang/$ if it performed better out of the gate.
 
  • Like
Reactions: Tlh97 and Yosar

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136
Performance of N32 or N33 will depend on actual clockspeed.

RX 6650 XT: 10,793 GFLOPs
RX 6950 XT: 23,654 GFLOPs

Comparable GFLOPs to RDNA2 could(should) look like this:

N31(2.5GHz): 96CU*2FLOP*128SP/2*1.174Arch*2.5GHz = 36,065 GFLOPs or 52.5% more than RX 6950XT
N31(+30% over N21): 96CU*2FLOP*128SP/2*1.174Arch*3GHz = 43,278 GFLOPs or 83% over 6950XT

N32(3GHz): 60CU*2FLOP*128SP/2*1.174Arch*3GHz = 27,050 GFLOPs or 14% more than RX 6950XT
N32(+30% over N22): 60CU*2FLOP*128SP/2*1.174Arch*3.380GHz = 30,475 GFLOPs or 28.8% over 6950XT

N33(3GHz): 32CU*2FLOP*128SP/2*1.174Arch*3GHz = 14,426 GFLOPs or 33.7% more than RX 6650XT
N33(+30% over N23): 32CU*2FLOP*128SP/2*1.174Arch*3.425GHz = 16,470 GFLOPs or 52.6% over 6650XT
Keep in mind that N33 has a weaker WGP than N31(N32) so the performance gain will be smaller than what I calculated.

- So basically if you get rid of the goofy dual pumped FP32 units but add in the IPC gain the FLOPs are basically perfectly in line with performance increase for N31 vs N21 (52% increase in FLOPs for 54% increase in performance)?
 
  • Like
Reactions: Tlh97 and Leeea

TESKATLIPOKA

Platinum Member
May 1, 2020
2,355
2,848
106
- So basically if you get rid of the goofy dual pumped FP32 units but add in the IPC gain the FLOPs are basically perfectly in line with performance increase for N31 vs N21 (52% increase in FLOPs for 54% increase in performance)?
Maybe. It looks pretty close to It, but I wouldn't bet on It, especially with higher clocks where the scaling won't be linear. If nothing else, It looks good on paper. :)
 

desrever

Member
Nov 6, 2021
110
267
106
Did the math (based on the limited set of first party data tho), and 7900XTX is faster than 7900XT by:

19.3% in 4K raster
15% in 4K RT/hybrid
17.2% in 4K mixed

Yup, looks like AMD is definitely trying to upsell the 7900XTX via the XT. I also expect a much lower volume of 7900 XT cards on the market, a-la RX 6800. They probably have really good yields and BOM costs on N31 designs. May just EOL the XT by the time RDNA3 refreshes come in.
I think the 7900XT will OCs close to the 7900XTX. The clocks seem artificially low on the 7900XT to give spacing to the XTX.

This looks very similar to the 7970 vs 7950 situation from way back, the paper difference is big but both hit about the same clocks with OC and the advantage of the higher end card wasn't significant when OC'ed.
 

Kaluan

Senior member
Jan 4, 2022
500
1,071
96
The +30% frequency slide seems to indicate that happens when you give up any efficiency gain. So if you want +54% performance/watt increase you have to back down on the frequency.

Just FYI, as there seems to be a lot of confusion how AMD measured that claim (or worse, no one asking this)
AMD has clarified the 1.5x/54% better performance/watt claim comes from measuring performance (in unspecified titles) on a Windows 10, 5900X system. 7900XTX vs 6900XT, both cards set to a limit of 300W TBP.

So while you may say that 7950XTX @300W TBP clocks lower, so would the 6950XT. The question is by how much? We already know Navi21 "XTX" doesn't gain much past 300-335W, scaling goes terrible.

I think the 7900XT will OCs close to the 7900XTX. The clocks seem artificially low on the 7900XT to give spacing to the XTX.

This looks very similar to the 7970 vs 7950 situation from way back, the paper difference is big but both hit about the same clocks with OC and the advantage of the higher end card wasn't significant when OC'ed.
A 17%-ish performance advantage is kinda hard to OC through, but you never know.

Boost-wise the clock difference doesn't look that big (2,4 to 2,5), but the gap between them in base/game clock as well as in-game clocks should be much bigger.
 
Last edited:

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
So do we know yet if there is actually a bug with higher clock speeds (like 3 GHz), or did AMD just back off the clocks for power saving reasons?
 

tajoh111

Senior member
Mar 28, 2005
298
312
136
One more variable for performance improvement is the actual game clock of the 6950xt being used as a reference point.

AMD specifies the game clocks of the 6950xt to be 2.1ghz but actual testing shows the average clock to be 2.4ghz about using the reference design(Using tech powerups review) using 23 games.

If AMD is downclocking the 6950xt to an actual game clock of 2.1ghz which is quite possible with the 300watt testing environment vs the actual 340watts, you might have to subtract about 10% performance from the performance claims if we want to use current reviews to extrapolate performance.

If we do that, we actually get numbers AMD 1.54x performance increase.

This is 2.3ghz/2.1Ghz = 1.095
1.095 x 1.175(IPC increase) x 1.2(20 percent more CUs) = 1.544 which is basically what AMD said the performance increase is.

However if we consider this, that means the design AMD is using to compare the improvement gen on gen is 14% lower clocked in reviews which should lead to about a 10% loss in performance which is about where the 6900xt is.

If we are actually getting around a 1.54 increase in performance vs a 6900xt. I could see why AMD wants to compare this to the RTX 4080 16gb. If using computerbase.de results we could see the 7950xtx losing to the RTX 4090 by 25% or using techpowerups results but corrected with their updated CPU game comparisons using faster processors, losing to the RTX 4090 by 22%. This is not a small amount and AMD would not want to post graphs like this even if the price difference is $600 dollars.

If the RTX 4080 ends up about 20% faster than a rtx 3090 ti, this leads to 7900 xtx being about 7 to 15 percent faster if we use techpowerups or computerbase.de review. Add the ray tracing advantage of Nvidia cards and all of a sudden the pricing of the 7900xtx makes sense along why AMD is targeting the RTX 4080.
 
  • Like
Reactions: Leeea and Joe NYC

uzzi38

Platinum Member
Oct 16, 2019
2,624
5,889
146
From where did you get these slides? Are they legit?

The frontend in N31 is 2.5GHz, but shaders are only at 2.3GHz.
Officially, the game clock is 2.3GHz and considering they mention in that last slide +18% for gaming clock, I think they were talking about gaming clock and not boost clock in the slide below about shader and frontend frequency.
Gd6Y6vy8KK7AJbwW.jpg

Yes, If they were talking about +15% for boost then against RX 6950XT at least the frontend should be clocked at ~2655MHz.
Boost clock for shaders is likely ~2.5GHz an then 61.6 TFLOPs is correct.

If I applied +30% to the existing RDNA2 models, then It would look like this.
RX 6950 XT: 2310 MHz -> 3003 MHz
RX 6750 XT: 2600 MHz -> 3380 MHz
RX 6650 XT: 2635 MHz -> 3426 MHz
RX 6500 XT: 2815 MHz -> 3660 MHz
N31 is nowhere near 3GHz even If I looked only at the frontend frequency.
Let's wait and see what Phoenix and N32 do :)
 
  • Like
Reactions: Joe NYC

Timorous

Golden Member
Oct 27, 2008
1,608
2,753
136
One more variable for performance improvement is the actual game clock of the 6950xt being used as a reference point.

AMD specifies the game clocks of the 6950xt to be 2.1ghz but actual testing shows the average clock to be 2.4ghz about using the reference design(Using tech powerups review) using 23 games.

If AMD is downclocking the 6950xt to an actual game clock of 2.1ghz which is quite possible with the 300watt testing environment vs the actual 340watts, you might have to subtract about 10% performance from the performance claims if we want to use current reviews to extrapolate performance.

If we do that, we actually get numbers AMD 1.54x performance increase.

This is 2.3ghz/2.1Ghz = 1.095
1.095 x 1.175(IPC increase) x 1.2(20 percent more CUs) = 1.544 which is basically what AMD said the performance increase is.

However if we consider this, that means the design AMD is using to compare the improvement gen on gen is 14% lower clocked in reviews which should lead to about a 10% loss in performance which is about where the 6900xt is.

If we are actually getting around a 1.54 increase in performance vs a 6900xt. I could see why AMD wants to compare this to the RTX 4080 16gb. If using computerbase.de results we could see the 7950xtx losing to the RTX 4090 by 25% or using techpowerups results but corrected with their updated CPU game comparisons using faster processors, losing to the RTX 4090 by 22%. This is not a small amount and AMD would not want to post graphs like this even if the price difference is $600 dollars.

If the RTX 4080 ends up about 20% faster than a rtx 3090 ti, this leads to 7900 xtx being about 7 to 15 percent faster if we use techpowerups or computerbase.de review. Add the ray tracing advantage of Nvidia cards and all of a sudden the pricing of the 7900xtx makes sense along why AMD is targeting the RTX 4080.

The 300W environment was with a 6900XT not a 6950XT.
 
  • Like
Reactions: Tlh97 and Kaluan

Timorous

Golden Member
Oct 27, 2008
1,608
2,753
136
I think the 7900XT will OCs close to the 7900XTX. The clocks seem artificially low on the 7900XT to give spacing to the XTX.

This looks very similar to the 7970 vs 7950 situation from way back, the paper difference is big but both hit about the same clocks with OC and the advantage of the higher end card wasn't significant when OC'ed.

I think AMD will lock that down in the BIOS / Drivers so you will need to something like MPT to increase the power limits by enough to get good OCs.

I do expect a small number of 7900XTs will actually be XTX capable dies but the MCD bond failed. If you can get one of those then in theory you could run at XTX clocks or even higher since it will have fewer shaders active. I don't see AMD ignoring that possibility.

EDIT: On further reflection perhaps that is the plan. The XTX is for those who want guaranteed performance stock, the 7900XT is for those who want to play the lottery and see if they can get a banger of an OC card for a bit less than the XTX. For those who do get the XTX binned die with a failed MCD bond they might be able to get to 95% of XTX performance at 90% of the price so the value proposition is there for a subset of users.

For the people who only look at stock performance the 7900XTX and the 7800XT will probably be better value purchases but the 7900XT will have a market, albeit much smaller which is precisely what AMD probably want. If the yields for XTX binned dies were lower or the MCD bonding was less successful or binning was harder then I expect the 7900XT would be lower priced due to having more supply of non XTX viable dies.
 
Last edited:

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
I think AMD will lock that down in the BIOS / Drivers so you will need to something like MPT to increase the power limits by enough to get good OCs.

I do expect a small number of 7900XTs will actually be XTX capable dies but the MCD bond failed. If you can get one of those then in theory you could run at XTX clocks or even higher since it will have fewer shaders active. I don't see AMD ignoring that possibility.

EDIT: On further reflection perhaps that is the plan. The XTX is for those who want guaranteed performance stock, the 7900XT is for those who want to play the lottery and see if they can get a banger of an OC card for a bit less than the XTX. For those who do get the XTX binned die with a failed MCD bond they might be able to get to 95% of XTX performance at 90% of the price so the value proposition is there for a subset of users.

For the people who only look at stock performance the 7900XTX and the 7800XT will probably be better value purchases but the 7900XT will have a market, albeit much smaller which is precisely what AMD probably want. If the yields for XTX binned dies were lower or the MCD bonding was less successful or binning was harder then I expect the 7900XT would be lower priced due to having more supply of non XTX viable dies.

Or the much simpler answer: the 7900XT was supposed to compete against the “unlaunched” 4080 model.
 
  • Like
Reactions: Tlh97 and Joe NYC

Timorous

Golden Member
Oct 27, 2008
1,608
2,753
136
Or the much simpler answer: the 7900XT was supposed to compete against the “unlaunched” 4080 model.

In Raster it looks as though it will compete against the 4080.

In any event though the performance delta does not line up with the price delta. I guess AMD may have decided to charge more last minute due to the 4080 12G unlaunch. Perhaps the original plan was $800 which would put the XTX 25% more expensive for about that level of performance increase but without a $900 4080 12G AMD didn't see the need to do this at launch.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Just to get a rough idea on RT performance of the RX7900XTX

Ill use the AMD slide and Kitguru RTX4080 review since I find it to have similar fps on the RX6050XT with AMDs own slides.

Unfortunately the review only have the CP77 and Resident Evil.


Cyberpunk 77 4K RayTracing (no DLSS/FSR)


4080 = 28.1fps

7900XTX = 21fps

6950XT = 13fps



Resident Evil : Village 4K RayTracing (no DLSS/FSR)

4080 =138.4fps

7900XTX = 135fps

6950XT = 92fps

RADEON-RX-7900-2.jpg


Cyber-DXR3-768x768.png


REVDXR3-768x768.png