Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 158 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136
Gaming frequency is not even better than mobile N22. Link
Boost frequency for RX 7600M XT is ~ 2612MHz based on TFLOPs in single precision.
View attachment 73948

View attachment 73949

View attachment 73950

7600M XT vs 6650M XT both at 120W.

Game clock is seeing an increase of 6.4% and boost clock is 7.3%.

Not great for N33. If that gives you linear scaling with a 17% IPC gain (which was not really independently confirmed so this is a true best case) then 7600M XT should be at most 26% faster than the 6650M XT @ 120W. Reality is probably lower than that so maybe a 20% increase.

Comparing the 7700S to the 6800S sees an 11.4% increase in game clock and a 11.2% increase in boost clock at upto 100W which suggests the V/F curve on N33 is steeper than N23 which is not ideal when put into a desktop part so I don't expect N33 for desktop for a long while.
 

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136
One performance slide is missing, or at least I couldn't find It yet.
It's about 7600M XT vs RTX 3060M 6GB. RM-103

The "best" comparison is the slide RM-102, because they use Zen 3 5600x and 16GB DDR4 3600Mhz for both systems, but of course they had to compare to 6600M which has only 28CU, 2177MHz game frequency and up to 100W.
View attachment 73956

I made a comparison to real Laptops with either X 6900M or RX 6850M XT. Had to put scores for both Ultra and High settings, because there is some discrepancy with AMDs scores. I think in some games AMD didn't really set MAX settings for the tested GPUs.
Lenovo Legion 5 Link
R5 5600H + 6600M
or from here
Asus ROG Strix G15 Link
5800HX + RX 6800M
or
Corsair Voyager a1600 Link
6900HS + RX 6800M
Lenovo Legion 7 Link
6900HX + 6850M XT
RM-102
R5 5600X + RX 6600M
RM-102
R5 5600X + RX 7600M XT
ControlHigh: 69.7 FPS--------------64 FPS89 FPS
Cyberpunk 2077Ultra: 60.7 FPS (High: 75.1)-------Ultra: 87.2 FPS (High: 106)67 FPS87 FPS
Horizon Zero Dawn--------------Ultra: 123 FPS (High: 142)92 FPS117 FPS
Resident Evil VillageUltra: 146 FPS (High: 169)
(CPU: R7 5800H)
Ultra: 91.9 FPS (High: 134)-------143 FPS176 FPS
Shadow of the Tomb RiderUltra: 105 FPS (High: 114)Ultra: 106 FPS (High: 113)-------120 FPS142 FPS
Sniper Elite 5---------------------73 FPS89 FPS
The Witcher 3Ultra: 81.3 FPS (High: 141)Ultra: 110 FPS (High: 199)Ultra: 118 FPS (High: 208)105 FPS136 FPS
Tiny Tina's wonderlandUltra: 59.8 FPS (High: 77.8)
(CPU: R7 5800H)
Ultra: 80.5 FPS (High: 105)-------79 FPS95 FPS

Based on 7600M XT vs 6600M with linear clockspeed and CU gains I have an IP average of 4.35%.

That would mean the 7600M XT should perform on average around (32/32) * (2300/2162) * 1.0435 times faster than the 6650M XT or 1 * 1.064 * 1.0435 = 11% faster at the same 120W TDP.

If the desktop part matches that it would mean at best 3060Ti level performance from the 7600XT. Unless it came in at around $300 that would be a very poor replacement so this rumour the desktop N33 won't be out for a while seems fairly likely to me.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,038
136
N33 is more like a replacement for N23. Smaller, a bit higher performance, a bit better efficiency.
Don't have high expectations for desktop part.
I really have to wonder If this is faster than RTX 4060.

TimeSpy Graphics
RX 6600M: 8,234 points
RX 6800S: 9,380 points
RX 7600M XT: 8,234*(1.25 or 1.3) = 10,293-10,704 points.
RX 6850M XT: 11,762 points.
RX 3050Ti: 6,122 points
RX 4050: 6,122 / 1695MHz * 2370MHz = ~8,560? points (linear scaling, in reality less)
RX 4060: 8,560 * (1.15 or 1.2) = 9,844-10,272 points

RX 4060M could end up relatively close to RX 7600M XT.
I am really interested in prices.
 
  • Like
Reactions: Lodix

leoneazzurro

Golden Member
Jul 26, 2016
1,052
1,716
136
RX 7700S when It's weaker than 7600M XT and 4 different models?
You really need only the two RX 7600M(XT) models, 28CU + 16gbps(50-90W) and 32 CU +18gbps(75-120W). The S models are totally pointless. Congratulations, AMD marketing department, you are utter mor*ns.
shFVjglQOvfjh6ti.jpg

At least you'll know what you are buying by looking at the name of the GPU. With Nvidia you have the same GPU name and:

- vastly different TDPs
- vastly different RAM bus width and size

Not speaking about the fact mobile 4090 is a cut 4080 but they did not even add "M" at the end.
 
Last edited:
  • Like
Reactions: Lodix and Kaluan

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,038
136
Yes I am sure and let's agree to disagree. 3060 8Gb vs 3060 12 Gb is a mess far worse than every 7600XT/7700S confusion. And it's not guaranteed we will not see something like that again. Not to mention than in the table you link the same name is given to GPUs from 35W to 115W TDP. Yeah. Yes, we know it's up to the OEM but, seriously? Yes, AMD also gives TDP variation in its specs but in a smaller range.
What's so bad about 3060 8GB vs 3060 12 GB? The same GA106 chip with 128bit 8GB instead of 192bit 12GB. You know what's different from the name.

On the other hand you have N33 based RX 7600s 8GB, 7600M 8GB, 7600M XT 8GB and 7700S 8GB. What's the difference? No one knows from name and you would expect that 7700S is the fastest one, yet It's not.

Nvidia could release new Ada models later with Ti suffix, but who says AMD won't make more models based on N33? They still didn't use 7650M or 7650M XT.:cool:
Adding Ti to the name as a way to show It's a faster GPU is okay in my opinion, although I would prefer a higher number but RTX 4055 would look a bit funny. :)

Nvidia has a much bigger difference in TGP than AMD, true, but why do you need a different name for It when TGP should be mentioned in specs for that laptop?
Then would you give a new name to every model with a different TGP?

TGPs could be these: 35W, 45W, 55W, 65W, 75W, 85W, 95W, 105W and 115W, that's 9 different TGPs in total.
35 W45 W55 W65 W75 W85 W95 W105 W115 W
2560 Cuda Ada107RTX 4050RTX 4051RTX 4052RTX 4053RTX 4054RTX 4055RTX 4056RTX 4057RTX 4058
3072 Cuda
Ada107
RTX 4060RTX 4061RTX 4062RTX 4063RTX 4064RTX 4065RTX 4066RTX 4067RTX 4068
4608 Cuda
Ada106
RTX 4070RTX 4071RTX 4072RTX 4073RTX 4074RTX 4075RTX 4076RTX 4077RTX 4078
This is simply atrocious. Mentioning TGP value is much better.
 

leoneazzurro

Golden Member
Jul 26, 2016
1,052
1,716
136
The bad about 3060 8GB vs 3060 12gb (but also the 6gb exists) is that with the same name you sell GPUs with big performance difference, as they were the same. Not even speaking about the eight 3050 variants out there. At least the "S" moniker tells you that the 7700S is a different beast from a 7700M XT.
The problem of having bigger brackets for the TDP is that the performance difference with the same name will vary more. What is the performance experience a laptop user will have with a 4080 50W? And you will be sure that these laptops will be sold as having a "4080" on board, with a price set accordingly.
Finally, if one should look at the specs of the laptop he wants to buy, what is the difference?
Why with AMD is "confusing" and with Nvidia is "good"? Both publish their specs.
Why 4 models based on the same GPU are bad when every single model is identified in specs by the name, when in Nvidia case you can get the same name and have different specs or different chips, and it is good?
To me, both are misleading but Nvidia is misleading more, especially with those power brackets set so wide.
 
Last edited:

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Based on 7600M XT vs 6600M with linear clockspeed and CU gains I have an IP average of 4.35%.

That would mean the 7600M XT should perform on average around (32/32) * (2300/2162) * 1.0435 times faster than the 6650M XT or 1 * 1.064 * 1.0435 = 11% faster at the same 120W TDP.

If the desktop part matches that it would mean at best 3060Ti level performance from the 7600XT. Unless it came in at around $300 that would be a very poor replacement so this rumour the desktop N33 won't be out for a while seems fairly likely to me.

N33 is like 200mm^2 on N6 so expecting a performance worldbeater was always unrealistic, especially in light of how N31 performed.

Back in August, Angstronomics, the guy who first leaked accurate RDNA3 gpu specs, had this to say:
As an aside, Navi33 outperforms Intel’s top end Alchemist GPU while being less than half the cost to make and pulling less power.

Instead of being realistic about what the above meant some people (both on this forum and elsewhere) went wild and interpreted that N33 would match N21 or something, which is funny because N33 could be a big disappointment and still prove his prediction accurate.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
RX 7700S when It's weaker than 7600M XT and 4 different models?
You really need only the two RX 7600M(XT) models, 28CU + 16gbps(50-90W) and 32 CU +18gbps(75-120W). The S models are totally pointless. Congratulations, AMD marketing department, you are utter mor*ns.
shFVjglQOvfjh6ti.jpg

Mobile GPU's tend to be dictated by what OEMs want. AMD doesn't sell mobile GPUs to end users. OEMs choose the GPU they want for specific machines based on many different factors (Cost, speed, TDP, etc).

At least AMD has 'M' in the model number. Unlike nVidia where they are calling their top mobile chip RTX 4090, even though it uses an entirely different chip than a desktop 4090.
 
  • Like
Reactions: Kaluan

leoneazzurro

Golden Member
Jul 26, 2016
1,052
1,716
136
Based on 7600M XT vs 6600M with linear clockspeed and CU gains I have an IP average of 4.35%.

That would mean the 7600M XT should perform on average around (32/32) * (2300/2162) * 1.0435 times faster than the 6650M XT or 1 * 1.064 * 1.0435 = 11% faster at the same 120W TDP.

If the desktop part matches that it would mean at best 3060Ti level performance from the 7600XT. Unless it came in at around $300 that would be a very poor replacement so this rumour the desktop N33 won't be out for a while seems fairly likely to me.

If you check the notes, the 3060 used in the AMD tests were desktop versions, while the 7600M XT was used probably at max specified TDP (120W) which may not be far from the desktop version, but probably lower and on the N33 side there was also the fact that it was tested with a 5600X as a CPU versus a 7600X on the NV testbed. This of course if these numbers and the footnotes can be trusted, as this presentation is probably the worst ever on AMD's side, looking at the errors and typos all around.
 
  • Like
Reactions: Lodix

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,038
136
The bad about 3060 8GB vs 3060 12gb (but also the 6gb exists) is that with the same name you sell GPUs with big performance difference, as they were the same. Not even speaking about the eight 3050 variants out there. At least the "S" moniker tells you that the 7700S is a different beast from a 7700XT.
The problem of having bigger brackets for the TDP is that the performance difference with the same name will vary more. What is the performance experience a laptop user will have with a 4080 50W? And you will be sure that these laptops will be sold as having a "4080" on board, with a price set accordingly.
Finally, if one should look at the specs of the laptop he wants to buy, what is the difference?
Why with AMD is "confusing" and with Nvidia is "good"? Both publish their specs.
Why 4 models based on the same GPU are bad when every single model is identified in specs by the name, when in Nvidia case you can get the same name and have different specs or different chips, and it is good?
To me, both are misleading but Nvidia is misleading more, especially with those power brackets set so wide.
I just found out that not every shop automatically shows the amount of Vram and TGP for Nvidia laptops. My shop luckily shows both, so I don't have to check anything more and this affected my opinion(stance). This doesn't mean I wouldn't check out reviews, but first I can choose a few laptops which I like and then buy based on tests.

3060 8GB and 3060 12GB are desktop models. If there is no mention of VRAM, but the name is the same, then customer can make a mistake and he loses 15% of performance in this case.
If you think about It, this is a much smaller problem than what could happen in laptops.
If they don't mention TGP then you can buy RTX 3050 35W but expect performance of RTX 3050 80W, this way the customer can loose up to 39% of performance.
RTX 4050-4070 have 35-115W so It's likely even worse.

You have 7700S vs 7600M XT so they should be different.
You have 7600S vs 7600M so they should be different.
Are they different? They are actually not, they are exactly the same. Chip, speed of memory, amount of memory. There is not even difference in TGP, because those "S" ones are already included in M models. AMD should have released only 7600M and 7600M XT, the S variant doesn't have a reason to exist in this case.

I think ADA laptop has better naming than Ampere laptop had, but I would probably bring back M(mobile) and that TGP is a problem.
If you don't check the actual TGP then you will loose more performance with Nvidia than AMD.
I don't have an idea how to include TGP in the name, but I am not against different TGPs, even If It's a lot.

I don't like N33's names. If there "needs" to be 4 models, then I would change It to this along with TGP, so there won't be a situation where lower numbered model would outperfom the higher numbered one.
RX 7600S -> RX 7600M 50-65W
RX 7600M -> RX 7600M XT 70-85W
RX 7700S -> RX 7650M 90-105W
RX 7600M XT -> RX 7650M XT 110-125W
M - mobile
**00 - cut die
**50 - uncut die
XT - higher TGP

I think this reply should be ok with you, right?
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,038
136
N33 and Phoenix's iGPU seem to be better than N31, especially when you take into account that the former is using N6, but with AMD's disastrous presentation and their lies from the N31 presentation, I'm going to wait and see until they're in reviewers' hands.
Phoenix IGP's boost is 3GHz at 45W TDP. This is a very nice surprice.
RX 7600M XT is only 2.6GHz at 120W. I think N33 is mediocre at best, a smaller process would have been better.
 

exquisitechar

Senior member
Apr 18, 2017
684
942
136
RX 7600M XT is only 2.6GHz at 120W. I think N33 is mediocre at best, a smaller process would have been better.
It's nothing mindblowing, but pretty good and using N6 lowers the cost significantly. If AMD isn't fudging the numbers, based on the comparison with the desktop 3060, it's approximately 6650 XT (180W) performance at 120W with a smaller die size than N23 on the same node.
 
  • Like
Reactions: Lodix and Kaluan

leoneazzurro

Golden Member
Jul 26, 2016
1,052
1,716
136
I just found out that not every shop automatically shows the amount of Vram and TGP for Nvidia laptops. My shop luckily shows both, so I don't have to check anything more and this affected my opinion(stance). This doesn't mean I wouldn't check out reviews, but first I can choose a few laptops which I like and then buy based on tests.

3060 8GB and 3060 12GB are desktop models. If there is no mention of VRAM, but the name is the same, then customer can make a mistake and he loses 15% of performance in this case.
If you think about It, this is a much smaller problem than what could happen in laptops.
If they don't mention TGP then you can buy RTX 3050 35W but expect performance of RTX 3050 80W, this way the customer can loose up to 39% of performance.
RTX 4050-4070 have 35-115W so It's likely even worse.

You have 7700S vs 7600M XT so they should be different.
You have 7600S vs 7600M so they should be different.
Are they different? They are actually not, they are exactly the same. Chip, speed of memory, amount of memory. There is not even difference in TGP, because those "S" ones are already included in M models. AMD should have released only 7600M and 7600M XT, the S variant doesn't have a reason to exist in this case.

I think ADA laptop has better naming than Ampere laptop had, but I would probably bring back M(mobile) and that TGP is a problem.
If you don't check the actual TGP then you will loose more performance with Nvidia than AMD.
I don't have an idea how to include TGP in the name, but I am not against different TGPs, even If It's a lot.

I don't like N33's names. If there "needs" to be 4 models, then I would change It to this along with TGP, so there won't be a situation where lower numbered model would outperfom the higher numbered one.
RX 7600S -> RX 7600M 50-65W
RX 7600M -> RX 7600M XT 70-85W
RX 7700S -> RX 7650M 90-105W
RX 7600M XT -> RX 7650M XT 110-125W
M - mobile
**00 - cut die
**50 - uncut die
XT - higher TGP

I think this reply should be ok with you, right?

Propose this to AMD lol. There is a difference in TDP range AND clock speed though (game clocks are different between the 7600M XT and the 7700S, and between the 7600M and 7600S). So while similar in specs, they are not the same. Also, in AMD's case one can have a rough estimation of the specs (exact TDP apart) only from the name.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,167
7,666
136
At this point if feels like AMD's biggest issue and the thing that seems to ALWAYS bite them in the ass when they have anything good or EVEN MEDIOCRE going for them is capacity.

AMD's CPU division has been historically ****ed over not only by some dirty dealings by Intel but also because they have never been able to supply anything close to the capacity that OEMs want.

This is twice the problem for AMD's GPU division since it gets a small fraction of the piece AMD's already smaller than the competition CPU division gets.

With N33 it looks like AMD is going to go for OK performance + Volume thanks to N6 being a depreciated node. At the end of the day, anything between 6600XT-6700XT is still going to be very solid performance for mainstream gamers and it behooves AMD to actually have parts to sell them in that segment in the form factor they're looking to buy (OEM laptops).
 

Saylick

Diamond Member
Sep 10, 2012
3,532
7,859
136
At this point if feels like AMD's biggest issue and the thing that seems to ALWAYS bite them in the ass when they have anything good or EVEN MEDIOCRE going for them is capacity.

AMD's CPU division has been historically ****ed over not only by some dirty dealings by Intel but also because they have never been able to supply anything close to the capacity that OEMs want.

This is twice the problem for AMD's GPU division since it gets a small fraction of the piece AMD's already smaller than the competition CPU division gets.
It's a chicken or the egg scenario. If a company isn't successful, how can it ship a ton of product, but if it cannot ship a ton of product, how can it be successful? As a result, organic growth tends to be slow. It doesn't help that Lisa Su is a very conservative CEO who is deliberate and measured in her projections, which is directly related to their wafer orders.

Economies of scale provides Intel and Nvidia with a lot of benefits that AMD does not enjoy. It's the reason why AMD cannot develop proprietary GPU technologies, even though everyone says they should if they want to sway consumers away from Nvidia, because no developer would spend the time and effort to adopt them due to AMD's vastly smaller market share.

I mentioned before that the market is a zero-sum game which rewards winners and punishes losers. It's a positive feedback loop that promotes natural monopolies. For all intents and purposes, Nvidia pretty much is a monopoly because AMD doesn't really have influence on what they do or how they price. The only reason why they don't bury AMD by dropping prices to force them out of the market is that the FTC would be up their butt and price wars never benefit anyone. For this reason, I typically roll my eyes whenever someone says that AMD's lack of competitiveness is the result of Nvidia's price gouging. If AMD didn't exist or left the GPU market, Nvidia would price gouge regardless. Doing so would be like me blaming Research In Motion for how expensive iPhones are rather than blaming Apple itself for raising prices.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,523
3,038
136
With N33 it looks like AMD is going to go for OK performance + Volume thanks to N6 being a depreciated node. At the end of the day, anything between 6600XT-6700XT is still going to be very solid performance for mainstream gamers and it behooves AMD to actually have parts to sell them in that segment in the form factor they're looking to buy (OEM laptops).
Nvidia uses N5(N4) for every Ada(laptop + desktop), that means there shouldn't be issues with production capacity of silicon, especially for AMD who doesn't need such a volume for their GPUs.
N33 being on N6 is most likely due to production cost.
N33 GCD wouldn't be 2x smaller than 204mm2 and you even need 2 MCDs.
I got ~125-130mm2 for N5 GCD + 2x 37.5mm2 MCD.

Chiplet N33 on N5 process
485-506 GCD dies per wafer(good and bad), with $15,000 per wafer = $29.6-30.9
1776 MCD dies per wafer(good and bad), with $7,000 per wafer = $3.95
So in total $29.6-30.9 + 2*$3.95 = $37.5-38.8

N33 on N6 process
300 dies per wafer(good and bad), with $7,000 per wafer = $23

With packaging, let's say It's $35 vs $55. Difference is ~$20
Keep in mind that this is just an example!

I have to wonder If It's worth saving $20 on production cost instead of selling It for at least $20 more. Thanks to the better process, you should be able to clock It higher within the same TGP, so a faster product can be sold for more.
 

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
3GHz clock for RDNA3 mobile APU going unnoticed?
Nothing to talk about it without knowing how much power it needs for that and under what workloads it's achievable.

In the right workloads you can see the 7900XTX briefly touch 4GHz after all. At the end of the day, it doesn't matter when it comes to real world performance.
 

Mopetar

Diamond Member
Jan 31, 2011
8,114
6,770
136
It would lend a lot of credence to the claims about hardware issues in N31 preventing it from reaching the target 3 GHz clock speed in gaming workloads.
 

jpiniero

Lifer
Oct 1, 2010
15,223
5,768
136
Phoenix IGP and N32 supposedly shouldn't have as many issues as N31 and N33.

But that might be because they are going to do a respin of N32. Which would line up with the idea that it will be 6+ months from now before you see it in any fashion.

If the 4070 Ti doesn't sell then I think they aren't going to be very excited about releasing N32 desktop in any case.