• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Question Speculation: RDNA3 + CDNA2 Architectures Thread

Page 191 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Seems likely if they don't release Navi 32, that they more faster to RDNA 4.
I'm sure AMD knew that it had fallen short of expectations with RDNA3 products at least a year before they came to market. In that case, they would have had time to grease the skids a bit on RDNA4 products. The question is how much could they do so - maybe AMD was able to move the launch up by a quarter? Two quarters would be pretty hard, they still need the process they targeted to be in HVM.
 
I'm sure AMD knew that it had fallen short of expectations with RDNA3 products at least a year before they came to market. In that case, they would have had time to grease the skids a bit on RDNA4 products. The question is how much could they do so - maybe AMD was able to move the launch up by a quarter? Two quarters would be pretty hard, they still need the process they targeted to be in HVM.

I don't think we would be talking about moving up the timeframe... but perhaps they would bring out the "mid-range" ($600-$700) part first.
 
I don't think we would be talking about moving up the timeframe... but perhaps they would bring out the "mid-range" ($600-$700) part first.
That doesn't make sense. A mid-range (N42??) cards will best the 7900XT for sure and maybe the 7900XTX if AMD's graphics group hits it out of the park.
 
That doesn't make sense. A mid-range (N42??) cards will best the 7900XT for sure and maybe the 7900XTX if AMD's graphics group hits it out of the park.

Too early to talk about absolute performance but the idea is that RDNA4 would mainly focus on fixing RDNA3's bad cost structure. Especially since AMD clearly can't go that high on pricing.
 
Too early to talk about absolute performance but the idea is that RDNA4 would mainly focus on fixing RDNA3's bad cost structure. Especially since AMD clearly can't go that high on pricing.
Who is postulating this idea (principle focus on cost structure)? With a sufficient gain in performance, AMD could match NV in pricing (well, at least to the $1200US mark) They do seem to need to take a bit of a haircut relative to NV hit sales targets. Obviously, getting BOM costs down will help AMD and AIBs.
 
I think AMD will announce N32 and N33 at Computex and have 4-5 new SKU's out by early July. They could really shake things up if they nail pricing but given their recent history with launch MSRP's I'm not holding my breath.

Here is what I'd like to see:

7800 xt $550 (N31 with W7800 cut & 16 gb VRAM) between 4070 ti and 4070 performance
7700 xt $450 (N32 Full die w/ 16 gb VRAM) 4070 performance
7700 $400 (N32 cut down w/ 16 gb VRAM) between 4070 and 4060 ti performance
7600 xt $300/$350 (N33 full die w/ 8 and 16 gb VRAM options) 4060 ti performance
7600 $250/$300 (N33 cut down w/ 8 and 16 gb VRAM options) 4060 performance
 
Last edited:
Too early to talk about absolute performance but the idea is that RDNA4 would mainly focus on fixing RDNA3's bad cost structure. Especially since AMD clearly can't go that high on pricing.
What is this bad cost structure? Do you know details? What is a good cost structure for them? What do you imagine the 7600XT should sell for, to have margins suiting your thinking?

Finally, I have at last, come to realize that your style of writing appears to be definitive, in the sense that you state feelings and opinions as facts.
 
What do you imagine the 7600XT should sell for, to have margins suiting your thinking?

Depends on whether they are getting any discounts on N6 because of the falling utilization. But otherwise (at least) the same $379 that the 6600 XT's MSRP was. So far TSMC hasn't admitted to cutting N6 prices.

If you're saying that's too high given current RDNA 2 prices... I'd agree. Which is why RDNA 2 has to go first.
 
Last edited:
RDNA4 is probably a year away at this point. Shipping 6 months late is better than not shipping anything for 18.

Pushing up the launch of RDNA4 is likely to result in the same kinds of issues that have plagued RDNA3. If anything I'd expect AMD to want to spend more time to ensure they've got everything right because these kinds of mistakes are costly.
 
Why wouldn't that be the expectation? The 6700 xt launched at $480, 5700 xt at $400.
You just gave the answer. 6700 XT was more expensive than 5700 XT. Why do you expect 7700XT to be suddenly cheaper than 6700XT?
A cutdown N31 launched at $899 and between It and full N32 is not that much difference in performance to warrant 2x difference in price.
 
You just gave the answer. 6700 XT was more expensive than 5700 XT. Why do you expect 7700XT to be suddenly cheaper than 6700XT?
Because it isn't 2020 anymore. The unprecedented market conditions during the 6700 xt launch were completely different than they are now. Even if you raise those prices I listed by $50 (which were listed as what I would LIKE to see BTW) they are still within reason IMO. At least for the 16 GB cards.
 
You just gave the answer. 6700 XT was more expensive than 5700 XT. Why do you expect 7700XT to be suddenly cheaper than 6700XT?
A cutdown N31 launched at $899 and between It and full N32 is not that much difference in performance to warrant 2x difference in price.

And also charge less while increasing VRAM.
 
I find the ongoing discussion a bit funny. Why would AMD respin anything when the supposedly broken current RDNA3 parts easily beat their RDNA2 counterparts on both performance and power? Whatever else is coming will just be more of the same.

RDNA4 to come sooner than expected? Why would they try to speed up the development of that when the prudent thing would be to spend some extra time making sure mistakes are not repeated?
 
I find the ongoing discussion a bit funny. Why would AMD re-spin anything when the supposedly broken current RDNA3 parts easily beat their RDNA2 counterparts on both performance and power? Whatever else is coming will just be more of the same.

RDNA4 to come sooner than expected? Why would they try to speed up the development of that when the prudent thing would be to spend some extra time making sure mistakes are not repeated?
Because a hypothetical got out of hand...
 
A cutdown N31 launched at $899 and between It and full N32 is not that much difference in performance to warrant 2x difference in price.
The cut down N31 was universally panned for that $900 price by reviewers. It was also widely rejected by consumers which is why it can be had for $770 currently. The MSRP on that card should have been $700 IMO. $750 at most. The 7900 xt is a terrible example to use in a pricing argument. It's MSRP was a complete and total sales & marketing blunder.

And also charge less while increasing VRAM.
The consumer GPU market seems to dictate that AMD cards need to cost something like 15-20% less than Nvidia at the same tier of performance to sell well. The market appears to be rejecting the 4070 at it's $600 MSRP despite it's software feature and mind share advantage over AMD. A major factor in this is obviously price/VRAM. I'd argue AMD doesn't have any choice but to offer 16 GB in the mid price tiers. How much do you think an AMD 16 GB 4070 competitor should cost to sell in high volume? I think it's pretty clear that it can't be more than $500 to succeed. While I'd like to see it at $450, it's probably more realistic to hope for $500.
 
I'd argue AMD doesn't have any choice but to offer 16 GB in the mid price tiers. How much do you think an AMD 16 GB 4070 competitor should cost to sell in high volume? I think it's pretty clear that it can't be more than $500 to succeed. While I'd like to see it at $450, it's probably more realistic to hope for $500.

They have the choice to just stay out, which they are so far doing, which is generating the speculation that such choices are squeezing profits too much.

https://www.igorslab.de/en/nvidia-s...-amd-zeigt-zur-computex-wohl-kleinere-karten/

What I can confirm, however, is that individual (!) AMD board partners will show a Radeon RX 7600 as a finished product at Computex, while other partners, who use both AMD and NVIDIA chips, are first exercising wait-and-see restraint. To paraphrase politely. The statement that you don’t want to produce anything just to satisfy production and where you don’t see any chances of profits then hurts a bit. There is no price basis for a Radeon RX 7700XT at the moment, because you could even slip into the loss zone because the target group shifts and the price shouldn’t fit anymore.
 
They have the choice to just stay out, which they are so far doing, which is generating the speculation that such choices are squeezing profits too much.

https://www.igorslab.de/en/nvidia-s...-amd-zeigt-zur-computex-wohl-kleinere-karten/
If you can't make a full N32 16 GB 7700 xt profitable at $450-500 than the mid and low end discrete GPU market is finished. They may as well just make high end $1000 + gaming DGPU's and focus on SoC's for the rest of the gaming market.
 
The cut down N31 was universally panned for that $900 price by reviewers. It was also widely rejected by consumers which is why it can be had for $770 currently. The MSRP on that card should have been $700 IMO. $750 at most. The 7900 xt is a terrible example to use in a pricing argument. It's MSRP was a complete and total sales & marketing blunder.

That's why AMD needs scalable GCDs so a product like the 7900 XT wouldn't need to exist.
 
If you can't make a full N32 16 GB 7700 xt profitable at $450-500 than the mid and low end discrete GPU market is finished. They may as well just make high end $1000 + gaming DGPU's and focus on SoC's for the rest of the gaming market.

Maybe low and mid don't need 16GB, which is going to add significantly to the BOM.
 
Maybe low and mid don't need 16GB, which is going to add significantly to the BOM.

I personally think it's more the number of chips than the capacity. Perhaps nVidia's decisions with the bus size makes more sense if they were expecting the Ada Refresh to get 3 or 4 GB/chip.
 
Back
Top