• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion RDNA4 + CDNA3 Architectures Thread

Page 483 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:
Would AMD have a higher graphics card market share if the RX 9070 XT was $500?
Could they?
IMO, they probably could have - although I think that the biggest mistake this gen was not having a 9080 XT. Whoever decided to cut the bigger die IMO killed the product. It's already difficult to beat Nvidia, but without a halo product yet again, they really made it impossible. I do agree with the sentiment that people want AMD for cheaper Nvidia, but I do think that people would go for AMD if they actually had better performance.

Not sure if it matters now anyway, as everything appears to be going to DC and AI anyway. It's quite clear that gaming GPUs are not really important for any of the companies.
 
Whoever decided to cut the bigger die IMO killed the product
Oh it wasn't a die, but a wonderful packaging moonshot.
Some of it stuck elsewhere, even (hemlo MI400).
It's already difficult to beat Nvidia, but without a halo product yet again, they really made it impossible. I do agree with the sentiment that people want AMD for cheaper Nvidia,
No one's gonna pay $3k for a Radeon.
but I do think that people would go for AMD if they actually had better performance.
They would have to win 3 gens in a row to start moving units above $1k.
Might as well just light R&D opex on fire and pray to whatever deity you fancy.
 
They would have to win 3 gens in a row to start moving units above $1k.
This is the crux of the problem really.

AMD was able to acheive this in the CPU market for a few reasons:
  • Intel dropping the ball, hard
  • CPUs use less bleeding edge silicon so good margins are easier to attain
  • Shared dies between client and DC
GPU market is just harder as AMD has none of these advantages and they'll just end up burning cash for 6 years to have a chance to beat NVIDIA for 3 gens running. But NVIDIA, despite it's apathy towards the gaming market in 2026, doesn't like to lose and can pretty easily turn on the financial horsepower taps and drown any Radeon attempts to win, or make them completely unprofitable.

You can't make generalised dies for graphics and compute to split costs anymore as the workloads have diverged too much and you'll blow transistor budgets. Ironically AMD probably had a good chance to do this successfully back in the GCN 1 - 3 era, and blew it.

Chiplet designs are just uneconomical at the scale needed for GPUs.

Pretty sure Radeon is done. Arc never stood a chance, either.

Our only hope is NVIDIA doing something so utterly stupid it breaks their own stranglehold on GPU.
 
But NVIDIA, despite it's apathy towards the gaming market in 2026, doesn't like to lose and can pretty easily turn on the financial horsepower taps and drown any Radeon attempts to win, or make them completely unprofitable.
It's not like it takes them a lot of money to possibly make the imaginary MSRP a bit lower to counter anything AMD may do.

Jack will - by his own words - focus on cards that target the main market. This means they're not trying for the 10% of the market, the highest end huge chips. Anything aiming at that market below that can be made irrelevant by a simple price cut.
 
Last edited:
AMD was able to acheive this in the CPU market for a few reasons:
  • Intel dropping the ball, hard
  • CPUs use less bleeding edge silicon so good margins are easier to attain
  • Shared dies between client and DC
It's just server scraps.
Intel spent the bulk of Zen lifetime being pretty competitive in client, especially mobile.
GPU market is just harder as AMD has none of these advantages and they'll just end up burning cash for 6 years to have a chance to beat NVIDIA for 3 gens running
It's not really a chance. They can whack them around at will, but it's just a lot of expensive and manpower-intensive R&D for a total market of maybe $5B a quarter (best case).
Ironically AMD probably had a good chance to do this successfully back in the GCN 1 - 3 era, and blew it.
Maxwell sent GCN2/3 back to the stone age.
You can't make generalised dies for graphics and compute to split costs anymore as the workloads have diverged too much and you'll blow transistor budgets.
Now they're making dies tied in other markets and serving the throwaway bins into DT AIC. Not very different from Ryzen (makes sense given who's in charge now).
Jack will - by his own words - focus on cards that target the main market. This means they're not trying for the 10% of the market, the highest end huge chips.
They do not focus on anything in DT AIC. They're just gonna server throwaway bins of w/ever they have at hand.
RDNA5 spans the chungus to the 45W mobile babydie. But 6 can have just mobile babydies if customer cycles up on top do not align.
 
I like to call rdna an 임베디드 발사대 (don't know if embedded includes a console) Korean meme
 
Last edited:
AMD does not and should not care about discrete graphics very much at this point.

The GPU market uses huge dies relatively to the rest of the chip industry in terms of using the newest processes and what the market and even this forum show is gamers are cheap aside from the high end.

Even the smallest GPU from AMD which is the 9600xt is relatively large at 200mm2. The cost for this card? 299 dollars for the 8gb model and 349 for the 16GB model. This means AMDs partners are like paying less than 200(maybe even 150) for this chip to take into account the cost to manufactures the rest of the card, the partners profit and the resellers profit.

This is really low relative to the rest of AMD product stack. Look at 9950x in comparison which also uses 4nm, this product uses 2 x 70mm2 + a 6nm 60mm2 IO die and sells for 650 dollars. All pure profit for AMD since there is no partner to share the cost with and the rest of the product cost is near negligible to make.

AMD 9070XT at $500 dollars? Why bother when you can sell a thread ripper 9975wx which consists of 4 x 70mm2 4nm dies and a 6nm IO die for $4100. Remember the 9070xt is a monolithic 361mm2 4nm chip which requires a bunch of other components to make the chip a sellable product while still needing to give room for partners to make money.

AT0 being a $2000 dollar product with a 700mm2 3nm die? Why bother when a 12x70mm2 4nm threadripper chip is close to 12000 dollars.

So why would AMD even bother trying if the margins are this bad, while requiring such a high R and D expenditure since so many chips are needed to address the entire market, while also having a high software requirement as a result of drivers and new features needing to be developed.

The discrete graphics was always going to turn into a monopoly at some point because the cost to manufacture and design chips was too high and the barrier to entry to enormous so we get no new players(look at Intel as an example). The cost to manufacture and research these chips have far outpaced the total addressable market.

AMD has a viable and highly attractive alternative to spending its limited resources on as result of this AI market. AMD care even less about gamers than Nvidia and they should. Nvidia has been producing less gaming cards as a result of the high demand for AI products but they are still supplying a huge quantity when you consider they are servicing the laptop market and system builders first which is 70% of the market and entirely theirs. AMD probably doesn't mind getting a 20-25% of the DIY market which is all that is left to them which is where this 6% is coming from. It is a small market with a highly competitive competitor that pours money into it like crazy compared to AMD.
 
Last edited:
AMD does not and should not care about discrete graphics very much at this point.

The GPU market uses huge dies relatively to the rest of the chip industry in terms of using the newest processes and what the market and even this forum show is gamers are cheap aside from the high end.

Even the smallest GPU from AMD which is the 9600xt is relatively large at 200mm2. The cost for this card? 299 dollars for the 8gb model and 349 for the 16GB model. This means AMDs partners are like paying less than 200(maybe even 150) for this chip to take into account the cost to manufactures the rest of the card, the partners profit and the resellers profit.

This is really low relative to the rest of AMD product stack. Look at 9950x in comparison which also uses 4nm, this product uses 2 x 70mm2 + a 6nm 60mm2 IO die and sells for 650 dollars. All pure profit for AMD since there is no partner to share the cost with and the rest of the product cost is near negligible to make.

AMD 9070XT at $500 dollars? Why bother when you can sell a thread ripper 9975wx which consists of 4 x 70mm2 4nm dies and a 6nm IO die for $4100. Remember the 9070xt is a monolithic 361mm2 4nm chip which requires a bunch of other components to make the chip a sellable product while still needing to give room for partners to make money.

AT0 being a $2000 dollar product with a 700mm2 3nm die? Why bother when a 12x70mm2 4nm threadripper chip is close to 12000 dollars.

So why would AMD even bother trying if the margins are this bad, while requiring such a high R and D expenditure since so many chips are needed to address the entire market, while also having a high software requirement as a result of drivers and new features needing to be developed.

The discrete graphics was always going to turn into a monopoly at some point because the cost to manufacture and design chips was too high and the barrier to entry to enormous so we get no new players(look at Intel as an example). The cost to manufacture and research these chips have far outpaced the total addressable market.

AMD has a viable and highly attractive alternative to spending its limited resources on as result of this AI market. AMD care even less about gamers than Nvidia and they should. Nvidia has been producing less gaming cards as a result of the high demand for AI products but they are still supplying a huge quantity when you consider they are servicing the laptop market and system builders first which is 70% of the market and entirely theirs. AMD probably doesn't mind getting a 20-25% of the DIY market which is all that is left to them which is where this 6% is coming from. It is a small market with a highly competitive competitor that pours money into it like crazy compared to AMD.

What if they got a good deal at Samsung for GPU's? I would say Intel as well but for that to happen they would likely have to drop Arc. Possible. But even then, pretty sure I've read 18A has no HD variant.
 
Last edited:
Samsung plays are for volume parts and Radeon got none.
I'd rate Nvidia as more likely to do that again than AMD. Excluding whatever Exynos is doing.

Sucks to be AMD then. TSMC is better used elsewhere and as much as some of us would love more Radeon market share it just isn't there.
 
AMD does not and should not care about discrete graphics very much at this point.

The GPU market uses huge dies relatively to the rest of the chip industry in terms of using the newest processes and what the market and even this forum show is gamers are cheap aside from the high end.

Even the smallest GPU from AMD which is the 9600xt is relatively large at 200mm2. The cost for this card? 299 dollars for the 8gb model and 349 for the 16GB model. This means AMDs partners are like paying less than 200(maybe even 150) for this chip to take into account the cost to manufactures the rest of the card, the partners profit and the resellers profit.

This is really low relative to the rest of AMD product stack. Look at 9950x in comparison which also uses 4nm, this product uses 2 x 70mm2 + a 6nm 60mm2 IO die and sells for 650 dollars. All pure profit for AMD since there is no partner to share the cost with and the rest of the product cost is near negligible to make.

AMD 9070XT at $500 dollars? Why bother when you can sell a thread ripper 9975wx which consists of 4 x 70mm2 4nm dies and a 6nm IO die for $4100. Remember the 9070xt is a monolithic 361mm2 4nm chip which requires a bunch of other components to make the chip a sellable product while still needing to give room for partners to make money.

AT0 being a $2000 dollar product with a 700mm2 3nm die? Why bother when a 12x70mm2 4nm threadripper chip is close to 12000 dollars.

So why would AMD even bother trying if the margins are this bad, while requiring such a high R and D expenditure since so many chips are needed to address the entire market, while also having a high software requirement as a result of drivers and new features needing to be developed.

The discrete graphics was always going to turn into a monopoly at some point because the cost to manufacture and design chips was too high and the barrier to entry to enormous so we get no new players(look at Intel as an example). The cost to manufacture and research these chips have far outpaced the total addressable market.

AMD has a viable and highly attractive alternative to spending its limited resources on as result of this AI market. AMD care even less about gamers than Nvidia and they should. Nvidia has been producing less gaming cards as a result of the high demand for AI products but they are still supplying a huge quantity when you consider they are servicing the laptop market and system builders first which is 70% of the market and entirely theirs. AMD probably doesn't mind getting a 20-25% of the DIY market which is all that is left to them which is where this 6% is coming from. It is a small market with a highly competitive competitor that pours money into it like crazy compared to AMD.
This is a nice summary of the situation. Essentially, the discrete desktop GPU market doesn't have the margins to justify AMD pouring billions of new dollars to be competitive with Nvidia. Nvidia has every conceivable advantage thus far: software ecosystem, proprietary features, volume, and a high-margin server business that effectively subsidizes the R&D costs of consumer GPUs.

For these reasons, I bet Nvidia wishes AMD tried to compete with them in earnest in the consumer desktop space just because it would require AMD spending money very ineffectively. AMD decided to not compete and instead focus their efforts elsewhere is simply the smarter business decision.

Personally, I kind of wish AMD dropped out of the discrete desktop GPU market and let Nvidia have it all for a few generations. It takes away the stereotypical argument that it's AMD's fault the market is what it is. When there's only one player left that is directly responsible for shafting the consumer, eventually it becomes silly to blame a scapegoat that simply no longer exists. Doing so would be like blaming the cops for allowing a robber to steal from my house, rather than putting blame on the robber directly, when the police department was defunded years ago.
 
Essentially, the discrete desktop GPU market doesn't have the margins to justify AMD pouring billions of new dollars to be competitive with Nvidia
They're dumping piles of cash into gfx IP roadmap anyway.
It's specifically a DT AIC issue, where they need to win on top for mindshare and program-wise that's just tricky and expensive at their units and current market positioning.
 
They're dumping piles of cash into gfx IP roadmap anyway.
It's specifically a DT AIC issue, where they need to win on top for mindshare and program-wise that's just tricky and expensive at their units and current market positioning.
That's why I emphasized "new" dollars. I ain't saying AMD should ditch GPU R&D; it's just that the desktop discrete market is essentially cornered by Nvidia and it would take a disproportionate amount of additional funds to crack it. For every $1 Nvidia invests in the desktop discrete market, it would require... Idk, probably like $5 of AMD's money to counter it. Plus, given that the TAM for consumer desktop GPUs pales in comparison to server GPUs, the calculus quickly concludes that this market is simply not worth pursuing.

If I were AMD, I'd let Nvidia take the desktop discrete market. Let consumers get used to the idea that Nvidia is the only player in the game; if consumers have gripes about the situation they are in, they can blame the only player left. In the meantime, focus on developing a GPU architecture that is shared between server and gaming so that the R&D funds used for the server market can immediately benefit the consumer side. The intent is to limit the "additional" spending needed for a consumer product; consumer products are now pure derivatives of highly lucrative server parts. Continue to develop ROCm so that the software is up to snuff and lobby Microsoft to come up with a decent vendor agnostic upscaling technology to level the playing field because AMD developing their own proprietary version isn't going to win against the King of Proprietary Technology. If ROCm isn't ready or MS doesn't have a vendor agnostic upscaling technology, then don't ever release a desktop discrete GPU.
 
it's just that the desktop discrete market is essentially cornered by Nvidia and it would take a disproportionate amount of additional funds to crack it.
It's not horribly expensive but it's pure gambling;
For every $1 Nvidia invests in the desktop discrete market, it would require... Idk, probably like $5 of AMD's money to counter it
Oh no lol.
Again, it's just gambling. It's the gambling part that Lisa hates (roadmap gambling almost killed AMD!).
If I were AMD, I'd let Nvidia take the desktop discrete market.
They already did.
In the meantime, focus on developing a GPU architecture that is shared between server and gaming so that the R&D funds used for the server market can immediately benefit the consumer side
CDNA shader cores are on a separate track forever.
because AMD developing their own proprietary version isn't going to win against the King of Proprietary Technology
this is a bit funny given AMD is second only to Intel in terms of having proprietary EDA tools/sims/etc. lmao.
 
Back
Top