• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Discussion RDNA4 + CDNA3 Architectures Thread

Page 483 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:
Would AMD have a higher graphics card market share if the RX 9070 XT was $500?
Could they?
IMO, they probably could have - although I think that the biggest mistake this gen was not having a 9080 XT. Whoever decided to cut the bigger die IMO killed the product. It's already difficult to beat Nvidia, but without a halo product yet again, they really made it impossible. I do agree with the sentiment that people want AMD for cheaper Nvidia, but I do think that people would go for AMD if they actually had better performance.

Not sure if it matters now anyway, as everything appears to be going to DC and AI anyway. It's quite clear that gaming GPUs are not really important for any of the companies.
 
Whoever decided to cut the bigger die IMO killed the product
Oh it wasn't a die, but a wonderful packaging moonshot.
Some of it stuck elsewhere, even (hemlo MI400).
It's already difficult to beat Nvidia, but without a halo product yet again, they really made it impossible. I do agree with the sentiment that people want AMD for cheaper Nvidia,
No one's gonna pay $3k for a Radeon.
but I do think that people would go for AMD if they actually had better performance.
They would have to win 3 gens in a row to start moving units above $1k.
Might as well just light R&D opex on fire and pray to whatever deity you fancy.
 
They would have to win 3 gens in a row to start moving units above $1k.
This is the crux of the problem really.

AMD was able to acheive this in the CPU market for a few reasons:
  • Intel dropping the ball, hard
  • CPUs use less bleeding edge silicon so good margins are easier to attain
  • Shared dies between client and DC
GPU market is just harder as AMD has none of these advantages and they'll just end up burning cash for 6 years to have a chance to beat NVIDIA for 3 gens running. But NVIDIA, despite it's apathy towards the gaming market in 2026, doesn't like to lose and can pretty easily turn on the financial horsepower taps and drown any Radeon attempts to win, or make them completely unprofitable.

You can't make generalised dies for graphics and compute to split costs anymore as the workloads have diverged too much and you'll blow transistor budgets. Ironically AMD probably had a good chance to do this successfully back in the GCN 1 - 3 era, and blew it.

Chiplet designs are just uneconomical at the scale needed for GPUs.

Pretty sure Radeon is done. Arc never stood a chance, either.

Our only hope is NVIDIA doing something so utterly stupid it breaks their own stranglehold on GPU.
 
But NVIDIA, despite it's apathy towards the gaming market in 2026, doesn't like to lose and can pretty easily turn on the financial horsepower taps and drown any Radeon attempts to win, or make them completely unprofitable.
It's not like it takes them a lot of money to possibly make the imaginary MSRP a bit lower to counter anything AMD may do.

Jack will - by his own words - focus on cards that target the main market. This means they're not trying for the 10% of the market, the highest end huge chips. Anything aiming at that market below that can be made irrelevant by a simple price cut.
 
Last edited:
AMD was able to acheive this in the CPU market for a few reasons:
  • Intel dropping the ball, hard
  • CPUs use less bleeding edge silicon so good margins are easier to attain
  • Shared dies between client and DC
It's just server scraps.
Intel spent the bulk of Zen lifetime being pretty competitive in client, especially mobile.
GPU market is just harder as AMD has none of these advantages and they'll just end up burning cash for 6 years to have a chance to beat NVIDIA for 3 gens running
It's not really a chance. They can whack them around at will, but it's just a lot of expensive and manpower-intensive R&D for a total market of maybe $5B a quarter (best case).
Ironically AMD probably had a good chance to do this successfully back in the GCN 1 - 3 era, and blew it.
Maxwell sent GCN2/3 back to the stone age.
You can't make generalised dies for graphics and compute to split costs anymore as the workloads have diverged too much and you'll blow transistor budgets.
Now they're making dies tied in other markets and serving the throwaway bins into DT AIC. Not very different from Ryzen (makes sense given who's in charge now).
Jack will - by his own words - focus on cards that target the main market. This means they're not trying for the 10% of the market, the highest end huge chips.
They do not focus on anything in DT AIC. They're just gonna server throwaway bins of w/ever they have at hand.
RDNA5 spans the chungus to the 45W mobile babydie. But 6 can have just mobile babydies if customer cycles up on top do not align.
 
Back
Top