Discussion RDNA4 + CDNA3 Architectures Thread

Page 471 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,777
6,789
136
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:

inquiss

Senior member
Oct 13, 2010
493
723
136
AMD is quite competitive without X3D with a smaller core.
You don't count LLC when measuring core size. Not even L2.
They have the biggest stick. Biggest stick doesn't mean it is the biggest in die size, consumers don't care about die size. They care about what's fast and what the price is. And if it's the fastest price is sorta what you get to set.
 

SolidQ

Golden Member
Jul 13, 2023
1,485
2,425
106
Best Selling GPU from Newegg
Poeple love Asrock?
3d2f98e5f979ae4471058e91c8ef730d.png
 
  • Like
Reactions: DAPUNISHER

Vikv1918

Member
Mar 12, 2025
34
101
66
'competitive' is for suckers.
You gotta win.
Here's the problem, nvidia will then just release some new proprietary feature, like proprietary neural shaders or something and it will run like crap on AMD, and this new feature would suddenly become the most important thing ever and the narrative would be "sure 10090 XT beat 6090 in outdated games but in the new hotness 6090 is 20% faster" and we get back to square one.
 

ToTTenTranz

Senior member
Feb 4, 2021
579
1,034
136
Here's the problem, nvidia will then just release some new proprietary feature, like proprietary neural shaders or something and it will run like crap on AMD, and this new feature would suddenly become the most important thing ever and the narrative would be "sure 10090 XT beat 6090 in outdated games but in the new hotness 6090 is 20% faster" and we get back to square one.
It's exactly this.
If Nvidia doesn't win at the current batch of benchmarks, they'll just push all reviewers to focus exhaustively on this new, suddenly-super-important feature that their newest architecture excels on.

They have enough leverage over reviewers and developers to not let a GeForce FX situation happen ever again.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,919
6,587
136
That the 3060 selling for $300, 5 years after release, is mind-boggling.
Using old and slow GDDR6 and a chip made on Samsung 8nm, Nvidia's (and OEMs') margins on this must be sky high.



It's probably just the cheapest 9060 XT 16GB around.

And chumps say that "AMD needs to compete on price". Well you can get a 6600 for cheaper than a 3060. Hell a 600 and 3050 used to be priced the same. Guess which sold by the boatload?
 
  • Like
Reactions: madtronik

adroc_thurston

Diamond Member
Jul 2, 2023
6,516
9,171
106
Here's the problem, nvidia will then just release some new proprietary feature, like proprietary neural shaders or something and it will run like crap on AMD, and this new feature would suddenly become the most important thing ever and the narrative would be "sure 10090 XT beat 6090 in outdated games but in the new hotness 6090 is 20% faster" and we get back to square one.
None of that matters when AMD can ship 2x the shader core count of NV.
If Nvidia doesn't win at the current batch of benchmarks, they'll just push all reviewers to focus exhaustively on this new, suddenly-super-important feature that their newest architecture excels on.
This only works if AMD plays fair.
They don't have to!