Discussion RDNA4 + CDNA3 Architectures Thread

Page 89 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,622
5,886
136
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:

Mahboi

Senior member
Apr 4, 2024
423
714
91
GP are stupid and someone with an eternal chip on his shoulder sure knows it. RDNA3 has actually performed okay in spite of being meh in every way.
I'm not going to entertain this back and forth forever, but I'll at least entertain this.

RDNA 3 comes after RDNA 2 and I believe there is a post-generation echo effect. Lots of nerds will pay attention to parts and more or less directly shill them to friends and online, but a ton of non-nerds don't buy or don't pay that much attention right away. However the time comes for them to actually buy parts eventually, 2, 3, maybe 4 years later.
I assume that one of the reasons for RDNA 3's mild success is that RDNA 2 got overall really solid press. It's not just LTT and some others shilling AMD, it has been a general movement of positivity. RDNA 1 also mildly participated in this.

The other main reason is that, well, duh. Jensen knew there was nothing to fear from RDNA 3 and didn't want to sell Lovelace. He wanted market retention, not market servicing. We got very cut down cards like the 4070 sold for $600. A 4070 Ti had the gall of selling for $800 and offering as much VRAM as a $350 previous gen 6700 xt. Even for Nvidia's extreme gimping, this is pushing it.
I have a complete NV drone friend who can't stop shilling them, and even he was disgusted when the $1200 and $1600 prices were announced for the 4080/90.

RDNA 3 has enjoyed a very mild success because Nvidia literally didn't want to compete. That's what really happened here. Had RDNA 3 not been power wrecked, Jensen would've lowered all prices by 20%, released a 4090 Ti (admittedly may have had serious power problems cause of the cable tee hee) with 10% more CUs active, and that would have been this gen.
NV is behind in what they can cram into a single package, cramming more into a single device has been the #1 focus of AMD R&D ever since Su and Papermaster took over.
I will concede on that.
I feel like people are blowing out the "muh chipletzzzzzz" side of AMD out of proportion, if AMD can do it at TSMC, then Intel, Nvidia, Qualcomm, anyone can do it at TSMC. But it is undeniable that they are ahead by at least a few years, and the scalability of it will not be matched by any size of monolithic die.
I think NV can probably, if they hit a brick wall with monolithic, go to chiplets pretty quick, but they'll definitely be lacking behind in the same way that AMD is lacking behind them in say, raytracing.

The only thing I find pretty absurd with this claim is that people act as if "Nvidia cannot into chipletzzzz lawl". Friends, it's Nvidia. They can. They're not led by a donkey. And they have way more money than they need to hire away the right people. If Nvidia hasn't started extensively investing into chipletisation yet, it's because they think it's not worth the investment, not because they can't.
Which leads me to believe that the whole "MOAR CHPIPLETS MOAAAAR" idea is probably not necessary for NV's plans, or is seen as a wild goose chase that they can afford to wait on. Please remember that several techs are just rolling out in the space. I've heard more than a bit about how UCIe for example is just not worth the money Intel has poured into it, while AMD was content with sticking to CXL. However, UCIe will evolve into something better and possibly kick out CXL. That could very very well be the thinking at Nvidia, I'd even bet some money on it. I think deep in there, between the Grand Priests of DLSS yapping like mad and the chants to the God Emperor Jensen, there is a 5 year long timetable with the start of chiplets around RDNA 5 to counter whatever AMD's doing.

(oh hey I've finally found again THAT Forrest Norrod interview that I can never find when I need it)
It is a GDDR DC accelerator part and a halo client part, so any risk of flopping has been covered. Things in semicon take time, remember a year ago the market tanked badly and anybody who invested hard took a big hit. Patience is a virtue, I have zero doubts that the greater wisdom is not on the side of a guy who cannot help but be envious of what isn't his.
I don't know.
Nvidia took over the top spot when their software/drivers proved far more invested than ATI's.
Not that much has changed. Yes the parts can be killer. But I wouldn't be shocked at all if there is a general worry about the software side. Likely not so strongly for enterprise, I suppose, Uzzi already mentioned that they're quite solid there, but I have some healthy skepticism. They are years behind CUDA in all the fine little details, and they can't catch up by RDNA 5. It'll be the hammer vs the chisel. And the chisel may yet prove more interesting for a lot of tasks. I honestly don't expect AMD to be at full software parity or near parity until 2028 or beyond.
In summary, one company wants to take what people do and do it better than anyone else, and the other wants to people to do what it wants them to do so nobody else can.
It really is the fundamental difference in psychology between the CEOs, my beef will always be with JHH for being a flawed human being, his company has always been an outward reflection of him and so I support anybody who wants to bring them back down to earth, that is where I stand.
Yes a lot of people hate Nvidia because it is the mirror of Jensen and Jensen loves to be the Great Dictator.
His zealots are much the same, losers who so desperately want to be on the winning team, that they will sacrifice all moral and ethical concerns to do so, because that totally never backfires, right?
As someone that understands marketing and mass psychology quite well, the way Nvidia creates FOMO and weaponizes their shills into attacking anyone that's threatening Nvidia's narrative is sickening.
I'm also in that camp.
I just don't rationalise that AMD is somehow doing it right because of it. Jensen is good. Lisa is good too, but I doubt that she'll be as good as him when it comes to biting down hard and taking the fight frontally.
 
Aug 4, 2023
181
383
96
I'm not going to entertain this back and forth forever, but I'll at least entertain this.

RDNA 3 comes after RDNA 2 and I believe there is a post-generation echo effect. Lots of nerds will pay attention to parts and more or less directly shill them to friends and online, but a ton of non-nerds don't buy or don't pay that much attention right away. However the time comes for them to actually buy parts eventually, 2, 3, maybe 4 years later.
I assume that one of the reasons for RDNA 3's mild success is that RDNA 2 got overall really solid press. It's not just LTT and some others shilling AMD, it has been a general movement of positivity. RDNA 1 also mildly participated in this.

The other main reason is that, well, duh. Jensen knew there was nothing to fear from RDNA 3 and didn't want to sell Lovelace. He wanted market retention, not market servicing. We got very cut down cards like the 4070 sold for $600. A 4070 Ti had the gall of selling for $800 and offering as much VRAM as a $350 previous gen 6700 xt. Even for Nvidia's extreme gimping, this is pushing it.
I have a complete NV drone friend who can't stop shilling them, and even he was disgusted when the $1200 and $1600 prices were announced for the 4080/90.

RDNA 3 has enjoyed a very mild success because Nvidia literally didn't want to compete. That's what really happened here. Had RDNA 3 not been power wrecked, Jensen would've lowered all prices by 20%, released a 4090 Ti (admittedly may have had serious power problems cause of the cable tee hee) with 10% more CUs active, and that would have been this gen.

I will concede on that.
I feel like people are blowing out the "muh chipletzzzzzz" side of AMD out of proportion, if AMD can do it at TSMC, then Intel, Nvidia, Qualcomm, anyone can do it at TSMC. But it is undeniable that they are ahead by at least a few years, and the scalability of it will not be matched by any size of monolithic die.
I think NV can probably, if they hit a brick wall with monolithic, go to chiplets pretty quick, but they'll definitely be lacking behind in the same way that AMD is lacking behind them in say, raytracing.

The only thing I find pretty absurd with this claim is that people act as if "Nvidia cannot into chipletzzzz lawl". Friends, it's Nvidia. They can. They're not led by a donkey. And they have way more money than they need to hire away the right people. If Nvidia hasn't started extensively investing into chipletisation yet, it's because they think it's not worth the investment, not because they can't.
Which leads me to believe that the whole "MOAR CHPIPLETS MOAAAAR" idea is probably not necessary for NV's plans, or is seen as a wild goose chase that they can afford to wait on. Please remember that several techs are just rolling out in the space. I've heard more than a bit about how UCIe for example is just not worth the money Intel has poured into it, while AMD was content with sticking to CXL. However, UCIe will evolve into something better and possibly kick out CXL. That could very very well be the thinking at Nvidia, I'd even bet some money on it. I think deep in there, between the Grand Priests of DLSS yapping like mad and the chants to the God Emperor Jensen, there is a 5 year long timetable with the start of chiplets around RDNA 5 to counter whatever AMD's doing.

(oh hey I've finally found again THAT Forrest Norrod interview that I can never find when I need it)

I don't know.
Nvidia took over the top spot when their software/drivers proved far more invested than ATI's.
Not that much has changed. Yes the parts can be killer. But I wouldn't be shocked at all if there is a general worry about the software side. Likely not so strongly for enterprise, I suppose, Uzzi already mentioned that they're quite solid there, but I have some healthy skepticism. They are years behind CUDA in all the fine little details, and they can't catch up by RDNA 5. It'll be the hammer vs the chisel. And the chisel may yet prove more interesting for a lot of tasks. I honestly don't expect AMD to be at full software parity or near parity until 2028 or beyond.

Yes a lot of people hate Nvidia because it is the mirror of Jensen and Jensen loves to be the Great Dictator.

As someone that understands marketing and mass psychology quite well, the way Nvidia creates FOMO and weaponizes their shills into attacking anyone that's threatening Nvidia's narrative is sickening.
I'm also in that camp.
I just don't rationalise that AMD is somehow doing it right because of it. Jensen is good. Lisa is good too, but I doubt that she'll be as good as him when it comes to biting down hard and taking the fight frontally.
Well yes, the argument we have at the end of the day is whether AMD's approach is the right one.
Opinions get found out eventually, what really matters is the knowledge people have in the debate.
NV is frankly a very predictable, boring company to analyse, yet so many analysts try to dig deeper into them then is really there. A lot like Apple hehe.
Sure they have a huge amount of software to look at and I'm sure some of that will end up being more important than I credit, but the overall picture is quite simple.
The funny part to me is that there are few NV fanboys who have a proper grasp of what AMD is doing, frustrating at the least.
I wish I could say something about Intel but... I dunno, they are kinda all over the place, outside of their volumes and long term fab plans (go pure-play already) they are kinda just there for now.

I think we can get back on topic now.
 
  • Like
Reactions: Tlh97 and Mahboi

Tup3x

Senior member
Dec 31, 2016
969
954
136
I doubt the Navi33 on N6 was supposed to reach much higher clocks than the RDNA2 predecessors on N7, considering the massive jump in clocks these had already brought compared to RDNA1.
How the RX 7600 XT on N6 averages >2.7GHz in games, almost +1GHz over the RX5700XT in N7, sounds super impressive to me. According to TSMC, N6 vs N7 should provide only 7% higher clocks at ISO power.

The big problem is how the N7->N5 transition should have brought at least +15% clocks and then N31/N32 had ~+0% over N21/N22.


Now if RDNA4 solves the clock problems that RDNA3 had, plus if it's using N4X, then we should be looking at +15%^2 between RDNA2 and RDNA4.
So if RDNA2 at N7 could average at ~2.4GHz with ease, then the RDNA4 chips on N4X should average (not boost) at 3.1GHz.




AMD wasn't a quitter in 2008 when they held back on a top-end product but launched the cost-focused RV770 cards that wouldn't compete on the high-end against the massive Tesla GT200 with a 512bit bus. On the contrary, it was a period when AMD gained a lot of marketshare for releasing a product with much better price/performance than the competition.

They weren't quitters when they launched the Polaris family in 2016 that was actually pretty well received and also never had anything to compete in the high-end.

And they weren't quitters again when they launched the RDNA1 family in 2019 with their highest-end at $400 that had nothing to compete with the $1200 RTX 2080 Ti.


Interestingly save for the R300 / Radeon 9700 era and the crypto craze anomalies, it's whenever AMD focuses on less SKUs without competing at the high-end that they've been able to recoup marketshare.

View attachment 97710


Of course, not competing at the high end brings problems for brand value, product ASP and raw margins. However AMD desperately needs to increase dGPU marketshare at the moment. Being in the 10-15% marketshare probably makes it very hard to be profitable with spending all the money needed to put new chips out there.
The thing is... NVIDIA has reached to a point where graphics card equals GeForce and even "random" Japanese idol knows that. It's hard to compete when other company has that kind of status. Producing decent cards will not help, they need another R300 and that's probably just good enough for them to maintain their current market share.
p1.jpg
(YouTube)
 

SolidQ

Senior member
Jul 13, 2023
335
344
96
So they need to ship 150+ WGPs in halo
it's only one side. another side alot people complaing it's
- analog for Cuda, ROCm need more improve.
- DLSS analog = FSR AI with DLDSR, DLAA
- Fast RT/PT
- Encoder

no R300 was just bigger.
I remember it, fast shader also was reason to get R3x0
85550a3beac932ea340ac78d0f1d76d5.png

26405539e7580c3d26b3b42311e98ba8.png
 
  • Like
Reactions: igor_kavinski

adroc_thurston

Platinum Member
Jul 2, 2023
2,402
3,351
96
it's only one side. another side alot people complaing it's
- analog for Cuda, ROCm need more improve.
- DLSS analog = FSR AI with DLDSR, DLAA
- Fast RT/PT
- Encoder
all you need is many WGP.
That's it.
The shill points are only worth something until AMD bribes the shills.
I remember it, fast shader also was reason to get R3x0
Yeah but at launch it was genuinely the biggest boy GPU made period.
That's just more MALL. Not very useful in itself.
 

gdansk

Platinum Member
Feb 8, 2011
2,163
2,700
136
I'd say the bigger question is whether they can get people to buy an expensive AMD GPU. Gotta start somewhere tho.
Pretty sure people bought the 6900XT. And that was (IMO) unreasonably expensive at the time.

If AMD can make something faster in raster then I think they can get away with more unreasonable prices even without a shortage.
 

Ajay

Lifer
Jan 8, 2001
15,539
7,906
136
I'd say the bigger question is whether they can get people to buy an expensive AMD GPU. Gotta start somewhere tho.
Given Nvidia’s pricing - that shouldn’t be a problem. Need lots of marketing dollars (and a good strategy), best of breed hardware kits to all influencers and really good day zero drivers. Interviews with top Radeon Brass to pump up the excitement would be nice.
 

linkgoron

Platinum Member
Mar 9, 2005
2,311
824
136
You're justifying AMD's state on circumstances. I'm looking at causes.
The corpo that takes hard chances in a growing market is winning. The corpo that waits for trends and follows is N°2. That's just how an economy works.
AMD has hardware excellence but is holding back on risky investments. They were always like that and outside of trying some really fly things on the technical side, they never really went hard on big ventures. It's just not in their culture.
Tons of companies were second or third to a trend and have won big. Facebook vs Myspace, Chrome vs anything, Zoom was created much later than webex or Skype, Android vs. Windows mobile etc.

Let's be real for a second: if Intel hadn't fumbled the ball so hard they somehow shot it down their own throat, if they weren't choking on bloated cores and bloated internal designs since 5+ years, the game wouldn't have changed one bit. Zen would be a cheaper, decent, reasonable and cost effective arch, and Radeon would be the cheaper GPUs vs Geforce. That's it. Nothing has really changed in their attitude since 2010. They got a huge break with Zen/Zen 2 and onwards, and it is currently their breadwinner in every possible way outside of MI300/AI. Everything else is doing pretty decent, nothing explosive.
AMD's Zen development has been rock solid and great. Much closer to what Nvidia is doing in GPUs than what AMD is doing in GPUs. Developing a rock solid foundation for their stack top-to-bottom. Solid roadmap, solid execution, and nice niche things like Threadripper and interesting experiments like 3d cache. They also had great support and longevity with AM4 and hopefully will have longevity with AM5 as well. They single handedly broke Intel's HEDT by giving 8 and 16 strong cores to the masses, after more than a decade of quad cores.

AMD essentially had the lucky break of Intel getting bloated and careless, and pounced. Since then, everything has been going great for them. But they're not market leader in their heads yet. And certainly not in GPUs.
You're basically describing any small company winning against a big company.
They play it safe mostly, take little risks and hardly invest really heavily.
They gambled on chiplets and TSMC and it worked great, they also created threadripper and 3d-cache. Things that didn't really exist before.

You can pinpoint the exact moment the mindshare machine won.
Good old Maxwell, it was certainly impressive, but even it didn't quite deserve the success it got.
See, when AMD got the console market monopoly, the whole PCMR movement began... Coincidence? I really don't think so, NV certainly had a hand in it.
Basically a mountain of compounding factors at this time absolutely massacred AMD beyond that.
Maxwell was really really great, and could have easily been 15%-20% more performant if Nvidia needed it to. It destroyed AMD in both performance and especially perf/watt, as everything was clocked low because AMD just didn't have anything in response - so Maxwell also had tons of OC headroom. Some 980TIs could gain 30% by OC vs the "default" 980TI, and the 980 was basically better than anything AMD had for a very long while. AMD couldn't outright beat the 980TI until Vega 56/64 which was launched two years later and it still had about on-par power consumption on a much better node (14nm vs 28nm) and HBM2.
 

jpiniero

Lifer
Oct 1, 2010
14,665
5,287
136
AMD's Zen development has been rock solid and great. Much closer to what Nvidia is doing in GPUs than what AMD is doing in GPUs. Developing a rock solid foundation for their stack top-to-bottom. Solid roadmap, solid execution, and nice niche things like Threadripper and interesting experiments like 3d cache. They also had great support and longevity with AM4 and hopefully will have longevity with AM5 as well. They single handedly broke Intel's HEDT by giving 8 and 16 strong cores to the masses, after more than a decade of quad cores.

And yet the client business is barely profitable.