Discussion RDNA4 + CDNA3 Architectures Thread

Page 36 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,599
5,762
136
1655034287489.png
1655034259690.png

1655034485504.png

With the GFX940 patches in full swing since first week of March, it is looking like MI300 is not far in the distant future!
Usually AMD takes around 3Qs to get the support in LLVM and amdgpu. Lately, since RDNA2 the window they push to add support for new devices is much reduced to prevent leaks.
But looking at the flurry of code in LLVM, it is a lot of commits. Maybe because US Govt is starting to prepare the SW environment for El Capitan (Maybe to avoid slow bring up situation like Frontier for example)

See here for the GFX940 specific commits
Or Phoronix

There is a lot more if you know whom to follow in LLVM review chains (before getting merged to github), but I am not going to link AMD employees.

I am starting to think MI300 will launch around the same time like Hopper probably only a couple of months later!
Although I believe Hopper had problems not having a host CPU capable of doing PCIe 5 in the very near future therefore it might have gotten pushed back a bit until SPR and Genoa arrives later in 2022.
If PVC slips again I believe MI300 could launch before it :grimacing:

This is nuts, MI100/200/300 cadence is impressive.

1655034362046.png

Previous thread on CDNA2 and RDNA3 here

 
Last edited:

Tuna-Fish

Golden Member
Mar 4, 2011
1,345
1,524
136
Aren't we stuck in sort of a classic chicken and egg scenario?

Dev's are hesitant to go all in on RTRT because the hardware isn't there, and the hardware side is hesitant to go all in on RTRT thanks to the massive backlog of raster games and the dev's are just painting RT over their raster effects.

Hardware would have to move first IMO, dedicate a larger proportion die space to RTRT and shrink the share of raster only resources aggressively from gen to gen.

Full RTRT GI won't be done (other than maybe a few individual games) until there are consoles that are capable of it. And as soon as there are, that will be the new minimum requirement, first for console ports and then for all games.

Based on the recent FTC leak, MS intends the next Xbox (not the pro one, but the next full revision) to do that, in like 5 years. Until then, RT is used only for fancy "add-on" effects, that you can turn off, devs must also ship a working non-rt version.

And the big reason why RTRT GI matters is not that it's prettier, but that while you can do pretty things with static lighting, that requires more dev work. When you don't need to do any of that anymore, you can achieve a prettier game with the same dev resources. But you can't do that until you can assume low end has it too, which means consoles must have it.

AMD put the infinity cache on the MCMs, which are using N6 so they aren't eating a lot of extra cost for little scaling. I don't know if it's necessary, but using v-cache on those seems like a good way to increase the cache size and give users a reason to drop extra $$$ on a model that's got extra bells and whistles.

The design I still really want to see is a base N6 die with cache, memory controllers and other low-speed stuff with a GPU die stacked on top. Might require backside power for the top die, so not until N2P, I think?
 
  • Like
Reactions: moinmoin

SolidQ

Senior member
Jul 13, 2023
273
293
96
TVs are 4k bro.
doesn't matter if you sitting 2+ meters. You won't notice difference between 1440 and 4k. Even alot my friends saying that 4k = waste and not noticable, if you sitting 2+ meters

In raster? Hell yea it can.
~40-45 fps can?
8edecb255e238b47fd6ca254e6c93a5c.png

8eed5a3bfe9a3b991ee1756b03dc677f.png
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
it's not anecdotal it's real world. If you have at home 4k TV(avg tv 50-55 inch), you can test it in real time. Don't need pixel hunting like DF with 4x zoom etc
While I agree with you that people especially with Large TV tend to sit too far from them, I for sure can tell the difference between 1080p movies and 4k ones on a 55" TV sitting approx 3 meters away. But that is a bigger difference than what you where talking about

in contrast I sit far closes than 2 meters from my PC monitor. it's more like 60 cm or less. I'm still on 1080p and debatting if I should go 4k for text clarity and longevity or 1440p and save a ton on the gpu side.
 

Tuna-Fish

Golden Member
Mar 4, 2011
1,345
1,524
136
Way too expensive for 500 buck boxes.

It will happen.

That's a pretty bad target in the age of 4k TVs.

1440p isn't the res you'd want to target in 2027.

Nobody gives a damn about resolution, when the tradeoff is getting full RTRT GI. The future is RT lighting at lower resolution and enough FSR/DLSS to make it look good.

To expand: The current consoles could run games at 4k native, if they scaled back visuals to a level where the hardware was capable of it. They don't do that, for the simple reason that pixel quality matters a lot more than pixel quantity.

RT GI is just one further step on that path, with the twist that the primary thing being sought for is not just pixel quality, but pixel quality per invested dev resources.
 
  • Like
Reactions: psolord and SolidQ
Mar 11, 2004
23,050
5,514
146
It will happen.



Nobody gives a damn about resolution, when the tradeoff is getting full RTRT GI. The future is RT lighting at lower resolution and enough FSR/DLSS to make it look good.

To expand: The current consoles could run games at 4k native, if they scaled back visuals to a level where the hardware was capable of it. They don't do that, for the simple reason that pixel quality matters a lot more than pixel quantity.

RT GI is just one further step on that path, with the twist that the primary thing being sought for is not just pixel quality, but pixel quality per invested dev resources.

I doubt it happens before game streaming starts eating the market alive. Heck, I bet they switch to just using AI generation instead even before that happens.

That's not even remotely close to being true. The future Nvidia wants, but they keep having to develop more and more cheats to make even the current half-baked RT method even remotely viable and they're getting more and more pushback because they keep relegating that to their newest models, while pricing them higher and higher. Many games offer options for the gamer to decide even on consoles now. Higher quality at lower frame rates or higher framerate but lower quality. Most games were focusing on smoothness more than either pixel quality or quantity because that's often what gamers want the most.

Yes, yes, people like you have been saying that for years, literally since RTX/RT was foisted upon us. Years later, that's still not been achieved, and in fact because of all the other stuff being needed to compensate for the massive performance loss means more work. More and more games are having more and more issues and a lot of it is because devs are being asked to do more and more, despite the insistence of how easy it is for them to allegedly just check boxes or other to enable this stuff. And if they don't do it then we get people griping about supposed anti-competitive practices, whilst they ignore how it actually leads to other issues like performance and/or image quality issues for lower VRAM (because it takes more time and effort for devs to optimize for lower VRAM).
 
  • Like
Reactions: moinmoin

Aapje

Golden Member
Mar 21, 2022
1,355
1,821
106
Nobody gives a damn about resolution, when the tradeoff is getting full RTRT GI. The future is RT lighting at lower resolution and enough FSR/DLSS to make it look good.
No, because upscaling works better if you have a higher resolution. What makes most sense is for consoles to aim for at least 4K/60Hz with upscaling. I just checked my local tech site and 49/50 of the most popular TVs have 4K screens.

With FSR 3 or DLSS 3, 60 Hz is still only an input latency of 30 Hz.
 

Timorous

Golden Member
Oct 27, 2008
1,595
2,707
136
No, because upscaling works better if you have a higher resolution. What makes most sense is for consoles to aim for at least 4K/60Hz with upscaling. I just checked my local tech site and 49/50 of the most popular TVs have 4K screens.

With FSR 3 or DLSS 3, 60 Hz is still only an input latency of 30 Hz.

And with a TVs frame interpolation you can get that sweet 4K/120 experience....
 
  • Love
Reactions: SteinFG

Ajay

Lifer
Jan 8, 2001
15,382
7,821
136
No, because upscaling works better if you have a higher resolution. What makes most sense is for consoles to aim for at least 4K/60Hz with upscaling. I just checked my local tech site and 49/50 of the most popular TVs have 4K screens.

With FSR 3 or DLSS 3, 60 Hz is still only an input latency of 30 Hz.
Do you mean 30 milliseconds or 1/30 seconds?
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Nice interview, I really want to see what MI300 can really do and the future of AI is definitely the way things will evolve in IT and in tech generally.


 

Frenetic Pony

Senior member
May 1, 2012
218
179
116
doesn't matter if you sitting 2+ meters. You won't notice difference between 1440 and 4k. Even alot my friends saying that 4k = waste and not noticable, if you sitting 2+ meters

That's just from the fact that most content, video and games, isn't designed for 4k yet.

Human vision is insanely detailed at the center, we'd need 12k screens to match it. But "content" takes so long to catch up hardware it'd be 10x more pointless than 4k is today. A lot of movies aren't actually in 4k still (there's this entire pipeline with "intermediaries" and it all needs to be 4k and, blah blah blah) and rendering games at 4k + TAA comes appreciably close to matching movie image quality at 1080p. So sure, you'd need a 4090 to run high quality 1080p today.

Maybe on the PS6 in docked mode 4k will look pretty good, if "docked mode" can run 90w. Or maybe the Xbox 1080 X will run double a 4090 at 400w or something.