Discussion RDNA 5 / UDNA (CDNA Next) speculation

Page 35 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Win2012R2

Golden Member
Dec 5, 2024
1,089
1,125
96
Depends on your definition of crap but should be about 60 fps.
60 FPS at 4k (even native) is crap on 4090, using 9800X3D too, so no CPU bottleneck. How could they get to that level of FPS being inside mine, which isn't even open world? Amazing stuff.

And I am not even talking about stuttering etc typical for UE5 games, and this isn't RT/PT scenario which I accept runs crap on any hardware right now.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
8,177
9,471
136
Y'all seen this? Rubin GPU arc is getting a delay due to a supposed "reworking" due to expected competition from AMD MI 400/450 series. Sounds like some rubbish to me, but maybe theres something to it.


-What kind of "reworking" is going to happen in a few months?

If it is in fact delayed I have to assume something is wrong with the arch and it isn't performing like it's supposed to.

Respin with bug-fixes, not some sort of mythical pressure from AMD.

Edit: Article says "debunked" so nvm on all that.
 

Magras00

Junior Member
Aug 9, 2025
11
26
46
yeah it is lmao.
Unless you have some frametime to spare.
Was not talking about making the implementation heavier but additional training = fine wine. Look at DLSS 2 -> DLSS 3.7. 4.5 years of improvements. FSR4 Hybrid CNN + ViT or NVIDIA's DLSS TF (likely also CNN + ViT) are still very new and will improve over time.

More noise and more blur? sure why not.

I always wanted 'real-time' lighting implementations to have 240+ frames of accumulation lag.

There are techniques underway to adress noise and accumulation lag. Look for stuff related to I3D, SIggraph and Eurographics. Better pixel sampling and resampling estimators (ReSTIR derivatives) algorithms that adress the grainy visuals rapidly without ray denoisers at same ms cost. The Intel technique from the June blog makes standard ReSTIR look crap.

Who said anything about blur? Last time I checked DLSS 4 upscaling was applauded by basically everyone in the fuckTAA subreddit and other places as a "TAA deblur filter".

No reason any non NR gfx software techniques can't be employed on PS5/Pro too.

PS5 OG is limited by RDNA2's far inferior matrix + ML ops handling relative to RDNA4 in PS5 Pro, so any NR techniques employed are likely to be hidebound to the latter console, increasing its attractiveness prior to PS6's release, so long as game devs actually take advantage of the hw feature set.

Referencing how things will be post-crossgen with PS6 as baseline but sure a two-tiered approach is very likely as in prev crossgen periods.

Maybe a tad too pessimistic regarding most games. If AMD, NVIDIA and Sony wants to showcase their features nextgen and beyond then perhaps enhanced features from PC ports can carry over to PS6 versions of games so even games in crossgen era gives us a taste of what PS6 can do besides the few first party games in the crossgen era.
 
  • Like
Reactions: Tlh97 and Win2012R2

Win2012R2

Golden Member
Dec 5, 2024
1,089
1,125
96
It's laggy as hell lmao, and looks generally pretty blurry on low albedo materials.
Why won't you do a better job and show everyone how it should be done then?

The direction of travel is pretty clear - it's RT, there is nothing else viable.

Indiana looked fine to me on 4090 at 4k, I think I did not even bother upscaling it.

we have like a billion of conetracing schemes for goodnuff GI.

Can RT hardware be used for that too?
 
Last edited:
  • Like
Reactions: Magras00

adroc_thurston

Diamond Member
Jul 2, 2023
6,259
8,813
106
Why won't you do a better job and show everyone how it should be done then?
The trick is not doing it.
The direction of travel is pretty clear - it's RT, there is nothing else viable.
IHVs gotta sell new hardware you know.
Doesn't mean it's a good idea.
Indiana looked fine to me on 4090 at 4k
I think we have different baseline expectations.
Can RT hardware be used for that too?
Nnnnnope.
 

gdansk

Diamond Member
Feb 8, 2011
4,368
7,341
136
Yeah, I wholeheartedly agree - you should try what you preach though, like try not being an arsehole for a change? You probably hear it a lot.
In this case he has a point, why force a technology before HW is ready? Consider early 3d graphics. I find many of these painful compared to late 2d games. It was a step back in many ways. Today you really have to like the gameplay to get over it.

I'm not sure if that's the case for garbage rtgi but if it lags behind the simulated world it could be distracting and effectively not any more realistic because in real life light doesn't lag, it's pretty fast.
 
  • Like
Reactions: marees

Saylick

Diamond Member
Sep 10, 2012
3,955
9,231
136
You have to start somewhere.
Especially in Nvidia's case, where the only way for them to sustain revenues and their marketshare is to introduce new technologies relatively early on in the adoption cycle because they know it takes a long time between introduction to mass adoption. If they didn't start early, they would see a hiccup in sales during the transition period because consumers have less desire to upgrade, which is a big no-no for them.