Discussion RDNA 5 / UDNA (CDNA Next) speculation

Page 72 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

adroc_thurston

Diamond Member
Jul 2, 2023
7,812
10,530
106

RnR_au

Platinum Member
Jun 6, 2021
2,767
6,264
136
And it could get even worse...
NVIDIA is rumored to be changing its approach by supplying only the raw silicon to its AIC partners, rather than bundling its usual GPU and memory kit. Typically, GPU manufacturers like NVIDIA, AMD, and Intel provide their GPU dies with GDDR memory to AIC partners as a kit. These partners then solder the components onto their custom PCBs, often modifying the layout and cooling systems. However, according to Golden Pig Upgrade, NVIDIA might halt this practice and supply only the bare silicon dies due to memory shortages affecting the company's ability to meet orders.
From https://www.techpowerup.com/343363/nvidia-may-stop-bundling-memory-with-gpu-kits-amid-gddr-shortage

Small players might not have alot of pull to get vram. They might exit or skip a gen. And if they could get vram, it would be very very expensive.

Buy now and hold for a few years.
 

soresu

Diamond Member
Dec 19, 2014
4,187
3,662
136
It's not a question of being about something as just talking.

Channels like RGT, Gamer Meld, MLID etc are all guilty of the same subject retreading thing for content production sake.

More videos = more engagement = more ad clicks.

If they don't have actual new stuff to talk about they just spitball some speculative subject matter that allows them to go back over everything they already talked about.
 

marees

Platinum Member
Apr 28, 2024
2,016
2,641
96
Computex 2027 — RGT

RDNA 5 could be launched at Computex 2027 — RGT rumor​

Computex 2027 is possibly the new date for the RDNA 5 launch. Computex is an annual tech conference and one of the biggest of the calendar year; at Computex 2025, AMD announced the RX 9060 XT, launching it a couple of weeks later. We’re accustomed to seeing brand-new generations launch earlier in the year, often at CES, but memory delays may prove to push RDNA 5 back. The dates for Computex 2027 have yet to be announced, but it usually takes place in late May or early June.

 

marees

Platinum Member
Apr 28, 2024
2,016
2,641
96

RDNA 5 could be launched at Computex 2027 — RGT rumor​

Computex 2027 is possibly the new date for the RDNA 5 launch. Computex is an annual tech conference and one of the biggest of the calendar year; at Computex 2025, AMD announced the RX 9060 XT, launching it a couple of weeks later. We’re accustomed to seeing brand-new generations launch earlier in the year, often at CES, but memory delays may prove to push RDNA 5 back. The dates for Computex 2027 have yet to be announced, but it usually takes place in late May or early June.

Although going by precedents AMD waits for nvidia to go first. RGT think nvidia won't be readt until H2 2027 (or even Q1 2028)
 

marees

Platinum Member
Apr 28, 2024
2,016
2,641
96
Although going by precedents AMD waits for nvidia to go first. RGT think nvidia won't be readt until H2 2027 (or even Q1 2028)
If AMD went by commonsense then they would launch the lpddr5x/lpddr6 based AT3 (60xt) & AT4 (50xt) first before gddr7 based AT2 (70xt)
 

Tigerick

Senior member
Apr 1, 2022
911
829
106
Updated RDNA5 Lineup Speculation

View attachment 134031

  • As explained in SWV thread, RDNA5 will get double SP per CU. Thus AT2 with max 70CU will get 140CU in old format. That's explain 20% faster performance than RTX4080. It also means AT2 GPU is severely bounded by memory bandwidth.
  • Therefore, AMD do not need to clock as high as RDNA4, I am expecting 2GHz+ not ~3GHz. It also means AT2 has headroom to grow. That's why I suspect AMD is reserving XTX model with full die of 70CU for future 40Gbps and 4GB GDDR7 die to appear. That explains the cancellation of AT1 cause AT2 XTX is good enough to compete with upcoming Rubin-70Ti with 24GB 256-bit memory bus.
  • There are leaks saying RDNA5 dGPU will be released in Q2 next year. I am actually expecting early announcement in Q1. Thus, the cancellation of RTX-50 Super make senses because it will be bloodbath for NV :p: no amount of overclocking will save RTX-50 Super series. NV needs to speed up the release of Rubin dGPU. If Rubin dGPUs are indeed fabbed by 3N (variant of 3X), then the earliest release date would be Q3 next year. That gives AMD early head up of next gen dGPU war.
  • That's why I am predicting AMD will set higher price point for AT2-70XT and AT2-70. AMD will keep selling RX9070XT until NV able to launch Rubin-60.
  • AMD most likely will keep selling N48 in the form of 9070GRE by then. And no, AT3 and AT4 are NOT for cheap dGPU lineup, period. Now that we know Medusa will have XDNA3, where do you think the NPU will reside in AT3, huh? ;)

Hoho, so much headroom for RDNA5 lineup: SK Hynix has presented 48Gbps GDDR7 memory die. :cool: My assumption of 40Gbps for AT2-XTX is so far behind: Anyone wants to calculate FP32 with updated memory BW? That explains why AMD don't use full die of AT2 as XT cause AMD knows about GDDR7 roadmap ahead of us.

And I hope you guys know about N3P improvement over N4P. AMD is not retreating from high end GPU market in RDNA4 generation. AMD knows in order to feed all the memory bandwidth, N4P would require much bigger die area and power TDP. NV's meh Blackwell performance proof it. That's why AMD is waiting for N3P node and next gen GDDR7 to be ready...

As for AT3:
AT2-GRE (48CU) - 160-bit GDDR7 (36Gbps - 720 GB/s) vs
AT3 (48CU) - 384-bit LPDDR6 (14.4 - 546 GB/s) :p

I am using fastest and wider LPDDR6 versus slowest and narrow GDDR7; it is up to you to believe which one is more economical and faster. I am done explaining, if you still don't get it then wait for official announcement which might happen sooner. ;)
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,989
7,395
136
I was thinking about this... if the PS6 is really using 160-bit, it's most likely 10x2 GB Clamshell... and will switch to 5x4 GB chips when they become available. I have serious doubts that anything higher than 4 will end up being available for GDDR7.
 
  • Like
Reactions: marees

CakeMonster

Golden Member
Nov 22, 2012
1,644
821
136
30GB would be nice. Imagine 24GB at the tail end of the next gen, around 2035, that would probably be a miserable experience. Even if you can do AI texture magic, you will probably also have lots of ML/AI models that would need to be loaded in memory.

(And I know we've been through this before in this thread, but I guess I'm not super optimistic that we'll be able to migrate to ideal VRAM lightweight AI powered engines and routines so quickly)
 
Last edited: