fastandfurious6
Senior member
- Jun 1, 2024
- 609
- 761
- 96
RDNA5 must be a really good architecture
This sentence awakens a sudden clarity in me... third eye wide open 👁️ 👁️ 👁️
RDNA5 must be a really good architecture
How the hell do I ship shit much with 30% shrink
I suspect Nvidia will be cutting L2 Cache for their dies in a similar way AMD is; with 36 Gbps GDDR7 there is little need for a lot of L2 cache as with Lovelace & Blackwell. I mean I feel RTX 60 is going to be on Samsung's SF2X (AFAIK on par with TSMC N3 nodes, which is 1.6x or so the logic density over N5 itself) with rumours of Nvidia been talking to them:Really 288 SMs? Thats 1,53x the SM count on RTX 6000 Pro
I think the list ain't final, like AMD could do different SKUs with AT0, AT2 etc. I mean RTX 5090 specs was going to be 160 SMs on a 448-bit bus until Nvidia boosted that up to 170 SMs & 512-bit bus.AT0 XL is a gaming card. If we're to believe the list that first named the AT0 codename, then we should believe what the list says.
I mean I feel RTX 60 is going to be on Samsung's SF2X (AFAIK on par with TSMC N3 nodes, which is 1.6x or so the logic density over N5 itself) with rumours of Nvidia been talking to them:
![]()
[News] Samsung Reportedly in Talks with NVIDIA, Broadcom to Ship Custom HBM4 in 1H26 | TrendForce
Amid rumors of Google moving away from its HBM3E and NVIDIA’s looming test in June, Samsung looks set to turn the tide with its next-gen HBM. Accordin...www.trendforce.com
No one's going back to Samsung since Samsung can't yield.I think if nVidia does go back to Samsung, it would be for a dual source strategy where the low end parts are fabbed at Samsung.
No one's going back to Samsung since Samsung can't yield.
When's the last time a small enough GPU was relevant?Maybe if it's small enough?
Why bother engaging a different PDK to do a tiny ass irrelevant part?Maybe if it's small enough?
Why bother engaging a different PDK to do a tiny ass irrelevant part?
Just as fine on a TSM node. better, even.They sell a boat load of those on mobile.
Why bother engaging a different PDK to do a tiny ass irrelevant part?
That was way back when design costs were cheaper, and cycles shorter.Did you tell nvidia they did a mistake in engaging with Samsung to make the tiny ass Pascal GPUs while the bigger ones were on TSMC?
how likely is this SKU to release?If they are truly targeting the RTX6090 with a 154CU 384b 380w part
Ask Lisa
So AT0 does really exist? That said why is there no AT1 documented? Was AT0 & AT2 chosen as examples of what they're developing? Really also what Nvidia may do for their top die.Ask Lisa
Put AMD logo on it and email to MLID to see if he makes another video confirming his leak...I may write out a diagram of what I think RDNA 5's ATx dies & various client dies
SoI may write out a diagram of what I think RDNA 5's ATx dies & various client dies may look like to explain to much people as possible, scan it and upload it here and other places. Just so it gives a speculative but educated view of how these ATx dies & various client dies may be combined (think a many-to-many relationship as seen in Databases, yes that's my old uni-computer science database course being of some use lol).
IDK, I don't think AMD will do more than 2 SKUs for AT3 on dGPU desktop. Like 48 and maybe 36-40 CU wise. And maybe instead of N3P they'll fab it on N3C it keep the costs down so 48CU, 12GB = $300 WW (US Tariffs make everything harder to estimate) and 40-36 CU, 9GB = $225 WW.So
48 CU AT3 xt 12gb = 16gb n48 xtx == $450?
40? CU AT3 xl 12gb = 16gb n48 xt == $400??
32?? CU AT3 le 9gb (or 16gb??) = 12gb n48 xl == $350???
Does that mean the n44 xt 16gb (9060xt) will have a very long run at a price around $300 ????
(& n33 below at $250 ? )
Lol, he could have at least put the effort in by recreating a clean chart himself rather than just pushing out that low res image with his stupid banners all over it.Put AMD logo on it and email to MLID to see if he makes another video confirming his leak...
I know this may be against forum niceness and etc., but sometimes it feels like an extra reaction icon (for example "stupidity") underneath a post wouldn't be inappropriate. Excuse me my offtopic (and this doesn't refer to the post right above)