Question Speculation: RDNA2 + CDNA Architectures thread

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
Hard to believe that. XSX can do 380 billion intersection tests per second. Assuming 20 intersections per ray(this is being very generous and it is very scene dependent), that is already 2X the quoted value of Turing's 10 Gigarays/s
And we don't even know how Turing's 10 Gigarays was calculated.
At least the values for XSX are more meaningful.
A proverbial Big Navi could be a different beast compared to the XSX. However, AMD did say the gains are diminishing with the increase of the number of shader resources.

PS:
I am a noob with computer graphics. I am mainly a Linux compute guy... But I have started reading something about this for a bit. In times of dearth of new info and COVID-19, it is just natural progression :)
The 380 billion intersections figure is a simple one. 4 Intersections per CU, 52 CUs, and clock speed all multiplied together (convert the clocks to just Hz).

It's not something that depends on scene or anything, it's literally maximum possible throughput and assumes no memory stalls which is... uh...yeah I don't even really need to say any more.
 

soresu

Platinum Member
Dec 19, 2014
2,617
1,812
136
I won't go in any direction on the topic here. All I can do is to report what has been told to me. There are factors in which it might be... plausible, like "RDNA2's RT capability is 1.5 Times of Turing, per SM". But im not technical enough to even speculate on the topic, so I will let more educated people, like you are, to speculate what could it mean.
I thought it was closer to 3+ times RT perf per TFLOP going by what was said after the XSX spec reveal?
 

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
I don't know. As I have said, I am not technical enough to fathom what that number 1.5 times means.
When you first told me I assumed it meant DXR1.0 performance, because realstically that what people would be testing with right now. I'd heard the Minecraft demo on the Seriex X actually runs closer to 60FPS but dips below, which would put it close to 2080Ti in Minecraft RTX, and I guess if you scaled that up to 68CUs (same as the 2080Ti) you end up with roughly 1.5x perf?
 
  • Like
Reactions: Tlh97 and Glo.

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,619
136
I'd imagine that there is a lot more invested in the game engines ported to Switch that support NVN than any nVidia specific toolchains related to NVN itself - id Tech, UE4, Unity and RED engine to name a few.

If those engines are built well for abstraction it should not be such a great chore to switch to another API (heheh pun), perhaps even a Vulkan derivative now that there is an open low level API to build a custom API from.
It's not about the engines and the low level API, it's about the toolchain and the development tools for it. Before Switch Nintendo's toolchains and development tools have been a mess, especially compared to what Microsoft and Sony are offering.

Nintendo always relied on external companies to deliver development tools which meant there is close to no continuity and consistency. Nvidia is essentially delivering ready made hardware (Tegra X1 in Switch is unchanged to what was on offer before already) as well as the whole toolchain including development tool integration all from one hand. I'm honestly not seeing any company that can rival Nvidia on that right now.
 

soresu

Platinum Member
Dec 19, 2014
2,617
1,812
136
It's not about the engines and the low level API, it's about the toolchain and the development tools for it. Before Switch Nintendo's toolchains and development tools have been a mess, especially compared to what Microsoft and Sony are offering.

Nintendo always relied on external companies to deliver development tools which meant there is close to no continuity and consistency. Nvidia is essentially delivering ready made hardware (Tegra X1 in Switch is unchanged to what was on offer before already) as well as the whole toolchain including development tool integration all from one hand. I'm honestly not seeing any company that can rival Nvidia on that right now.
Is this based on actual dev talk ,or just your own assertions based on what you would expect nVidia to do based on their past behaviour with CUDA?

Because Nintendo as a company don't strike me as foolish enough to get into bed with such a walled garden, lock in type vendor like nVidia without some sort of agreements and reassurances in place for exactly this sort of problem.

The problems that Microsoft and Sony have had with nVidia over their consoles in the past were well documented and with no small amount of bad blood, and Nintendo have already had their own problems with them over security issues with the original TX1 hardware - it stands to reason that Nintendo's lawyers and higher ups have an exit strategy in place unless they took leave of their senses when the contract was being drawn up.

Also, as far as toolchains go, the state of some modern engines like Unreal and Unity is such that they are toolchains (and also pseudo drivers from a Vulkan/D3D12 perspective) unto themselves - so as long as they support the platform it is already good to go for a good many developers unwilling or unable to develop a game engine from scratch for themselves.

The WiiU may have been underpowered compared to XB1 and PS4, but it was not that big a difference - and the Switch itself is a relatively meagre change from WiiU in spec anyway. The PPC based CPU was basically a faster tri core Wii chip - so any well experienced Nintendo dev could work on it, and the AMD GPU was practically off the shelf so again not a huge stretch toolchain wise to develop for.

The sudden difference in the kind of games found on the Switch vs the WiiU has more to do with a change in policy from Nintendo than toolchains and hardware - they are well aware that their home console business could go the way of Sega if they don't diversify from their mostly first party casual gaming model of previous generations, as the dire sales of the WiiU shows.
 
Last edited:

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,619
136
Is this based on actual dev talk
It's based on actual dev talk.

I'm no fan of Nvidia (as a Linux user I abhor their way of making every single thing proprietary) and it's indeed completely unlike Nintendo to let themselves into such a dependency (with history like pulling out of the joint venture with Sony that led to Playstation since the contracts were too much in favor of Sony etc.). But the partnership is what it is.

I don't think there is such a difference between games on Switch and Wii U for that matter, the former is just massively more popular and as such ends up getting plenty "ultimate" versions of games from Nintendo, aside being swamped by good indie games.
 
Mar 11, 2004
23,031
5,495
146
Is this based on actual dev talk ,or just your own assertions based on what you would expect nVidia to do based on their past behaviour with CUDA?

Because Nintendo as a company don't strike me as foolish enough to get into bed with such a walled garden, lock in type vendor like nVidia without some sort of agreements and reassurances in place for exactly this sort of problem.

The problems that Microsoft and Sony have had with nVidia over their consoles in the past were well documented and with no small amount of bad blood, and Nintendo have already had their own problems with them over security issues with the original TX1 hardware - it stands to reason that Nintendo's lawyers and higher ups have an exit strategy in place unless they took leave of their senses when the contract was being drawn up.

Also, as far as toolchains go, the state of some modern engines like Unreal and Unity is such that they are toolchains (and also pseudo drivers from a Vulkan/D3D12 perspective) unto themselves - so as long as they support the platform it is already good to go for a good many developers unwilling or unable to develop a game engine from scratch for themselves.

The WiiU may have been underpowered compared to XB1 and PS4, but it was not that big a difference - and the Switch itself is a relatively meagre change from WiiU in spec anyway. The PPC based CPU was basically a faster tri core Wii chip - so any well experienced Nintendo dev could work on it, and the AMD GPU was practically off the shelf so again not a huge stretch toolchain wise to develop for.

The sudden difference in the kind of games found on the Switch vs the WiiU has more to do with a change in policy from Nintendo than toolchains and hardware - they are well aware that their home console business could go the way of Sega if they don't diversify from their mostly first party casual gaming model of previous generations, as the dire sales of the WiiU shows.

I don't think you realize how woeful Nintendo's previous stuff was, that moving to Nvidia is HUGE improvement for developers. Its been a constant issue, and was likely the major reason why the Wii U was basically a doubled up Gamecube (to maintain consistency and make it easy for developers), and then the Wii U built off of of that somewhat, and somewhat the CPU stuff that the 360/PS3 had, and then newer GPU. Even so, the Wii U dev situation was a complete mess and was a big reason for the dearth of games. Eurogamer had an article detailing what a mess the situation was (supposedly Nintendo didn't even have finalized API spec or something when the Wii U launched so devs had to try and figure things out on their own).

I also don't think its nearly as much of a walled garden as you think. In some ways it actually opens things up for Nintendo as games made for the Switch should meet modern API standards, and so they should be able to run on other hardware that also meets those API specs. Its also why there's been so many ports to the Switch, games that were developed for those API specs could run on it (just much reduced). And it instantly meant their hardware had compatibility with the popular game engines. What this means is that Nintendo could move to another ARM SoC that meets similar API. Which I think a good amount of them do these days. And AMD could potentially get back in by doing what Nvidia did, pairing a small version of their GPU with some ARM cores.

The problems Microsoft and Sony had were mostly due to pricing and contracts. Which Sony's was their own fault (they stupidly thought they'd be able to use Cell to run graphics, and then I think even considered just putting two Cell chips, but GPUs were much better for graphics, so they had to rush to find a GPU, and then had to pay Nvidia a bunch; Microsoft's was because Nvidia controlled the IP which limited Microsoft's ability to do things like die shrinks to lower costs). Plus Nvidia couldn't offer a CPU solution, whereas AMD could offer both CPU and GPU solution. Its true that Nintendo had the security issue, but from what I've read they seem to be quite happy otherwise. And Nvidia did very little to really woo them, which I think Nvidia would be willing to make a custom Tegra chip, or at least a more modern one for them. And I think a lot of that is due to the massive improvement to the development situation.

I've wondered if cloud gaming might have also been in play. I know Nvidia touted those ray-tracing render boxes, and I feel like Nintendo has to be seeing companies moving to subscription models. Plus it would let Nintendo keep the console costs low, while getting more control over the games. Granted, I don't know how much if any advantage the cloud would have been for Nvidia, as I believe AMD is in Microsoft's (and therefore Sony as well since they worked a deal) and Google's streaming. I guess maybe Apple's too although I don't know what hardware Apple is using. And I'm not sure if Nintendo would be that high on it, considering their bizarre online service behavior in general. But, it would provide them an avenue to match or even possibly exceed the other consoles in graphics, while making the buy in cost much lower since they could offer cheaper console costs (but making more money over time).
 
Last edited:
Mar 11, 2004
23,031
5,495
146
It's not about the engines and the low level API, it's about the toolchain and the development tools for it. Before Switch Nintendo's toolchains and development tools have been a mess, especially compared to what Microsoft and Sony are offering.

Nintendo always relied on external companies to deliver development tools which meant there is close to no continuity and consistency. Nvidia is essentially delivering ready made hardware (Tegra X1 in Switch is unchanged to what was on offer before already) as well as the whole toolchain including development tool integration all from one hand. I'm honestly not seeing any company that can rival Nvidia on that right now.

I'd guess Microsoft would, but that's mostly because they full on control the API and also the hardware in the Xbox (although I think that translates to PC as well). But Nvidia is certainly up there, and much ahead of other ARM SoC as far as gaming API support goes. And they're ahead of AMD on the PC side in that regards.
 

soresu

Platinum Member
Dec 19, 2014
2,617
1,812
136
I'd guess Microsoft would, but that's mostly because they full on control the API and also the hardware in the Xbox (although I think that translates to PC as well).

Only for Surface can they fully control both SW and HW - it's a bit late for them to go the full Apple route across all Windows installations.

Likewise Pixel is Google's Surface - this generation they seem to be asserting much more control over the HW spec than just adding a security chip.
 

DXDiag

Member
Nov 12, 2017
165
121
116
I'd heard the Minecraft demo on the Seriex X actually runs closer to 60FPS but dips below, which would put it close to 2080Ti in Minecraft RTX,
It's much much lower than 2080Ti, it ran anywhere from 30fps to 60fps @1080p, depending on the scene, with much simpler scenes to those tested on Turing. Digital Foundry saw it live at Microsoft HQ.


Here is Turing running Native beta code @1080p from the map "Imagination Island", which is a huge map filled with complex objects:

1589175472582.png

https://www.youtube.com/watch?v=puPjj4zSLGc
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Can anyone point me to any article that dumbs down to my level topic of Ray Tracing, and would help me understand what displayed by graphics comes from what on the hardware?

Or should I just assume that its Witchcraft and should not be bothered by this topic, at all?
 

DisEnchantment

Golden Member
Mar 3, 2017
1,590
5,722
136
AMD's GPUOpen got a new website and bits of new content, new effects added to FidelityFX (in addition to CAS ) https://gpuopen.com

Interesting bit to me at the moment is SSSR (Stochastic screen space reflection).

Link to github.
1589212844308.png

High-fidelity reflections in your scene, without costing the earth. SSSR uses your rendered frame to create brilliant reflections. Follow this link below and play with the slider to see the differences
https://gpuopen.com/fidelityfx-sssr

Other new effects includes below
 

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
It's much much lower than 2080Ti, it ran anywhere from 30fps to 60fps @1080p, depending on the scene, with much simpler scenes to those tested on Turing. Digital Foundry saw it live at Microsoft HQ.


Here is Turing running Native beta code @1080p from the map "Imagination Island", which is a huge map filled with complex objects:

View attachment 20891


Minecraft on the Xbox One Series X is targeting 4K60, not 1080p60, FYI. The RTX 2080ti doesn’t hold a candle to the GPU found in the new console.

What is more interesting is that people don’t currently realize how much the PS5 is gimped compared to the Xbox for ray tracing. Wider RDNA2 parts will inevitably have higher RT performance than higher clocked ones.
 
  • Like
Reactions: Tlh97

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
Minecraft on the Xbox One Series X is targeting 4K60, not 1080p60, FYI. The RTX 2080ti doesn’t hold a candle to the GPU found in the new console.

What is more interesting is that people don’t currently realize how much the PS5 is gimped compared to the Xbox for ray tracing. Wider RDNA2 parts will inevitably have higher RT performance than higher clocked ones.
The fully path traced demo is 1080p60. And I can only tell you what I heard - and that is that it was at 60FPS for the vast majority of the demo, only momentarily dipping to 30FPS in certain spots.

There are supposedly 4K ray-traced demos, but not fully path traced which I think Minecraft RTX is, right?
 

DXDiag

Member
Nov 12, 2017
165
121
116
Minecraft on the Xbox One Series X is targeting 4K60
Nope, it's not even on the map as a scheduled release, it's only an experiment.

The fully path traced demo is 1080p60. And I can only tell you what I heard - and that is that it was at 60FPS for the vast majority of the demo,
Nope to that too, it's between 30 and 60, it was never locked to 60fps, see DF video above.
 

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146
Nope to that too, it's between 30 and 60, it was never locked to 60fps, see DF video above.
Alright, fair enough there. Still very impressive stuff for 1 dev's work over the course of a month with an unfinished version of the RTX version of the game that eventually released, because that's still ~2080 performance in the finalised RTX version. Which, you know, had additional dev time and dev effort.
 

DXDiag

Member
Nov 12, 2017
165
121
116
Alright, fair enough there. Still very impressive stuff for 1 dev's work over the course of a month with an unfinished version of the RTX version of the game that eventually released, because that's still ~2080 performance in the finalised RTX version. Which, you know, had additional dev time and dev effort.
That is still way below the 2080, the 2080 achieves 56fps average in crowded detailed maps, Series X achieved presumably way less than in simple maps.
 

DisEnchantment

Golden Member
Mar 3, 2017
1,590
5,722
136

AMD Radeon PRO VII

Not exactly CDNA (Instinct) or RDNA(RX) but it is a workstation Card like the W5700. This segment lags so far behind all others, is it the certification process?

On the other hand this looks like a downgraded MI50 which AMD could not probably sell?
I would not be surprised if it goes out of stock really quickly. Preparing for CDNA, perhaps.

 
Last edited:
  • Like
Reactions: uzzi38 and Tlh97
Mar 11, 2004
23,031
5,495
146
Only for Surface can they fully control both SW and HW - it's a bit late for them to go the full Apple route across all Windows installations.

Likewise Pixel is Google's Surface - this generation they seem to be asserting much more control over the HW spec than just adding a security chip.

I don't think you understood what I meant, which is that the tools they offer for development on Xbox translate to PC side, making ports very easy. Plus they have more ability to dictate API on PC than probably anyone.

Not even going to bother with the rest since none of that is pertinent to the discussion. None of those are gaming platforms and frankly none of them are being dictated much. Not even Apple as they're still using others CPU designs and while they can dictate some of that, they can't or perhaps just don't even dictate to the level of console makers.
 

DisEnchantment

Golden Member
Mar 3, 2017
1,590
5,722
136
I would imagine so - don't they use ECC memory? In which case it would definitely take a bit longer.
HBM2 has ECC built in, although it is OK to ingore the ECC checks

Here is the official release from AMD.

In the ProRender Release, HBCC is still a big differentiator, I suppose it is one of the reasons they want to hang on to it.
 
  • Like
Reactions: Tlh97