Question Speculation: RDNA2 + CDNA Architectures thread

Page 93 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,635
5,983
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Its not about being magical or Nvidia convincing the public that it needs it, its about LOGIC, think about it, i have a 4K monitor the extra resolution helps me to work, but when im gaming... i cant play at 4K with a RX570... so in order to avoid changing resolution or using games at fullscreen, i had to settle for 1440p as desktop resolution and... yes im playing at 1440p with a RX570 and bordeless screen, for 1080P im forced to go fullscreen. In an ideal world all games would have diferent UI and render resolutions, but this is not an ideal world.

If the RX570 had DLSS i would be able to use 4K resolution and play at 1080P upscaled. This also helps people wanting to play with very high fps in monitors with higher resolution than 1080p.
It is still a issue because games have to support it, but its something.

Whats important here is: as AMD is in console wharever system they implement it probably ends up being the widely used option.

AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
You are wrong in two aspects. First DLSS can improve IQ - there are lots of examples published by reviewers. Second none of the upscaling solutions offered by AMD comes anywhere close to DLSS(2.0) as far is IQ is concerned.
I not even consider buying an AMD solution unless they have a comparable upscaling technology like DLSS - it is just that much of a gamechanger in my opinion.

You can never improve original art. Sure, you can change it. And some people may like the new look. But that doesn't mean it is better.

If somebody 'reworked' the Mono Lisa, there would be a minority of people that would claim up and down that the new version was better. But that doesn't mean it would be.
 

Konan

Senior member
Jul 28, 2017
360
291
106
IMO

AMD will need a comprehensive solution to compete in RT games against NV. The jury is still out until all the reviewers are going to test against it, which is going to happen. DX12U will be on XSX but not PS5.
Additionally, a DLSS competitor will also be needed. What AI training is there? will there be a competing solution? Hopefully...

“I think ray tracing is an important technology, and it’s something we’re working on as well, both from a hardware and software standpoint,” Su said
Later, Su expanded on her thought. “I don’t think we should say that we are ‘waiting,’” Su said, in response to this reporter’s question. “We are deep in development, and that development is concurrent between hardware and software.”
“The consumer doesn’t see a lot of benefit today because the other parts of the ecosystem are not ready,” Su added.
 

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
I'm not sure what you gents are complaining about. The Radeon Image Sharpening is a pretty dang good tool, just like DLSS is. For anyone saying DLSS is bad, they're just a picking a side and arguing for the sake of it.

I'm sure it would be easy to implement RIS 2.0 (whatever that may be). https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/#allcomments

It's understood that purist are amongst us, but something needs to change. I just wish Physx was usable on the GPU front for AMD, and impacted gameplay. That to me is a better use and function verse a RT core, or dedicated hardware set to do RT.
 
  • Like
Reactions: beginner99

Geranium

Member
Apr 22, 2020
83
101
61
You can never improve original art. Sure, you can change it. And some people may like the new look. But that doesn't mean it is better.

If somebody 'reworked' the Mono Lisa, there would be a minority of people that would claim up and down that the new version was better. But that doesn't mean it would be.
When you compare something with TAA, that something is always better than TAA. Here that something is DLSS.
If I am not wrong that games that supports DLSS only has TAA or poor implementation of some AA. In that that way DLSS will automatically look better than native resolution.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
677
136
If somebody 'reworked' the Mono Lisa, there would be a minority of people that would claim up and down that the new version was better. But that doesn't mean it would be.

No, it more like someone took a picture of the Mona Lisa w/ an old camera phone at low res, used a snapchat filter to add cat ears and glitter, printed it out, hung it next to the original and then started screaming at everyone who didn't believe the it was better than the original...
 
Last edited:

stebler

Member
Sep 10, 2020
25
75
61
Somebody asked and you answered this already on reddit, but I'd like to second the request reading this data from earlier macOS versions to compare how much changed, if at all.

Update:

I checked on macOS 10.15.1, which was released in October 2019.

It contains discovery firmware for Navi 14, Navi 12, Navi 21, Navi 21 Lite and Van Gogh.

For Navi 14 and Navi 12, they are identical to the ones from macOS 11 beta 6.

Navi 21 and Van Gogh use an older format (the same as the Navi 1x family) for the gc_info structure, but otherwise the values are the same.

Navi 21 Lite (Xbox Series X) has 56 CUs and 20 tccs as expected. I will add it to the table in the Reddit post.

I also want to reiterate that the powerplay tables are identical between macOS and Linux.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
I'm not sure what you gents are complaining about. The Radeon Image Sharpening is a pretty dang good tool, just like DLSS is. For anyone saying DLSS is bad, they're just a picking a side and arguing for the sake of it.

I'm sure it would be easy to implement RIS 2.0 (whatever that may be). https://www.techspot.com/article/1873-radeon-image-sharpening-vs-nvidia-dlss/#allcomments

For some values of ‘easy’ that is. If you look at what DLSS is doing, there’s a huge amount going on.

Most of the underlying deep learning research is public domain of course but probably not practically trivial to apply it right. It took NV a fair while to get it right and they‘re doing lots of DL these days.

I’m not sure if AMD could easily replicate the back end training work themselves. Microsoft could definitely have done you’d think.

The other potential barrier is also that it simply doesn’t make sense from a performance point of view to use DLSS unless you have the tensor cores in your GPU architecture. There’s a non trivial cost to including those of course.

NV have them in to sell their cards for various deep learning situations, so in some ways DLSS is them compensating for this. No point in a console of course and that’s where RDNA2 is coming from.

It's understood that purist are amongst us, but something needs to change. I just wish Physx was usable on the GPU front for AMD, and impacted gameplay. That to me is a better use and function verse a RT core, or dedicated hardware set to do RT.

The RT stuff could get somewhat annoying in the next few years. The consoles RT performance is going to set the basic level of effects most games see, and you have to suspect they’ll probably be mostly superfical.

Oh well :)
 

ModEl4

Member
Oct 14, 2019
71
33
61
I think anyone saying that an AI enchanted lower res image can't be perceivably as good as a higher resolution image, is fighting the inevitable. AI assisted image improvements is the way forward and Nvidia's is just the first tiny step in a Marathon. In less than 4 years from now, probably we will have the first examples of how a 4K AI assisted image will have unquestionably better perceivable Image quality than a 8K version in certain crucial key aspects for the majority of people comparing them. Regarding DLSS, maybe is not perfect but certainly is years ahead of what the competition is offering with Navi1. Already if you check an implementation like in Control you can see that QHD DLSS (4K quality) can have certain advantages in relation with 4K, check the Digital Foundry video, for me the clarity and the preservation of details in the faces for example, heavily outweights the faint ringing artifact (Of course is a matter of taste, and depends on TAA implementation etc) Read the article at Coreteks, although a little anti-Nvidia and Raja will save us all etc, it is very interesting read.

EDIT : spelling corrections
 
Last edited:

CakeMonster

Golden Member
Nov 22, 2012
1,392
501
136
I played Control in 4K with DLSS (changed monitors around) and going back to 1600p and native was a huge relief.

I'm very sensitive to sharpening and it really bothered me. If sharpening can't be disabled with DLSS in the future, I'm out, there's no way I'll be using it for as long as I can hold off.

Also, DLSS messed up small text (signs etc) where the resolution would fool you into thinking it would be discernable but its just a sharpened mess. Its minor but not pretty.

Thirdly, on high contrast edges DLSS approached MSAA effect both with regards to resolution and lack of staircase effect. However with low contrast its still not very aliased but its a low resolution blur and does not look 4K. The huge grey column in the Foundation DLC against a gray background has very blurred edges compared to native.

The film grain effect (for those who like that) is pretty much ruined with DLSS too.

Edit: A good experiment that I recommend everyone do (while standing still in place) is turn DLSS resolution way down to study what it does, then turn it gradually back up toward native and compare the effects, then lastly turn it off on native.
 
Last edited:

DJinPrime

Member
Sep 9, 2020
87
89
51
No, it more like someone took a picture of the Mona Lisa w/ an old camera phone at low res, used a snapchat filter to add cat ears and glitter, printed it out, hung it next to the original and then started screaming at everyone who didn't believe the it was better than the original...
Your analogy is so far off base though, DLSS is no where near that bad in both changing the quality or what the image is. And Stuka87, taking your analogy of the Mona Lisa. The reason why people are excited about DLSS is because it allows them to enjoy higher quality settings and framerates (Mona Lisa) without having the hardware (money, there's only 1 Mona Lisa) with minor image quality trade off (buying a print of the Mona Lisa and hanging it in your home). Everyone knows it's a print but it still looks really nice and I can enjoy it at home. And I wonder what monitors you guys run, because if you don't have a HDR1000 capable, you have no idea what visual qualities you're missing out on.
 
  • Haha
Reactions: pcp7

insertcarehere

Senior member
Jan 17, 2013
639
607
136
No, it more like someone took a picture of the Mona Lisa w/ an old camera phone at low res, used a snapchat filter to add cat ears and glitter, printed it out, hung it next to the original and then started screaming at everyone who didn't believe the it was better than the original...

I take it you believe the only worthwhile AA is SSAA? Because just about every other AA method (esp. post-processing AA) also "reworks" images in a somewhat similar fashion that arguably does not represent the native render in a "perfect" manner.
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
677
136
Your analogy is so far off base though, DLSS is no where near that bad in both changing the quality or what the image is. And Stuka87, taking your analogy of the Mona Lisa. The reason why people are excited about DLSS is because it allows them to enjoy higher quality settings and framerates (Mona Lisa) without having the hardware (money, there's only 1 Mona Lisa) with minor image quality trade off (buying a print of the Mona Lisa and hanging it in your home). Everyone knows it's a print but it still looks really nice and I can enjoy it at home. And I wonder what monitors you guys run, because if you don't have a HDR1000 capable, you have no idea what visual qualities you're missing out on.
I take it you believe the only worthwhile AA is SSAA? Because just about every other AA method (esp. post-processing AA) also "reworks" images in a somewhat similar fashion that arguably does not represent the native render in a "perfect" manner.
My issue is with the presentation not the tool. As a tool, an option it's perfectly viable. People saying it produces something more accurate than source though...
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
My issue is with the presentation not the tool. As a tool, an option it's perfectly viable. People saying it produces something more accurate than source though...

Well, pedantically it can because the source images they train on are very high resolution.

Obviously most often they mean better than standard anti aliasing.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.

Radeon RSI does not work like that, it reduces perf and it only makes sence to use it in a game that allows you to reduce rendering resolution. Thats the only thing it does, try to improve visual quality at the cost of FPS.

BUT, AMD also has GPU upscaling, what it does if a game is outputing 1080P fullscreen, it upscales it to the native res, you can use that along with RSI.
But dont even try to use 1080p native and upscale it to 4K with that, it is a huge no-no. Not to mention that games running at 1440P will look worse and they are slower as you cant set a gpu upscale per game. On top of that you need to use fullscreen on every game.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Radeon RSI does not work like that, it reduces perf and it only makes sence to use it in a game that allows you to reduce rendering resolution. Thats the only thing it does, try to improve visual quality at the cost of FPS.

BUT, AMD also has GPU upscaling, what it does if a game is outputing 1080P fullscreen, it upscales it to the native res, you can use that along with RSI.
But dont even try to use 1080p native and upscale it to 4K with that, it is a huge no-no. Not to mention that games running at 1440P will look worse and they are slower as you cant set a gpu upscale per game. On top of that you need to use fullscreen on every game.

DLSS causes a 15-20% performance hit (over just upscaling). RIS is within the margin of error according to most tests. In some cases it hits 5%, depends on the scene and the amount of upscaling.
 

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,009
136
DLSS causes a 15-20% performance hit (over just upscaling). RIS is within the margin of error according to most tests. In some cases it hits 5%, depends on the scene and the amount of upscaling.

To be fair DLSS does more than just upscaling an image. I have no doubt that If you compared it to a simple upscale it would look a lot better, but the idea that you can start with a lower resolution image and use magic to upscale it to something that looks better than a native image is just laughable.

I'm pretty sure we need to update our children's stories or something. I've got a great idea for one called The Emperor's New Super Sampling Algorithm. I'm just looking for a good illustrator. Or even just a bad one since I can apparently save a lot of money by getting some low res mockups instead of springing for anything with finer detail.
 

Tup3x

Senior member
Dec 31, 2016
965
951
136
AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.
It really doesn't look even remotely close to DLSS2.0, but if you want to go that route NVIDIA has that as and option too.
Screenshot 2020-09-28 205547.png

Performance is all about compromises and the more choices you have the better. Pick your poison.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It really doesn't look even remotely close to DLSS2.0, but if you want to go that route NVIDIA has that as and option too.
View attachment 30584

Performance is all about compromises and the more choices you have the better. Pick your poison.

Yup, and when DLSS first came out, nVidia's own upscale sharpening looked a lot better. Now obviously DLSS 2 is better than 1, but the upscaling is still superior in some cases.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
DLSS causes a 15-20% performance hit (over just upscaling). RIS is within the margin of error according to most tests. In some cases it hits 5%, depends on the scene and the amount of upscaling.

The AMD GPU upscaling + RIS does not do exactly what you think it does, the GPU upscaling works at the video output, this is mainly intended to play old fullscreen games, specially the ones whiout 16:9 support or high res support, so the monitor can keep working at native resolution, this means the Radeon Image Sharpening is applied to the game resolution image, then upscaled, not applied to the upscaled image. I actually use the GPU upscale because my monitor tends to make a weird noise when working at 1080p, it also tends to create issues when alt+tabing.

Were RIS works in s similar way to DLSS, is with a game that supports a diferent renderer resolution, in this way you can keep using the native monitor resolution for UI and image sharpening is applied to the upscaled image, and results will depend mostly on the game renderer and upscaler.

But there is no way, i saw in the 3090 reviews 4K Native, vs 8K DLSS(upscaled from 1440p) vs 8K Native, the 8K DLSS are generally way closer to 8K Native and better than 4K native. Ok this is a extreme example, but i also saw 540p to 1080p w/DLSS 2.0 results looking close to 1080p native... so everyone getting paid by Nvidia, or they have something good there. The main problem, as always, is that it needs game support.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,711
4,559
136
Let me end this discussion about DLSS, and requirement for AMD to have this killer feature in their GPUs in order for them to sell in one sentence.

If AMD will have faster GPU in raterization than direct counterparts from Nvidia, while also being cheaper, and more efficient, only complete morons, or Nvidia fanboys will complain that RDNA2 GPUs do not have DLSS feature competitor.

The end.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Because DLSS isn't the first time NVidia has offered something like it. DSR came out in 2013-2014 and it was a flop. I'd forgotten about it until someone mentioned it in a YouTube comment. You only win, IMO, if you can offer native 8K gameplay at 80 FPS or greater in a large amount of titles.

I suppose Minecraft is a neat game but DLSS in Minecraft at 4K or 8K is.... I'll refrain from using a word that would surely insult a portion of the global populace and isn't something I'd normally blurt out either.

AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.
Don't take my word for it, but I was told these aren't very good approaches.
 
  • Like
Reactions: Tlh97 and blckgrffn

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
Because DLSS isn't the first time NVidia has offered something like it. DSR came out in 2013-2014 and it was a flop.

DSR and DLSS are nearly opposites.

DLSS takes an internal lower resolution and scales it up to display resolution, DSR takes an internal higher resolution and scales it down to display resolution.

DLSS results in significantly higher FPS at display resolution.
DSR results in significantly lower FPS at display resolution.

DSR is ridiculously expensive to run and craters your FPS, so it's no surprise it didn't catch on.