Question Speculation: RDNA2 + CDNA Architectures thread

Page 95 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

uzzi38

Platinum Member
Oct 16, 2019
2,705
6,427
146
All die sizes are within 5mm^2. The poster here has been right on some things in the past afaik, and to his credit was the first to saying 505mm^2 for Navi21, which other people have backed up. Even still though, take the following with a pich of salt.

Navi21 - 505mm^2

Navi22 - 340mm^2

Navi23 - 240mm^2

Source is the following post: https://www.ptt.cc/bbs/PC_Shopping/M.1588075782.A.C1E.html
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.

Radeon RSI does not work like that, it reduces perf and it only makes sence to use it in a game that allows you to reduce rendering resolution. Thats the only thing it does, try to improve visual quality at the cost of FPS.

BUT, AMD also has GPU upscaling, what it does if a game is outputing 1080P fullscreen, it upscales it to the native res, you can use that along with RSI.
But dont even try to use 1080p native and upscale it to 4K with that, it is a huge no-no. Not to mention that games running at 1440P will look worse and they are slower as you cant set a gpu upscale per game. On top of that you need to use fullscreen on every game.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Radeon RSI does not work like that, it reduces perf and it only makes sence to use it in a game that allows you to reduce rendering resolution. Thats the only thing it does, try to improve visual quality at the cost of FPS.

BUT, AMD also has GPU upscaling, what it does if a game is outputing 1080P fullscreen, it upscales it to the native res, you can use that along with RSI.
But dont even try to use 1080p native and upscale it to 4K with that, it is a huge no-no. Not to mention that games running at 1440P will look worse and they are slower as you cant set a gpu upscale per game. On top of that you need to use fullscreen on every game.

DLSS causes a 15-20% performance hit (over just upscaling). RIS is within the margin of error according to most tests. In some cases it hits 5%, depends on the scene and the amount of upscaling.
 

Mopetar

Diamond Member
Jan 31, 2011
8,114
6,770
136
DLSS causes a 15-20% performance hit (over just upscaling). RIS is within the margin of error according to most tests. In some cases it hits 5%, depends on the scene and the amount of upscaling.

To be fair DLSS does more than just upscaling an image. I have no doubt that If you compared it to a simple upscale it would look a lot better, but the idea that you can start with a lower resolution image and use magic to upscale it to something that looks better than a native image is just laughable.

I'm pretty sure we need to update our children's stories or something. I've got a great idea for one called The Emperor's New Super Sampling Algorithm. I'm just looking for a good illustrator. Or even just a bad one since I can apparently save a lot of money by getting some low res mockups instead of springing for anything with finer detail.
 

Tup3x

Golden Member
Dec 31, 2016
1,086
1,085
136
AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.
It really doesn't look even remotely close to DLSS2.0, but if you want to go that route NVIDIA has that as and option too.
Screenshot 2020-09-28 205547.png

Performance is all about compromises and the more choices you have the better. Pick your poison.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
It really doesn't look even remotely close to DLSS2.0, but if you want to go that route NVIDIA has that as and option too.
View attachment 30584

Performance is all about compromises and the more choices you have the better. Pick your poison.

Yup, and when DLSS first came out, nVidia's own upscale sharpening looked a lot better. Now obviously DLSS 2 is better than 1, but the upscaling is still superior in some cases.
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
DLSS causes a 15-20% performance hit (over just upscaling). RIS is within the margin of error according to most tests. In some cases it hits 5%, depends on the scene and the amount of upscaling.

The AMD GPU upscaling + RIS does not do exactly what you think it does, the GPU upscaling works at the video output, this is mainly intended to play old fullscreen games, specially the ones whiout 16:9 support or high res support, so the monitor can keep working at native resolution, this means the Radeon Image Sharpening is applied to the game resolution image, then upscaled, not applied to the upscaled image. I actually use the GPU upscale because my monitor tends to make a weird noise when working at 1080p, it also tends to create issues when alt+tabing.

Were RIS works in s similar way to DLSS, is with a game that supports a diferent renderer resolution, in this way you can keep using the native monitor resolution for UI and image sharpening is applied to the upscaled image, and results will depend mostly on the game renderer and upscaler.

But there is no way, i saw in the 3090 reviews 4K Native, vs 8K DLSS(upscaled from 1440p) vs 8K Native, the 8K DLSS are generally way closer to 8K Native and better than 4K native. Ok this is a extreme example, but i also saw 540p to 1080p w/DLSS 2.0 results looking close to 1080p native... so everyone getting paid by Nvidia, or they have something good there. The main problem, as always, is that it needs game support.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,803
4,777
136
Let me end this discussion about DLSS, and requirement for AMD to have this killer feature in their GPUs in order for them to sell in one sentence.

If AMD will have faster GPU in raterization than direct counterparts from Nvidia, while also being cheaper, and more efficient, only complete morons, or Nvidia fanboys will complain that RDNA2 GPUs do not have DLSS feature competitor.

The end.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,158
136
Because DLSS isn't the first time NVidia has offered something like it. DSR came out in 2013-2014 and it was a flop. I'd forgotten about it until someone mentioned it in a YouTube comment. You only win, IMO, if you can offer native 8K gameplay at 80 FPS or greater in a large amount of titles.

I suppose Minecraft is a neat game but DLSS in Minecraft at 4K or 8K is.... I'll refrain from using a word that would surely insult a portion of the global populace and isn't something I'd normally blurt out either.

AMD has had image upscaling for years. Why not use it in this case? They even have RIS (Radeon Image Sharpening) to clean up the upscaled image, and it objectively looks better than DLSS in some cases, but without the performance hit that DLSS has. Your situation is the exact use case for using upscaling.
Don't take my word for it, but I was told these aren't very good approaches.
 
  • Like
Reactions: Tlh97 and blckgrffn

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
Because DLSS isn't the first time NVidia has offered something like it. DSR came out in 2013-2014 and it was a flop.

DSR and DLSS are nearly opposites.

DLSS takes an internal lower resolution and scales it up to display resolution, DSR takes an internal higher resolution and scales it down to display resolution.

DLSS results in significantly higher FPS at display resolution.
DSR results in significantly lower FPS at display resolution.

DSR is ridiculously expensive to run and craters your FPS, so it's no surprise it didn't catch on.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
If they marketed DLSS as the function it is, upscaling and AA.
I would totally buy the concept if Nvidia marketed 1440p DLSS to 4k as 1440p With Nvidia special sauce AAx16 and 1080p DLSS to 4k as 1080p With Nvidia special sauce AAx4 or something. Heck, there has been sepia-filters, inverted color filters and what not included in old drivers as extra post process effects. But it isn't comparable to native resolution and if Nvidia succeeds with tricking enough people that 1440p with Nvidia AA is indeed 4k, we all loose. The only thing that would have happened if DLSS is accepted in the terms it is presented today, is that a 4k resolution in 2018 is equal to a 1440p resolution today. Nvidia is lowering the bar of quality with buzzwords. Not even the heaviest fanboys should be pleased with this. Demand more from your favourite GPU manufacturer! Dont sell their redacted scam to others for free on forums like theese! Nvidia only sees a chance to gain 25% for free. It's all there is.

Profanity isn't allowed in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:

Thala

Golden Member
Nov 12, 2014
1,355
653
136
This entire "better than native" is a terrible meme, due to native being blurred by TAA. Use TSSAA8x like in idTech engine and there's zero chance for DLSS 2 to even look as good as native.

Thats the thing, as long as you cannot freely chose the AA algorithm and you are stuck with TAA - TAA is essentially what you get when choosing native 4k. DLSS replaces TAA while upscaling - so you are in a win-win situation image quality and framerate wise.
I agree that DLSS is not totally free of artifacts - but hey it is an option for a reason.
 

soresu

Diamond Member
Dec 19, 2014
3,230
2,515
136
I suppose Minecraft is a neat game but DLSS in Minecraft at 4K or 8K is.... I'll refrain from using a word that would surely insult a portion of the global populace and isn't something I'd normally blurt out either.
The thought did cross my mind more than once when seeing Minecraft, or even Quake 2 RTX demos.

If they had churned out FEAR RTX I would have been a bit more impressed.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
Thats the thing, as long as you cannot freely chose the AA algorithm and you are stuck with TAA - TAA is essentially what you get when choosing native 4k. DLSS replaces TAA while upscaling - so you are in a win-win situation image quality and framerate wise.
I agree that DLSS is not totally free of artifacts - but hey it is an option for a reason.
Then why not call it what it is? 1440p upscaled to 4k with DLSS Should just be named 1440p Nvidia AAx4. Or something like that. Why call it another res? (8k gaming cards and what not) They could market hype the redacted out of it, or be honest with a really great AA-method. There lies all the difference in the world.

Profanity is not allowed in the tech areas.

AT Mod Usandthem
 
Last edited by a moderator:

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Then why not call it what it is? 1440p upscaled to 4k with DLSS Should just be named 1440p Nvidia AAx4. Or something like that. Why call it another res? (8k gaming cards and what not) They could market hype the redacted out of it, or be honest with a really great AA-method. There lies all the difference in the world.

Personally i do not much care about what NVidia calls it, i am just interested in the result. Why are you so concerned about the naming?

Purely from a technical perspective it is not just AA, it is reconstruction based on prior knowledge. This knowledge is gained by training on high resolution images. So DLSS is adding information, which mathematically is not present in the lower resolution version. So nothing much wrong calling it 4k i believe.
 
Last edited by a moderator:

Nox51

Senior member
Jul 4, 2009
376
20
81
Personally i do not much care about what NVidia calls it, i am just interested in the result. Why are you so concerned about the naming?

Purely from a technical perspective it is not just AA, it is reconstruction based on prior knowledge. This knowledge is gained by training on high resolution images. So DLSS is adding information, which mathematically is not present in the lower resolution version. So nothing much wrong calling it 4k i believe.


Other than it not being, you know, rendering at 4k natively as it should be.
 

Shivansps

Diamond Member
Sep 11, 2013
3,875
1,530
136
DLSS is not a AA method for god sake, its a good upscaler that uses AI to create a image of higher resolution than the original using data that the game provides. Is not a simple upscaler either, as not upscaler in the world (that we know off), is able to produce a 1080P image from a 540p one that looks better than native 720p and is very close to native 1080p. That is just amazing in my book, not sure why so much hate. If this were to become a stardart it is a true game changer, but im not even sure that Nvidia wants that as this would extend the life of the gpus.

The Radeon Image Sharpener is not DLSS, but it is a base to start if AMD were to make a DLSS alternative, they would need to combine it with their own upscaler, as it nows depends on the game having his own upscaler, and it would need to do more than just normal upscale and what is doing right now.

The problem with DLSS, as with everything, is that is complicated to implement and is propietary, if something like DLSS were to succeed outside games sponsored by Nvidia it needs to be implemented at DX12, this is why Freesync became the standart over G-Sync or why Mantle died and DX12 became the standart. The problem is that we are now talking about something that would make GPUs to last longer, and thats a problem.

EDIT: I forgot to add that DLSS adds data to the image (and artifacts) the result it provides does not comes only from the original image.
 
Last edited:

jamescox

Senior member
Nov 11, 2009
644
1,105
136
I know it makes very little actual sense but there's little arguing with demonstrated reality. There's a dedicated section of the market who buy these things. Don't ask me why.

The 8k is just some sort of silly marketing attempt to differentiate the 3090 from the 3080. The huge 3090 die will sell very well indeed for deep learning workstations etc, even if it has to be as a Pro card, so NV are happily covered.



This is where people start getting a bit ahead of themselves. Zen was resulted from a huge, multi year, investment while Intel ran into endless process trouble. Here, they've had one years work since the 5700/XT, only mild process improvements, and they've had to jam ray tracing into the cards while they're at it - as Turing showed that isn't free.

Vs a half decent die shrink & new architecture, you would, a priori, have expected them to drop back from last years position.

Perhaps they've done well enough to hold the line, perhaps a bit better, perhaps slightly worse. Honestly that isn't critical here.

What's critical is that they do a fast, organised, refresh of their entire GPU stack. Ideally competitive mobile GPU's into the bargain. That'll show that they've got/are putting in the resources to take it fully seriously. Frankly its been far too long since they've been able to do this.

If they're doing that then, yes, we'll have a rather competitive market again.

The 8K marking BS is very suspicious to me. The number of people with 8K displays of any kind is very small and most of them are not gamers. The 3090 8K gaming stuff makes me think Nvidia knows that they are going to be outperformed by a cheaper card at 4K. They can make some claim of superiority at 8K, even though it is completely irrelevant. This happens all of the time in the gpu market where people read the reviews of the top end cards and use that in a purchasing decision, even though they are buying a low to mid range card.

Zen was a huge investment, but RDNA almost certainly has a bit of Zen DNA in it. I wouldn't be surprised that it shares some design bits, perhaps caches and such. They are going to be using "infinity architecture" across both cpus and gpus. AMD has more than 1 team working on iterations of Zen processors, so each Zen release isn't just work done since the previous release. This stuff has long lead times, so RDNA2 work would have started probably several years ago, just like work is probably being done on Zen 5 right now. This is why a bad decision several years ago can have repercussions now. A bad desicion back then may also not be fixable for quite some time. If Nvidia went in the wrong direction with the 30 series, then it isn't going to be fixed anytime soon. Also, they needed ray tracing for the consoles, so that was partially payed for by Microsoft and Sony. It wasn't just shoe horned in at the last minute. I wouldn't be surprised if ray tracing was in development before Vega was even released.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
7,166
7,666
136
DSR and DLSS are nearly opposites.

DLSS takes an internal lower resolution and scales it up to display resolution, DSR takes an internal higher resolution and scales it down to display resolution.

DLSS results in significantly higher FPS at display resolution.
DSR results in significantly lower FPS at display resolution.

DSR is ridiculously expensive to run and craters your FPS, so it's no surprise it didn't catch on.

- I agree with basically everything you said, but I'm not sure about the "DSR never caught on" comment. There is nothing to "catch on". Its an SSAA injector built into the driver. If you are playing older games and have a buttload of performance sitting around "wasted" you might as well downsample 4K to 1080P and get buttery delicious image quality on some normally craptastic looking old game.

You know all those super nice looking marketing shots your game never looks like when you run it? That's a super sampled image (much larger image crammed into a smaller viewing area).

I use DSR all the time when I play older games.

AMD has the same thing, only they call it VSR.

If you're playing an older game and your GPU is sitting at 50% utilization or something, crank up that SSAA and at least get a much better looking game for the same FPS. Of course if you're playing some newfangled hotness and struggling to maintain even 60FPS then DSR/VSR is about the last thing you want to use.
 
  • Like
Reactions: Tlh97 and psolord

Kuiva maa

Member
May 1, 2014
182
235
116
Digital Foundry looking at series X. Look at the Hitman 2 walking around the estate benchmark. Seems to be hitting 2080S performance provided the 4k console setting is comparable to 4k PC maxed settings. Definitely going to be matching the 2080 as a minimum.

They did mention MS confirmed that XSX is running XOX games in compatibility mode without activating the newer architectural enhancements. I wouldn't be surprised if XSX wasn't running max clocks either in this scenario, just targets certain numbers that allow it to run older games without compatibility issues. Either way, it wasn't the full power of the console at display there.
 

Timorous

Golden Member
Oct 27, 2008
1,748
3,240
136
They did mention MS confirmed that XSX is running XOX games in compatibility mode without activating the newer architectural enhancements. I wouldn't be surprised if XSX wasn't running max clocks either in this scenario, just targets certain numbers that allow it to run older games without compatibility issues. Either way, it wasn't the full power of the console at display there.

Indeed and the settings are not 100% comparable and I have no idea where TPU actually perform their test. It is just an initial indicator that the Series X is showing good numbers even in this sub optimal state.