Question DLSS 2.0

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is no thread on this, and it's worth one because it seems to finally be realising it's potential. There's many articles discussing it elsewhere - it's used by wolfenstein young blood, control and a few other games and it works really well. Allegedly it's easy to add - no per game training required. Gives a good performance increase and looks really sharp.

Nvidia article.
wccf article.
eurogamer article

The above articles have some good comparison screen shots that really demonstrate what it can do.

Discuss...
 
  • Like
Reactions: DXDiag

Gideon

Golden Member
Nov 27, 2007
1,608
3,570
136
Tensor cores are not mandatory for doing Machine Learning. DX-12 DirectML can take advantage of any GPU hardware, including Tensor Cores.
In the future most games will use DirectML and not DLSS.
Yes, they don't need tensor cores for that (though these could improve performance).


1. DLSS 1.9 ran just fine on shaders (thoigh obv dedicated hw is better)
2. Smaller resolution significantly reduces the shader workload a lot more than this adds (see point 1)
3. AMD shader cores run packed 8 and 4 bit integer calculations. Not quite the same as tensor cores, but still hugely more efficient than RDNA 1.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Yes, they don't need tensor cores for that (though these could improve performance).


1. DLSS 1.9 ran just fine on shaders (thoigh obv dedicated hw is better)
2. Smaller resolution significantly reduces the shader workload a lot more than this adds (see point 1)
3. AMD shader cores run packed 8 and 4 bit integer calculations. Not quite the same as tensor cores, but still hugely more efficient than RDNA 1.

RDNA 1 also supports Rapid Packed Math (FP16) and 8-4bit FP and 4bit INT
 

Guru

Senior member
May 5, 2017
830
361
106
Sorry, that's your own bias/perception, the sources I quoted agree with the consesnsus that DLSS 2 looks as good or better than native resolution + TAA, especially during motion where TAA blurs everything out.

View attachment 19894


SURE!
Native 1440p: https://static.techspot.com/articles-info/1992/images/F-31.jpg
DLSS 2.0 1440p: https://static.techspot.com/articles-info/1992/images/F-32.jpg

Look at the yellow wall, is DLSS wall darker? Does DLSS wall have less detail?

Look at the wall board that has paper in it. Does the DLSS one looks like crap? Does the DLSS color look monotone and bland? Does the text look blurry as hell and unreadable in the dlss one?

Focus on the wooden box now, does it look monotone and bland and blurry compared to native 1440p?

Yeah thought so!

Okay you say, DLSS 2.0 is CRAP rendering 1440p, its only good for 4k, fine lets see the 4k images!

Okay lets take these images: native 4k https://static.techspot.com/articles-info/1992/images/F-19.jpg
dlss 4k: https://static.techspot.com/articles-info/1992/images/F-20.jpg

Lets focus on the yellow wall again. Is the wall darker compared to the native 4k? Look at the shadows on the wall, the floor. Are DLSS shadows more monotone and bland? 4k native the shadows are crisp and big array of coloers blending in, DLSS on the other hand is monotone, has blocking, is lower detail, etc...

How about the white board? Again the white color is now darker in DLSS pic, the images on the white board are extremely blurry and monotone, ALL of the colors in the whole image are worse with DLSS 2.0, they all seem bland, washed out.

Lets focus on the hair now, DLSS does it look plasticky and darker and more monotone?

How about the metal object there? Does the DLSS one has less detail? Does it look blocky on the edges?

Your image according to eurogames uses low textures. Why play at 4k and use lowe textures? Who does that? Its a paid comparison, they disclose it in their youtube video.
 

DXDiag

Member
Nov 12, 2017
165
121
116
SURE!
Native 1440p: https://static.techspot.com/articles-info/1992/images/F-31.jpg
DLSS 2.0 1440p: https://static.techspot.com/articles-info/1992/images/F-32.jpg

Look at the yellow wall, is DLSS wall darker? Does DLSS wall have less detail?
All of the images you are using are not from DLSS 2, they are DLSS 1.9, DLSS 2 is VASTLY better. See the Digital Foundry comparison I provided.
Your image according to eurogames uses low textures. Why play at 4k and use lowe textures? Who does that? Its a paid comparison, they disclose it in their youtube video.
DF doesn't use low textures, that's illogical, they use Ultra details in every scene.
 

DXDiag

Member
Nov 12, 2017
165
121
116
The article specifically states that they are testing DLSS 2.0...
The article was released Feb 26th 2020, Control received DLSS 2 on March 26th 2020, ie: after a month from the article.

Directly quoting the article:

DLSS 1.9, was a one-off and will only be used for Control. But we think it’s still important to talk about what Nvidia has done in Control, both to see how DLSS has evolved and also to see what is possible with a shader core image processing algorithm, so let’s dive into it.

The article began by examining the 1.9 DLSS implementation in Control, then switched to DLSS 2 implementation in Wolfenstein Youngblood.

After analyzing DLSS in Youngblood, there's no doubt that the technology works. The first version of DLSS was unimpressive, but it’s almost the opposite with DLSS 2.0: the upscaling power of this new AI-driven algorithm is remarkable and gives Nvidia a real weapon for improving performance with virtually no impact to visuals.


DLSS now works with all RTX GPUs, at all resolutions and quality settings, and delivers effectively native image quality when upscaling, while actually rendering at a lower resolution. It's mind blowing. It's also exactly what Nvidia promised at launch. We're just glad we’re finally getting to see that now.
 
Last edited:

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
Okay you say, DLSS 2.0 is CRAP rendering 1440p, its only good for 4k, fine lets see the 4k images!

Okay lets take these images: native 4k https://static.techspot.com/articles-info/1992/images/F-19.jpg
dlss 4k: https://static.techspot.com/articles-info/1992/images/F-20.jpg

While I tend to not like NV and their proprietary vendor-lock in stuff, I', always a bit puzzled by these comparisons. The 1440p one is clear due to the blurred barley readable text. but here? I see minor differences but I could not say which one I like more after starring at still images for couple minutes.

I've never bothered much with image quality. But "in-action" does it matter at all? Probably depends on the games you play. In shooters I don't see how this matters over faster fps. And then the real difference is having anti-blur (strobbing) display. moving fast without that you can't read the text anyway.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is nothing like "DLSS 1.9" :rolleyes:
If you look at the eurogamer article I linked in the first post you'll see that has DLSS 1.9 in the comparisons for control. It is significantly worse then DLSS 2.0 in those comparisons. Guru's comparison pics are from 1.9 not 2.0 like he claimed (the comparison pics come from feb, DLSS 2 came out for control at the end of march).
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
DLSS 2.0 is now in Death Stranding, and it seems is actually considered superior not just to FidelityFX + TAA but to the native image too, while still giving the huge performance boost (allowing even a 2060 to run the game at 4k at 60fps).
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
DLSS 2.0 is now in Death Stranding, and it seems is actually considered superior not just to FidelityFX + TAA but to the native image too, while still giving the huge performance boost (allowing even a 2060 to run the game at 4k at 60fps).

Either AMD comes up with similar solution soon, or it is game over to them in high end graphics. Sure they are used to being second and providing GPU alternative so people can buy Nvidia cards cheaper, but that only worked when DLSS was "performance" option. Now that it provides best visuals, it is really game changing. And the time is ticking, Metro/Death Stranding are niche games, but certain game from Poland will deliver coup de grace in the autumn...

There will always be people buying AMD cards in low/mid range, but these DLSS advances are shutting down >400$ market completely. Even if AMD is cheaper in higher end, Nvidia is much better choice.
 
  • Like
Reactions: xpea

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
DLSS 2.0 is now in Death Stranding, and it seems is actually considered superior not just to FidelityFX + TAA but to the native image too, while still giving the huge performance boost (allowing even a 2060 to run the game at 4k at 60fps).

The issue is that TAA simply is pretty bad so I wonder how the comparions would look without TAA just 4k native.
 

DXDiag

Member
Nov 12, 2017
165
121
116
The issue is that TAA simply is pretty bad so I wonder how the comparions would look without TAA just 4k native.
TAA is tightly coupled to graphics engines pipelines now, you can't simply switch it off as SSAO, AA, shadows, SSR relies on it to function properly in modern engines.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
Wonder if we'll ever get a DLSS 3.0 "injector" like DSR at the driver level, where any game that supports TAA gets its TAA algorithm replaced by a DLSS algorithm and the associated benefits.
 
Mar 11, 2004
23,030
5,495
146
Wonder if we'll ever get a DLSS 3.0 "injector" like DSR at the driver level, where any game that supports TAA gets its TAA algorithm replaced by a DLSS algorithm and the associated benefits.

I think the next gen consoles are leading to a reworking of things quite a bit, so hopefully some of the stuff (like the TAA situation) can be remedied without needing to apply other stuff to compensate (which to me is senseless and is causing a twofold performance issue). Honestly it feels like there's deliberate move for stuff like lower resolution and the like in order to make certain other stuff (path tracing) viable, or even just to get back performance lost for other stuff that in a lot of ways doesn't seem worth the performance hit.

Which, DLSS2.0 is looking pretty good, although I still can't shake that we're compensating for going back steps to try and go forward, but when it can offer comparable image quality with higher framerates its tough to complain. Kinda like with foveated rendering, if we want stuff like VR to take off we're going to need to do that stuff.

I also wonder if we had a more "pure" rendering, if it might not make graphics processing more scalable (so SLI/Crossfire be more viable, as well as possibly chiplets or mGPU in datacenter where it could be bruteforced more easily). I also wonder if it might not be able to get same, close enough, or perhaps better via a dedicated processing chip (by that I mean, something like dedicated upscaler).

I personally am not really into the artistic visual look of Death Stranding (while I cans see objective aspects of why one certain image or another in the links shown can look better, it frankly still looks unappealing to me; I'd much more be interested in what it could do for say a game like...the one Assassin's Creed that took place in Paris and had loads of NPCs on screen). I'd take Horizon Zero Dawn over Death Stranding visually.

I give Nvidia credit they've stuck with it and it is showing improvements.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
Either AMD comes up with similar solution soon, or it is game over to them in high end graphics. Sure they are used to being second and providing GPU alternative so people can buy Nvidia cards cheaper, but that only worked when DLSS was "performance" option. Now that it provides best visuals, it is really game changing. And the time is ticking, Metro/Death Stranding are niche games, but certain game from Poland will deliver coup de grace in the autumn...

There will always be people buying AMD cards in low/mid range, but these DLSS advances are shutting down >400$ market completely. Even if AMD is cheaper in higher end, Nvidia is much better choice.

DLSS is the same technology as MicroSoft's DirectML , which every DX12 graphics card can use.
Direct ML will be used both in consoles (XBOX SX) and desktop from both AMD and NVIDIA.
I will not be surprised if NV will stop using DLSS in favor of DirectML in the future since Direct ML will see wider application in most of the future games for both Consoles and PC.
 
  • Like
Reactions: USER8000 and Tlh97

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
DLSS is the same technology as MicroSoft's DirectML , which every DX12 graphics card can use.

That is like saying "Deferred rendering is the same technology as Vulcan".

No they are not the same. DLSS like techniques can be made with DirectML, but *cough* someone needs to come up with middleware, that takes motion vectors, previuos frames and whatever else DLSS 2.0 needs to function. That is know-how, and that i what i meant AMD needs to come up with.
Maybe Microsoft will help, maybe game engine makers will step up. But as things stand now, it is easier for game makers to use DLSS 2.0 as long as they accept its NV only status ( and restrictions on resolutions.
 
  • Like
Reactions: Heartbreaker

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
That is like saying "Deferred rendering is the same technology as Vulcan".

No they are not the same. DLSS like techniques can be made with DirectML, but *cough* someone needs to come up with middleware, that takes motion vectors, previuos frames and whatever else DLSS 2.0 needs to function. That is know-how, and that i what i meant AMD needs to come up with.
Maybe Microsoft will help, maybe game engine makers will step up. But as things stand now, it is easier for game makers to use DLSS 2.0 as long as they accept its NV only status ( and restrictions on resolutions.

AMD only has to provide the Inference model and expose its hardware specific architecture to the Microsoft DirectX-12 Metacommands and drivers. The rest is up to the DirectML and the game developers.

Also, DirectML is already part of the Windows 10 (since version 1903) and we already know thay Unity game engine will use it.
Since all XBOX SX games will be able to use DirectML and every DX-12 capable PC graphics card from both AMD and NVIDIA, it is not hard to see that the vast majority of games will use DirectML in the future and not NVIDIAs proprietary DLSS.

ps. take a look at the video bellow for more information (technical) about DirectMl
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
Again, sure DirectML can be used to do the calculations, like you could invent some magic shader with 1FPS performance that does the same without DirectML.

With DLSS Nvidia provides both specialized hardware and middleware to implement

Also, DirectML is already part of the Windows 10 (since version 1903) and we already know thay Unity game engine will use it.

And? DirectML is supported on Kepler, won't make "DLSS" work there.

There needs to be both ML hardware support (one that does not give performance by rendering lower resolution, only to take it away by using the same shader computational resources as everything else) and middleware implemented. Right now NV has both and AMD has none.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
DLSS 2.0 is now in Death Stranding, and it seems is actually considered superior not just to FidelityFX + TAA but to the native image too, while still giving the huge performance boost (allowing even a 2060 to run the game at 4k at 60fps).
FidelityFX CAS/Radeon Image Sharpening is a nice feature to have in games with TAA to mitigate the softening effect while still cleaning up jagged edges. The two filters in tandem provide better image quality over the raw image while being virtually free in comparison to traditional multisampling. I appreciate it being available on the driver level or as an injector in Reshade for games like Deus Ex Mankind Divided where the in game sharpening is pretty bad, and it's nice to have in game support for CAS to avoid the UI elements getting oversharpened.

That said, yeah, DLSS 2.0 does appear to be the optimal solution, and AMD needs a better response to it than CAS. Can DirectML be used to make a vendor-agnostic alternative to DLSS? That remains to be seen, but AMD's ace in the hole here is the Xbox Series X/PS5. If such a solution is possible on RDNA2, you can bet console developers will find it. They won't pass up on getting a free performance and image quality boost.