Question DLSS 2.0

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is no thread on this, and it's worth one because it seems to finally be realising it's potential. There's many articles discussing it elsewhere - it's used by wolfenstein young blood, control and a few other games and it works really well. Allegedly it's easy to add - no per game training required. Gives a good performance increase and looks really sharp.

Nvidia article.
wccf article.
eurogamer article

The above articles have some good comparison screen shots that really demonstrate what it can do.

Discuss...
 
  • Like
Reactions: DXDiag

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
People believe that DLSS 2.0 image quality is better than Native simple because everyone is comparing it against Native + TAA.
And this is because from what I understand, DLSS 2.0 replaces the TAA in the game engine. So, we cannot have DLSS 2.0 in a game that doesnt have TAA build in.

In most games that support TAA natively, the game developers are opting for smoother images but that creates blurriness.
So, DLSS 2.0 at quality settings it does look much better against Native Resolution + TAA.

What I would like to see is DLSS 2.0 against Native Resolution without TAA.
 

DisarmedDespot

Senior member
Jun 2, 2016
587
588
136
Nvidia has done an incredible job with the RTX generation. A lot of people hated on the RTX 2000 series but I think it will be remembered as one of the best ever.
I really doubt that. The non-super cards launched with big price hikes and with only one game (Battlefield 5) using raytracing at launch.They had to sell it on DLSS and a good-but-not-amazing boost over the last generation, with the $1200 2080 Ti being ~30% better in non-RT titles compared to the 1080 Ti. DLSS was a blurry mess and the teased DLSS 2 (or whatever it was named) that would have a performance hit but improve image quality at native resolution never materialized. The Super cards were better, but more 'this is what it should've been in the first place' than anything worth singing praises for.

If DLSS finally works now, great. It just took 17 months to get into a working state.
 
  • Like
Reactions: Tlh97

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
That is like saying "Deferred rendering is the same technology as Vulcan".

No they are not the same. DLSS like techniques can be made with DirectML, but *cough* someone needs to come up with middleware, that takes motion vectors, previuos frames and whatever else DLSS 2.0 needs to function. That is know-how, and that i what i meant AMD needs to come up with.
Maybe Microsoft will help, maybe game engine makers will step up. But as things stand now, it is easier for game makers to use DLSS 2.0 as long as they accept its NV only status ( and restrictions on resolutions.

AMD only has to provide the Inference model and expose its hardware specific architecture to the Microsoft DirectX-12 Metacommands and drivers. The rest is up to the DirectML and the game developers.

Also, DirectML is already part of the Windows 10 (since version 1903) and we already know thay Unity game engine will use it.
Since all XBOX SX games will be able to use DirectML and every DX-12 capable PC graphics card from both AMD and NVIDIA, it is not hard to see that the vast majority of games will use DirectML in the future and not NVIDIAs proprietary DLSS.

ps. take a look at the video bellow for more information (technical) about DirectMl
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
No dedicated thread, but its already been talked about in a few other long running threads.

The "No per game training required" however is wrong. The game maker still has to submit the game to nVidia, and nVidia has to create a profile for it.

The performance increase is just the result of running the game at a lower resolution. So say, rendering at 1600P on a 4K monitor.

This is what DLSS 1.0 probably should have been, as it would be been reviewed better. It still won't match native resolution, and its arguable as to it being better than up-scaling with image sharpening. It would need to be put in games besides Control, which IMO is not a good looking game.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,825
7,189
136
It's unfortunate that DLSS 2.0 isn't seamless, for a guy that holds on to graphics cards for a long time, it sounds like a good way to stretch a card another year or two past it's expiry date.

What it doesn't sound like is something I'd want to have to activate right after buying a card, just means I didn't buy enough card.
 
  • Like
Reactions: Tlh97 and maddie

DXDiag

Member
Nov 12, 2017
165
121
116
The "No per game training required" however is wrong. The game maker still has to submit the game to nVidia, and nVidia has to create a profile for it.
No they don't. Game makers just have to create a DLSS path in their engine. That's it. NVIDIA already trained DLSS 2 to be a generic solution using thousands of game images.

It still won't match native resolution, and its arguable as to it being better than up-scaling with image sharpening.
It's now either on equal terms with native resolution + TAA, or even slightly better. Every analysis of DLSS 2 came to that conclusion. Both native res + TAA and DLSS 2 have advantages and disadvantages, but DLSS 2's advantages are simply hugely better.

then just resolution upscaling them and using an image sharpener is the simplest and easiest way to do it,
No it's not, DLSS 2 provides a better IQ than even native + TAA, so upscaling will be even worse.
then find then find the optimal settings that keep most of the visual fidelity, but gain a ton of performance
DLSS 2 typically provides you with about 40% to 70% more performance at the same IQ, you will have to use very low or Medium settings to gain that amount of fps, with huge cuts to image quality as a result.
It would need to be put in games besides Control, which IMO is not a good looking game.
It's already in 4 more games:
Wolfesntein Youngblood
Deliver Us The Moon
MechWarrior 5
Bright Memory
 

DXDiag

Member
Nov 12, 2017
165
121
116
Source? Isn't DLSS 2 basically only available for Control? And I don't even know if there are actual performance tests done yet on it, I've only seen quality comparison ones.

Read everything here, as I said at least 4 games are available with DLSS 2 right now:

As for performance, watch DF analysis:

Also see here: 50% to 90% more performance going from native 4K to DLSS Quality/1440p, at bascically the same image quality level, or better.

1586493943320.png



1586494129727.png
 

DXDiag

Member
Nov 12, 2017
165
121
116
If you compare 1440p upscaled to 4k and apply a sharpen filter, its competing with DLSS 2 in quality and performance.
No it's not, HardwareUnboxed did that comparison and found out DLSS to be vastly better. Once more DLSS = native resoluion, so your method of lower resolution then sharpen filter is vastly inferior.
If you want to play 4k, buy a gpu that can handle that, otherwise dlss or resolution upscaling or whatever is just a gimmick, it can't beat native quality.
There are no cards that can handle 4K max settings. And there won't be if you use RT.

If you want higher performance then just use optimal settings, much simpler and easier.
Or just use DLSS, since it delivers the same quality, and greater performance.
it can't beat native quality.
It can, and it consistently does it actually, according to the myriads of quality comparisons.
And if for some reason you want to upscale then again resolution scaling+sharpen filter is the easiest and most consistent way to do it for ALL games, at ALL times, hassle free!
That's not hassle free at all, that's the definition of hassle itself, DLSS 2 is hassle free, since it's just a single switch.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It's gotta be the way forward, the video shows that you can run at 4K using 1080P + DLSS 2 and get image quality that's near identical - slightly worse in some areas due to a little halo'ing but it does a better job of resolving detail in motion then native 4K so overall similar. That's huge, it literally over doubles your fps for no image quality loss.
I know a lot of people have to hate it because it's Nvidia but it really is amazing what it can achieve. I would not even consider buying a card now that did not do DLSS.
 
  • Like
Reactions: Tlh97 and DXDiag

DXDiag

Member
Nov 12, 2017
165
121
116
And this is because from what I understand, DLSS 2.0 replaces the TAA in the game engine. So, we cannot have DLSS 2.0 in a game that doesnt have TAA build in.
Almost all engines have TAA built in.

So, DLSS 2.0 at quality settings it does look much better against Native Resolution + TAA.
Yes, comparison is Native + TAA vs DLSS 2.
What I would like to see is DLSS 2.0 against Native Resolution without TAA.
Most engines now have TAA baked in, they even rely on it to do their AO, transparencies, reflections .. etc. You can't turn TAA off without breaking those effects.
 
  • Haha
Reactions: Guru

DXDiag

Member
Nov 12, 2017
165
121
116
And again where does this "better quality" nonsense coming from? It's literally ONE game tested across all sources I've seen.
Nope, sources have tested:
-Control
-MechWarrior 5
-Wolfesntein Young Blood


Going through each techspot pic and you can CLEARLY, CLEARLY See the loss of quality.
You can see whatever you like, all sources have concluded that DLSS 2 is equal or better than native res + TAA, whether Digital Foundry (in their Control and Wolfesntein analysis), or TechSpot, or Overclock3d, or the dozens of other sources on Youtube and tech sites. It's also my personal observation. You are free to think whatever you want, but that's probably your tainted view, nothing more, and goes against what testers have experienced.
 

mikegg

Golden Member
Jan 30, 2010
1,756
411
136
People believe that DLSS 2.0 image quality is better than Native simple because everyone is comparing it against Native + TAA.
And this is because from what I understand, DLSS 2.0 replaces the TAA in the game engine. So, we cannot have DLSS 2.0 in a game that doesnt have TAA build in.

In most games that support TAA natively, the game developers are opting for smoother images but that creates blurriness.
So, DLSS 2.0 at quality settings it does look much better against Native Resolution + TAA.

What I would like to see is DLSS 2.0 against Native Resolution without TAA.
Regardless, it doesn't matter.

DLSS 2.0 with 1080p upscaled to 4k + ultra settings + maxed ray tracing on is playable on an RTX 2070.

That's incredible.

And it'll look way better than native 4k with settings turned down and no maxed ray tracing.

Nvidia's 7nm GPUs combined with DLSS 2.0 will allow mid-range cards to play in "4k" resolution with full ray tracing.

Nvidia has done an incredible job with the RTX generation. A lot of people hated on the RTX 2000 series but I think it will be remembered as one of the best ever.
 
  • Like
Reactions: DXDiag

GodisanAtheist

Diamond Member
Nov 16, 2006
6,825
7,189
136
I guess time will tell how easy it is to implement based on the number of games that ultimately support it.

Not to "both sides" this thing, as each camp has their own idiosyncrasies, but one of NV's hallmarks has been cool proprietary tech that sees minimal adoption only to be eventually abandoned (GPU PhysX and Gsync immediately leap to mind).

Hell even many of the RTX titles in the initial announcement of the technology have dropped the RT part of RTX feature set.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,643
136
I guess time will tell how easy it is to implement based on the number of games that ultimately support it.

Not to "both sides" this thing, as each camp has their own idiosyncrasies, but one of NV's hallmarks has been cool proprietary tech that sees minimal adoption only to be eventually abandoned (GPU PhysX and Gsync immediately leap to mind).

Hell even many of the RTX titles in the initial announcement of the technology have dropped the RT part of RTX feature set.

In what world has G-Sync been abandoned? Nvidia has expanded G-Sync to G-Sync compatible and monitor manufacturers are adopting that label left and right.
 
  • Like
Reactions: DXDiag

GodisanAtheist

Diamond Member
Nov 16, 2006
6,825
7,189
136

In what world has G-Sync been abandoned? Nvidia has expanded G-Sync to G-Sync compatible and monitor manufacturers are adopting that label left and right.

- NV has expanded Gsync to be their version of Freesync(basically a branded open standard) but the underlying proprietary tech that required a special monitor and module, etc has been largely abandoned.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,643
136
- NV has expanded Gsync to be their version of Freesync(basically a branded open standard) but the underlying proprietary tech that required a special monitor and module, etc has been largely abandoned.
That's incorrect. G-Sync and G-Sync Ultimate monitors still come with the module. It hasn't been abandoned. If anything manufacturers are abandoning FreeSync and are instead opting for G-Sync compatible branding.
 
  • Like
Reactions: DXDiag

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Freesync exists because of gsync. Gsync came out and then AMD started to think about freesync which took several years get working properly - and started basically by monitor makers with a gsync display copying it to make a freesync one. Out of gsync came variable rate displays for all. Hardly forgotten. I expect DLSS will be the same - Nvidia invent a new tech, they market it and make money from it, the rest of the market eventually catches up and it becomes ubiquitous.
 
  • Like
Reactions: DXDiag

BFG10K

Lifer
Aug 14, 2000
22,709
2,972
126
Freesync exists because of gsync.
Not true, VRR has been an implemented part of the VESA spec since 2009, far before gsync: https://vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Also gsync requires DP1.2 minimum, which again isn't a coincidence.

Gsync came out and then AMD started to think about freesync which took several years get working properly - and started basically by monitor makers with a gsync display copying it to make a freesync one. Out of gsync came variable rate displays for all. Hardly forgotten.
How was gsync "copied"? The VRR spec pre-dates gsync by years and specifically avoids proprietary hardware or licensing fees. More like nVidia copied an open standard and slapped their tax on it.

After freesync was implemented more and more people realized the utter lunacy of locking a monitor purchase to a graphics card and also paying a monitor tax, which forced nVidia's hand.

Exactly the same thing happened with SLI. Initially it was locked to nForce chipsets until Intel forced nVidia's hand.

I expect DLSS will be the same - Nvidia invent a new tech, they market it and make money from it, the rest of the market eventually catches up and it becomes ubiquitous.
Historically a feature locked to a single vendor almost never becomes ubiquitous. I mean who here claims hardware PhysX, TXAA or 3DVision are "ubiquitous"? These technologies are dead as a dodo despite being repeatedly hyped as the second coming on this very forum.

As for DLSS 2.0, it's certainly better than the fraudulent monstrosity that is 1.0, but it still needs per-game-support and is locked to a single vendor. Unlike upscaling + sharpening which works virtually everywhere.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Not true, VRR has been an implemented part of the VESA spec since 2009, far before gsync: https://vesa.org/featured-articles/vesa-adds-adaptive-sync-to-popular-displayport-video-standard/

Also gsync requires DP1.2 minimum, which again isn't a coincidence.

How was gsync "copied"? The VRR spec pre-dates gsync by years and specifically avoids proprietary hardware or licensing fees. More like nVidia copied an open standard and slapped their tax on it.
VVR in the VESA spec is not freesync. Like you said it existed in 2009 but it was many years after the first gsync display that a properly working freesync one arrived. Would it have happened without gsync, perhaps eventually but the monitor makers are lazy. They wouldn't have put much effort into developing it without gsync. What gsync did was made them build compatible displays which could handle the VVR, including variable overdrive to the very strict gsync spec, and showed them it working. That was key to developing displays that could do freesync well.
Historically a feature locked to a single vendor almost never becomes ubiquitous. I mean who here claims hardware PhysX, TXAA or 3DVision are "ubiquitous"? These technologies are dead as a dodo despite being repeatedly hyped as the second coming on this very forum.
PhysX is the most common physics engine in games today I think, the hardware bit not so much as cpu's have improved but that tech is still in use all over the place. 3DVision has now died as VR became a thing, but it is the thing that pushed super low blur high refresh rate monitors. Before that there were none, and for years the only ones that existed were for 3d vision. Now it's standard, even if you never wanted 3D you can thank 3D Vision for your high refresh rate low blur monitor.
As for DLSS 2.0, it's certainly better than the fraudulent monstrosity that is 1.0, but it still needs per-game-support and is locked to a single vendor. Unlike upscaling + sharpening which works virtually everywhere.
Your hatred of DLSS around here still burns bright I see. DLSS 2 is just better upscaling + sharpening because it uses AI hardware to enable vastly more complex algorithms. That's the same hardware as DLSS 1 used but they've improved the software, and I'm sure they will continue to improve it. What's key is the hardware, with only shaders you are more limited as you don't have the same image processing compute power.
 
  • Like
Reactions: sxr7171 and DXDiag

GodisanAtheist

Diamond Member
Nov 16, 2006
6,825
7,189
136
DX and Dribble, you make good points, but ultimately your points support the underlying argument "DLSS 2.0 is good stuff, but if its proprietary its not likely to stick around".

I don't think there is much of an argument (at least not one I'm making) against "proprietary technology can lead to industry-wide support of an open standard", but there is a argument against proprietary tech introduced by either vendor hanging around for more than a couple generations of cards.

Physx - GPGPU Physx is effectively dead at this point. Physx as a CPU driven competitor to Havok is nice, but not really unique or relevant to the conversation at large about proprietary tech since it went from "Only NV GPUs" to "Runs on all CPUs".

Gsync - NV's proprietary implementation was definitely nice and adhered to much stricter tolerances than the open VRR standard or Freesync that followed, but with NV essentially conceding the race by opening their last two gens of cards to "Gsync Compatible" aka Freesync monitors, you're going to see NV only monitors dwindle and finally disappear (not that there were many to begin with).

I just used those examples but you can also look at proprietary Renderers like Glide or more recently Mantle or Ati Truform Tessellation. Those techs eventually were scrapped and folded in to "open" (not that DX is open) standards that everyone can use.
 

Hitman928

Diamond Member
Apr 15, 2012
5,321
8,005
136
VVR in the VESA spec is not freesync. Like you said it existed in 2009 but it was many years after the first gsync display that a properly working freesync one arrived. Would it have happened without gsync, perhaps eventually but the monitor makers are lazy. They wouldn't have put much effort into developing it without gsync. What gsync did was made them build compatible displays which could handle the VVR, including variable overdrive to the very strict gsync spec, and showed them it working. That was key to developing displays that could do freesync well.

PhysX is the most common physics engine in games today I think, the hardware bit not so much as cpu's have improved but that tech is still in use all over the place. 3DVision has now died as VR became a thing, but it is the thing that pushed super low blur high refresh rate monitors. Before that there were none, and for years the only ones that existed were for 3d vision. Now it's standard, even if you never wanted 3D you can thank 3D Vision for your high refresh rate low blur monitor.

Your hatred of DLSS around here still burns bright I see. DLSS 2 is just better upscaling + sharpening because it uses AI hardware to enable vastly more complex algorithms. That's the same hardware as DLSS 1 used but they've improved the software, and I'm sure they will continue to improve it. What's key is the hardware, with only shaders you are more limited as you don't have the same image processing compute power.

The first Gsync monitors came out early 2014. The first Freesync monitors came out early 2015. How is that many years?
 
  • Like
Reactions: Tlh97