Question DLSS 2.0

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
There is no thread on this, and it's worth one because it seems to finally be realising it's potential. There's many articles discussing it elsewhere - it's used by wolfenstein young blood, control and a few other games and it works really well. Allegedly it's easy to add - no per game training required. Gives a good performance increase and looks really sharp.

Nvidia article.
wccf article.
eurogamer article

The above articles have some good comparison screen shots that really demonstrate what it can do.

Discuss...
 
  • Like
Reactions: DXDiag

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
No dedicated thread, but its already been talked about in a few other long running threads.

The "No per game training required" however is wrong. The game maker still has to submit the game to nVidia, and nVidia has to create a profile for it.

The performance increase is just the result of running the game at a lower resolution. So say, rendering at 1600P on a 4K monitor.

This is what DLSS 1.0 probably should have been, as it would be been reviewed better. It still won't match native resolution, and its arguable as to it being better than up-scaling with image sharpening. It would need to be put in games besides Control, which IMO is not a good looking game.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,115
136
It's unfortunate that DLSS 2.0 isn't seamless, for a guy that holds on to graphics cards for a long time, it sounds like a good way to stretch a card another year or two past it's expiry date.

What it doesn't sound like is something I'd want to have to activate right after buying a card, just means I didn't buy enough card.
 
  • Like
Reactions: Tlh97 and maddie

Guru

Senior member
May 5, 2017
830
361
106
To me DLSS is just not valuable enough, if I bought a lower end vcard, I'd just lower certain settings and improve performance that way, with small loss to image quality. There are a ton of youtubers who do tests with image quality settings and provide you the optimal settings in which the game looks almost as good as all settings maxed, but performs much better, in some cases 25% to 35% better performance for virtually no loss of visual quality.

So to me, if I bought a lower end GPU, I'd just look to play at the optimal settings, otherwise as GodisanAtheist said, I just didn't bought the right GPU for the resolution I want to play at. Why would I buy a say GTX 1660super if I want to play on 4k resolution? It doesn't make sense, and why would I want to play at lower than 1080p resolution, but upscaled, when I can play at 1080p and just use the optimal settings.

To me I just don't see the point of DLSS at all. Just give me better cards Nvidia, not the overpriced TURDS that the Turing series was. Thank god AMD came out with the RX 5700 series, to force Nvidia to come out with the super cards, which is what the Turing cards were supposed to be from the start, not the overpriced turds that they were.

Nvidia released as garbage cards as possible at the start with insane pricing and gave us DLSS instead. Hey here is overpriced $350 2060, that should have been $250 from the start or should have released as the 2060 super from the start, but you can now play at 640x480 resolution upscaled to gain more performance, because our garbage card can't even handle 1080p in 2019!

So to me the whole technology is pointless, if you want to upscale games for whatever reason, then just resolution upscaling them and using an image sharpener is the simplest and easiest way to do it, and it gives the most consistent visual quality across ALL games. If you want to get more frames from a game, then find then find the optimal settings that keep most of the visual fidelity, but gain a ton of performance(there are sites and youtube channels that test these), or just buy a card SUITED to what you need and want!
 

DXDiag

Member
Nov 12, 2017
165
121
116
The "No per game training required" however is wrong. The game maker still has to submit the game to nVidia, and nVidia has to create a profile for it.
No they don't. Game makers just have to create a DLSS path in their engine. That's it. NVIDIA already trained DLSS 2 to be a generic solution using thousands of game images.

It still won't match native resolution, and its arguable as to it being better than up-scaling with image sharpening.
It's now either on equal terms with native resolution + TAA, or even slightly better. Every analysis of DLSS 2 came to that conclusion. Both native res + TAA and DLSS 2 have advantages and disadvantages, but DLSS 2's advantages are simply hugely better.

then just resolution upscaling them and using an image sharpener is the simplest and easiest way to do it,
No it's not, DLSS 2 provides a better IQ than even native + TAA, so upscaling will be even worse.
then find then find the optimal settings that keep most of the visual fidelity, but gain a ton of performance
DLSS 2 typically provides you with about 40% to 70% more performance at the same IQ, you will have to use very low or Medium settings to gain that amount of fps, with huge cuts to image quality as a result.
It would need to be put in games besides Control, which IMO is not a good looking game.
It's already in 4 more games:
Wolfesntein Youngblood
Deliver Us The Moon
MechWarrior 5
Bright Memory
 

Guru

Senior member
May 5, 2017
830
361
106
DLSS 2 typically provides you with about 40% to 70% more performance at the same IQ, you will have to use very low or Medium settings to gain that amount of fps, with huge cuts to image quality as a result.
Source? Isn't DLSS 2 basically only available for Control? And I don't even know if there are actual performance tests done yet on it, I've only seen quality comparison ones.
 

DXDiag

Member
Nov 12, 2017
165
121
116
Source? Isn't DLSS 2 basically only available for Control? And I don't even know if there are actual performance tests done yet on it, I've only seen quality comparison ones.

Read everything here, as I said at least 4 games are available with DLSS 2 right now:

As for performance, watch DF analysis:

Also see here: 50% to 90% more performance going from native 4K to DLSS Quality/1440p, at bascically the same image quality level, or better.

1586493943320.png



1586494129727.png
 

Guru

Senior member
May 5, 2017
830
361
106
Read everything here, as I said at least 4 games are available with DLSS 2 right now:

As for performance, watch DF analysis:

Also see here: 50% to 90% more performance going from native 4K to DLSS Quality/1440p, at bascically the same image quality level, or better.

View attachment 19419



View attachment 19420
I don't see anything that would make it valuable, over what I've already said. No, native 4k high quality textures are better than DLSS upscaled ones, you can see this is games that do have 4k native textures, most games actually don't.

If you compare 1440p upscaled to 4k and apply a sharpen filter, its competing with DLSS 2 in quality and performance.

Again I don't see how this is good for anything, other than marketing bullshit, when they don't provide good graphic cards. If you want to play 4k, buy a gpu that can handle that, otherwise dlss or resolution upscaling or whatever is just a gimmick, it can't beat native quality.

If you want higher performance then just use optimal settings, much simpler and easier. And if for some reason you want to upscale then again resolution scaling+sharpen filter is the easiest and most consistent way to do it for ALL games, at ALL times, hassle free!
 

DXDiag

Member
Nov 12, 2017
165
121
116
If you compare 1440p upscaled to 4k and apply a sharpen filter, its competing with DLSS 2 in quality and performance.
No it's not, HardwareUnboxed did that comparison and found out DLSS to be vastly better. Once more DLSS = native resoluion, so your method of lower resolution then sharpen filter is vastly inferior.
If you want to play 4k, buy a gpu that can handle that, otherwise dlss or resolution upscaling or whatever is just a gimmick, it can't beat native quality.
There are no cards that can handle 4K max settings. And there won't be if you use RT.

If you want higher performance then just use optimal settings, much simpler and easier.
Or just use DLSS, since it delivers the same quality, and greater performance.
it can't beat native quality.
It can, and it consistently does it actually, according to the myriads of quality comparisons.
And if for some reason you want to upscale then again resolution scaling+sharpen filter is the easiest and most consistent way to do it for ALL games, at ALL times, hassle free!
That's not hassle free at all, that's the definition of hassle itself, DLSS 2 is hassle free, since it's just a single switch.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
It's gotta be the way forward, the video shows that you can run at 4K using 1080P + DLSS 2 and get image quality that's near identical - slightly worse in some areas due to a little halo'ing but it does a better job of resolving detail in motion then native 4K so overall similar. That's huge, it literally over doubles your fps for no image quality loss.
I know a lot of people have to hate it because it's Nvidia but it really is amazing what it can achieve. I would not even consider buying a card now that did not do DLSS.
 
  • Like
Reactions: Tlh97 and DXDiag

DXDiag

Member
Nov 12, 2017
165
121
116
It's gotta be the way forward, the video shows that you can run at 4K using 1080P + DLSS 2 and get image quality that's near identical - slightly worse in some areas due to a little halo'ing but it does a better job of resolving detail in motion then native 4K so overall similar. That's huge, it literally over doubles your fps for no image quality loss.
I know a lot of people have to hate it because it's Nvidia but it really is amazing what it can achieve. I would not even consider buying a card now that did not do DLSS.
If you upscale from 1440p (aka DLSS 2 Quality mode), you get even more quality, surpassing native resolution by a decent margin.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
People believe that DLSS 2.0 image quality is better than Native simple because everyone is comparing it against Native + TAA.
And this is because from what I understand, DLSS 2.0 replaces the TAA in the game engine. So, we cannot have DLSS 2.0 in a game that doesnt have TAA build in.

In most games that support TAA natively, the game developers are opting for smoother images but that creates blurriness.
So, DLSS 2.0 at quality settings it does look much better against Native Resolution + TAA.

What I would like to see is DLSS 2.0 against Native Resolution without TAA.
 

DXDiag

Member
Nov 12, 2017
165
121
116
And this is because from what I understand, DLSS 2.0 replaces the TAA in the game engine. So, we cannot have DLSS 2.0 in a game that doesnt have TAA build in.
Almost all engines have TAA built in.

So, DLSS 2.0 at quality settings it does look much better against Native Resolution + TAA.
Yes, comparison is Native + TAA vs DLSS 2.
What I would like to see is DLSS 2.0 against Native Resolution without TAA.
Most engines now have TAA baked in, they even rely on it to do their AO, transparencies, reflections .. etc. You can't turn TAA off without breaking those effects.
 
  • Haha
Reactions: Guru

Guru

Senior member
May 5, 2017
830
361
106
No it's not, HardwareUnboxed did that comparison and found out DLSS to be vastly better. Once more DLSS = native resoluion, so your method of lower resolution then sharpen filter is vastly inferior.

There are no cards that can handle 4K max settings. And there won't be if you use RT.


Or just use DLSS, since it delivers the same quality, and greater performance.

It can, and it consistently does it actually, according to the myriads of quality comparisons.

That's not hassle free at all, that's the definition of hassle itself, DLSS 2 is hassle free, since it's just a single switch.
What the hell are you talking about? It's literally ONE game being tested, if you aren't blind and actually look at the pics that techspot showed, you can clearly see the loss of detail and darkening of textures that are a bit further away. So its using darkening of textures to hide the loss of quality, plus TAA is NOT the end all be all, ultimately no AA filtering is perfect, they have downfalls, but you really even need AA I found in 4k, even lower resolution, because you don't really notice those jagged edges as much in motion and it looks crispier without AA.

There also seems to be a sharpening filter applied in DLSS 2, you can clearly see it in the text comparisons, its jagging the text extremely, in fact in DLSS 2, text looks worse than DLSS 1.9, there is a huge sharpening filter applied and yeah, it does make the picture look crispier, but you can still see the loss of detail if you actually look.

And again where does this "better quality" nonsense coming from? It's literally ONE game tested across all sources I've seen. Techspot, digital foundry, one game and if you are not blind you can CLEARLY see the loss of detail. Plus as I've already said not all games have 4k textures, so 4k is not going to look better than 1440p, it looks maybe very subtly better, because it's just more pixels due to the resolution and oftentimes 4k monitors are actually just better than lower res ones, because they are more expensive, certainly there are many exceptions obviously, but since 4k are more expensive in general, the higher end screens are reserved more often for 4k.

I mean Jesus, admit when you are wrong! Going through each techspot pic and you can CLEARLY, CLEARLY See the loss of quality. Take the zoomed in picture of the first pictures, you can clearly see it's more blurry and rather BIG loss of detail. The images on the board look blurrier and have loss of detail, the metallic presentation objects look blurrier and have a very noticeable loss of detail, the shadow below has a freaking loss of detail and looks more one color, more uniform, in native it looks more natural and not so dark and blurry, etc...

Second set of zoomed pictures even if you were blind you can see that it darkens the colors, the yellow wall is much darker, hiding the loss of detail by darkening it, but its still easily perceivable, the color is more uniform, has no branching of colors, has so much less detail, its like a beginner artist vs experienced top level artist, the DLSS color is so much simpler and just uniform and dark.
The cardboard is a lot blurrier, again the monotone of the color, the darkening, just massive loss of detail!

STOP treating us like idiots, DLSS even if you were BLIND is NOT better quality at 4k, is NOT same quality as 4k, is much lower quality at 4k, at 1440p, at 1080, EASILY PERCEIVABLE!

ALL of the pictures CLEARLY show very noticeable loss of quality! Even Nvidia's OWN MARKETING DEPARTMENT doesn't claim its same quality! Even Nvidia admits its lower quality, but for higher framerate. Even they are not so brazen to claim such nonsense!




Dial it back. Insults aren't allowed in tech, and you posted multiple insults.


esquared
Anandtech Forum Director
 
Last edited by a moderator:

DXDiag

Member
Nov 12, 2017
165
121
116
And again where does this "better quality" nonsense coming from? It's literally ONE game tested across all sources I've seen.
Nope, sources have tested:
-Control
-MechWarrior 5
-Wolfesntein Young Blood


Going through each techspot pic and you can CLEARLY, CLEARLY See the loss of quality.
You can see whatever you like, all sources have concluded that DLSS 2 is equal or better than native res + TAA, whether Digital Foundry (in their Control and Wolfesntein analysis), or TechSpot, or Overclock3d, or the dozens of other sources on Youtube and tech sites. It's also my personal observation. You are free to think whatever you want, but that's probably your tainted view, nothing more, and goes against what testers have experienced.
 

mikegg

Golden Member
Jan 30, 2010
1,755
411
136
People believe that DLSS 2.0 image quality is better than Native simple because everyone is comparing it against Native + TAA.
And this is because from what I understand, DLSS 2.0 replaces the TAA in the game engine. So, we cannot have DLSS 2.0 in a game that doesnt have TAA build in.

In most games that support TAA natively, the game developers are opting for smoother images but that creates blurriness.
So, DLSS 2.0 at quality settings it does look much better against Native Resolution + TAA.

What I would like to see is DLSS 2.0 against Native Resolution without TAA.
Regardless, it doesn't matter.

DLSS 2.0 with 1080p upscaled to 4k + ultra settings + maxed ray tracing on is playable on an RTX 2070.

That's incredible.

And it'll look way better than native 4k with settings turned down and no maxed ray tracing.

Nvidia's 7nm GPUs combined with DLSS 2.0 will allow mid-range cards to play in "4k" resolution with full ray tracing.

Nvidia has done an incredible job with the RTX generation. A lot of people hated on the RTX 2000 series but I think it will be remembered as one of the best ever.
 
  • Like
Reactions: DXDiag

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,115
136
I guess time will tell how easy it is to implement based on the number of games that ultimately support it.

Not to "both sides" this thing, as each camp has their own idiosyncrasies, but one of NV's hallmarks has been cool proprietary tech that sees minimal adoption only to be eventually abandoned (GPU PhysX and Gsync immediately leap to mind).

Hell even many of the RTX titles in the initial announcement of the technology have dropped the RT part of RTX feature set.
 

DisarmedDespot

Senior member
Jun 2, 2016
587
588
136
Nvidia has done an incredible job with the RTX generation. A lot of people hated on the RTX 2000 series but I think it will be remembered as one of the best ever.
I really doubt that. The non-super cards launched with big price hikes and with only one game (Battlefield 5) using raytracing at launch.They had to sell it on DLSS and a good-but-not-amazing boost over the last generation, with the $1200 2080 Ti being ~30% better in non-RT titles compared to the 1080 Ti. DLSS was a blurry mess and the teased DLSS 2 (or whatever it was named) that would have a performance hit but improve image quality at native resolution never materialized. The Super cards were better, but more 'this is what it should've been in the first place' than anything worth singing praises for.

If DLSS finally works now, great. It just took 17 months to get into a working state.
 
  • Like
Reactions: Tlh97

mikegg

Golden Member
Jan 30, 2010
1,755
411
136
I really doubt that. The non-super cards launched with big price hikes and with only one game (Battlefield 5) using raytracing at launch.They had to sell it on DLSS and a good-but-not-amazing boost over the last generation, with the $1200 2080 Ti being ~30% better in non-RT titles compared to the 1080 Ti. DLSS was a blurry mess and the teased DLSS 2 (or whatever it was named) that would have a performance hit but improve image quality at native resolution never materialized. The Super cards were better, but more 'this is what it should've been in the first place' than anything worth singing praises for.

If DLSS finally works now, great. It just took 17 months to get into a working state.
Nvidia is a business, not a charity. It optimizes prices based on competition, value, etc. It's not Nvidia's fault that AMD has nothing to compete with the performance of RTX, Ray Tracing, DLSS, driver stability. Nor is it Nvidia's fault that consumers are willing to pay those prices.

RTX brought two game-changing technology to the mainstream: ray tracing and DLSS 2.0. Most generations don't bring any. That's why I think RTX will be remembered fondly.

Ray tracing had to start somewhere, even if it was just one game. Don't knock on Nvidia for being innovative.

PS. I own both AMD and Nvidia stocks.
 
Last edited:
  • Like
Reactions: xpea

tamz_msc

Diamond Member
Jan 5, 2017
3,770
3,590
136
I guess time will tell how easy it is to implement based on the number of games that ultimately support it.

Not to "both sides" this thing, as each camp has their own idiosyncrasies, but one of NV's hallmarks has been cool proprietary tech that sees minimal adoption only to be eventually abandoned (GPU PhysX and Gsync immediately leap to mind).

Hell even many of the RTX titles in the initial announcement of the technology have dropped the RT part of RTX feature set.

In what world has G-Sync been abandoned? Nvidia has expanded G-Sync to G-Sync compatible and monitor manufacturers are adopting that label left and right.
 
  • Like
Reactions: DXDiag

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,115
136

In what world has G-Sync been abandoned? Nvidia has expanded G-Sync to G-Sync compatible and monitor manufacturers are adopting that label left and right.

- NV has expanded Gsync to be their version of Freesync(basically a branded open standard) but the underlying proprietary tech that required a special monitor and module, etc has been largely abandoned.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,770
3,590
136
- NV has expanded Gsync to be their version of Freesync(basically a branded open standard) but the underlying proprietary tech that required a special monitor and module, etc has been largely abandoned.
That's incorrect. G-Sync and G-Sync Ultimate monitors still come with the module. It hasn't been abandoned. If anything manufacturers are abandoning FreeSync and are instead opting for G-Sync compatible branding.
 
  • Like
Reactions: DXDiag