My nVidia DLDSR Testing

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Introduction
I thought I’d do a quick writeup of the new DLDSR just released in driver 511.23. DSR has been around for years but this new version uses Tensor cores to downscale instead of shaders used in the old version.

I used a 2070 with the 2.25x setting on a 1440p native display (4K effective), default 33% smoothing.

Performance
There’s a picture from nVidia showing Prey 2017 DLDSR 2.25x having the same performance as native, so let’s start there. I picked four older games that are still quite GPU limited at 1440p:

DLDSR.png


nVidia’s claim is clearly nonsense and it’s obvious they cherry-picked a CPU limited and/or framecapped scene for their screenshot. If you’re rendering 4K pixels you’re going to have the hit from 4K hit no matter how you downsample later.

What’s even more interesting is the new version is actually slower than the legacy version.

Image Quality
A quick subjective check in about half a dozen games shows DL is much sharper than legacy because nVidia are employing their old trick of applying massive oversharpening to Tensor core output, ala DLSS. While it does look sharper, thin edges like ropes, cables and power lines have more pronounced stair-stepping with DL compared to legacy.

nVidia’s claim that 2.25x DL image quality matches 4x legacy is also false. 4x has better image stability than 2.25x DL because it’s rendering far more pixels, and also because it’s integer scaling.

Conclusion
DLDSR has exactly the same problems as regular DSR. The biggest one is the display drops to 60Hz in a lot of games, and this problem has been around for years, way back in Windows 7. I’m amazed nVidia still haven’t fixed this.

Other problems are that some games’ UI are too small, and it also messes with display scaling and/or the position of background windows.

In my opinion DLDSR is a complete waste of time.
 
Last edited:

Tup3x

Senior member
Dec 31, 2016
959
942
136
They don't claim same performance and obviously higher resolution is going to be slower - anyone should realise that. Some older DX11 games just are so CPU bound that using it causes only small hit. I tested 1,78x DLDSR in Skyrim and it looked good there.

1,78x DLDSR:
Screenshot (2724).jpg

1,78x DSR:
Screenshot (2725).jpg

Performance was identical but DLDSR looked better in my opinion. Kinda funny to do that with DLSS enabled but it does show that tensor processing was no a bottleneck here.

Same performance, slightly different image output. More options for user to pick.
 
Last edited:
  • Like
Reactions: Heartbreaker

Leon

Platinum Member
Nov 14, 1999
2,215
4
81
Made a big difference in Witcher 2 and Witcher 3 (as compared to native 4K). Cleaned up foliage nicely, looks a bit sharper too. I fixed 60fps issue by temporary setting desktop resolution to 5760x3240 144hz from Nvidia panel.
Old DSR was too heavy at 4x, and useless at any other ratio, as it required applying "smoothness" aka blur filter.
 

Tup3x

Senior member
Dec 31, 2016
959
942
136
Did some image quality testing with AA off. Results are quite interesting. Previously I was surprised how well it removes aliasing and it definitely does something more than just simple downscaling with sharpness control.

These are 300% zooms without interpolation (you should view them without any scaling or results are bad). Rendering resolution was 3413x1920. That little difference in colours are from dynamic day/night cycle and clouds.
lanczos.png
This is was resized to 2560x1440 using Lanczos resampler (in XnView).

dldsr.png
This is DLDSR (50% sharpness) image. It does much better job when it comes to removing aliasing. By the way, GeForce Experience captures the down sampled and processed image.

Lanczos:
lanczos2.png

DLDSR:
dldsr2.png
I guess it applies some form of FXAA (likely further developed version) and then sharpness filter.
 
  • Like
Reactions: psolord

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
The awful sharpening kills it, along with having the same problems as regular DSR. Switching desktop resolutions is not solution especially since it's a per-game issue.

And I won't ever install excrement that is GFE. I specifically strip it out of my drive package before installation.
 

Tup3x

Senior member
Dec 31, 2016
959
942
136
Sharpening is a non-issue since there's a slider for that. Saying that "awful sharpening kills it" makes no sense because it's untrue. If one is too lazy to set one slider to max then I don't know what to say.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Sharpening is a non-issue since there's a slider for that. Saying that "awful sharpening kills it" makes no sense because it's untrue. If one is too lazy to set one slider to max then I don't know what to say.
Nonsense. It needs an option to disable the sharpening at the source, not to apply a 100% blur filter over the top of it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Watch the DF video. It isn't a blur filter.
I did watch it. It's the only setting that affects the sharpening, but it's after-the-fact. There's no setting that disables the sharpening at the source. I tested 0%, 33%, 50% and 100% long before that video, and none were satisfactory in comparison.

Their findings confirm DL is slower than regular DSR, which is exactly what I posted in the OP.

Their findings also confirm 4x with 0% is better with regard to image stability and sub-pixel aliasing, which is my comment in the OP about very thin lines.
 
Last edited:

Tup3x

Senior member
Dec 31, 2016
959
942
136
Nonsense. It needs an option to disable the sharpening at the source, not to apply a 100% blur filter over the top of it.
Right, you clearly have no idea what you are talking about or you rant for the sake of ranting. That slider works totally differently when using DLDSR and it doesn't do what the name suggests. Setting it to 100% means no sharpening and setting it to 0% means truckloads of sharpening. I personally think the sweetspot is 85% to 100% (it will depend on game, since some may have super blurry TAA). It doesn't add a blur filter.

To me it sounds like you haven't even used it or looked properly. I've done quite a bit of pixel peeping since I've actually captured screenshot and flipped between them so I don't have to rely on memory.

If you use same downsampling resolution for both DSR and DLDSR, then DLDSR comes out ahead with miniscule difference in performance. That alone is a great thing. If you are really getting worse performance with DLDSR 2,25x than DSR 4x then there's some kind of user error happening. When I tested the performance difference in Shadow of the Tomb Raider the performance was essentially identical. (Alternatively Turing takes much higher hit than Ampere).

Their findings also confirm 4x with 0% is better with regard to image stability and sub-pixel aliasing, which is my comment in the OP about very thin lines.
Not always true. DLDSR reduces sub pixel shimmering and handles edge aliasing better. Thin lines are obviously not going to better because the data just isn't.
 
Last edited:
  • Like
Reactions: Heartbreaker

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
That slider works totally differently when using DLDSR and it doesn't do what the name suggests. Setting it to 100% means no sharpening and setting it to 0% means truckloads of sharpening.
I misspoke when describing the slider in that case but the results are the same. Even 100% is un-naturally sharper than it should be.

Also 1440p native + FXAA provides smoother edges than 2.25x + 100%, especially during movement.

I've done quite a bit of pixel peeping since I've actually captured screenshot and flipped between them so I don't have to rely on memory.
Pixel peeping is useless to test image stability. I played half a dozen different games over the course of several hours - legacy games using basic rasterization which don't use modern post-filtering to mask these problems.

...And that steaming pile of 60Hz problem was in every one of them.

If you are really getting worse performance with DLDSR 2,25x than DSR 4x then there's some kind of user error happening.
I never said that. I said DL 2.25 is slower than legacy 2.25. I also said 4x with 0% has superior IQ to any 2.25x DL slider value.
 

Tup3x

Senior member
Dec 31, 2016
959
942
136
Lets agree to disagree then. I don't think there's any over sharpening involved when smoothness is set to 100%. On top of that I can happily use 2,25x at 165Hz and the performance penalty is almost within margin of error.

I'd personally never use 2,25x alone to smooth out edges (or even 4x DSR). The point is, it does better job than 2,25x DSR - at least in my opinion. I'd always use at least driver FXAA (which is better than any in game implementation) combined with DSR/DLDSR.

In any case, there are now more options and it's not a bad thing.
 
  • Like
Reactions: Heartbreaker

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
Sharpening is 100% unacceptable to me. I hate its artifacts and the general look.

I'll test this as soon as I can but I already hear reports of major games where sharpening can't be disabled for now. That is quite disappointing since I'm pretty sure upscaling is the future as monitor resolutions increase faster than processing power.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,226
5,228
136
Lets agree to disagree then. I don't think there's any over sharpening involved when smoothness is set to 100%. On top of that I can happily use 2,25x at 165Hz and the performance penalty is almost within margin of error.

I'd personally never use 2,25x alone to smooth out edges (or even 4x DSR). The point is, it does better job than 2,25x DSR - at least in my opinion. I'd always use at least driver FXAA (which is better than any in game implementation) combined with DSR/DLDSR.

In any case, there are now more options and it's not a bad thing.

Finally got a modern GPU and I have been playing with this feature, in a couple of games that have no AA, and using the 2.25x gives very decent AA.

In Witcher 3, I've been combining DLSS Quality with 2.25X DLDSR, and IMO it's pretty great.

So many settings to tweak and play with now.

Zero regrets paying a bit more for NVidia and the extra features.