[ Digital Trends ] Radeon Image Sharpening competes with DLSS

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
getting much better and welcome addition for games that use some form of RT. I know for future games like CyberPunk 2077, I’ll happily enable DLSS if I can use RT and get OK performance.

Why not just 1800p and get the same performance?

You don't even need upscaling. Just make sure your games run at a certain resolution. I think it was hardware unboxed that did a good review about it and said 1800p = 4K DLSS

By next year, we should see next gen Nvidia cards. Say it boosts performance by 25% normally and 40% with Ray Tracing on.

Also, you should play around with RT settings. Don't need to have Ultra with massive loss, when High/Medium and even Low might work just as well.
 

FiendishMind

Member
Aug 9, 2013
60
14
81
Why not just 1800p and get the same performance?

You don't even need upscaling. Just make sure your games run at a certain resolution. I think it was hardware unboxed that did a good review about it and said 1800p = 4K DLSS

By next year, we should see next gen Nvidia cards. Say it boosts performance by 25% normally and 40% with Ray Tracing on.

Also, you should play around with RT settings. Don't need to have Ultra with massive loss, when High/Medium and even Low might work just as well.
It's going to depend on the engine. Most game engines aren't really set up properly for ideal NN training yet.


This is probably why the quality of DLSS is varying so much between games so far. This along with the lack of DirectML support seems to have signifigantly gimped DLSS on the quality side overall. A big part of Nvidia's recent RTX partnership efforts with the big game engines seems to have included better engine level optimizations for NN training, so it will be interesting to see what DLSS looks like in upcoming titles when it's more ideally supported.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
You'll never get the full quality because its working from data that doesn't exist.

The biggest problem with DLSS is not the quality is not up to par, but the performance loss is roughly equal to loss in quality when compared to simple resolution change. Like Hardware Unboxed said. You get 1800p quality and 1800p performance.

So what have they gained by using Tensor Cores and wasting data center processing with DLSS? Nothing.

Ahh, what am I doing? The OP is basically trolling with the post that has little to do with the title.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
You'll never get the full quality because its working from data that doesn't exist.

The biggest problem with DLSS is not the quality is not up to par, but the performance loss is roughly equal to loss in quality when compared to simple resolution change. Like Hardware Unboxed said. You get 1800p quality and 1800p performance.

So what have they gained by using Tensor Cores and wasting data center processing with DLSS? Nothing.

Ahh, what am I doing? The OP is basically trolling with the post that has little to do with the title.
But it should be better - that's the laws of conservation of information. A simple upscale has to work with a simple algorithm that doesn't know anything about what's been drawn. DLSS should be like a highly compressed instruction manual on how to upscale for a particular game, not just a dumb one size fits all. I am not saying it's there yet, but that has to be the end goal.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
You'll never get the full quality because its working from data that doesn't exist.

The biggest problem with DLSS is not the quality is not up to par, but the performance loss is roughly equal to loss in quality when compared to simple resolution change. Like Hardware Unboxed said. You get 1800p quality and 1800p performance.

So what have they gained by using Tensor Cores and wasting data center processing with DLSS? Nothing.

Ahh, what am I doing? The OP is basically trolling with the post that has little to do with the title.

Simple upscaling almost always use bandlimited interpolation - so higher frequencies are simply not there. The idea behind DLSS is to at least guess the higher frequencies. Thats as far as the theory goes.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
These are intriguing technologies. They could turn out to be bad / never supported, but let's hope for the best.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126

Confirms RIS is superior overall to DLSS in practice.

But it's extremely stupid of AMD not to support DX11, especially since this should be an API-agnostic postfilter.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,783
7,117
136

Confirms RIS is superior overall to DLSS in practice.

But it's extremely stupid of AMD not to support DX11, especially since this should be an API-agnostic postfilter.

-AMD's GPU division motto over the last several launches has been "snatching defeat from the jaws of victory" so it should really not surprise anyone...
 
  • Like
Reactions: TypoFairy©