• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question DLSS interaction with Global Illumination

coercitiv

Diamond Member
Jan 24, 2014
4,293
5,466
136
There's a new RTX showcase demo from Nvidia which allows turning RTX and DLSS features on and off in real-time. A video showcase can be seen here.

Since DLSS can be switched on in real-time, one can easily see the effect it has on the scene illumination. I'll include a sample bellow:
DLSS-Illumination-Interaction.gif

As you can see, enabling DLSS has a strong effect on perceived illumination for both highlights and shadows. I'd like to stress that it doesn't matter what screenshot one likes best out of the two, if this was a real game scene then the game designer would probably tweak the lighting to get closer to one presentation or the other depending on artistic intent.

The big question is why would DLSS be tuned to affect perceived illumination in a way that at best makes it look like native rendering with RTX Global Illumination is not accurate enough?

PS: It would be great if the discussion does not immediately degenerate into DLSS BAD vs DLSS GOOD from the first page.
 
  • Like
Reactions: Elfear and Gideon

Dribble

Golden Member
Aug 9, 2005
1,896
423
136
There's an extra light or the sun is brighter or more to the right in the non DLSS version from the look of it - look at the sofa there's a bunch of bright highlights. The right hand side of pretty well everything in the scene looks more lit up. Can't see why or how DLSS would be doing that, particularly the big highlights on the sofa.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
5,747
2,509
136
Is it a case of the scene having different lighting across a day and the training for DLSS being biased to a certain time? If for example you had a game world with both day and night and only trained for evening, it's going to be off for both midday and midnight.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,052
176
106
It looks like the same artifacts that you'd get from doing "radiance clamping" as seen in many path tracers ...

This could literally stop DLSS from being used on games with ray traced effects like dynamic global illumination or glossy reflections altogether ...

 

GodisanAtheist

Platinum Member
Nov 16, 2006
2,975
1,458
136
At the end of the day, lighting is really a hugely, bigly interpretive thing.

What looks good for one person doesn't always look good for someone else etc.

DLSS is eventually just an AI's guess at what an upscaled scene should look like. The irony here is while it is most useful for scenes with heavy RT, I doubt the algo was ever actually trained on RT images. It's essentially guessing how the scene should look based in pre-baked non-RT data.

Maybe we'll eventually get a DLSS 3.0 that's heavily trained on and perhaps even has vector data for ray tracing and rays that handles traced scenes better.
 

Olikan

Platinum Member
Sep 23, 2011
2,007
239
106
Maybe at lower resolution the engine shoot less rays... less bounce, less ilumination
 

JoeRambo

Golden Member
Jun 13, 2013
1,078
897
136
Having actually played with the demo on 3090 before this post, i was left in awe just how amazing DLSS performance gain was for small loss of quality. I think it is "ultra performance" DLSS preset.
( i guess Ampere cards are for mining only now, since noone even commented on quite amazing spectacle in demo ).

On topic of lightning changes, i don't think it is fair to discuss it much yet:

1) Esp in actual demo, turning DLSS on and off seems to recalculate and reseed lightning values. You can see the effect from 31 to 33s in, looking on light patches on sofa, that get bright and then get "iterated" over hundreds of frames to tone them down.
2) Ultra performance DLSS works with too few pixels, when toying with demo i was already wishing for DLSS quality controls
3) We are in the early days of PBR techniques, just like some other posters here noted - DLSS will need to adapt from being optimized to Raster hack era
 

Ajay

Diamond Member
Jan 8, 2001
8,426
3,264
136
Looks kind of like a contrast filter to me. Although, the highlights on the couch seem to have been clamped as @ThatBuzzkiller suggested. Also looks a bit like what happens when a scene is auto-leveled. Curious behavior - but I do prefer the scene after DLSS is activated.
 

coercitiv

Diamond Member
Jan 24, 2014
4,293
5,466
136
On topic of lightning changes, i don't think it is fair to discuss it much yet:
I think it's fair game, but then again I did not start the thread with the intention of pointing out a flaw in DLSS, but rather seeking to understand what's happening in the demo.

Looks kind of like a contrast filter to me. Although, the highlights on the couch seem to have been clamped as @ThatBuzzkiller suggested. Also looks a bit like what happens when a scene is auto-leveled. Curious behavior - but I do prefer the scene after DLSS is activated.
This is what I thought about first too, a filter of sorts. Some changes like the removal of small reflections on the couch do indeed require another explanation though, hence the reason for the thread.

I too prefer the resulting scene with DLSS activated, although this type of result should be possible in native presentation as well.
 

JoeRambo

Golden Member
Jun 13, 2013
1,078
897
136
Curious behavior - but I do prefer the scene after DLSS is activated.
You are not the only one, while DLSS certainly does damage to image clarity ( probably due to DLSS ultra perf mode ), but i prefer it over image without.

But that is definately subjective, we were kept hostages of raster hacks for so long, that we can't tell what is what at this point :)
 

Attachments

JoeRambo

Golden Member
Jun 13, 2013
1,078
897
136
I think we are getting a proper test of DLSS in RTGI sooner than expected:


There is a section in video about RT limitations and bugs, very interesting.
 
  • Like
Reactions: NTMBK

Gideon

Golden Member
Nov 27, 2007
1,248
2,348
136
Slightly offtopic, but the dev's surprisingly did mention AMD's FSR in their FAQ an what they mention is not nice:



I hope this is a reason why AMD went back to the drawing board with FSR. Based on this quote alone:

1. FSR won't work with all rendering techniques that work with DLSS (that alone is a near blocker fail)
2. In it's current form it doesn't look better than Metro Exodus or Unreal Engines TAA upsampling.

I wouldn't really mind the last point that much if it offered similar temporal stability as DLSS, used higher mipmaps for textures and would work on a wide variety of games.

Now I'm beginning to worry. No wonder AMD does'nt want to show it off yet (and they'll hopefully improve it).
 
  • Like
Reactions: GodisanAtheist

Leeea

Senior member
Apr 3, 2020
599
683
96
Slightly offtopic, but the dev's surprisingly did mention AMD's FSR in their FAQ an what they mention is not nice:



I hope this is a reason why AMD went back to the drawing board with FSR. Based on this quote alone:

1. FSR won't work with all rendering techniques that work with DLSS (that alone is a near blocker fail)
2. In it's current form it doesn't look better than Metro Exodus or Unreal Engines TAA upsampling.

I wouldn't really mind the last point that much if it offered similar temporal stability as DLSS, used higher mipmaps for textures and would work on a wide variety of games.

Now I'm beginning to worry. No wonder AMD does'nt want to show it off yet (and they'll hopefully improve it).
I suspect that developer is making this up. 4A has always had a particular affinity for proprietary NVidia tech. 4A with Metro Exodus also embraced hairworks and PhysX. I believe that makes it the last game released to support nvidia's hardware based physX, for all you people who are running a 2nd video card for PhysX ...
 
Last edited:

GodisanAtheist

Platinum Member
Nov 16, 2006
2,975
1,458
136
I suspect that developer is making this up. 4A has always had a particular affinity for proprietary NVidia tech. 4A with Metro Exodus also embraced hairworks and PhysX. I believe that makes it the last game released to support nvidia's hardware based physX, for all you people who are running a 2nd video card for PhysX ...
- I doubt the developer is lying.

Yes, they're definitely a pro-NV dev house, but they have to work with and cater to AMD hardware as well if they're going to be doing anything with the consoles. Additionally, there are many more elegant ways to put this in "corporate speak" to decline the support of the AMD feature while not burning the bridge at the same time.

In all likelihood, they had access to some prototype of what AMD is working on for their enhanced edition release to test performance limits on consoles, and just didn't like what they saw. The fact that AMD is so mum on this feature likely means that they've probably gotten similar feedback from other dev houses privately and have to go back to the drawing board.

Consoles have used any number of tricks in the past to keep performance at acceptable levels and I think AMD's first FSR techniques will mirror those (Dynamic Resolution and checker-board rendering) to buy them some time.
 
  • Like
Reactions: Gideon

Gideon

Golden Member
Nov 27, 2007
1,248
2,348
136
Consoles have used any number of tricks in the past to keep performance at acceptable levels and I think AMD's first FSR techniques will mirror those (Dynamic Resolution and checker-board rendering) to buy them some time.
Speaking of which, Returnal has an interesting solution with checkerboarding and temporal upscaling. Looks quite good considering the base resolution:

 

Gideon

Golden Member
Nov 27, 2007
1,248
2,348
136
Ok i might have made a mountain of a molehill 4A released a corrected statement:

4A Games has not evaluated the AMD FidelityFX Super Resolution feature for Metro Exodus at this time. In our FAQ, we were referring to the AMD FidelityFX open source image quality toolkit which targets traditional rendering techniques that our new RT only render does not use, and noting that we have our own Temporal Reconstruction tech implemented natively which provides great quality benefits for all hardware, so do not currently plan to utilize any other toolkits. 4A Games is always motivated to innovate, evaluate, and use the newest technologies that will benefit our fans across all platforms and hardware.
So they have not tested FSR yet, just current Fidelity FX which right now has very little for an only RT renderer
 

JoeRambo

Golden Member
Jun 13, 2013
1,078
897
136
What is really scary in this whole FSR/Fidely FX debacle is that premier AAA developer who has first ever game with full RT pipe out, has not even heard from AMD about feature that is absolutely critical to RT performance.
One has to wonder how far away it is from release? A year for first games to show up? Two years? Releasing right after Primitive Shaders for Vega?

Real strange priorities by AMD. You'd think company with weaker RT hardware would rush to implement DLSS like technique that can up to double performance for manageable loss of quality? Nope, they take their sweet time.
 

Mopetar

Diamond Member
Jan 31, 2011
5,747
2,509
136
Better that they take their time. The original DLSS from Nvidia was awful and soured a lot of opinions on the technology as a whole. First impressions matter a lot and I suspect that even after a year that AMD is going to unveil something that's similarly cruddy.
 

Dribble

Golden Member
Aug 9, 2005
1,896
423
136
What is really scary in this whole FSR/Fidely FX debacle is that premier AAA developer who has first ever game with full RT pipe out, has not even heard from AMD about feature that is absolutely critical to RT performance.
One has to wonder how far away it is from release? A year for first games to show up? Two years? Releasing right after Primitive Shaders for Vega?

Real strange priorities by AMD. You'd think company with weaker RT hardware would rush to implement DLSS like technique that can up to double performance for manageable loss of quality? Nope, they take their sweet time.
DLSS takes custom hardware and the AMD cards don't have it, after that it takes programming a supercomputer to generate the algorithm to run on that hardware which AMD probably don't have either. It must have also taken Nvidia about 4 years working on that software to get to where they are now. Hence not surprising AMD are having trouble replicating DLSS.
 
  • Like
Reactions: GodisanAtheist

Ajay

Diamond Member
Jan 8, 2001
8,426
3,264
136
after that it takes programming a supercomputer to generate the algorithm to run on that hardware which AMD probably don't have either.
They don't need a top super computer running in the petaflop range and I'm sure they can acquire the talent need to do the programming.
So, no big deal.
 

Stuka87

Diamond Member
Dec 10, 2010
5,366
1,158
136
DLSS takes custom hardware and the AMD cards don't have it, after that it takes programming a supercomputer to generate the algorithm to run on that hardware which AMD probably don't have either. It must have also taken Nvidia about 4 years working on that software to get to where they are now. Hence not surprising AMD are having trouble replicating DLSS.
This is arbitrarily false. nVidia developed Tensor Cores for datacenter use. When building cards for consumer use, they decided to add a small number of these core types. They have very little usage in a consumer workload. They decided to have some parts of DLSS use them, as otherwise they sit there doing nothing at all, taking up die space, and consuming power.

nVidia's thought process behind dedicated hardware for both RT and DLSS is vendor lock in, which has been there goal for pretty much every single piece of proprietary tech they have developed in the last 20 years.
 

ASK THE COMMUNITY