Metro Exodus Update Shows Huge Improvements With NVIDIA DLSS

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

RaV666

Member
Jan 26, 2004
76
34
91
Problem with DLSS is not that it doesnt work.
Problem with it is the fact that id doesnt work as advertised on launch.
Lets get back to what they were claiming.
They were showing off screens of a MASSIVE improvement in IQ and 2x the perfomance !
There was talk of about 20 LAUNCH titles.
Its 2/3 year later. We have 3 games, 30% perf improvement and a big IQ decrease.
There are various artifacts, there are objects missing/blurred in the distance, on latest matro update there are visible sharpening artifatcs (i guess they fixed it in this way), some weird restrictions when you can and cant use it and on what.
Remember that ?
1551138118916.pnghttps://inthegame.nl/wp-content/uploads/DLSS.png
Well, it isnt it.
They were selling it as a groundbreaking new AA mode.
What we got is a new mediocre upscale method.With restrictions and very few games.
 

Attachments

  • 1551138123294.png
    1551138123294.png
    7.9 MB · Views: 6
  • Like
Reactions: Arkaign

Arkaign

Lifer
Oct 27, 2006
20,736
1,379
126
Problem with DLSS is not that it doesnt work.
Problem with it is the fact that id doesnt work as advertised on launch.
Lets get back to what they were claiming.
They were showing off screens of a MASSIVE improvement in IQ and 2x the perfomance !
There was talk of about 20 LAUNCH titles.
Its 2/3 year later. We have 3 games, 30% perf improvement and a big IQ decrease.
There are various artifacts, there are objects missing/blurred in the distance, on latest matro update there are visible sharpening artifatcs (i guess they fixed it in this way), some weird restrictions when you can and cant use it and on what.
Remember that ?
View attachment 3662https://inthegame.nl/wp-content/uploads/DLSS.png
Well, it isnt it.
They were selling it as a groundbreaking new AA mode.
What we got is a new mediocre upscale method.With restrictions and very few games.

This is ultimately the basic truth of it, and what is also unavoidable is that Tensor cores take up a certain % of the die and design, and outside of DLSS have no other use in gaming thus far (not even rumored other uses).

A basically identical die size, but with more standard cores and no tensor cores might have upwards of 40-50% higher performance, more than enough to offset the DLSS feature, and with 100% of titles and all resolutions applicable to the performance uplift. Eg; instead of DLSS letting you have ~1440p performance at 4k in some games albeit slightly to hugely blurred, you'd get basically 40-50% increased performance across the board, probably enough to make native 4k viable in most of them.

I realize this is a hypothetical, it goes back to an examination of the Turing die by diagram, and counting the percentage of the die used by each component. Standard components made up roughly half of the die, then RT and Tensor cores made up about a quarter each. You can run your rough math from there in terms of figuring out substitution for each item and the probable rough results. One thing that might be limiting would be bandwidth, if you did increase standard cores by 50%, perhaps the actual performance increase would be more like 20-30% instead of a full 45-50ish. I believe that even that would be more desirable than what we got.

Another way to look at it is if tensor was just not on RTX cards at all, and the entire transistor/die space used was instead utilized to double the RT cores. Seems like it would have a dramatic effect on Raytracing. 2060 would have the same RT capabilities of the 2080ti, and the 2080ti would have 2080ti SLI RT performance.

That is, if indeed they are purely separate components that do not need each other to exist.

These are still early days though, maybe something really compelling will arise that harnesses the RT and/or Tensor cores in a way we haven't even considered yet, or a mode where they can be used to boost other areas.
 

NTMBK

Lifer
Nov 14, 2011
10,458
5,844
136
This is ultimately the basic truth of it, and what is also unavoidable is that Tensor cores take up a certain % of the die and design, and outside of DLSS have no other use in gaming thus far (not even rumored other uses).

A basically identical die size, but with more standard cores and no tensor cores might have upwards of 40-50% higher performance, more than enough to offset the DLSS feature, and with 100% of titles and all resolutions applicable to the performance uplift. Eg; instead of DLSS letting you have ~1440p performance at 4k in some games albeit slightly to hugely blurred, you'd get basically 40-50% increased performance across the board, probably enough to make native 4k viable in most of them.

I realize this is a hypothetical, it goes back to an examination of the Turing die by diagram, and counting the percentage of the die used by each component. Standard components made up roughly half of the die, then RT and Tensor cores made up about a quarter each. You can run your rough math from there in terms of figuring out substitution for each item and the probable rough results. One thing that might be limiting would be bandwidth, if you did increase standard cores by 50%, perhaps the actual performance increase would be more like 20-30% instead of a full 45-50ish. I believe that even that would be more desirable than what we got.

Another way to look at it is if tensor was just not on RTX cards at all, and the entire transistor/die space used was instead utilized to double the RT cores. Seems like it would have a dramatic effect on Raytracing. 2060 would have the same RT capabilities of the 2080ti, and the 2080ti would have 2080ti SLI RT performance.

That is, if indeed they are purely separate components that do not need each other to exist.

These are still early days though, maybe something really compelling will arise that harnesses the RT and/or Tensor cores in a way we haven't even considered yet, or a mode where they can be used to boost other areas.

There's no way in hell that the tensor cores take up that much die area. The same scheduler, register file, cache hierarchy, shared memory/local data store are used between the regular cores and the tensor cores. And we've seen in the 1660ti that if they leave out tensor cores then they need to add FP16 cores to replace them, as the tensor hardware is used up run FP16 ops.

It's definitely a nice saving, but you're not going to get 40-50% higher!
 

DrMrLordX

Lifer
Apr 27, 2000
22,953
13,043
136
It's nice that Metro Exodus got an improvement. Seems a little late to the launch party, though.

Is there any sign that NV will do the same for other titles in the future? How long will it take them to "fix" blurry DLSS implementations?
 
Last edited:

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
I suppose the hope is that the power they have available to train DLSS will significantly increase over time so.....

Years though.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Are we finally in the era of "cloud gaming"? Anyone remember the original PS3 and it's Cell Technology!? My wording, but the gist of it, "even when your PS3 is OFF, it's still ON and providing a fraction of it's processing power for background operations. A cluster of PS3s providing resources for other PS3s. The future is interconnected with Cell Technology!"

Fast forward to MSFT's Xbone Cloud sales slogan "backgrounds will be generated by strong servers while client side will just be player objects. This allows for much more detailed objects since the cloud will be rendering back drops" or something like that.

To quote the famous Sonic the Hedgehog "I'm waaaaiiittttiiinnnggggg"
 
  • Like
Reactions: Arkaign

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
I suppose the hope is that the power they have available to train DLSS will significantly increase over time so.....

Years though.

Exactly. Nvidia's fundamental issue with DLSS is that they are running up against Nyquist. They have a low sampling rate with high frequency data. It's a noob move, but marketing has their back.
 
  • Like
Reactions: VirtualLarry

Hitman928

Diamond Member
Apr 15, 2012
6,710
12,402
136
Techspot/Hardware Unboxed just released an article detailing and comparing the updated DLSS in Metro.

https://www.techspot.com/article/1801-nvidia-dlss-metro-exodus/

tl;dr is that with the update DLSS is basically equivalent to traditional upscaling in both performance and image quality. Some things look better using one and some things look better using the other. They also noticed some visual artifacts with the updated DLSS which typically occur from sharpening filters and speculate that it's possible a significant portion of the image quality improvement came from applying a sharpening filter in post process for DLSS and that it wasn't all from increased learning time for the NN.
 
  • Like
Reactions: coercitiv

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
They dont compare DLSS and 1800p with Raytracing. I wonder they didnt do it because the TAA works a lot worse with Raytracing...
 

Hitman928

Diamond Member
Apr 15, 2012
6,710
12,402
136
They dont compare DLSS and 1800p with Raytracing. I wonder they didnt do it because the TAA works a lot worse with Raytracing...

They do at 1440p, same conclusion.

2019-02-27-image-2.png
 
Mar 11, 2004
23,444
5,852
146
Techspot/Hardware Unboxed just released an article detailing and comparing the updated DLSS in Metro.

https://www.techspot.com/article/1801-nvidia-dlss-metro-exodus/

tl;dr is that with the update DLSS is basically equivalent to traditional upscaling in both performance and image quality. Some things look better using one and some things look better using the other. They also noticed some visual artifacts with the updated DLSS which typically occur from sharpening filters and speculate that it's possible a significant portion of the image quality improvement came from applying a sharpening filter in post process for DLSS and that it wasn't all from increased learning time for the NN.

Considering that the "major performance improvement" that came for BFV was simply scaling back actual raytracing and using traditional raster tricks, I have a hunch that we'll start to see a decent chunk of updates to DLSS is actually doing traditional upscaling since it both seems to look better and perform better and they'll simply claim its DLSS. Hopefully devs keep being honest (like the Metro ones saying how DXR can be done without RTX - although that was only a bombshell for people that ignored Microsoft outright having said that already; and then Dice admitting their improved performance came from scaling ray-tracing back and using normal tricks) so we can suss out the reality since Nvidia will just blow shadowboxed smoke and roses up our butts if we let them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Is it just me or does DLSS seem to adjust lighting from its original state?
This goes back to when Jensen was on the stage and his "best engineer" made his "best effort" to produce a rasterized scene that looked suspiciously dark and devoid of lighting. It's funny how up until now we never noticed games had a "dark" problem until the savior that is RTX/DLSS arrived.

As was pointed out above, I wouldn't be at all surprised that DLSS will now use upscaling and/or sharpening filters and be fraudulently passed off as deeeeep learning.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
DLSS works best (quality and performance) at 4K with Raytracing.
Reasons to use upscaling instead of DLSS:
  • Works on any graphics card.
  • Works at any resolution.
  • Works in DX11.
  • Doesn't need RTX to be enabled.
Can you list a single reason to use DLSS instead of upscaling? Thanks.
 

coercitiv

Diamond Member
Jan 24, 2014
7,387
17,520
136
Reasons to use upscaling instead of DLSS:
You should also add it works for high refresh gaming. Reportedly DLSS gains going past a certain FPS threshold are subject to diminishing returns, as the latency introduced by the specialized hardware gains importance relative to average frame time - meaning you gain less and less FPS as you start climbing towards 100Hz+.

Personally I can think of one reason DLSS can end up besting traditional upscaling in RTX cards: while used with RT enabled, reflections are usually rendered at lower than native resolution. In theory DLSS could help improve reflection quality, as they should be able to train the network at high RT resolution. (which begs the question, do they need to train for each RT quality setting as well?)

I still believe the biggest problem DLSS has is being tied to RT. As long as the feature is not a first class citizen from Nvidia's point of view, it will remain a mere squire in the shadow of the holy RTX knight, and that means riding the donkey instead of the tall white horse.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
Reasons to use upscaling instead of DLSS:
  • Works on any graphics card.
  • Works at any resolution.
  • Works in DX11.
  • Doesn't need RTX to be enabled.
Can you list a single reason to use DLSS instead of upscaling? Thanks.

So, neither image quality nor performance is a reason? Basically using scaling makes sense when you just want to hate against nVidia?

Here is a simple comparision between 4K/0.5 shading rate and 4K/DLSS in a worst case scenario in Metro:



And DLSS works in DX11, too. Final Fantasy anyone?!
 

nurturedhate

Golden Member
Aug 27, 2011
1,767
773
136
So, neither image quality nor performance is a reason? Basically using scaling makes sense when you just want to hate against nVidia?

Here is a simple comparision between 4K/0.5 shading rate and 4K/DLSS in a worst case scenario in Metro:



And DLSS works in DX11, too. Final Fantasy anyone?!
This does not help your argument.
 

Hitman928

Diamond Member
Apr 15, 2012
6,710
12,402
136
Unless you like the blurred 4K/0.5 picture it does. :D

0.5 scaling would give you better than DLSS performance for worse IQ. The comparison is 0.7-0.8 scaling which gives the same performance and IQ as DLSS.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
0.5 scaling would give you better than DLSS performance for worse IQ. The comparison is 0.7-0.8 scaling which gives the same performance and IQ as DLSS.

No, it doesnt with Raytracing. And especially not in worst case scenarios where 4K is around 30FPS.
 

Hitman928

Diamond Member
Apr 15, 2012
6,710
12,402
136
No, it doesnt with Raytracing. And especially not in worst case scenarios where 4K is around 30FPS.

Every reviewer I've seen who has tested it says it does and multiple forum users I've seen post their own experiences trying it agree. Yet you say it doesn't and post a pic with an unnecessarily low scaling ratio as "proof". I'll trust the reviewers and my own eyes.
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
None of them have shown you the reality in Metro. Hardwareunboxed has not tested 4K with Raytracing. So how can you come to the conclusion that 1800p with raytracing performs and looks like 4K/DLSS with Raytracing?
 

Hitman928

Diamond Member
Apr 15, 2012
6,710
12,402
136
None of them have shown you the reality in Metro. Hardwareunboxed has not tested 4K with Raytracing. So how can you come to the conclusion that 1800p with raytracing performs and looks like 4K/DLSS with Raytracing?

The reality is that the best way to view Metro would be 4K + ray tracing + HDR. Unfortunately DLSS doesn't work with HDR right now so your only choice is low fps or upscaling.

In terms of non-HDR content you have every application of DLSS looking and performing the same or worse than upscaling with multiple sources saying the same thing across both Metro and Battlefield 5. You say that changes at 4K with raytracing (with zero proof of such) and that everyone else missed it. Let's see what pcgamer.com had to say about DLSS post patch (and yes they test with ray tracing):

DLSS looks a lot like doing bicubic upscaling from 2716x1528 to 4k and then applying a soft blur filter in Photoshop. But don't worry, things could improve in the future!
https://www.pcgamer.com/metro-exodus-settings-performance-ray-tracing-dlss/

So another example of a reviewer saying DLSS looks worse than upscaling, even at 4K with ray tracing on. There is a significant amount of proof against what you are claiming. What proof do you bring to support your position?