Considering how Nvidia fans kept denigrating people citing AMD straight up saying they'd release a future driver update that was supposed to enable the NGG fully on Vega (the "magic driver" the Nvidia fans kept calling it), its somewhat amusing how we keep hearing how it'll take time for development of these algorithms that will make DLSS amazing. So how long are we supposed to wait for the "magic algorithm"?
Near as I can tell, DLSS is just applying some mix of image processing (AA like FXAA/SMAA, sharpening, etc) and then they have a supercomputer decide what is "optimum" for given game/scene (even though optimum is highly subjective). Which you could do without needing tensor cores (in fact, I have a hunch there are image processors that would do a lot better job at that than the tensor cores, and that would be useful for video processing as well - while making it so you didn't need to be using brunt of the GPU, and if they integrated them on current processes would probably be a much more negligible transistor count and use less power) on top of it (or could do it as a cheaper chip they add to the board; oh and I'm fairly sure they could make it programmable to some extent so that they could tailor it depending on the game, or best give consumers some settings they could tweak for how they want things to look), but even this half-baked hybrid version of ray-tracing needs to run at lower resolutions and upscale. My argument is less that RTX and DLSS are singularly crap, and that we're nowhere close to being able to do real time ray-tracing in a worthwhile way, that raster tricks couldn't likely match the IQ, while doing so at higher resolution and higher performance (or if rendered at lower and upscaled would offer much better performance), and no need for specialized bits in the hardware, taking up more transistor space (that would've boosted raster performance even more), and costing a lot (in adding to die sizes, and heat and power, requiring better cooling and sucking more electricity).
Ray-tracing makes sense in cloud rendering when you can throw the proper resources at it to do it well and at high enough framerates. But since both companies have added so much stuff for non-gaming related markets into their GPUs and are selling those GPUs to consumers, they have to try and justify that by trying to make use of those bits for consumers. And I just strongly disagree that is correct. The costs alone are big enough issue, but throw in how the performance and quality is iffy, and this is looking terrible. And again, its not just Nvidia, I think Radeon VII (and Vega 56/64) is similarly affected (and have a hunch that AMD's compute bits could do what RTX/DLSS is doing).
Oh and OP, have you looked into this stuff on the new Metro game? Ars had an article on it and the person was raving about how good it looked, and then their screenshots showed something wonky, where shadow levels between the RTX and non-RTX versions seem exaggered (maybe to make RTX stand out more?) and so I wonder if the difference in perceived quality is even down to ray-tracing and not manipulating shadow levels (since you can certainly achieve similar shadow levels without it; it was so jarring it was like in games like Doom 3 where you'd turn shadows from full to low or even off; there was one scene where the ray-tracing was excessively dark, and the non-raytraced looked like it was too bright/lacking in shadows so they both looked wrong). And with how the higher levels of ray tracing in Battlefield V had their own unique textures (which could account for a significant amount of perceived image quality improvement), I'm really wondering if this isn't some intentional manipulation to try and make RTX seem more worth it (kinda like some of the antics Nvidia pulled with Gameworks stuff; which that's what RTX is reminding me of). The guy also said that the native 4K and the RTX+DLSS was "neck and neck" in performance even though the native 4K was 10 and 20 percent faster in average FPS (but was about 10% slower in lowest 1%), I'm guessing he was basing it on his subjective feel (since even the faster one wasn't sustaining 60FPS).