TPU: many combinations of GPU/resolution/RTX/DLSS not allowed

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

coercitiv

Diamond Member
Jan 24, 2014
6,211
11,940
136
To emulate what RTX does with the dedicated hardware takes a LOT, and if Nvidia felt like the only way to do cursory raytracing is to develop specialized hardware that's way more efficient at it, it's just not going to he possible on the smaller APUs with much much much less die space. It will he fighting to get enough juice to make 4k HDR native gaming possible, let alone just dialing back so far that you could give like 75% of the GPU a job that it would still be worse at than a 2060.
Consoles will always be cutting back on a few things, it's in their nature to do so.

Up until now the best feature of RTX is the least performance intensive one, and likely the one able to gracefully degrade with aggressive optimization and sparse hardware resources - global illumination. I'm inclined to believe this will also be the first feature to be enabled on AMD GPUs, with or without RT specific hardware.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126


1685p/80% runs just as fast as DLSS but looks far better. You can also switch off DX12 and RTX and simply upscale from DX11, getting far higher performance on any graphics card. You can't do that with DLSS because of vendor-lock at the hardware and software level.
  • It just works: lie.
  • Provides equivalent image quality of 64xSSAA: lie.
  • Provides equivalent of 4K quality at lower resolutions: lie.
  • Will make RTX performance more usable: lie.
On top of this, it's basically Russian roulette when you can use it. No doubt there's some serious Tensor core bottleneck that nVidia is desperately hiding, as witnessed by that PR mouthpiece they released a few days ago.

So six months after launch, this is the result of the partnership in the launch title that was backed by massive nVidia engineering resources and R&D.

DLSS is a fraud, folks. There's just no other way to describe it.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Of all the RTX owners we have here, how many actually use DLSS? I'm curious about their experiences. Seems like we have more than a few . . .
 
Mar 11, 2004
23,077
5,558
146
Considering how Nvidia fans kept denigrating people citing AMD straight up saying they'd release a future driver update that was supposed to enable the NGG fully on Vega (the "magic driver" the Nvidia fans kept calling it), its somewhat amusing how we keep hearing how it'll take time for development of these algorithms that will make DLSS amazing. So how long are we supposed to wait for the "magic algorithm"?

Near as I can tell, DLSS is just applying some mix of image processing (AA like FXAA/SMAA, sharpening, etc) and then they have a supercomputer decide what is "optimum" for given game/scene (even though optimum is highly subjective). Which you could do without needing tensor cores (in fact, I have a hunch there are image processors that would do a lot better job at that than the tensor cores, and that would be useful for video processing as well - while making it so you didn't need to be using brunt of the GPU, and if they integrated them on current processes would probably be a much more negligible transistor count and use less power) on top of it (or could do it as a cheaper chip they add to the board; oh and I'm fairly sure they could make it programmable to some extent so that they could tailor it depending on the game, or best give consumers some settings they could tweak for how they want things to look), but even this half-baked hybrid version of ray-tracing needs to run at lower resolutions and upscale. My argument is less that RTX and DLSS are singularly crap, and that we're nowhere close to being able to do real time ray-tracing in a worthwhile way, that raster tricks couldn't likely match the IQ, while doing so at higher resolution and higher performance (or if rendered at lower and upscaled would offer much better performance), and no need for specialized bits in the hardware, taking up more transistor space (that would've boosted raster performance even more), and costing a lot (in adding to die sizes, and heat and power, requiring better cooling and sucking more electricity).

Ray-tracing makes sense in cloud rendering when you can throw the proper resources at it to do it well and at high enough framerates. But since both companies have added so much stuff for non-gaming related markets into their GPUs and are selling those GPUs to consumers, they have to try and justify that by trying to make use of those bits for consumers. And I just strongly disagree that is correct. The costs alone are big enough issue, but throw in how the performance and quality is iffy, and this is looking terrible. And again, its not just Nvidia, I think Radeon VII (and Vega 56/64) is similarly affected (and have a hunch that AMD's compute bits could do what RTX/DLSS is doing).

Oh and OP, have you looked into this stuff on the new Metro game? Ars had an article on it and the person was raving about how good it looked, and then their screenshots showed something wonky, where shadow levels between the RTX and non-RTX versions seem exaggered (maybe to make RTX stand out more?) and so I wonder if the difference in perceived quality is even down to ray-tracing and not manipulating shadow levels (since you can certainly achieve similar shadow levels without it; it was so jarring it was like in games like Doom 3 where you'd turn shadows from full to low or even off; there was one scene where the ray-tracing was excessively dark, and the non-raytraced looked like it was too bright/lacking in shadows so they both looked wrong). And with how the higher levels of ray tracing in Battlefield V had their own unique textures (which could account for a significant amount of perceived image quality improvement), I'm really wondering if this isn't some intentional manipulation to try and make RTX seem more worth it (kinda like some of the antics Nvidia pulled with Gameworks stuff; which that's what RTX is reminding me of). The guy also said that the native 4K and the RTX+DLSS was "neck and neck" in performance even though the native 4K was 10 and 20 percent faster in average FPS (but was about 10% slower in lowest 1%), I'm guessing he was basing it on his subjective feel (since even the faster one wasn't sustaining 60FPS).
 
  • Like
Reactions: Ranulf and badb0y

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It seems unlikely that it's supposed to look so blurred, though.

I have to think something is wrong.

No one is going to want to use it if it looks like that.

If that's what it's supposed to look like, then NV has a lot of explaining to do.
 

NTMBK

Lifer
Nov 14, 2011
10,240
5,026
136
We've seen with Metro Exodus that DLSS that only certain combinations of resolutions and other features like RTX will be allowed. The logic behind it is that the tensor cores might end up being a bottleneck otherwise- they can only manage a certain amount of throughput, and might not be able to push frames out fast enough to keep up with the rest of the GPU.

This seriously reminds me of pre-unified shaders. In theory dedicated hardware for vertex shaders and pixel shaders can be more efficient, as the hardware for each task can be more tightly optimised. But in practice it turned out that getting the balance of hardware resources "just right" was almost impossible, and you would end up with one part of the pipeline bottlenecking another... And just having a big array of unified shaders that can handle anything worked out better.

Do you think we'll see that happen again? Will this period fade away? I think we'll see a return to big arrays of unified shaders (perhaps with some new instructions or capabilities to optimise workloads like BVH traversal and DNN inference). This trend for increasing hardware partitioning and complexity just feels like it's going to be difficult for developers to make the most of.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
As an RTX owner I can confirm that DLSS is a blurry Vaseline mess. I was more interested in the DLSS 2X mode that used native resolution as base and DL to get SS benefits than what is currently available. Not sure when or if we will see that. Ray tracing performance has improved a lot since the cards came out though. Its actually use-able in BF5 now (at least in single player) and I think it looks amazing in Metro Exodus. I would like to see it in non FPS games soon where I think it can have some neat benefits as well. At the end of the day, I bought my cards for the +30% over the 1080ti so I am satisfied regardless (with performance not the price!), but it would be nice if the uptake of RTX increases. Hopefully the 2060 can spur that on.
 
  • Like
Reactions: lightmanek

nurturedhate

Golden Member
Aug 27, 2011
1,743
677
136
What worries me most is driver support for features like this. Seems it needs to be calculated on a per resolution/core count scenario and on a per game basis. That is a massive undertaking. You'd think the perfect time for something like DLSS would be towards the end of a cards life when trying to maximize fps on new games. Is Nvidia really going to dump resources into setups for a 2060 in 2022 (roughly the same time between 1060 and 2060). Is Nvidia going to say something along the line of "we'll do that calculations for DLSS but you have to give us something"?
 

nurturedhate

Golden Member
Aug 27, 2011
1,743
677
136
As an RTX owner I can confirm that DLSS is a blurry Vaseline mess. I was more interested in the DLSS 2X mode that used native resolution as base and DL to get SS benefits than what is currently available. Not sure when or if we will see that. Ray tracing performance has improved a lot since the cards came out though. Its actually use-able in BF5 now (at least in single player) and I think it looks amazing in Metro Exodus. I would like to see it in non FPS games soon where I think it can have some neat benefits as well. At the end of the day, I bought my cards for the +30% over the 1080ti so I am satisfied regardless (with performance not the price!), but it would be nice if the uptake of RTX increases. Hopefully the 2060 can spur that on.
You're right. It's been 6 months and nothing on DLSS 2X mode. Starting to turn in to NGG or BitBoys at this point.
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
I'm sure in another 6-8 months things will start to turn around... with RTX 3000, maybe. Probably not.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Near as I can tell, DLSS is just applying some mix of image processing (AA like FXAA/SMAA, sharpening, etc) and then they have a supercomputer decide what is "optimum" for given game/scene (even though optimum is highly subjective). Which you could do without needing tensor cores (in fact, I have a hunch there are image processors that would do a lot better job at that than the tensor cores, and that would be useful for video processing as well - while making it so you didn't need to be using brunt of the GPU, and if they integrated them on current processes would probably be a much more negligible transistor count and use less power)

No, its much fancier than that. They generate the pixels for the image at the lower resolution and train a neural network on massively detailed/over resolution assets from the game to 'guess' how to fill in the gaps to make an anti aliased higher resolution image. Training that network is a massive amount of computational work, and the results might legitimately improve with more time devoted to the training for a given game.

They then use the tensor cores to run the resulting trained neural network. Nothing else can do that remotely as well.

The idea is definitely logical and attractive, whether it works in practice or not we'll have to see. The tensor cores never going anywhere anyway - there's a huge market for this sort of deep learning stuff outside of games and they're very useful for it.
 
  • Like
Reactions: ozzy702

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Not with tensor cores I think? Those are a huge gain when using these chips to execute pretrained neural nets, and having smaller chips for executing pretrained neural networks really matters for the compute markets. They're very big now.

RTX cores, maybe I suppose.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
they shoulda just used the die space from the tensor cores on more RT cores then we wouldn't need DLSS
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
2080, haven't used any of the RTX features and probably won't until the bugs are worked out. Time will tell if RTX features end up working as promised or if they pull an AMD and never deliver.
 

maddie

Diamond Member
Jul 18, 2010
4,746
4,687
136
Not with tensor cores I think? Those are a huge gain when using these chips to execute pretrained neural nets, and having smaller chips for executing pretrained neural networks really matters for the compute markets. They're very big now.

RTX cores, maybe I suppose.
This is why both main GPU companies are bifurcating their lines going forward. The gaming market will probably settle for general compute cores, using added capabilities and instructions, leaving the addition of specialized high performance circuitry for compute workloads.

To be expected is the integration of these specialized units using IF in AMD's case, not by incorporating them in the general die area. Nvidia has NVlink. Integrating on die leaves you at the slow end of a rapidly evolving tech, due to the lead time for a GPU.

Nvidia is operating on the bet that they can dominate in all areas, but it's too wide now for any one company to be competitive in all areas at once.





a
 

pauldun170

Diamond Member
Sep 26, 2011
9,133
5,072
136
Of all the RTX owners we have here, how many actually use DLSS? I'm curious about their experiences. Seems like we have more than a few . . .

RTX 2060 owner here.
In the other thread I noted that I've been playing with everything set to ultra at 1080P
When the DLSS patch came out, I tried it but turned it off due to the blur it introduced. Game had been running fine, ray tracing and all prior so I didn't need it.
Then I got to Tiralleur ....
Lets just say I turned DLSS back on. On the RTX 2060, Tiralleur on ultra settings at 1080p was a stutter fest. DLSS helped things out.
 

Det0x

Golden Member
Sep 11, 2014
1,031
2,963
136
they shoulda just used the die space from the tensor cores on more RT cores then we wouldn't need DLSS

Well.. I'm not surprised.

Only me that got the feeling that Nvidia is trying to sell hardware intended for deep-learning/ray-tracing (etc) with data-centers and professional enterprises as the target audience, to gamers, to recoup some of the cost developing these server SKU's?

Don't think Turning was made with gaming in mind.. But since AMD is MIA they can do what they want, and make us "gamers" pay for it.
Iam surprised it didn't also include lots of FP64, but i guess they have Volta for that.

Monopoly is a great thing ;(

This is probably the intended market with Turing.
30.jpg

That's a new 250B market they did not have access to before Turning, with its ray-tracing engines.

Ray-tracing in games with gimpworks is likely a afterthought..

And considering what we know today, in relation to these two threads:
"Tensor cores" and "RT cores" reminds me of pre-unified shaders
Metro Dev: Ray Tracing Is Doable via Compute Even on Next-Gen Consoles, RT Cores Aren’t the Only Way

I would argue that Turing is not worth it for gamers in 2019, and we'll be better served waiting for the 3000 series/Navi
 

maddie

Diamond Member
Jul 18, 2010
4,746
4,687
136
It appears that Nvidia tried to leverage the 1st mover advantage for Ray Tracing, trying to replicate their Cuda supremacy in gaming. They probably had to do this now as the new consoles will almost certainly have RT abilities and possibly Navi.

I don't blame them for trying but the consumers that are slaves to the marketing, that's a whole other story.

With DLSS, we had a few very vocal posters touting DLSS as almost the 2nd coming, based solely on the announcement marketing. They have gone silent as the truth reveals itself. The myth of Nvidia's invincibility has taken a hit. Time will tell if they will repair it of continue to make mistakes.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
On consoles you would rather increase the number of RTX cores at the cost of removing tensor cores - instead of putting in sufficient general purpose shaders such that realtime raytracing is viable within the given transistor budget.
With other words on a gaming oriented platform RTX cores are much more useful, because they help to solve problems like GI, which you have in essentially every game.
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
RTX 2060 owner here.
In the other thread I noted that I've been playing with everything set to ultra at 1080P
When the DLSS patch came out, I tried it but turned it off due to the blur it introduced. Game had been running fine, ray tracing and all prior so I didn't need it.
Then I got to Tiralleur ....
Lets just say I turned DLSS back on. On the RTX 2060, Tiralleur on ultra settings at 1080p was a stutter fest. DLSS helped things out.

Did you try turning off ray tracing as opposed to turning on DLSS?

I will say, I guess it is good to have an option to "turn on the blur" as a way to speed up fps when you dip into a stutterfest. If you can tolerate the image quality loss.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
just turn off dlss and use resolution scale to get performance back at better image quality
 

amenx

Diamond Member
Dec 17, 2004
3,910
2,133
136
5 months after RTX release and DLSS is a fail? Sure looks like it. But is 5 months enough to evaluate and reach a definitive conclusion from a couple of titles? Lets seem reviews discussing it in another year from now.
 

maddie

Diamond Member
Jul 18, 2010
4,746
4,687
136
On consoles you would rather increase the number of RTX cores at the cost of removing tensor cores - instead of putting in sufficient general purpose shaders such that realtime raytracing is viable within the given transistor budget.
With other words on a gaming oriented platform RTX cores are much more useful, because they help to solve problems like GI, which you have in essentially every game.
The critical question is if RTX cores are indispensable? The answer is no, from an industry source. After all, in the end, it's all math.

We are going on Nvidia's choice to make certain assumptions, which appear not be true, namely, the only way is through specialized RT computing units.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
I think its going to be very hard to get a definitive conclusion. Judgements about IQ can be comically biased by what you're expecting to see - loads of research about this including random hifi stuff etc of course - and plenty of temptation for click bait articles taking extreme positions.

I refuse to believe that it won't get better than a fixed hardware upscaling algorithm - there really isn't any objective reason that it won't - but it won't ever be perfect either.
 

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
nV said they were going to work on it, right? I don't get this style thread, reads like a clickbait youtube video

At this point I think everyone, especially on this forum, knows nV overstated and overpriced this generation. AMD dropped the ball with VII too, in my opinion.

I think there is a larger discussion that needs to be had about the future of the dGPU and the lack of titles worth buying a shiny new GPU for. I think us consumers, pro-users, enthusiasts, etc. need to step back and re-evaluate what kind of card WE want to buy rather than being told by AMD/nV new tech justifies charging twice as much as last gen.

And willingly paying for it