TPU: many combinations of GPU/resolution/RTX/DLSS not allowed

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
22,939
13,024
136
nV said they were going to work on it, right? I don't get this style thread, reads like a clickbait youtube video

How long do they get before people declare it a failure? Honest question. Keep in mind that AMD fans are constantly saying that their GCN dGPUs get "better with time". I won't defend or criticize the claim, but the basic idea is that driver updates improve the product, sometimes a year or more after release. Is it reasonable for people to buy a product with the expectation that it'll take a year or longer (or in the case of NGG, never) to squeeze optimal performance from the product? Are we going to start saying that about nVidia's cards too?

At this point I think everyone, especially on this forum, knows nV overstated and overpriced this generation. AMD dropped the ball with VII too, in my opinion.

I don't think AMD overstated anything. Their product is clearly not a consumer-oriented card anyway. I would agree that they did overall underdeliver, unless all you want is cheap fp64 performance.

I think there is a larger discussion that needs to be had about the future of the dGPU and the lack of titles worth buying a shiny new GPU for. I think us consumers, pro-users, enthusiasts, etc. need to step back and re-evaluate what kind of card WE want to buy rather than being told by AMD/nV new tech justifies charging twice as much as last gen.

And willingly paying for it

We are hostages. Both AMD and nVidia have tasted the sweet nectar of profits from pro graphics applications. It's quite lucrative. Try explaining why they should sell us dGPUs at anything but the console level anymore. It doesn't make any sense for their wallets. If the enthusiast community refuses to buy their overpriced halo cards for $700 or $999 or $1200 or more (ugh) then, in the long run, we're only signaling to them that it's not worth it to serve the consumer market.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
Of all the RTX owners we have here, how many actually use DLSS? I'm curious about their experiences. Seems like we have more than a few . . .

RTX in the shop, haven't touched DLSS, but from what I've seen, I wouldn't use outside of specific situations (ie distance cant see the blur).
 

pauldun170

Diamond Member
Sep 26, 2011
9,493
5,708
136
Did you try turning off ray tracing as opposed to turning on DLSS?

I will say, I guess it is good to have an option to "turn on the blur" as a way to speed up fps when you dip into a stutterfest. If you can tolerate the image quality loss.

I haven't messed around with settings other than switching it back on so I can finish the mission. On the section I'm on now, even with DLSS it still looks great and blur is not really all that much of big deal at the moment.
When I wrap up this section I'll replay it with different settings to see how it looks with different settings and report back.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
EDIT: Error on quote, fixed.

Oh and OP, have you looked into this stuff on the new Metro game? Ars had an article on it and the person was raving about how good it looked, and then their screenshots showed something wonky, where shadow levels between the RTX and non-RTX versions seem exaggered (maybe to make RTX stand out more?) and so I wonder if the difference in perceived quality is even down to ray-tracing and not manipulating shadow levels (since you can certainly achieve similar shadow levels without it; it was so jarring it was like in games like Doom 3 where you'd turn shadows from full to low or even off; there was one scene where the ray-tracing was excessively dark, and the non-raytraced looked like it was too bright/lacking in shadows so they both looked wrong).

I pretty much expect this to happen. We saw it with "fog" back in Batman: AA. I wouldn't be surprised if NV pushed their weight around for these kind of things going forward. As it is, I won't find out until next year because screw Epic games.
 
Last edited:

4K_shmoorK

Senior member
Jul 1, 2015
464
43
91
How long do they get before people declare it a failure? Honest question. Keep in mind that AMD fans are constantly saying that their GCN dGPUs get "better with time". I won't defend or criticize the claim, but the basic idea is that driver updates improve the product, sometimes a year or more after release. Is it reasonable for people to buy a product with the expectation that it'll take a year or longer (or in the case of NGG, never) to squeeze optimal performance from the product? Are we going to start saying that about nVidia's cards too?


I think there are only two titles that support DLSS right? I'm just saying its a small population of GPU owners that have the right combination (RTX card, own the game, care enough to play with the feature on). Not saying that the implementation shouldn't be criticized. it should be! On the other hand, does that mean this generation is a 'fail'. I don't know. I don't know because I think there should be a larger discussion about why we are buying cards at inflated prices. There is a slide I saw floating around that correlated Nvidia GPU prices directly to large increases in revenue for Nvidia, year over year. These cards make the companies money, otherwise they wouldn't make them

Not sure we owe these companies loyalty or our hard earned income when they view us dGPU consumers as little more than a revenue stream.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,574
252
126
I pretty much expect this to happen. We saw it with "fog" back in Batman: AA. I wouldn't be surprised if NV pushed their weight around for these kind of things going forward. As it is, I won't find out until next year because screw Epic games.
:confused2: Dont remember posting that? misquote?
 

Chesebert

Golden Member
Oct 16, 2001
1,013
15
81
ML-based up scaling certain has potential. Final Fantasy 7's pre-rendered backgrounds has recently been upscaled from their original resolution to HD quality. "Using state of the art AI neural networks, this upscaling tries to emulate the detail the original renders would have had," writes the mod's author.

I just installed the HD backgrounds and they look amazing. If we can do this in real time that would be amazing. DLSS as currently implemented is not there yet (far from it).
 

firewolfsm

Golden Member
Oct 16, 2005
1,848
29
91
As one poster said, tensor cores are invaluable, but RTX cores are not as necessary for their job (about twice as good as shaders). Vega can already perform raytracing fairly well, and that's because the unified shaders are optimized for compute in some ways. The performance uplift tensor cores provide for neural net computation is an order of magnitude though.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,600
6,084
136
Don't use either RTX or DLSS features on 2080 Ti.

Completely agree with others that RTX is not worth the performance hit and DLSS is (currently, at least) a blurry mess. On the handful of games that support either feature, anyways.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
It is my belief that Tensor and RT cores are more ideologically similar to hardware accelerated T&L in concept compared to separate vertex and pixel shaders. Tensor and RT cores by design are "fixed function" units just like hardware T&L, rasterizers, blending units or texture samplers because there's no way to program them but only just that they offer a bunch 'states' to be able to switch within a limited window of functions ... :)

In the future we might be able to get rid of blending units and then maybe the rasterizers altogether but I don't think for even a moment that we'll ever get rid of texture samplers since their a massive win in performance and the same argument can probably be made for RT cores too since a TU106 ( RTX 2060) die doesn't compare all that favourably to a GP104 (GTX1080) die since the former needs an extra ~40% die area to reach a similar performance to the latter in the vast majority of benchmarks ...
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
So games currently require ray tracing to be on to enable DLSS.

1660 Ti has Tensor cores but not RT cores. Seems like if they allow DLSS-only on the 1660 Ti they'll be able to claim improved perf/$.

Side-note. It's been seen throughout history that companies have their success through innovation, but as they get bigger they are run by bean counters and from that point on all technological development is done for the sake of maximizing profits.

Machine Learning has also just passed the tip of the hype cycle, and that hype will die soon.
 

DrMrLordX

Lifer
Apr 27, 2000
22,939
13,024
136
RTX in the shop, haven't touched DLSS, but from what I've seen, I wouldn't use outside of specific situations (ie distance cant see the blur).

Seems like a lot of people either aren't using it at all, or are only doing so for certain situations.

I haven't messed around with settings other than switching it back on so I can finish the mission. On the section I'm on now, even with DLSS it still looks great and blur is not really all that much of big deal at the moment.
When I wrap up this section I'll replay it with different settings to see how it looks with different settings and report back.

That would be informative, thanks.

Not sure we owe these companies loyalty or our hard earned income when they view us dGPU consumers as little more than a revenue stream.

Sadly most customers today of many products are seen as a revenue stream. Take a look at how the "recording industry" (if you can call it that anymore; it's more like a pack of lawyers these days) treats its customers. nVidia and AMD haven't gotten that bad.

I don't know that we owe them anything other than what we pay for their products. The simple reality is that as dGPU companies begin to develop revenue streams in other segments, the importance of consumer dGPU buyers diminishes. If we don't pony up vast sums of money for their cards then they have the option - however undesirable it may be - to serve other markets while leaving us behind.

Machine Learning has also just passed the tip of the hype cycle, and that hype will die soon.

Eh? What?
 
May 11, 2008
22,557
1,471
126
I would not be surprised if we get in the future a dedicated ai chip for use as neural net in predicting user behavior for more advanced and real ai behavior in the 3d scenes like the artifical behavior from computer controlled enemies or for example when playing soccer, computer controlled adversaries.
Or use the net to predict the users behavior and use the fast recognition capabilities in reverse to simulate worldly behavior to get a true more random behavior.
What the net predicts will happen, do not use that prediction but instead go for another variant to create random effects.

A pc or game console with a GPU, a CPU , An AIPU (Ariticial Intelligence processing unit.
Make the computer predict user behavior for normal use, for more advanced and real like communication and conversations.
Great for games. Later great for robotics. Have a central pc in the house doing the crunching, have a power efficient robot in the house be the arms and legs of the pc.
Since the always powered pc does the heavy lifting, the accu powered robot can have a massive amount of sensors and have a wireless connection to the pc and still be extreme power efficient.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Seems like a lot of people either aren't using it at all, or are only doing so for certain situations.



That would be informative, thanks.



Sadly most customers today of many products are seen as a revenue stream. Take a look at how the "recording industry" (if you can call it that anymore; it's more like a pack of lawyers these days) treats its customers. nVidia and AMD haven't gotten that bad.

I don't know that we owe them anything other than what we pay for their products. The simple reality is that as dGPU companies begin to develop revenue streams in other segments, the importance of consumer dGPU buyers diminishes. If we don't pony up vast sums of money for their cards then they have the option - however undesirable it may be - to serve other markets while leaving us behind.



Eh? What?

AAAS: Machine learning 'causing science crisis'
https://www.bbc.co.uk/news/science-environment-47267081

There are some concerns over accuracy of ML being raised recently, with concerns comes a natural drop in hype.
 

pauldun170

Diamond Member
Sep 26, 2011
9,493
5,708
136
Seems like a lot of people either aren't using it at all, or are only doing so for certain situations.



That would be informative, thanks.



Sadly most customers today of many products are seen as a revenue stream. Take a look at how the "recording industry" (if you can call it that anymore; it's more like a pack of lawyers these days) treats its customers. nVidia and AMD haven't gotten that bad.

I don't know that we owe them anything other than what we pay for their products. The simple reality is that as dGPU companies begin to develop revenue streams in other segments, the importance of consumer dGPU buyers diminishes. If we don't pony up vast sums of money for their cards then they have the option - however undesirable it may be - to serve other markets while leaving us behind.



Eh? What?

Note: My processor is an i7-2700K at stock speed. Most of the game has been fine with all settings set to ultra and no DLSS. It's just this one section of the game during the beginning that seems to be noticeably hard hitting on the system.
Went back and replayed the section where it originally had frequent stuttering. Looks like it was a one time thing only and I probably had some sort of process running in the background.
With DLSS off and everything on ultra, there was occasional stutter (very little impact on gaming experience) and the frame rate was defintiely on the low side. During the beginning section where it really is resource intensive, its playable but not exactly ideal. I'm guessing that FPS averaged in to the hiigh 20's low 30's at times.
High - FPS picked up a bit. I really didn't notice much difference in visual quality between ultra and high for the section I tested in. Lower the RTX, the faster it got.

IMO, DLSS blur wasn't as noticeable in this particular section.
I'll end up using it on a "as needed" basis.
 

coercitiv

Diamond Member
Jan 24, 2014
7,378
17,486
136
There are some concerns over accuracy of ML being raised recently, with concerns comes a natural drop in hype.
One simple but powerful example here is the use of adversarial attacks to fool ML models into complete misinterpretation of images.


This doesn't mean the models cannot be adjusted to increase their resilience, but it does show the data actually used to interpret images is nowhere near what ordinary people would expect. This "blind spot" of the AI may be an eye opener for some.
 

NTMBK

Lifer
Nov 14, 2011
10,452
5,839
136
Not with tensor cores I think? Those are a huge gain when using these chips to execute pretrained neural nets, and having smaller chips for executing pretrained neural networks really matters for the compute markets. They're very big now.

RTX cores, maybe I suppose.

Tensor cores are great for deep learning, but do those customers really need the baggage of a gaming GPU with expensive GDDR6, fully featured compute cores, texture units and rasterizer hardware? And does the one application of tensor cores to games (DLSS) really justify those tensor cores for gaming customers? The needs of DL and the needs of video games are going to continue to diverge, not converge. Google's in house TPU doesn't look much like a GPU.
 

Despoiler

Golden Member
Nov 10, 2007
1,968
773
136
Back in the day Tensor and RT cores would have just been referred to as math co-processors. Now, now we have all this marketing.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
RTX 2060 owner here.
In the other thread I noted that I've been playing with everything set to ultra at 1080P
When the DLSS patch came out, I tried it but turned it off due to the blur it introduced. Game had been running fine, ray tracing and all prior so I didn't need it.
Then I got to Tiralleur ....
Lets just say I turned DLSS back on. On the RTX 2060, Tiralleur on ultra settings at 1080p was a stutter fest. DLSS helped things out.

You would be better off setting resolution scale down to 75 or 80%. Image will be sharper and FPS will be the same or better as with DLSS on.
 
  • Like
Reactions: n0x1ous

pauldun170

Diamond Member
Sep 26, 2011
9,493
5,708
136
You would be better off setting resolution scale down to 75 or 80%. Image will be sharper and FPS will be the same or better as with DLSS on.
I'll keep that in mind. Fortunately, except for that short section of the game I don't really need to bother.
 

NTMBK

Lifer
Nov 14, 2011
10,452
5,839
136
Part of the problem with CNNs is that it's nigh on impossible to verify what the heck it is "actually doing", and debug it. In a traditionally coded scaling algorithm, you can debug through a problematic case, and try to figure out why the algorithm is giving sketchy results. With a CNN what can you do, apart from add more data to the training set, retrain and hope for the best?
 

pj-

Senior member
May 5, 2015
501
278
136
I've been waiting for comparisons of DLSS 4k to rendering at sub-native res.

It is a shocking failure how bad DLSS looks in comparison.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Part of the problem with CNNs is that it's nigh on impossible to verify what the heck it is "actually doing", and debug it. In a traditionally coded scaling algorithm, you can debug through a problematic case, and try to figure out why the algorithm is giving sketchy results. With a CNN what can you do, apart from add more data to the training set, retrain and hope for the best?

Yeah, its a lot of make some changes, then have it retrain itself and wait for the results. If its not how you wanted it, then go back to step one and try again. Its very tedious.

And maybe nVidia will get this good enough to be usable. But to me it seems like this was mostly a marketing thing. They get to talk about how their amazing deep learning tech was able to do this awesome thing. Except the awesome thing is only awesome on paper.

I do find it interesting that the quality is *SO* much worse than basic upscaling. They claim DLSS at 4k is rendering at 1440, but it almost seems like its actually rendering way lower than that. Or its rendering less data than what a GPU would typically render at that resolution.
 

amenx

Diamond Member
Dec 17, 2004
4,525
2,862
136
I'm glad Nvidia is getting a lot of heat over this. Should teach them to very careful with what they promise next. And not to deliver half-baked new features or techinques that dont deliver adequately on day 1.