TPU: many combinations of GPU/resolution/RTX/DLSS not allowed

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
I can not remember that i have ever made such a claim. Can you please share a quote?



Future nodes? You do not seem to realize that Moores Law is slowing down and that heterogenous computing becomes more important in the future..

In addition for gaming you always need to solve global illumination and other things where currently (without raytracing) only very crude approximation methods are available. So its not like the RT units staying idle in gaming workloads.
I meant games that exist right now with no RT usage, so a GPU with general purpose cores that can also do RT will have an advantage in such games over another GPU that reserves part of the area for specialized hardware, due to being able to fit more cores. Tensor cores make it even worse. That's all.

By the way, even if node advancement is slowing, it still exists, with 7nm bringing a 100% improvement, with 5 & 3nm coming in the next few years, bringing another 100% at least. Not trivial in my opinion.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
I meant games that exist right now with no RT usage, so a GPU with general purpose cores that can also do RT will have an advantage in such games over another GPU that reserves part of the area for specialized hardware, due to being able to fit more cores. Tensor cores make it even worse. That's all.

I was talking about future games only and not about legacy stuff. You cannot have both faster raytracing performance and lower prices when not having RT cores. In particular i made the reference to consoles, where everything is about balance pricing.
Tensor cores are pretty much use-less in a gaming oriented product - agreed.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,959
126
Well no Tensor cores for the 1660Ti. I'm not surprised given there was no way they could offer DLSS without cannibalizing the 2060. The 1660TI is better off without them, anyway.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
DLSS is a good concept but seems poorly implemented. Even if they fix the blurriness, the fact that it needs to be trained on each game and resolution separately means it will remain a niche feature and not widely supported across all games.

I think there is a larger discussion that needs to be had about the future of the dGPU and the lack of titles worth buying a shiny new GPU for. I think us consumers, pro-users, enthusiasts, etc. need to step back and re-evaluate what kind of card WE want to buy rather than being told by AMD/nV new tech justifies charging twice as much as last gen.

And willingly paying for it

I agree, I don't see much incentive to keep buying new video cards at this point. You can get a slightly higher resolution or refresh rate, but the actual game graphics have not improved much for years now. It seems like any significant improvements from here would make the games prohibitively expensive to develop and won't be happening, especially with the big AAA companies already struggling to come up with good business models for games. For most existing games, the 1080 Ti is generally good enough even at 4K, even if I have to turn down a few settings that make no difference visually.
 
  • Like
Reactions: Arkaign

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
DLSS is a good concept but seems poorly implemented. Even if they fix the blurriness, the fact that it needs to be trained on each game and resolution separately means it will remain a niche feature and not widely supported across all games.

That's how it had to be though? Otherwise you've just got a fixed hardware upscaler. NV do the training on their computers etc, so its a really cheap thing for developers to support.

NV do clearly need to get quite a bit better at actually supporting it/doing the learning etc though :)
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
That's how it had to be though? Otherwise you've just got a fixed hardware upscaler. NV do the training on their computers etc, so its a really cheap thing for developers to support.

NV do clearly need to get quite a bit better at actually supporting it/doing the learning etc though :)

Given how hot machine learning is right now, do you really think Nvidia is going to be just giving away training time on their supercomputer?
 

DrMrLordX

Lifer
Apr 27, 2000
21,622
10,830
136
Given how hot machine learning is right now, do you really think Nvidia is going to be just giving away training time on their supercomputer?

Can AI training be run across a DC grid? Maybe NV needs to build in training software into their driver stack and let buyers do the training. With an opt-in.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Given that NV already have the training hardware & its one of the features they're using to try and push RTX?

They might plausibly be giving the time away, yes.