RTX continues to seriously disappoint me

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,576
96
1080p 60 fps for a xx60 part, the eighth fastest ray tracing GPU of the first generation of hardware and people are mocking it..... Understanding of technology is rather lacking in this discussion.

Yeah i played some BF5 at 1080p on my 2070 with RTX on and fps wise for the limited time i played it ran a solid 60fps. Not sure if a driver issue or DX12 issue but despite the 60+ fps experience it did not feel smooth at all. Felt stuttery but it did look very nice. Well as nice as 1080p can look on a 28'' 4k monitor.

Its just for 4k i am not gonna enjoy 4k RT smoothly for maybe 2 more generations? By then no one is playing BF5 not that i am now. I think its a trash game .
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You younger Hitman? Not an insult in anyway, but you could wake me from a drunken stupor and I could tell you what the Q2 bench shows you, it's been the same for decades. There are a lot of other settings you can tweak in Q2, stuff that goes way beyond the original version. Running medium GI with water caustics off overall looks better and plays faster than GI low and water caustics on.

When is the 2060 going to play any new title @4k 60 FPS with everything maxed? In metro playing with RTX mode on is faster on the 2060 than playing at the ultra setting with RTX off. Still aren't getting close to 60Hz at 4k, maybe not even with the lowest settings.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
When is the 2060 going to play any new title @4k 60 FPS with everything maxed? In metro playing with RTX mode on is faster on the 2060 than playing at the ultra setting with RTX off. Still aren't getting close to 60Hz at 4k, maybe not even with the lowest settings.
Then you know why it is still a gimmick. And will be for quite some time.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The Radeon 5700xt can't play metro exodus max settings at 4k 60Hz either, so your standard says that card is a gimmick.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Then you know why it is still a gimmick. And will be for quite some time.

Then maybe 4k 60Hz is also a gimmick because there are many GPUs out there, including the 5700XT, which do not achieve this?

Raytracing is not different than any other IQ enhancing technology like for instance screen resolution - you cannot necessarily enable everything at the same time with a mid range card. This does not mean that either raytracing, screen resolution or anything else is a gimmick.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Decisions, decisions.
Also keep in mind that you're not just seeing Quake 2 RTX, you're seeing Quake 2 RTX + HD content. The textures and gun model (for example) have nothing to do with RTX. nVidia's lying by omission, much like Intel "forgot" to tell us their 5GHz Xeon doesn't actually exist.

So...sub-60 FPS @ 1080p on $1200 hardware released in 2019, in a 22 year old game.

Also ray tracing is supposedly so "good", yet Quake2 RTX's visuals have to be propped up by HD content that's been around for over a decade.

Epic fail.

A single average number isn't worth much when you have no idea what scene they are testing or how they are testing it. Actual video benchmarks show consistently under 60, even when turning GI to low:
Yep, real in-game performance from actual Quake 2 RTX gameplay slideshows even a $1200 2800Ti. Or you can simply run Berserker which looks better overall than Quake 2 RTX and runs far faster. It also works on any DX9 video card and doesn't require horrifically overpriced Turing cards.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
3,007
126
Then maybe 4k 60Hz is also a gimmick because there are many GPUs out there, including the 5700XT, which do not achieve this?
Many GPUs achieve it just fine in older games. I routinely run 5K @ 144Hz (many with 4xMSAA) in a lot of 5+ year old games on my 1070, including a Quake 2 source-port which has nice OpenGL 3.2 dynamic lighting.

Meanwhile raytracing slideshows a 22 year old game @ 1080p with no AA on 2019 hardware. So no, 4K is not even close to ray tracing.

Ray Tracing is more like hardware PhysX. Utter garbage performance relative to the visual gain, vendor-lock and pricing, and relative to existing techniques that run far faster and already look really good.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
23,196
13,279
136
Doom eternal

Someone else already responded to that. Aaaaaand not released yet.

Cyberpunk

Also not released yet. I did some poking around and

https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/

found it to be slim pickings for RT support.

all the major next gen console titles revealed so far

That's gonna be a major test not only for AMD's hardware, but for DXR in general. How many of these titles will go from 4K 30 fps or 1440p 60 fps to 1080p 30fps because of RT?

The entire industry is on board outside of a small handful of people rabidly anti progress.

Nobody's "anti progress" some people just realize that:

a). Nvidia's implementation of RT has thus far been underwhelming
b). not as many people care about RT as you might think.

The fact that you've donned the mantle of "progress" is quite telling.

When can we enjoy Ray Tracing in 4K on RTX 2060 at 60 FPS? ;)

And I mean Enjoy. With everything cranked up to the limits.

Um, never? I'm skeptical as to whether a 2060 is good for 4K 60 fps without RT. Unless you make some IQ sacrifices. It's more of a 1440p card.

The Radeon 5700xt can't play metro exodus max settings at 4k 60Hz either, so your standard says that card is a gimmick.

Pssh, stop being silly. It's a 1440p card too. You might get somewhere if you'd stop slinging balderdash. Maybe.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Many GPUs achieve it just fine in older games. I routinely run 5K @ 144Hz (many with 4xMSAA) in a lot of 5+ year old games on my 1070, including a Quake 2 source-port which has nice OpenGL 3.2 dynamic lighting.

Meanwhile raytracing slideshows a 22 year old game @ 1080p with no AA on 2019 hardware. So no, 4K is not even close to ray tracing.

Ray Tracing is more like hardware PhysX. Utter garbage performance relative to the visual gain, vendor-lock and pricing, and relative to existing techniques that run far faster and already look really good.

Perhaps give RT 5 years yo hit those same performance targets then, crapping on it from a great height right now is a bit like junking electric cars because they can't go 500 miles on a single charge.

Someone else already responded to that. Aaaaaand not released yet.



Also not released yet. I did some poking around and

https://www.digitaltrends.com/computing/games-support-nvidia-ray-tracing/

found it to be slim pickings for RT support.



That's gonna be a major test not only for AMD's hardware, but for DXR in general. How many of these titles will go from 4K 30 fps or 1440p 60 fps to 1080p 30fps because of RT?



Nobody's "anti progress" some people just realize that:

a). Nvidia's implementation of RT has thus far been underwhelming
b). not as many people care about RT as you might think.

The fact that you've donned the mantle of "progress" is quite telling.



Um, never? I'm skeptical as to whether a 2060 is good for 4K 60 fps without RT. Unless you make some IQ sacrifices. It's more of a 1440p card.



Pssh, stop being silly. It's a 1440p card too. You might get somewhere if you'd stop slinging balderdash. Maybe.

I see plenty of people hating on new things for banal reasons like "my preferred brand doesn't have it" or "it's not as good as x, y or z". We techies seem to be utterly entrenched in the turf war that blinds us on occasions to the flaws in our own camp to attack the other. "Anti-progress" aka anti Nvidia/RTX/Proprietary opinionated posters will do a 180 when AMD joins the pre-party but you just know some will deny hating on RT now when that happens.

FWIW I too think the 2060/2060S maybe even the 2070 are 1080p 144hz/1440p cards, 4K is still not fully covered by the 2080 Ti so expecting a 2060 to do it is a bit much. The consoles in the next gen will be RT Lite, the refresh or next gen will be when RT is either a success or stillborn like 3D.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
I see plenty of people hating on new things for banal reasons like "my preferred brand doesn't have it" or "it's not as good as x, y or z". We techies seem to be utterly entrenched in the turf war that blinds us on occasions to the flaws in our own camp to attack the other. "Anti-progress" aka anti Nvidia/RTX/Proprietary opinionated posters will do a 180 when AMD joins the pre-party but you just know some will deny hating on RT now when that happens.

FWIW I too think the 2060/2060S maybe even the 2070 are 1080p 144hz/1440p cards, 4K is still not fully covered by the 2080 Ti so expecting a 2060 to do it is a bit much. The consoles in the next gen will be RT Lite, the refresh or next gen will be when RT is either a success or stillborn like 3D.
Everybody here who is sceptical about Ray Tracing at this point in time would gladly call out AMD for the reason they would call out Nvidia. Any GPU that has lower RT and rasterization performance than RTX 2080 Ti, makes no sense at this point.

The other side of this coin is if AMD can offer RTX 2080 Ti RT performance on a GPU that has lower than RTX 2080 Ti rasterization performance.

This also makes me laugh. So far, in this thread people constantly discuss AMD vs Nvidia, forgetting about the main goal of this thread. Which is discussion about Nvidia's implementation of Ray Tracing: RTX. There is one specific user here who wants to spin the discussion towards Nvidia vs AMD war. Most of people here didn't even bothered to go down that route.

Getting back to topic. RTX and Ray Tracing as of right now is a gimmick, when you need to pay 1200$ to enjoy all of it in biggeest glory possible.

The reason I talked about RTX 2060 being capable of running RTX in 4K at Ultra settings is very simple. Only when GPUs of this price point offer Ray Tracing and Rasterization performance that can handle RT in 4K Ultra settings - this is the very first moment when RT is NOT a gimmick.

Somebody few weeks ago said that he thinks that in 2023 there will be first game that will require RT capable hardsware. By that time we will have 2 GPU generation and most likely we will be on 5 nm node with them. Which is very logical assumption. When that happens - Ray Tracing will not be a gimmick.
 
  • Like
Reactions: Ranulf and kawi6rr

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Just so we're clear Glo, are you saying every single AMD GPU is a pointless gimmick? I'm trying to read your post as anything but insane, and the only way I can do that reads like you think AMD needs to shut down their GPU division.

People claiming not to be anti progress, how are you rationalizing that in your head? Saying RTX isn't for you, or you don't like the exact parts on offer now, that I can get, but the utter insanity of comments like QuakeXP looks better, which is why I jumped in this conversation in the first place, that is comically inaccurate.

1080p monitors are roughly forty times more common amongst gamers than 4k, yet the people in this thread mock the overwhelming majority and say it doesn't matter(not a guesstimate), the less than 2% of people that own 4k monitors are what matters.

This is the reality of today, the overwhelming majority of people game on a 1080p display and how do you improve their visual experience? Ray tracing is a very, very good answer to that.

But no, instead let's hate on progress and trash a first gen technology that is genuinely usable and gaining support faster than any major tech I can ever remember.

I asked about upcoming AAA games ray tracing support for a reason, the overwhelming majority of the entire industry is on board. If you don't like it, just run low quality settings and be happy.
 

Glo.

Diamond Member
Apr 25, 2015
5,930
4,991
136
Just so we're clear Glo, are you saying every single AMD GPU is a pointless gimmick? I'm trying to read your post as anything but insane, and the only way I can do that reads like you think AMD needs to shut down their GPU division.

People claiming not to be anti progress, how are you rationalizing that in your head? Saying RTX isn't for you, or you don't like the exact parts on offer now, that I can get, but the utter insanity of comments like QuakeXP looks better, which is why I jumped in this conversation in the first place, that is comically inaccurate.

1080p monitors are roughly forty times more common amongst gamers than 4k, yet the people in this thread mock the overwhelming majority and say it doesn't matter(not a guesstimate), the less than 2% of people that own 4k monitors are what matters.

This is the reality of today, the overwhelming majority of people game on a 1080p display and how do you improve their visual experience? Ray tracing is a very, very good answer to that.

But no, instead let's hate on progress and trash a first gen technology that is genuinely usable and gaining support faster than any major tech I can ever remember.

I asked about upcoming AAA games ray tracing support for a reason, the overwhelming majority of the entire industry is on board. If you don't like it, just run low quality settings and be happy.
So in your head RTX is worth the price premium? Where you always have to sacrifice something? Be it performance, image quality or money?

Not even mentioning that currently Ray Tracing is not even impressive, from Image Quality/Fidelity point of view, as it is typical for first iterations of new tech, on limited hardware.

Currently to even enjoy Ray Tracing at full tilt, with everything on Ultra even in 1080p, you have to pay 1200$, for RTX 2080 Ti. Im sure that in your head that is good value, and good deal, but in reality - its completely and utterly stupid idea. We've talked about it here even without you and your AMD vs Nvidia agenda for months. There were people like you here claiming that progress is good, and that Ray Tracing adds a lot of fidelity. At the same time they said that to enjoy that fidelity, they have to sacrifice... fidelity, because they cannot run RT in 1080p with RTX 2060 at Ultra Details with decent enough framerates.

But you know what? Nobody is claiming that progress is bad. Its just that RTX is not impressive, and effectively it results always in some sort of disappointment, AT THIS VERY MOMENT. Because the tech is still not ready for prime time. Nor hardware, nor software is. Which is the bloody point of this thread. Everybody for months is saying that RT is the future. But right now it is a gimmick. Deal with it, bro, because that is the reality we are facing.

RTX is disappointment, at this point. Be it from Fidelity point of view, because it is not that impressive, at all. It will become impressive, and must have in few years, but right now it is still disappointing.
It is disappointment from performance point of view, because it sacrifices a lot of performance to make RT possible.
And if you want both: performance and Fidelity that is questionably not disappointing, you have to sacrifice money.

All in all this whole thread should be summed by one very simple question. Is RTX worth the price you pay for it? Be it in terms of Fidelity, price or performance. Objectively? No. Not yet. In few years it will be. Because the Engine tech will be much more mature, hardware will be much more mature, and much more widespread in price margins. This is what AMD actually said about Ray Tracing, which is 100% factually correct: RT will not become mainstream untill mainstream GPUs will be able to handle it.

And this was the whole point of this thread. Ray Tracing being a disappointment, because of the prices you have to pay right now for it. This threads point was not making it AMD vs Nvidia flame war, like you have an agenda to push. Because you mistake the point. RTX is Nvidia's implementation of Ray Tracing. Replace the title: "RTX continues to seriously disappoint me" with "Ray Tracing continues to seriously disappoint me", and we have completely different meaning.

All you do is try to defend Nvidia's brand, here. Not to discuss the point of this thread.
 
  • Like
Reactions: OTG

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
My first post was a video by digital foundry talking about how astonishingly good ray tracing looked in the game this thread chose to use to trash RTX. Then we had a bunch of people who think from what they've heard it isn't that great.

Fine, let's use another game
Metro Exodus
So this game has a flat out comparison using higher resolution shaders versus ray tracing(DXR vs extreme) and ray tracing is *faster* while looking significantly better. This is exactly what the anti RTX crowd has been saying, except the opposite.

I made nVidia vs AMD an issue because of the flat out [bRedacted[/b]about ray tracing being proprietary and not an industry standard that was being promoted by team red. DXR is an industry standard AMD is not supporting for their customers right now.

So, Metro Exodus, higher resolution shaders vs ray tracing- ray tracing is both faster and looks way better.

I know this isn't a fair discussion, I actually know what I'm talking about while the anti crowd is based on how what they heard made them feel. This thread is going to make some people look really, really foolish in the not too distant future.




Profanity in the technical forums
is not allowed.


esquared
Anandtech Forum Director
 
Last edited by a moderator:
  • Like
Reactions: Muhammed

nurturedhate

Golden Member
Aug 27, 2011
1,767
774
136
My first post was a video by digital foundry talking about how astonishingly good ray tracing looked in the game this thread chose to use to trash RTX. Then we had a bunch of people who think from what they've heard it isn't that great.

Fine, let's use another game
Metro Exodus
So this game has a flat out comparison using higher resolution shaders versus ray tracing(DXR vs extreme) and ray tracing is *faster* while looking significantly better. This is exactly what the anti RTX crowd has been saying, except the opposite.

I made nVidia vs AMD an issue because of the flat out bull $h!t about ray tracing being proprietary and not an industry standard that was being promoted by team red. DXR is an industry standard AMD is not supporting for their customers right now.

So, Metro Exodus, higher resolution shaders vs ray tracing- ray tracing is both faster and looks way better.

I know this isn't a fair discussion, I actually know what I'm talking about while the anti crowd is based on how what they heard made them feel. This thread is going to make some people look really, really foolish in the not too distant future.
This is the problem. You openly state you had to defend Nvidia. You openly state you went off topic. You openly state you turned a thread about RTX into an Nvidia vs AMD flame bait thread. You openly swear, against forum rules, you know that, you purposely misspelled the word. You state you are the only one who knows what they are talking about and everyone else is stupid and foolish. You twist words around and shift goalposts. It's been weeks of this. This is trolling.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Interesting read, I was defending the truth, DXR is an industry standard. That is not proprietary. That is the truth.

People were saying Q2RTX looked worse than Q2XP, I provided a link with analysis blowing that out of the water.

People were saying higher resolution shading looks better and/or is faster than ray tracing, I provided a link with evidence showing this isn't always the case.

The counter points, people's feelings based on what they've heard. You say *I'm* trolling? I'd have to say you don't understand what that word means.

I self censored for the 'swear', you know the same thing they do on network TV?

I also never said I'm the only one who knows what they're talking about, I said the people I'm debating don't, massive difference. I have seen zero analysis outside of if you crank everything to highest settings at 4k it isn't playable.

Trolling? Please. Get some people here with actual counter points or some credible third party analysis, in other words, get the actual trolls out of the thread and offer some reasonable counter points.
 
  • Like
Reactions: Muhammed

DrMrLordX

Lifer
Apr 27, 2000
23,196
13,279
136
"Anti-progress" aka anti Nvidia/RTX/Proprietary opinionated posters will do a 180 when AMD joins the pre-party but you just know some will deny hating on RT now when that happens.

If it stinks as badly as RTX stunk on those first-gen NV cards then AMD deserves to be slapped around for it. AMD darn well knows they can't do it right now. They need to focus on making their cards faster compared to NV overall. They haven't even overcome the performance gap between . . . let's say Radeon VII and 2080Ti. Do that and then MAYBE they can think about RT. And even then, when you consider how much silicon NV threw at RT when they could have worked on a big brother to the 1660Ti instead, you have to wonder . . . was it worth it? We still don't have 4k 60fps for the masses and 4k 144 fps for the high-end. And NV is worried about RT instead? Cmon guys.

Just so we're clear Glo, are you saying every single AMD GPU is a pointless gimmick?

Logical fallacies like this are why people accuse you of trolling. Maybe you think you're being clever. You aren't. Sorry.

yet the people in this thread mock the overwhelming majority

Another logical fallacy. Nobody's mocking people for running 1080p. A video game from 22 years ago should be running at around 500 fps, give or take. Why is it struggling to hit 60 fps in 1080p with RT turned on?

Because it sucks. That. Much.

If you don't like it, just run low quality settings and be happy.

Or I can just run Ultra and turn that off (like god rays . . . who even uses those anymore?). Remember god rays? Were those progress too?
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,753
12,492
136
So this game has a flat out comparison using higher resolution shaders versus ray tracing(DXR vs extreme) and ray tracing is *faster* while looking significantly better.

The extreme performance graph is showing the performance for the extreme quality preset, not just higher resolution shaders. According to the articel doubling the shader rate has a 25 - 30% perfromance hit. RTX has a 30 - 40% performance hit on high to ultra, so while RTX does look better, it also has a bigger hit. Additionally, with doubling the shader rate of the rasterized graphics, you are doing multiple samples per pixel, essentially doing supersampling for the shader whereas even at ultra for RTX you are doing less than a sample per pixel and then denoising the result. While RTX still looks better, it's pretty disingenuous to say it performs faster.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
The extreme preset only visibly changes shader resolution, everything else state the same in the settings panel, I don't know if they spelled that out in the article or not, I'm going on I actually tested it myself. The 2060 is faster in DXR mode at 1440p than extreme mode without ray tracing@1080p. That is the reality of what many consider to be the best looking game out.

That I consider more of a canary, increasing traditional shaders long ago passed the diminishing return crest, this is only going to be more evident as we move forward.

DXR mode in Metro is faster than extreme, markedly so. Thank you for reading the article by the way, it's nice to have an actual conversation instead of people claiming your wrong for no apparent reason.

Glo said anything slower than a 2080ti is a gimmick, he said that, you say it's trolling if I call that out? Seriously? Wow.

For the record, on a 2060 Quake 2 runs at roughly 1500FPS, not 500. That's your side of the debate, but accuracy is the important element here. Also, a 22 year old game with the most advanced lighting engine ever in a game to date.

The claims in this thread were that AMD would support ray tracing when there was an open standard, DXR is already here. DXR is an industry standard, not nVidia proprietary. Arguing that DXR shouldn't be supported because RTX hardware is proprietary would be akin to arguing developers shouldn't support Navi because it's proprietary. It's an argument so stupid I try to read an implied alternative that wouldn't be that dumb, I'm attempting to give the benefit of doubt.
 
Last edited:

Hitman928

Diamond Member
Apr 15, 2012
6,753
12,492
136
The extreme preset only visibly changes shader resolution, everything else state the same in the settings panel, I don't know if they spelled that out in the article or not, I'm going on I actually tested it myself.

They do spell it out, as best they can:

"Extreme quality is basically like ultra, only even more demanding. The shading rate is 200, and while there's no specific information on what the Quality setting does, it appears to increase texture resolution, shadow quality, and a few other bits."

I guess you'd have to ask the developer exactly what it changes, but according to pcgamer, it's more than just the shader resolution.

The 2060 is faster in DXR mode at 1440p than extreme mode without ray tracing@1080p. That is the reality of what many consider to be the best looking game out.

Again, extreme mode is changing things in a way like you are running a higher resoultion, like supersampling. It may even be applying SSAA or some such so comparing ray tracing on versus extreme mode is disingenuous.
 

DrMrLordX

Lifer
Apr 27, 2000
23,196
13,279
136
Glo said anything slower than a 2080ti is a gimmick

No he didn't. He said:

Any GPU that has lower RT and rasterization performance than RTX 2080 Ti, makes no sense at this point.

Which basically means, "if you don't have a 2080Ti, don't turn on RT".

For the record, on a 2060 Quake 2 runs at roughly 1500FPS, not 500.

Wow thanks for making my argument for me.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Fine, let's use another game
Metro Exodus
So this game has a flat out comparison using higher resolution shaders versus ray tracing(DXR vs extreme) and ray tracing is *faster* while looking significantly better. This is exactly what the anti RTX crowd has been saying, except the opposite.

Thats a poor comparison because the RTX version is being rendered at a far lower resolution and results in blurry textures (The reviewer straight up says that DLSS looks poor and manually setting the resolution to be lower looks better). If you want to compare performance, you have to compare them using THE SAME RESOLUTION and at the same settings besides RTX being enabled.

Its been proven time and time again that DLSS is a sham, and basic up-scaling looks superior while offering the same performance. When you run the game at the same resolution for RTX on and OFF, RTX runs significantly slower as the benchmarks in this article show.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Look, this sort of anti hyperbole is just as annoying as the other way round.

DLSS has plenty of problems - especially the need to train the neural nets an awful lot to use it, in specific conditions too. We do however know that neural nets are good at this sort of interpolation. We also know that NV know an awful lot about deep learning on images and have the hardware so the training will be working OK (+ tensor cores to run it.).

Claiming that a basic up scaling algorithm will work better than DLSS does in terms of IQ is simply profoundly implausible.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Alright, ran the benches myself.

Ultra 56.43
Ultra +200% shade- 42.19
Extreme- 31.93
RTX no DLSS- 47.82
RTX- 54.71

Throw away the DLSS numbers, I just ran them because that's now default RTX mode so for comparison. RTX is faster than ultra with a 200% shading rate and nothing else and significantly faster than extreme. There was obviously a performance patch for Metro in the last few months, heh.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Look, this sort of anti hyperbole is just as annoying as the other way round.

DLSS has plenty of problems - especially the need to train the neural nets an awful lot to use it, in specific conditions too. We do however know that neural nets are good at this sort of interpolation. We also know that NV know an awful lot about deep learning on images and have the hardware so the training will be working OK (+ tensor cores to run it.).

Claiming that a basic up scaling algorithm will work better than DLSS does in terms of IQ is simply profoundly implausible.

Its has been proven on every game that has DLSS support that basic up-scaling provides better image quality and nearly identical performance. Several places have run these tests. AMD's new Radeon Image Sharpening also provides superior image quality than DLSS, and works with any DX9/DX10/DX12 game (DX11 support is coming). Now obviously AMD doesn't have RT support yet, so its only handy for people wanting to boost performance of regular rasterized games. But it shows that nVidia's tech behind DLSS is inferior at this time.