RTX continues to seriously disappoint me

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Microsoft would have known about RTX, at the absolute latest, by 2016 although I would assume a year or two earlier is far more likely, somewhere between four to six years prior to their console launch and then yes, they and Sony whom I'm sure was briefed on the technology, forced AMD to support the industry standard technology at least for the consoles.

There is an industry standard, nVidia is supporting it, AMD has to date refused to for their consumer parts.

As far as agreeing with the op, his follow up post said the XP version had superior color lighting, seriously that's so absurdly wrong it needed to be addressed. The XP lighting is hot trash, absolute garbage in comparison(for it's time it was great, but we aren't talking about for its time).

Q2RTX has, by far, the best top to bottom lighting I've seen in any game ever, Metro Exodus is the only title that I'd even consider remotely comparable.
 

nurturedhate

Golden Member
Aug 27, 2011
1,742
673
136
I think that was probably true - no one believed ray tracing was possible till the RTX cards came out. MS having made the DXR standard are clearly interested in ray tracing, they will have seen the cards (probably quite a bit before release) and demanded that AMD add something to their chips to support it. AMD is now doing a last minute add of that support. We know it's last minute because we have Navi V1 and it's missing - if you look at AMD press releases they go along the lines of "what raytracing...err", then "we'll do it on the server" and now "it's in Navi V2 and next gen consoles".
It's a definite possibility. I'm more of the mindset MS wanted something to sell the new xbox and it propagated from there. We got both Nvidia's and MS's RT annoucement at GDC 2018. AMD being slow to respond and late is typical RTG. Nivida's sloppy reveal, performance, demos, games that support the feature, DLSS, etc were unlike what we have seen from Nvidia for a decade. This seems more like shoehorning in HPC/ML/etc cards into consumer graphics cards because they saw an opening to be the standard setter. RT is RTX. it's not DXR. It's not whatever AMD will bring.
 
Last edited:

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
People keep talking about shoe horning ML/AI hardware into consumer parts, this is simply not reality. The dedicated RT cores on the RTX line are useless for anything other than intersection tests, the tensor cores, which account for a quite small portion, are very useful but if they were looking for ML/AI purposes they could have quadrupled tensor cores and used regular shader hardware for intersection tests with four times the denoising.

That wouldn't have worked as well, not even close, but it would still be way faster than AMD and would've given them far better compute performance for AI/ML.
 

njdevilsfan87

Platinum Member
Apr 19, 2007
2,330
251
126
I Nvidia realized they couldn't just throw more FP32 units onto the cards forever. 7nm, 5nm, and then what? They'll hit a wall in performance, and we've been hearing about carbon nano-whatever CPUs for like 2 decades now.

No pun intended, but what Nvidia is doing is smart. RTX done well provides that next level of graphical immersion, and DLSS matured could give us 4-8K VR with high frame rates. The former use RTX cores and the latter Tensor cores, so they've both got their place for gaming. Neither of those could be done with FP32 units alone for a very long time.
 

nurturedhate

Golden Member
Aug 27, 2011
1,742
673
136
=The dedicated RT cores on the RTX line are useless for anything other than intersection tests

Not true but close enough for conversation. Even limiting it to expressly that there are still plenty of professional cases where that acceleration is needed. There's a large difference between saying something wasn't developed at all (not what I'm saying) and it's initial use case wasn't consumer cards (what I am saying).
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I think that was probably true - no one believed ray tracing was possible till the RTX cards came out. MS having made the DXR standard are clearly interested in ray tracing, they will have seen the cards (probably quite a bit before release) and demanded that AMD add something to their chips to support it. AMD is now doing a last minute add of that support. We know it's last minute because we have Navi V1 and it's missing - if you look at AMD press releases they go along the lines of "what raytracing...err", then "we'll do it on the server" and now "it's in Navi V2 and next gen consoles".

There is no such thing as a "last minute add' of hardware to a chip architecture that has been in development for years. AMD/nVidia/Intel are all involved in everything that goes into DirectX. The fact that you think somehow MS invented this standard on their own in secret and then suddenly asked AMD to add something after they were finished is laughable.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
Has anyone else reproduced the observed effect? That's what I'm asking.
Why would you expect her copy of BF5 be unique? Any other RTX video quickly confirms it:


Watch the exact same lighting pulse from 7:02 repeat like clockwork every ten seconds, regardless of what's happening on the screen.
 
Last edited:

FiendishMind

Member
Aug 9, 2013
60
14
81
Why would you expect her copy of BF5 be unique? Any other RTX video quickly confirms it:


Watch the exact same lighting pulse from 7:02 repeat like clockwork every ten seconds, regardless of what's happening on the screen.
Reproducing it confirms it's an actual issue, which it clearly is or at least was back then.

Since we've seen false narratives about BFV's RTX implementation pushed to drum up outrage before, you don't think maybe we should be prudent and get solid confirmation about these things before going completely ham?
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
I think that was probably true - no one believed ray tracing was possible till the RTX cards came out. MS having made the DXR standard are clearly interested in ray tracing, they will have seen the cards (probably quite a bit before release) and demanded that AMD add something to their chips to support it. AMD is now doing a last minute add of that support. We know it's last minute because we have Navi V1 and it's missing - if you look at AMD press releases they go along the lines of "what raytracing...err", then "we'll do it on the server" and now "it's in Navi V2 and next gen consoles".
Next gen consoles are in development for quite some time. Do you guys think that AMD, Microsoft and Sony knowing what Nvidia released added Ray Tracing features(potentially) in response to this, or they were hard at work developing this tech, and Nvidia beat everyone on release of this technology, knowing perfectly well that it is not ready for prime time, yet, and they can this way sell more GPUs to the uneducated?
 

JasonLD

Senior member
Aug 22, 2017
485
445
136
Next gen consoles are in development for quite some time. Do you guys think that AMD, Microsoft and Sony knowing what Nvidia released added Ray Tracing features(potentially) in response to this, or they were hard at work developing this tech, and Nvidia beat everyone on release of this technology, knowing perfectly well that it is not ready for prime time, yet, and they can this way sell more GPUs to the uneducated?

Game developers are not going to implement technology if there isn't any consumer GPU product out there that supports it. Nvidia jumped on early to get developers to start implementing ray tracing asap, so when the Ray Tracing performance is finally ready for prime time, developers would be ready to fully embrace it.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
How is ray tracing not ready for prime time? We witnessed a year long splooge fest over async compute that gave us what?

Compare that to a video made by the people the op linked that called Q2RTX amazing, truly next level, breath taking and several other high praise adjectives.

Been years now, link up any remotely respectable journalist using those adjectives for async compute.

"It'll be years before there is any support" is the normal refrain, so show me these mountains of examples, or you know, one, of async compute accomplishing what ray tracing already has. Ray tracing is months old in hardware, games are out, it's an industry standard- where is AMD with their industry standard driver support for this feature?

RT units are viable for the Quadro line, not Tesla. Saying something has professional uses is very different than saying it has HPC/AI/ML uses. On that front, intersection test hardware would've been usable on pro hardware for decades, consumer parts are what got it done. Tensor cores were built for the HPC customers and they made them work for consumers sure, RT cores were consumer driven and benefit the pro crowd.
 

joesiv

Member
Mar 21, 2019
75
24
41
Without async compute, Raytracing on Nvidia cards would have even worse frametimes, as it leverages it... It's a useful technology.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
How is ray tracing not ready for prime time? We witnessed a year long splooge fest over async compute that gave us what?

Compare that to a video made by the people the op linked that called Q2RTX amazing, truly next level, breath taking and several other high praise adjectives.

Been years now, link up any remotely respectable journalist using those adjectives for async compute.

"It'll be years before there is any support" is the normal refrain, so show me these mountains of examples, or you know, one, of async compute accomplishing what ray tracing already has. Ray tracing is months old in hardware, games are out, it's an industry standard- where is AMD with their industry standard driver support for this feature?

RT units are viable for the Quadro line, not Tesla. Saying something has professional uses is very different than saying it has HPC/AI/ML uses. On that front, intersection test hardware would've been usable on pro hardware for decades, consumer parts are what got it done. Tensor cores were built for the HPC customers and they made them work for consumers sure, RT cores were consumer driven and benefit the pro crowd.
Can you run Ray tracing on ALL of new segments? Including those that cost 150$, and have decent experience?

That is why it is not ready for prime time, yet.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Useful and game changing aren't even close to the same thing, and many of the people bashing ray tracing are the same people who spent a very long time singing the glorious benefits of async compute.

I claimed, and provided a link backing the assertion, that ray tracing is a major technology that has an incredible impact and visually embarrasses legacy technology. This is the tech under fire in this thread. Ray tracing is months old, I'm asking for ONE example of async compute being in that realm. I know it's not possible, it's a trivial technology that at best very mildly improves throughput under certain situations, but again, some of the people trashing ray tracing(the op is not part of this group) were talking about how game changing it was.

$150 isn't prime time, it's poor time. Can you get a PS5 for $150? Because that's sold over 100 million units faster than any console in history. I could say Ryzen isn't ready for prime time because you can't get one for $0.13, it'd just be inane drivel.

My launch PS4 ran $50 more than it cost to get ray tracing hardware right now.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
Useful and game changing aren't even close to the same thing, and many of the people bashing ray tracing are the same people who spent a very long time singing the glorious benefits of async compute.

I claimed, and provided a link backing the assertion, that ray tracing is a major technology that has an incredible impact and visually embarrasses legacy technology. This is the tech under fire in this thread. Ray tracing is months old, I'm asking for ONE example of async compute being in that realm. I know it's not possible, it's a trivial technology that at best very mildly improves throughput under certain situations, but again, some of the people trashing ray tracing(the op is not part of this group) were talking about how game changing it was.

$150 isn't prime time, it's poor time. Can you get a PS5 for $150? Because that's sold over 100 million units faster than any console in history. I could say Ryzen isn't ready for prime time because you can't get one for $0.13, it'd just be inane drivel.

My launch PS4 ran $50 more than it cost to get ready tracing hardware right now.
You do realize that Async Compute is something completely different to Ray Tracing, and the impact it has of graphics?

To shorten this debate: Async Compute can allow Ray Tracing. See the difference?

What you are doing is you are shifting the goal posts to fit your narrative. Maybe Ray Tracing is good. Maybe it is important. But current implementations are completely and utterly rubbish and completely and utterly useless for consumers.

And this is my last post on this topic here. Its completely pointless to debate it, when the tech is not ready for Ray Tracing.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Async compute doesn't 'allow' ray tracing, whomever told you that is an utter moron. It's a concurrency issue with data processing, something that is beneficial for performance under certain conditions, and gives you nothing new in terms of end results.

People in this forum talked about how game changing async compute was for a very long time at absurd lengths. Those same people are now trashing ray tracing which is actually game changing.

I already linked the video by DF showing how the tech humiliates traditional rendering and is very playable even on the lowest end RTX part. Proof already up.

Link up some counter proof outside of feel feels.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
How is ray tracing not ready for prime time? We witnessed a year long splooge fest over async compute that gave us what?

Compare that to a video made by the people the op linked that called Q2RTX amazing, truly next level, breath taking and several other high praise adjectives.

Been years now, link up any remotely respectable journalist using those adjectives for async compute.

"It'll be years before there is any support" is the normal refrain, so show me these mountains of examples, or you know, one, of async compute accomplishing what ray tracing already has. Ray tracing is months old in hardware, games are out, it's an industry standard- where is AMD with their industry standard driver support for this feature?

RT units are viable for the Quadro line, not Tesla. Saying something has professional uses is very different than saying it has HPC/AI/ML uses. On that front, intersection test hardware would've been usable on pro hardware for decades, consumer parts are what got it done. Tensor cores were built for the HPC customers and they made them work for consumers sure, RT cores were consumer driven and benefit the pro crowd.

DX12 might be a better analogy, still not showing any real benefits over DX11, putting the extra work on the already overwhelmed devs has pretty much been proven to be a bad move. They already had crappy optimisations, poor ports and general bugfests before having to put extra effort in via DX12.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
DX12 has gone pretty much exactly how I thought it would, assuming some normal coder would be able to make graphics code perform better than the driver developers always seemed to be sketchy to me. Outside of the Carmack\Sweeney level devs it's just not realistic to expect it, not to mention it's more work on already over worked people.

I think there are some valid examples of DX12 offering some marked improvements over DX11 in a few edge cases, more then async anyway.

I'm not disagreeing with you btw, low level APIs were also absurdly overhyped.
 
  • Like
Reactions: xpea

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
Async compute doesn't 'allow' ray tracing, whomever told you that is an utter moron. It's a concurrency issue with data processing, something that is beneficial for performance under certain conditions, and gives you nothing new in terms of end results.

People in this forum talked about how game changing async compute was for a very long time at absurd lengths. Those same people are now trashing ray tracing which is actually game changing.

I already linked the video by DF showing how the tech humiliates traditional rendering and is very playable even on the lowest end RTX part. Proof already up.

Link up some counter proof outside of feel feels.
What some are trying to say and you keep ignoring is that using the term "game changing" in the context of a technology only presently usable by the 1% of consumers is delusional. In a few years, yes. Not now. So I guess it's correct to state " it will be game changing".

Edit:
The Lear jet wasn't game changing but the B737 was.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Is this a semantics issue? Ok, your context and tenses are off. If in four years time ray tracing is dominant it will have changed the game, the process of going from no ray tracing to ray tracing being dominant is the process of changing the game.

So if you believe that ray tracing will end up dominant, then you acknowledge it is game changing technology. When ray tracing is dominant it is no longer game changing and is instead status quo.

If you think it won't end up dominant, then it is valid to say it isn't game changing from your perspective.

If you think it will, then it is game changing and the discussion is how long it will take or how much progress it is making in achieving the change, not if it is game changing or not.
 

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
I have no interest in the RTX part of my 2080Ti, nor will I in the 2020 or 2021 generation.

But I want RT support in the coming years, and I hope both NV and AMD support it and quietly keep adding cores for each new chip they put out until its actually usable in a couple of years. Would be a shame if its dropped and we have to wait for it to be reinvented in 5-10 years.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Cake Monster, do you own Quake2? RTX is a free download and certainly worth checking out. Your card can cruise over 100FPS at 1080p with moderate settings.

Image Tech, I want to be clear, in no way whatsoever am I bashing their efforts, but they were making ray tracing hardware for mobile. Their part hit about 500MRays if I recall correctly, extremely impressive for a sub 10 watt 28nm chip for sure, but one tenth what the 2060 can do and very importantly, without tensor cores.

If you have a decent NV card and Quake2 enable RTX and then disable denoising, if you have epilepsy don't actually do that. You'll see that they are actually doing sparse sampling and then using AI to clean up the image, there are edge cases where this fails(shadows from fans cast on water are rough) but overall you get an idea of how much lifting is being removed by the 'AI only' portion of the chip.

Power VR did some great work, but their offering wasn't close to fast enough for real time unfortunately.
 

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,792
136
Cake Monster, do you own Quake2? RTX is a free download and certainly worth checking out. Your card can cruise over 100FPS at 1080p with moderate settings.

Image Tech, I want to be clear, in no way whatsoever am I bashing their efforts, but they were making ray tracing hardware for mobile. Their part hit about 500MRays if I recall correctly, extremely impressive for a sub 10 watt 28nm chip for sure, but one tenth what the 2060 can do and very importantly, without tensor cores.

If you have a decent NV card and Quake2 enable RTX and then disable denoising, if you have epilepsy don't actually do that. You'll see that they are actually doing sparse sampling and then using AI to clean up the image, there are edge cases where this fails(shadows from fans cast on water are rough) but overall you get an idea of how much lifting is being removed by the 'AI only' portion of the chip.

Power VR did some great work, but their offering wasn't close to fast enough for real time unfortunately.

Battlefield 5 doesn't use tensor cores for denoising and according to the creator of Quake2 RTX, it doesn't either. I'm not sure about Tomb Raider or Metro Exodus but my guess is they don't as well.
 

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
I actually tried Q2 RTX for like 5 minutes but was bored and underwhelmed. It was quite choppy at 1600p too. If I'm missing out I guess I could try it out some more, but it doesn't really change my point about not seeing the use for another 2-3 years though.