Ray tracing the new "physX"?

biostud

Lifer
Feb 27, 2003
18,248
4,760
136
Obviously different things but also share some similarities.

The adoption of ray tracing depends on wide spread hardware support, and significant visual improvements in the games that supports it and lots of games that supports it.

With the launch prices, and hardware adoption in laptops and mid range gaming machines it will take years before Turing technology is wide spread, and thus developers will have to weigh whether the extra development cost is worth it.

Will ray tracing even make it to the mid range cards, and if so will it be powerful enough to enable it in games?

Will there be different levels of ray tracing, just like physX?

Will it be worth it visually vs fps drop?

Will AMD support raytracing (or intel in the future)?

Or is this simply too big to fail for nvidia?

What do you say, O mighty forum?
 

CuriousMike

Diamond Member
Feb 22, 2001
3,044
543
136
If the visual fidelity quickly blows current rasterization techniques out of the water, it'll have an easier time gaining traction.

It just feels like VR did a couple years ago - lots of excitement, but also the knowledge that it was expensive and imperfect.

Right now, it's too expensive for mass adoption and AAA game development studios likely won't dedicate serious resources to its implementation. In particular since it's missing on Sony/MSFT/Nintendo platforms and likely won't arrive on said platforms until after whatever the next gen is.

Codemaster took the hit in terms of investment and put VR in Dirt Rally and the results were highly regarded.
They didn't take that hit when creating their later title, Dirt 4.
My suspicion is the bean counters saw that there simply isn't enough consumer support for VR and the investment wasn't worth it.

That's my guess for RT in the immediate future - we'll have some studios support it, but not all ... and unless there is mass adoption of the tech by consumers, it'll be a slower grind for actual implementation.
 

philosofool

Senior member
Nov 3, 2008
281
19
81
I'm pretty sure that Ray Trace will eventually become a part of video game rendering. It's the holy grail of rendering, the simulation of how real world visual impressions reach our real world eyes, not just a neat concept to add realism, as was physics simulation. It's why five and even ten year old Pixar films look better in many respects than games made yesterday. There's a limit to what you can do with rasterization and shaders.

However, I think there are serious questions about the time frame: how soon will real-time ray trace will be a component of many games? Depending on many variables that I can't even list, much less predict the precise effect of, ray trace may be an cool setting in games that are released in the next year or it may be a decade before we see ray-trace make it's way into AAA titles. Arrival on consoles will be a turning point, but it's hard to say what PC gamers will see in the next few years or how soon console hardware will support it.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
I believe ray tracing support is more like a new DX or SM. But adoption will be even slower. New DX and SM support pretty much hits a whole generation at once (aside from low-end rebrands). And even then it took years for old-gen cards lacking support to truly be unable to play a game.

With die size requirements, we are expected not to see RTX on 2060 and below. It may be many years before we see it on 1050 Ti, 1030 class GPUs. Combine this with AMD and consoles both being questionable on future support, and it can be assured that games will be built around rasterization first and foremost for many years to come. Ray tracing will simply be stapled on Ultra settings. That being said, I look forward to ray tracing one day being the standard.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
I view it as one of those forward looking things that people ridicule AMD over. I guess we'll have to wait and see how it plays out in the end. It does seem like the visual effects might be beneficial at times as long as the performance hit doesn't kill the fps too bad.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,810
7,166
136
Developers have gotten so good at faking good lighting in most games using traditional raster methods that the gap between the fake and real lighting models really isn't large enough (at the moment) to warrant the mountain of silicon slapped on the new RTX series.

Ray Tracing, to these eyes at least, is a last 5% kind of problem where the amount of benefit to get to that last 5% of visual fidelity has trouble passing a cost/benefit muster.

It also faces the somewhat counter-intuitive problem physx faces when it is implemented well (in the BFV demo for example): its the kind of thing that you don't notice when its on because its so natural, and don't really miss when its off because that's the way it has always been done.

As always, however, I will applaud NV for really coming out of left field and dedicating silicon to such a specialized task as ray tracing (Physx always felt like a little value add off to the side of the CUDA/Compute business NV was building at the time). NV took a similar approach when Tessellation was the buzzword of the day when DX11 became a thing.
 

Det0x

Golden Member
Sep 11, 2014
1,028
2,953
136
Will AMD support raytracing (or intel in the future)?

Meme removed

There's been a lot of chatter on Twitter from devs about Nvidia's 10Gray/s in the past few days. With some of them wondering how/when/where was this achieved given that there are a redacted ton of ways to "Ray Trace" and WTH are those RT Cores all about exactly. Sebbi just blessed us with the following:

"Claybook ray-traces at 4.88 Gigarays/s on AMD Vega 64. Primary RT pass. Shadow rays are slightly slower. 1 GB volumetric scene. 4K runs at 60 fps. Runs even faster on my Titan X. And with temporal upsampling even mid tier cards render 4K at almost native quality."

https://twitter.com/SebAaltonen

Profanity and memes are not allowed
in tech areas / discussions.

AT Mod Usandthem
 
Last edited by a moderator:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Obviously different things but also share some similarities.

The adoption of ray tracing depends on wide spread hardware support, and significant visual improvements in the games that supports it and lots of games that supports it.

With the launch prices, and hardware adoption in laptops and mid range gaming machines it will take years before Turing technology is wide spread, and thus developers will have to weigh whether the extra development cost is worth it.

Will ray tracing even make it to the mid range cards, and if so will it be powerful enough to enable it in games?

Will there be different levels of ray tracing, just like physX?

Will it be worth it visually vs fps drop?

Will AMD support raytracing (or intel in the future)?

Or is this simply too big to fail for nvidia?

What do you say, O mighty forum?


Not like PhysX at all. This is being incorporated into DirectX. NVidia is much more serious about this than it ever was about PhysX.

It's almost like NVidia is betting the company on Ray Tracing, compared to the minimal amount of resources devoted to PhysX.

I expect a lot more developer pick up as well. Since RT is actually an easier way to do high quality lighting/shadows/reflection/refraction. Adoption in AAA games will be swift because devs like new tech that can boost visuals for showcasing even if not everyone can run them.

A year from now I would expect more RT adoption in AAA games than PhysX ever saw in it's best year.
 

CuriousMike

Diamond Member
Feb 22, 2001
3,044
543
136
Adoption in AAA games will be swift because devs like new tech that can boost visuals for showcasing even if not everyone can run them.

Devs might like the new tech, but it's the bean-counters that have to agree the effort and time will be profitable.
For a few games, that might be the case.
I still suggest that adoption won't be swift until consoles pick it up.
RT will add time and complexity to the dev cycle, including testing a whole 'nother rendering pipeline.
 

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
It's definitely a possibility since we take it too much for granted of how many adopted technology failed to catch on even when their standardized with industry wide supported APIs like DX!

Not many games today use the supposed "geometry shaders" (Apple doesn't even try to support it in their recent Metal gfx API) or the "stream out" feature since DX10 and "deferred contexts" in DX11 ended up being a total disaster ...

The D3D12 header files once hinted support of ASTC but not many IHVs liked it especially when AMD dropped the feature sometime during Vega microachitecture's development as seen in their ISA docs so Microsoft decided against standardizing in DX in the end ...

Considering that there's only 11 or so games confirmed to have support with ray tracing and Nvidia has yet to confirm that all of their new entire lineup will support HW accelerated ray tracing, it will need to target far more configurations than available to get it even remotely semi-mainstream status ...

Nvidia could seriously risk opening up unintended opportunities for their competitors as well since Nvidia has gotten "adventurous" with experimenting large portions of their silicon so their competitors could see it fit to strike them at any moment ...
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
I'm expecting deja vu over the use of hardware PhysX in games.

For years hardware PhysX got you minimal additions of minor eye candy like blowing leaves or nicer acid weapon splashes in Borderlands. And in some cases it felt like the game developers removed eye candy from the software path just to make nvidia happy and get co-marketing dollars or other support.

Will ray tracing be different this time and give us significant visual improvements with this first generation of RTX cards? Maybe, but cynical me is deeply skeptical.

I can easily afford a 2080 ti and I wouldn't mind replacing my 980 ti, but I'm not opening my wallet without a lot more convincing.
 

Fir

Senior member
Jan 15, 2010
484
194
116
I have RT demos that are over 10 years old. They look great but even on current CPU they don't run fluid enough to be enjoyable.

Remember in 1999 when the original Geforce 256 came out and they were touting hardware transform and lighting? If you run those original demos (which looked great at the time nearly two decades ago!) on current GPUs they run super smooth. I'd say give it a few generations before we have games running at full UHD at 60+ FPS.
 

Guru

Senior member
May 5, 2017
830
361
106
Its a one off for Nvidia. I believe their 7nm products will return to the GTX naming and not feature the RT cores. Right now Nvidia knows that AMD doesn't have anything until 2019 and they can take advantage of that with a stop gap at 12nm with some new features and test the water for it.

I don't see how half baked, poor implementations of ray tracing can add any actual benefit, when also comparing the cost for fps. I feel like we are going to get a 10% improvement in visuals on certain things, like the occasional glass, reflection, window, maybe metallic surface and the half baked implementation will improve graphics by 10% or 20%. So you are seeing these "ray-traced" things in like 1% of the game that improve graphics by like 10% to 20% over the norm at the cost of say 50% performance. I don't see it as a real feature.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Its a one off for Nvidia. I believe their 7nm products will return to the GTX naming and not feature the RT cores. Right now Nvidia knows that AMD doesn't have anything until 2019 and they can take advantage of that with a stop gap at 12nm with some new features and test the water for it.

That is utterly preposterous, though it is starting to look like a pattern of your pronouncements.

NVidia wont be walking this back, this is a goal the have been working towards for MANY years.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
It's worse than PhysX. You could run PhysX on most Nvidia cards that were available at the time, and if you had an old card, you could use it as a PhysX processor. This ray tracing stuff seems to have a huge performance hit and Nvidia wants about 5 people on the planet to be able to afford it. It may be the future, but I don't think it will be Nvidia proprietary or RTX. I give it 10 years before it becomes any kind of actual thing. Seriously.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
741
136
I am expecting the next Xbox to fully support DXR (but not RTX), Microsoft will see it as a way to outdo Sony. I am not expecting a Nvidia GPU in the next Xbox though.
 

JeffMD

Platinum Member
Feb 15, 2002
2,026
19
81
Developers have gotten so good at faking good lighting in most games using traditional raster methods that the gap between the fake and real lighting models really isn't large enough (at the moment) to warrant the mountain of silicon slapped on the new RTX series.


No... no they have not. Reflection shaders have grown stagnet and pretty much always revolve around Screen Space Reflection which have the same old hat results. It never shows what is culled and if there is anything 3d engines strive to do today is to cull every damn thing the user does not see for the sake of performance. They also by default are not subject to normal AA, instead they need AA of their own if supported.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,810
7,166
136
No... no they have not. Reflection shaders have grown stagnet and pretty much always revolve around Screen Space Reflection which have the same old hat results. It never shows what is culled and if there is anything 3d engines strive to do today is to cull every damn thing the user does not see for the sake of performance. They also by default are not subject to normal AA, instead they need AA of their own if supported.

That's fair.

Nevertheless, in a fluid, fast paced game environment, I'm not sure I'd really notice. It was not always obvious what exactly RTX was doing in many of the game demos that NV displayed at their event. Many times, the lighting merely looked different, not necessarily objectively better (even if the RTX lighting was objectively more true to life).

I wonder if, instead of merely accepting all comers, NV curate the launch titles they work with to really emphasize the difference between rasterization and Ray traced vs just going to the quantity of titles. Maybe even focus on slower paced games where the player has more time to appreciate the flair this early stage of Ray Tracing has to offer.
 

DigDog

Lifer
Jun 3, 2011
13,493
2,120
126
Y'all here need to go watch some ray-tracing videos. It's nothing like traditional rendering, it's very near photorealism.
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
i'd say it's more like Tessellation than PhysX - totally optional graphical effect which does not affect gameplay and could potentially reduce frame-rates dramatically while providing a marginal visual improvement.(Crysis 2 anyone?)
PhysX, at least, *could have been* used for something more than just visual effects.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Y'all here need to go watch some ray-tracing videos. It's nothing like traditional rendering, it's very near photorealism.

No doubt it can be. But realistically it is currently useful for non real time rendering. For real time rendering it will not be fast enough and will be implemented in gimmicky piecemeal fashion.

I want something faster than a 1080ti but really this generation is a test generation. It’s not a real die shrink. It’s just a source of funding for the real next gen GPU. Which might take another year to come out.

The only reason this will sell is because we are so hungry for something new. Having been starved for 2+ years. I blame AMD for not making something competitive for half a decade.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
That's fair.

Nevertheless, in a fluid, fast paced game environment, I'm not sure I'd really notice. It was not always obvious what exactly RTX was doing in many of the game demos that NV displayed at their event. Many times, the lighting merely looked different, not necessarily objectively better (even if the RTX lighting was objectively more true to life).

I wonder if, instead of merely accepting all comers, NV curate the launch titles they work with to really emphasize the difference between rasterization and Ray traced vs just going to the quantity of titles. Maybe even focus on slower paced games where the player has more time to appreciate the flair this early stage of Ray Tracing has to offer.


Gaming graphics is an art form. I mean yes realism is a noble goal. But ultimately unless true full photorealism is the intended result it will be a form of art.

As an example Bioshock IMHO is a better looking game than most of these high tech games with the performance killing silly features like PCSS or whatever. Creating a sense of immersion is not dependent so much on these piecemeal baby steps towards realism as much as it is on the art style and direction.

Either this needs to go all the way to photorealism or we just need to admit that this is art. Furthermore why do many people choose to watch animated content over filmed content? Also why is so much filmed content graphically “enhanced” anyway? It’s a noble goal no doubt but is it really what we want? If so at what cost?
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Developers have gotten so good at faking good lighting in most games using traditional raster methods that the gap between the fake and real lighting models really isn't large enough (at the moment) to warrant the mountain of silicon slapped on the new RTX series.

Ray Tracing, to these eyes at least, is a last 5% kind of problem where the amount of benefit to get to that last 5% of visual fidelity has trouble passing a cost/benefit muster.

It also faces the somewhat counter-intuitive problem physx faces when it is implemented well (in the BFV demo for example): its the kind of thing that you don't notice when its on because its so natural, and don't really miss when its off because that's the way it has always been done.

As always, however, I will applaud NV for really coming out of left field and dedicating silicon to such a specialized task as ray tracing (Physx always felt like a little value add off to the side of the CUDA/Compute business NV was building at the time). NV took a similar approach when Tessellation was the buzzword of the day when DX11 became a thing.


Totally agree. Maybe AMD could stop being fancy on these last 5% features and just focus on pushing 4K 120fps. That’s what I would buy. Get into a different niche. Let NVidia spend resources on developing this stuff. AMD needs to go bread and butter to save itself.
 

DigDog

Lifer
Jun 3, 2011
13,493
2,120
126
Bear with me through this convulted reasoning;

Back in early 2000 i was playing mostly NeverWinterNights. Graphically it was allright but more important you could load custom scenarios in your game. These were made out of a toolkit, so it was mostly just the dialogue that changed between each scenario, and obviously the various assets were moved around, but that's it, no custom models of any sort.

Now, the results can only be described as "the product of a diseased mind".

They were brilliant tho. The same applied to custom scenarios people designed when playing p&p D&D, as opposed to the official scenarios.

When given creative freedom, people write things which are INSANE compared to what paid professionals do, no matter the industry.


This is because no matter what, the spectre of PG13 hovers over any production with a sensible budget. Even games like God Of War or The Witcher that got titties & gore, they are too afraid of having writing that is "of bad taste".

And modern graphics are expensive. Always.

So i have learned to associate good graphics to bad writing, and generally this holds true. If you want a game that breaks the rules of conformism (be it in writing or in design) you need to look in the indie market. Big "hollywood" productions just cannot risk dropping millions on art and then doing badly or even not getting carried by a national retailer or two.

As for ray tracing, i do love the way it looks. You get less artistic control as a tradeoff for mo...FAR more realism. And who knows what people will be able to achieve using it. Photorealism isn't the peak.

Remember that every technology .. more or less .. sucks at first. Injection was inferior to carburators. Disposable blades would rip open your whole face. NON-SLICED BREAD!!
You just need patience.

*cough* early adopters *cough*
 
  • Like
Reactions: sxr7171

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
That's fair.
Nevertheless, in a fluid, fast paced game environment, I'm not sure I'd really notice. It was not always obvious what exactly RTX was doing in many of the game demos that NV displayed at their event. Many times, the lighting merely looked different, not necessarily objectively better (even if the RTX lighting was objectively more true to life).

It's quite early days. The BFV presentation made it most clear showing RTX on/off in many shots and explaining the limitations of old techniques and how RT "just works".
https://www.youtube.com/watch?v=WoQr0k2IA9A

At some point in the next couple of years I expect RT reflections and shadows will commonly be part of "Ultra" settings, and if you don't have a RT capable card, you don't get "Ultra", and I also expect that developers will start doing less work on faking high quality shadows/reflections at lower quality tiers, because the old way is more work for less quality.

Developers are all going to want RT to succeed, because it is higher quality for less Dev work.

RT is inevitable.