Ray tracing the new "physX"?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ZeroRift

Member
Apr 13, 2005
195
6
81
Though you won't see an abandonment of "traditional" rendering pathways soon, I suspect we will quickly see them given less polish, in favor of using RT pathways to showcase a games best visual presentation.

Yep - While I still expect the occasional feature or optimization to be released in the next 5-10 years, rasterization techniques will gradually begin to stagnate. RT will become a focal point for gaming benchmarks very shortly after the hardware becomes available, which will begin the process of game companies drawing attention away from rasterization.
 

jpiniero

Lifer
Oct 1, 2010
14,817
5,432
136
However, I also don't predict any sort of abandonment of "traditional" rendering pathways happening until every sub ~$300 GPU has reasonable RT performance.

The consoles are the bigger factor IMO. Don't know if either Sony or MS will push to include RT acceleration in either next gen console.
 
  • Like
Reactions: crisium

ZeroRift

Member
Apr 13, 2005
195
6
81
The consoles are the bigger factor IMO. Don't know if either Sony or MS will push to include RT acceleration in either next gen console.
I'm not sure that consoles have ever been a driver for enhancing rasterization / pushing new features. Instead, devs seem to prefer using low level coding and specialized paths to produce acceptable results on anemic hardware.

By "abandonment" I was thinking more along the lines of complete cessation of development rather than total deprecation.

I think you're completely right about consoles being the limiting factor for the complete removal of rasterization code paths, at least using history to predict the future. But I'm also not certain that total removal of rasterization will ever occur. After all, 2D sprite-based renders are still popular to this day.

IMO, rasterization will continue to exist in some form for the foreseeable future, even if hardware support is completely dropped (is this even possible?) and it's relegated to purely software emulation.
 
  • Like
Reactions: PeterScott

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
RT incurs way too much of a performance hit even on the $1,200 2080TI. When the top tier GPU can push 100fps+ @ 1440p or better with RT it will be a viable technology and one that would get me to pay $$$ for a new cutting edge GPU. As is, even the 2080TI can't effectively implement RT so it's a non-factor in my purchasing decisions. 1st gen 7nm may make RT something that matters or it may take 2nd gen 7nm GPUs to get there, only time will tell.

It's really too soon to call the performance hit. DICE was working on Volta cards, and only got the RTX cards with real RT HW, 2 weeks before the show demo. There is a lot of room for optimization.

I also think 60 FPS will be fine for the best uses of RT, which won't be competitive FPS, but more in line with Adventure/RPG games where you have time to stop and smell the roses admire the scenery.
 

crisium

Platinum Member
Aug 19, 2001
2,643
615
136
Without console support, it will take some time for RT to be more than a stapled on PC Ultra feature like Gameworks HBAO+ is.

I'm not saying RT isn't a greater impact than HBAO+, but that's what it will be in the near future. It'll be its own row in the settings menu, or it will be apart of "Ultra" shadows/lighting. Every AAA game will have full and complete rasterization lighting for many years to come and many more of the next consoles don't have RT.

What will come sooner is when games still have a rasterization path in the options, but RT becomes more than just a stapled Ultra setting. There will be a mode where all lighting is built around it, and all rasterzation is disabled in that mode. This could still happen with RT-less consoles, if Nvidia is aggressive enough pushing it.
 

ZeroRift

Member
Apr 13, 2005
195
6
81
Which is exactly the problem. Devs aren't going to push RT that hard if it's not doable on the consoles.
That's a reasonable position.

However, tessellation was still used selectively in games shortly after its release even though the current-gen consoles did not support it. Considerable optimization was done on the latest-and-greatest PC platforms, which then made it easier for consoles to adopt it.

I don't think that one set of hardware necessarily precludes development of features on another set of hardware.

Is there less incentive for devs to focus on RT due to limited hardware support?
Certainly.

Will the industry refuse to adopt the technology because the hardware is not yet available?
I don't think so.

The main reasons I think ray tracing will gain industry traction are manifold:
1) Low overall cost of implementation (the feature set already exists, it just doesn't have fixed function hardware acceleration)
2) Superior image quality
3) It's in the vogue thanks in part to NVIDIA's marketing and in part to its reputation from offline rendered images featured in prominent media (super hero movies, etc.)
4) It has widespread industry adoption for offline renders
5) In its initial (first decade?) implementation, RT will compliment rasterization rather than replacing it, which permits devs to still re-use the majority of their code on consoles.

Who knows, you might even see something wild like the old N64 expansion pack that could add RT hardware acceleration support to current-gen consoles. The future is a crazy place. ;)
 

Midwayman

Diamond Member
Jan 28, 2000
5,723
325
126
On the plus side of consoles supporting RT
-Consoles are perfectly happy with low framerates. 30fps is considered 'enough'
-Consoles often run games substantially below screen resolution. 1080p wouldn't be much of a handicap in a console
-Consoles currently run low-medium setting. The jump to RT is an opportunity for a more substantial quality jump than a PC user.
-Consoles could just kill the raster path if they wanted and dedicate all silicon to RT.

Would a RT only chip be able to hit 30fps on 7mm cheap enough?
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
RT incurs way too much of a performance hit even on the $1,200 2080TI. When the top tier GPU can push 100fps+ @ 1440p or better with RT it will be a viable technology and one that would get me to pay $$$ for a new cutting edge GPU.

Who are you to decide what target framerate is viable? People are accepting sub 60fps minimum framerate in many cases cases when going 4k resolution even with the 1080ti - still people are choosing running games at 4k resolution today. Still it is only a resolution bump without higher quality pixels - you just get higher resolution fake shadows, reflections and refractions.

And now we are getting a real image quality increase and not just the seasonal resolution increases of the last few years - and people complaining about only 60 fps.
 
  • Like
Reactions: Muhammed

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
Who are you to decide what target framerate is viable? People are accepting sub 60fps minimum framerate in many cases cases when going 4k resolution even with the 1080ti - still people are choosing running games at 4k resolution today. Still it is only a resolution bump without higher quality pixels - you just get higher resolution fake shadows, reflections and refractions.

And now we are getting a real image quality increase and not just the seasonal resolution increases of the last few years - and people complaining about only 60 fps.

Can't pull 60fps @1080p with a $1,200 card but you're free to pay the early adopter tax. The 2070 and 2080 will be even weaker in RT titles.

I'll have no problem dropping coin on hardware but I'll hold off on replacing my current 1080 until 7nm and playable frame rates at more than peasant resolutions.
 
  • Like
Reactions: crisium

Thala

Golden Member
Nov 12, 2014
1,355
653
136
Can't pull 60fps @1080p with a $1,200 card but you're free to pay the early adopter tax. The 2070 and 2080 will be even weaker in RT titles.

I'll have no problem dropping coin on hardware but I'll hold off on replacing my current 1080 until 7nm and playable frame rates at more than peasant resolutions.

I could not care less what you are doing, however i was opposing your statement, that anything below 1440p and 100fps is not viable.
In addition, of course, there was a clear statement from the Battlefield developers, that the sub 60 fps was a result of the short time they had with the card and that they have clear improvements planned.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Can't pull 60fps @1080p with a $1,200 card but you're free to pay the early adopter tax. The 2070 and 2080 will be even weaker in RT titles.

I'll have no problem dropping coin on hardware but I'll hold off on replacing my current 1080 until 7nm and playable frame rates at more than peasant resolutions.
I predict that when reviews come out and drivers improve, you won't hold off until 7nm. :D
 
  • Like
Reactions: ozzy702

Konan

Senior member
Jul 28, 2017
360
291
106
Interesting stuff -
Both video games and film recently migrated from empirical models to physically-based shading [McAuley and Hill 2016], but the simplified light transport available in rasterization continues to push developers to consider ray tracing for accurate shadows and reflections, and multi-bounce global illumination. But current ray tracing performance is limited to around 200–300Mrays/sec [Binder and Keller 2016; Wald et al. 2014], giving just a few rays per pixel at 1920×1080 and 30 Hz. This number is even lower for production usage with dynamic acceleration structures,large scenes, and variable CPU/GPU performance. therefore, with multiple rays per path and the trends towards higher resolutions and refresh rates, practical performance is not likely to exceed one path per pixel for the foreseeable future. By developing a reconstruction filter that respects this constraint, we aim to make real-time path tracing a reality much sooner.

http://cg.ivd.kit.edu/publications/2017/svgf/svgf_preprint.pdf

Video: Spatiotemporal Variance-Guided Filtering: Real Time Reconstruction for Path-Traced Global Illumination
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136

That has been expected since the first leaks. It's almost the only way it could go.

RT and Tensor cores use a large amount of die space, and even on the ridiculously large 754mm2 2080Ti die, it just gives acceptable frame rates at 1080p.

How could this possibly scale to a small dies much less than half that size? It can't.

I wouldn't suggest anyone consider a 2070 for ray tracing, and even the 2080 might be questionable. If you are an early adopter genuinely interested in Raytracing, then 2080Ti is really your only safe bet.

I expect we have essentially seen all the RTX 2000 series cards, and it won't be until the 3000 series that it migrates to the x60 series cards.
 

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
This pretty much seals it, Cutting it out of the x60 and lower tiers pretty much gurantees this will see little to no adoption. They need to make this viable on a x60 card to push it forward.

Agreed and that won't happen until at least first gen 7nm, if not 2nd gen 7nm. Really, the $250 and under price point is what's needed for technology to see widespread adoption. No console support plus no support at the $250 and under GPU price point means that RT will be fringe for several years to come. I look forward to it becoming standard.

Maybe NVIDIA can build a cheap RT standalone die to handle things?
 
  • Like
Reactions: psolord

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Agreed and that won't happen until at least first gen 7nm, if not 2nd gen 7nm. Really, the $250 and under price point is what's needed for technology to see widespread adoption. No console support plus no support at the $250 and under GPU price point means that RT will be fringe for several years to come. I look forward to it becoming standard.

Maybe NVIDIA can build a cheap RT standalone die to handle things?

Hardware PhysX 2.0 confirmed, and the PhysX-only cards didn't sell. The few who cared about it mostly used their old GPU cards for it after upgrading to a newer GPU but that won't work for RT.

Perhaps RT will be worth caring about in 2020.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
I'm not sure that consoles have ever been a driver for enhancing rasterization / pushing new features. Instead, devs seem to prefer using low level coding and specialized paths to produce acceptable results on anemic hardware.

That is not really correct. The consoles (to say, the game developers who deploy games to consoles) have absolutely been a driver of enhanced rasterization techniques. What do you think low level coding and specialized paths are? In some aspects the consoles lead more than PC in this respect in that they are designing silicon to meet the needs of the games (less so today). Back in the day some game carts would have special hardware to run certain routines for the graphics pipeline. If that's not pushing new features I don't know what is.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Agreed and that won't happen until at least first gen 7nm, if not 2nd gen 7nm. Really, the $250 and under price point is what's needed for technology to see widespread adoption. No console support plus no support at the $250 and under GPU price point means that RT will be fringe for several years to come. I look forward to it becoming standard.

It's only common sense that generation 1, isn't going to takeoff and become widespread. But it will be enough to get software showcasing RT seeded. How many people do you think bought Voodoo 1 cards? How many 3D games using it during year one?

Maybe NVIDIA can build a cheap RT standalone die to handle things?

Not likely. Maybe far future when MCM are a common thing, with full bandwidth compared to single chip modules.

There definitely wont be an add in card for RT. RT is too involved in rendering pipeline, so you need full bandwidth access to the data processed on the main chip without delay (latency).
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
I don't ever think PhysX had 11 games at launch, with more to come on the way .. Support is going to be far better this time because NVIDIA is heavily pushing it ..

Support is one thing and performance is another. I'm looking forward to reviews showing the trade offs of performance vs visuals using the feature to it's full potential. Spending the $'s on a 2080Ti and having to fall back to 1080p to get respectable framerates doesn't sound like what the end users would really want.
 

DooKey

Golden Member
Nov 9, 2005
1,811
458
136
This push for ray tracing is all about mindshare. NVIDIA is capitalizing once again on the halo effect and it will gain further mindshare for them. Most gamers aren't whining forum denizens and will pay attention to the latest and greatest stuff they see in games. I believe the average gamer is going to say this is neat and they can't wait till they can afford a card that will run it for them. They never buy halo products, but they sure do like what they can do. JMHO.