When Will Ray Tracing Replace Rasterization?

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Good quick overview of the state of raytracing in games.
Quite a few hybrid rendering engines already exist, but not targeted at games because the GPU isn't capable enough yet.

 

Scali

Banned
Dec 3, 2004
2,495
0
0
I don't think raytracing will ever replace rasterization.
A hybrid is simply a better balance between speed and quality than a raytracer will ever be.

At the least we shouldn't expect raytracing in games until Hollywood abandons its rasterizers with full raytracing, which is probably still years off, if it will ever happen at all.

Pixar/RenderMan has proven time and time again that you don't need raytracing for photo-realistic quality images. It's time people got a clue about it, and killed the raytracing myth.

Besides, the biggest obstacle for raytracing in realtime/game applications has yet to be solved: dynamic geometry.
With current raytracing approaches you simply cannot have efficient skinned meshes and similar mesh-based animations in realtime.
Realtime raytracing exists by the grace of pre-generated acceleration structures, which can't be used when your geometry is dynamic.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
It would be nice if someone took the time to do a full article showing what Ray Tracing does on its' own without the help of radiosity and photon mapping so people could start seeing the rather huge downside of ray tracing versis rasterization. The staggering amount of aliasing issues combined with hard shadows everywhere and extremely poor diffuse on every surface isn't something I consider a very good trade off.

Fire up an actual game and see how often you are looking at reflections- is making those look a reasonable amount better really worth making everything else in the game look significantly worse and slower? Even if performance were identical, which it isn't even remotely close, I still don't see why ray tracing is worth getting excited over in a real time setting. Until we can at least get some sort of the functionality of radiosity going along with ray tracing it seems like a plain bad choice.

Intel and nV both seem to be working to get their parts headed in this direction currently, I think it is a bad choice and for nV in particular a contrary direction to what they have been pushing for the last couple of years. Huge increases in physics load is also going to create huge amounts of dynamic geometry which is going to make any ray tracer fall down.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Could Nvidia be hedging their bets against Intel's larrabee for their reason in looking at raytracing? From the few articles I have read on Raytracing, it is not the end all be all in graphics. There are a lot of issues it presents that may not be solveable.

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Could Nvidia be hedging their bets against Intel's larrabee for their reason in looking at raytracing?

I think of this as kind of a circular question, and a lot of it is conjecture on my part(not saying I'm right, but it's the impression I get from what info I can scrounge up).

Around the time that Intel and AMD were starting to push the idea that GPUs were just going to end up as functional units on CPUs nVidia was starting to find the idea of GPGPU very appealing as it could allow them a significant increase in revenue without having to enter the CPU market directly. They start having their R&D team working on GPGPU designs and how to evolve them in architectures that won't hurt them against ATi until they reached the level they wanted to be at(ATi was still a seperate entity at this point).

With the G80 we started seeing this idea pay dividends albeit to a very small market segment to start with, mainly folders and video encoders with some moderate uses for HPC available. GT2x0 pushed this quite a bit further on the HPC side adding close to complete IEEE DP which opened up a great deal of possibilities only this happened right in time for a global economic meltdown. Because they still had themselves positioned to be competitive on the graphics side they obviously didn't pay the price in lost marketshare, but their margins did take a bit of a hit.

At some point during that evolutionary process Intel started to make noise about Larrabee. Intel clearly saw the potential issue with GPGPU, their highest margin market is HPC and one that they aren't likely to want to surrender. To make matters worse, the possibility of nV getting enough general purpose computing power out of their GPUs could put Intel in the position of their higher end CPUs being marginalized even for desktop useage. How warranted that concern is long term remains to be seen, but it is certainly the direction that nVidia is taking at the moment. How would the market look if a Via $19 CPU paired with a powerful GPGPU offered end users the same experience as a $500 Intel CPU? Obviously we are a long ways off from that being a reality, but the parts we see shipped today are ones that hit the design phase years ago obviously.

So Intel decides they need to come up with a part to try and counteract nV's attempt at pushing into their market. In a realistic sense nV's main strength is that they pretty much own the high end graphics segment from high end gamer and up(close to 70% marketshare) and on a realistic basis Intel stands almost no chance of catching nV at their own game(just as nV has no chance, even if they secured a license, of catching Intel's performance on x86). So Intel comes up with the idea of making a GPGPU part which is flexible enough to enable a ray trace render engine which would totally negate nV's extensive lead in R&D(not to mention IP) in rasterizing. Given the level of flexibility that nV has with its' upcoming designs, it takes little effort to make a very competitive ray tracing solution of their own to combat Intel. I honestly wouldn't be shocked if nV offered significantly better performance then Intel while ray tracing out of the gate. I don't see a lot of industry support for the new rendering technique either way, but it is possible that Intel will try and use considerable financial backing to try and push developers into using their new rendering technique so they can at least take over the portion of the market that nV doesn't already have. I can see them on a realistic basis having Abrash work up an engine and then license it for next to nothing(or even nothing) to help promote their new rendering techniques and then use pricing to battle nV.

nV's specialty is graphics, and they aren't dealing with the type of legacy architecture for general computing Intel is, so I wouldn't be shocked in the least if they end up faster as a ray tracer then Larrabee from day one, and likely by a decent amount. On the flip side of that, I think Larrabee is going to prove more flexible as a GPGPU from day one and by a reasonable amount too, and it's likely to hold that advantage for quite some time. While for the people in this forum, myself included, gaming performance is going to be the most important factor in either of these new parts, I think that both nV and Intel are going to be far more concerned with the level of GPGPU performance they can extract from their parts as long terms that is likely to be the larger factor in long term success in this segment. Yes, gamers are going to be more interested in gaming performance and that will not change at any point really, but the idea of nV being able to push at least a $150 part into almost every new PC sold is far too lucrative for them to ignore.

For the other party in this race, it really seems to me that AMD stayed focus on their original idea of getting GPUs small enough to put on die with their CPUs. I can honestly see this as something that could work out well for them as a business perspective. It seems that if they can manage to pull this off moving forward the note book and netbook market segments would be very interested in making use of these parts and that is a very rapidly growing segment. May not be the highest margin market, but it also would be a bit safer in terms of amounts of R&D required.

I could be wrong on all of this, but is seems to me to be where everyone is heading.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I still want to see how they plan on doing AA in real-time with ray tracing without an exponential performance hit.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Ray-tracing will take off when we start dealing with micro-polygons (multiple polys smaller than a pixel) in terms of detail. Right now, bump-mapping and pixel shaders have pushed ray-tracing off almost indefinitely, so we no longer need obscenely high-poly scenes to achieve high detail. Aliasing would be horrible if we got to that level of detail with polygons though.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Originally posted by: Fox5
Ray-tracing will take off when we start dealing with micro-polygons (multiple polys smaller than a pixel) in terms of detail. Right now, bump-mapping and pixel shaders have pushed ray-tracing off almost indefinitely, so we no longer need obscenely high-poly scenes to achieve high detail. Aliasing would be horrible if we got to that level of detail with polygons though.

Pixar's RenderMan actually renders with micropolygons, and it's a REYES renderer, which is much closer to a rasterizer than a raytracer is.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
And people say I'm computer savvy but compared to you guys I'm a nothing! I didn't understand a word you all said!:eek: