Another Interesting Ray Tracing Article

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
great. now i just have to wait for those 16 core processors before it starts to become viable :p


 

cheier

Junior Member
Jan 16, 2008
4
0
0
Interesting to see the more widespread use of OpenRT for this kind of stuff. I remember when the Saarland boys were demonstrating OpenRT running on FPGAs at SIGGRAPH in 2005. I'd actually be quite curious though as to how the GPU could handle this kind of stuff if it were to be utilized in this kind of ray tracing atmosphere. I know that while by default, the GPU rasterizes within OpenGL and DirectX, but I would guess that should these guys use CUDA (or Brook+ for ATI cards), you could push additional ray tracing capacity into the GPU as a function of general purpose computing. Would certainly be the cheaper route to use an extra 8800 GTX or 8800 Ultra for computing, as opposed to investing in a dual Xeon 5365 workstation for it.
 

heyheybooboo

Diamond Member
Jun 29, 2007
6,278
0
0
Without going head over heels into the whole logarithmic/rasterization approaches, I sometimes feel this is a solution in search of a problem.

I had to *snicker* at the conclusion that we must go *ALL-IN* on his complete 100% ray-tracing approach. I bet his employer is 100% behind him on that ... :p

I will somewhat agree however with his premise of reduction - not in how he goes about doing it ...

Rapidly generating a 'realistic' 2D image out of a given 3D scene will be best achieved by elimination - of triangles, textures, reflections, pixels, etc - while maintaining the quality of the 'illusion'.

Somewhere out there is a pimplely-faced kid in jr or sr high school who will write the algorithms to do so :D especially if the algorithm introduces an element of motion blur.

The fallacy of his arguement is shown when he pops the "Boeing 777 model with 350 million triangles ... extreme highly detailed model that includes every screw of the plane" into the article and (weakly) attempts to explain away the lower precision of gameplay realism to a detailed technical drawing used in a manufacturing environment ...

Apples and oranges ...
 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
This whole raytracing hype is just what Intel wants to promote their products. Anyone who wants a good read on rasterization vs. raytracing should take a look at this article:

http://www.beyond3d.com/content/articles/94/1

I firmly believe games will continue to use rasterization for a long time to come. It's one thing to demonstrate a tech demo of how far cpu's have come, but it's a whole other issue if you try to convince the whole game development industry to adopt raytracing when both HW and SW vendors have so much resources already invested in rasterization. Even if raytracing eventually catches up in performance to rasterization (and I don't believe we're anywhere close to that happenning, if it ever happens), ask yourself: why should we make the switch? Game rendering has always been based upon approximations and shortcuts to make the visuals look convincing enough to look real. With every generation of HW and SW raster graphics are getting closer to realistic appearance, so why should we care about raytracing? I don't see anything in raytracing that would bring a big enough improvement to games to make it a viable alternative to raster graphics.
 

Cogman

Lifer
Sep 19, 2000
10,284
138
106
Originally posted by: munky
This whole raytracing hype is just what Intel wants to promote their products. Anyone who wants a good read on rasterization vs. raytracing should take a look at this article:

http://www.beyond3d.com/content/articles/94/1

I firmly believe games will continue to use rasterization for a long time to come. It's one thing to demonstrate a tech demo of how far cpu's have come, but it's a whole other issue if you try to convince the whole game development industry to adopt raytracing when both HW and SW vendors have so much resources already invested in rasterization. Even if raytracing eventually catches up in performance to rasterization (and I don't believe we're anywhere close to that happenning, if it ever happens), ask yourself: why should we make the switch? Game rendering has always been based upon approximations and shortcuts to make the visuals look convincing enough to look real. With every generation of HW and SW raster graphics are getting closer to realistic appearance, so why should we care about raytracing? I don't see anything in raytracing that would bring a big enough improvement to games to make it a viable alternative to raster graphics.

I somewhat agree with you, raytracing is just too new and too resource demanding. Not only that, but OpenRT isn't exactly open... instead it is closed source propitiatory stuff. IMO drastically different Techs like this are almost garenteed to fail when they go again the huge industry norm. Not only that, but hardware is not optimized for it. Until it can be ran on a modern computer without a lot of hickups, and look MUCH better then current rasterization techs, it is just not going to go anywhere.

The benefit of it, though, is that once a computer can run it, it can run some of the most complex scenes with no problem, current tech will slow down somewhat as the scene gets more complex, raytracing on the other hand takes a very small hit with very advanced scenes. (water, shadows, dynamic lighting, highly poly objects, all throw into one scene and looking almost real)

I would love to see raytracing take off, but it is not likely for the next couple of years at least. If OpenRT really wanted to help, they would be, you know, open source. Until that happens and more people start playing with it, we wont see a lot of progress.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
...must...optimize...DX10...first...
...must...optimize...DX10...first...
...must...optimize...DX10...first...

*twirls shiny object in your faces*
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
Originally posted by: munky
This whole raytracing hype is just what Intel wants to promote their products. Anyone who wants a good read on rasterization vs. raytracing should take a look at this article:

http://www.beyond3d.com/content/articles/94/1

I firmly believe games will continue to use rasterization for a long time to come. It's one thing to demonstrate a tech demo of how far cpu's have come, but it's a whole other issue if you try to convince the whole game development industry to adopt raytracing when both HW and SW vendors have so much resources already invested in rasterization. Even if raytracing eventually catches up in performance to rasterization (and I don't believe we're anywhere close to that happenning, if it ever happens), ask yourself: why should we make the switch? Game rendering has always been based upon approximations and shortcuts to make the visuals look convincing enough to look real. With every generation of HW and SW raster graphics are getting closer to realistic appearance, so why should we care about raytracing? I don't see anything in raytracing that would bring a big enough improvement to games to make it a viable alternative to raster graphics.

Why does it have to be 1) Either ... 2) Or

?

Does anyone see a problem with +adding+ RT to programming games as it becomes more efficient and more gamers migrate to QC.
:confused:

Surely the CPUs can handle the RT while the GPUs continue to do rasterization?

... of course intel is behind it :p
:roll:
 
Apr 17, 2005
13,465
3
81
cool stuff...interesting to see how this all works out. i also think that some hybrid approach will be better and more feasible in a few years.
 

Dadofamunky

Platinum Member
Jan 4, 2005
2,184
0
0
Originally posted by: nullpointerus
...must...optimize...DX10...first...
...must...optimize...DX10...first...
...must...optimize...DX10...first...

*twirls shiny object in your faces*

Yeah, right, there's a lot of work left to do there, that's for darn sure...
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The article continues to dodge the issue of AA and AF.

So we have 90 frames per second at a low resolution 1280x720 with no AA and no AF, plus we don't even know if the rest of the game is running and it?s not just the graphics.

And we still need eight cores to do it.

If Intel thinks the future is with ray-tracing why are they backing Crossfire so much?
 

perzy

Junior Member
Dec 19, 2007
19
0
0
Well since the cpu is DEAD, x86-processors has hit the heat wall and that wall isnt moving tomorrow I guess they are janking any chain they can. Must make that 100-core, 3 GHz CPU's interesting somehow...
The future is in discrete cpu's, thats why AMD is a better company than Intel because they have ATI, and that's why I would by Ageia stock today if I had money and brains and not just brains ;-)
 

qbfx

Senior member
Dec 26, 2007
240
0
0
Originally posted by: munky

ask yourself: why should we make the switch?

Ask yourself: why shouldn't we ?

That's how physics work, if you want a realistic game, that's the only way to render it. And don't worry about the SW and HW, they'll switch design and optimization to RT very quickly when it becomes practically possible to render RT in real time. And why stick with approximation when you can have "the real thing" ? That'll be another booster for manufacturers like AMD, Intel and nVidia to develop faster and more sophisticated products anyway, and could shorten the distance between mainstream and high-end. I personally think it's a good direction to "aim" Moore's law at for graphics computation...

 

Munky

Diamond Member
Feb 5, 2005
9,372
0
76
Originally posted by: qbfx
Originally posted by: munky

ask yourself: why should we make the switch?

Ask yourself: why shouldn't we ?

That's how physics work, if you want a realistic game, that's the only way to render it. And don't worry about the SW and HW, they'll switch design and optimization to RT very quickly when it becomes practically possible to render RT in real time. And why stick with approximation when you can have "the real thing" ? That'll be another booster for manufacturers like AMD, Intel and nVidia to develop faster and more sophisticated products anyway, and could shorten the distance between mainstream and high-end. I personally think it's a good direction to "aim" Moore's law at for graphics computation...

That's exactly what's NOT going to happen. The game development industry is already slow as it is in adopting new features of DX10 and PhysX, and you think they will jump ship when asked to completely abandon all previous ideas and start from a blank page of RT? And from a consumer point of view, I'm sure Intel would love for all of us to spend $1000+ on their new 16-core cpu's and upgrade the whole rig every 6 months, instead of buying a $300-400 AMD or Nvidia video card when more demanding games are released, but I'd rather not go that route.