• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Pathtraced Quake 2

monstercameron

Diamond Member
This is a from-scratch GPU-based pathtracer created specifically for Quake 2. It has several optimisations which are only viable due to the typical characteristics of Quake 2 such as support for parallogram-shaped lightsources, BSP ray traversal, and special handling of sky 'surfaces' (portals). It doesn't officially have a name, but you can call it Raylgun.

Supports all static and dynamic light types that were used in the original game, for which illumination was originally baked into static lightmaps.
All objects can cast shadows.
In terms of graphics API support, only OpenGL 3.3 is required.
Requires no offline pre-processing of input data, so any mod or level which works with Yamagi Quake 2 (which is a no-frills port) should work.
Still uses the rasterisation capabilities of the GPU anywhere that it makes sense.
All code modifications reside within the refresher (Quake 2's rendering subsystem) and the pathtracing codepath can be averted entirely at runtime without qualitative detriment to the original renderer's results.

VIA: http://raytracey.blogspot.com/2016/06/real-time-path-traced-quake-2.html
SOURCE: http://amietia.com/q2pt.html

mZ53ehq.png

8 bounces

XQSDQJd.png

1 bounce

[AMD 860k, 7970]

If anyone has a beast gpu and can record opengl, please give this a go.
PM me for the prebuilt windows binary(cant directly link a file here).
 
I took a look at the screens and videos, but I can't figure out what is going on here. I'm just seeing what looks like a very grainy version of q2.
 
Interesting, a lot of path noise. I wonder if some clever method of interpolation or hybrid rendering could be used to project the path render onto a ray traced or rasterized rendered scene to give a much cleaner result while retaining the natural lighting.
 
Interesting, a lot of path noise. I wonder if some clever method of interpolation or hybrid rendering could be used to project the path render onto a ray traced or rasterized rendered scene to give a much cleaner result while retaining the natural lighting.
Frame blending might work well, the dev is still working on it.
 
The noise is really awful ...

There's got to be some analytical solution to the full volume rendering equation ...

I don't enjoy the idea of having to probabilistically render scenes when you need near unbounded computational complexity for the general case ...

Analytical methods are ideal for achieving lower error solutions and whilst most of it's performance pitfalls are shared with a probabilistic renderer like computational cost probably not independent of scene geometry or the types of material used, the computational complexity is far more predictable compared to random sampling ...
 
Hi,

@monstercameron: I'd love to try it but don't have a monster GPU (GeForce 760 GTX).

I'm writing here because I'm unable to PM you. You belong to the admin group according to the error message I'm getting, and for some reason I need to post 25 times before being allowed to post to the admin group or a member thereof.
 
While there is a lot of noise, if I look past it, there is something really cool happening there. The lighting and shadow quality are dramatically better.

This is an oversimplification due to me not really understanding what I'm seeing, but to me it seems that there needs to be an equivalent to anisotropic filtering applied to the distance that the engine draws the paths/rays/whatever. Objects in the distance appear to suffer worse from noise.
 
It's because the sample density decreases the further you get from the camera.

I would really like to see imported_bman's idea of integrating this with the normal rasterized version. It must cost practically nothing to render this game on modern hardware.
 
How is it the future when Quake 2 is nearly 2 decades old? Probably should have some context.
Context is that there is a path tracer written into a shader connected to the idtech engine. The engine is ancient but the realtime path tracer is where the excitement is at.
 
It's because the sample density decreases the further you get from the camera.

That's what I was thinking. It's like a real-time LiDAR visualization simulator, where objects up close are more detailed, but objects far away or at oblique angles are harder to see detail. Cool experiment.
 
Back
Top