Mizuchi realtime rendering engine... Looks amazing :)

dogen1

Senior member
Oct 14, 2014
739
40
91
If I'm remembering the name right, I think some studios use this as a reference or something like that.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The lighting looks incredible. Demo didn't have any humans so I wonder how their skin/face rendering is, but it certainly looks fantastic on objects
 
May 11, 2008
22,566
1,472
126
The lighting looks incredible. Demo didn't have any humans so I wonder how their skin/face rendering is, but it certainly looks fantastic on objects

Their skin rendering seems also amazing :
This is not a photo, it is a rendered image by Mizuchi.

purpose2.png
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I mean actual games with actual gameplay.

You could probably get the engine in question to run actual games with actual gameplay, however they wouldn't really look as good as the video.

Part of the reason why the video looks so good is because it makes use of the classic trick of using a very shallow depth of field, which will make pretty much anything look much more photorealistic. The problem is of course that you can't realistically apply so heavy of a filter in actual gameplay, and as such the engine wouldn't look anywhere near as good during gameplay.

This shoot of a trainyard scene is probably closer to what you should expect out of the engine in actual gameplay.
 
May 11, 2008
22,566
1,472
126
You could probably get the engine in question to run actual games with actual gameplay, however they wouldn't really look as good as the video.

Part of the reason why the video looks so good is because it makes use of the classic trick of using a very shallow depth of field, which will make pretty much anything look much more photorealistic. The problem is of course that you can't realistically apply so heavy of a filter in actual gameplay, and as such the engine wouldn't look anywhere near as good during gameplay.

This shoot of a trainyard scene is probably closer to what you should expect out of the engine in actual gameplay.

That is a good point. Using that trick. That kind of reminds me how with the game DOOM 2016 the entire environment is static. Explosions will not damage the environment. Only the enemies.
It looks great, but it is a lot of smoke and mirrors. But in defense of the game, it is supposed to be similar to the old doom game , just with stunning visuals.

I guess we need to wait a few more years. And wait for new ideas like for example primitive shaders and see how these new ideas pay off when it comes to performance.
 
May 11, 2008
22,566
1,472
126
I do have to write, if there is one thing that i find annoying is the adding of blur for background in games (with the exception of using zoom on a rifle scope). I know that only the point that my eyes focus on is sharp, and that the rest of my visual field of view is not in focus. But when i play a game, and the whole scene is blurred except for a small fraction just to be able to increase detail. I have no more desire to play the game. When i look around on the screen, the entire image should be sharp and not blurred. My eyes (like everybody else )already have this feature because the retina has the highest number of cones in the center and decreases going outwards.
The lens of the eye focuses on this part of the retina.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
That is a good point. Using that trick. That kind of reminds me how with the game DOOM 2016 the entire environment is static. Explosions will not damage the environment. Only the enemies.
It looks great, but it is a lot of smoke and mirrors. But in defense of the game, it is supposed to be similar to the old doom game , just with stunning visuals.

I guess we need to wait a few more years. And wait for new ideas like for example primitive shaders and see how these new ideas pay off when it comes to performance.

One interesting thing is that heavy DoF filters like the ones seen here, might actually become viable to use in actual gameplay once we get proper eye tracking in VR headsets. Combined with foveated rendering, that will probably allow for some pretty significant advances in quality, even taking into account the higher refresh rates mandated for VR.
 
May 11, 2008
22,566
1,472
126
One interesting thing is that heavy DoF filters like the ones seen here, might actually become viable to use in actual gameplay once we get proper eye tracking in VR headsets. Combined with foveated rendering, that will probably allow for some pretty significant advances in quality, even taking into account the higher refresh rates mandated for VR.

For VR with proper eye tracking and foveated rendering, i agree. With the ability to setup the size of the foveated rendering field to adjust to individual users eyes and preferences, no one would notice the difference. But while thinking about it, only using a blurring shader or such a technique would not be the best solution. Really decreasing the resolution outside the foveated rendering field, would give more performance benefits. Then adding a blur effect to minimize jagged edges and less anti aliasing is needed as a bonus. Yes, that would work fine.
Maybe those primitive shaders would help a lot in such a render example.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
For VR with proper eye tracking and foveated rendering, i agree. With the ability to setup the size of the foveated rendering field to adjust to individual users eyes and preferences, no one would notice the difference. But while thinking about it, only using a blurring shader or such a technique would not be the best solution. Really decreasing the resolution outside the foveated rendering field, would give more performance benefits. Then adding a blur effect to minimize jagged edges and less anti aliasing is needed as a bonus. Yes, that would work fine.

By foveated rendering I'm referring to the gradually decreasing resolution used outside of the central focus area. On top of this you then often use a blurring shader, to mask the transitions in resolution

A DoF filter would be a replacement for the normal blurring shader, since it is essentially a blurring shader that takes depth into account. Also the DoF filter would also be applied within the full resolution center region, since this region is generally big enough that you could have objects of widely differing depth position present (even though the user would only be focusing on one or the other of these objects).

Maybe those primitive shaders would help a lot in such a render example.

I imagine something like AMD's ID buffer might be useful here.

It would be as if it has reversed tessellation for outside the foveated rendering field.

With foveated rendering the traditional approach is of course to simple reduce the resolution the further away you get from the center of the focus area, but I don't see any reason why you couldn't also reduce any number of other features such as for instance geometry detail (i.e. reverse tessellation), or jerry rigging the normal LOD systems already present in game engines to work not just based on Z distance, but also on X/Y distance to the focus area.

One problem with this though is that it would probably require a fairly significant rewrite of game engines, but hey VR in general tends to require this already to get the best possible results.
 

Valantar

Golden Member
Aug 26, 2014
1,792
508
136
I do have to write, if there is one thing that i find annoying is the adding of blur for background in games (with the exception of using zoom on a rifle scope). I know that only the point that my eyes focus on is sharp, and that the rest of my visual field of view is not in focus. But when i play a game, and the whole scene is blurred except for a small fraction just to be able to increase detail. I have no more desire to play the game. When i look around on the screen, the entire image should be sharp and not blurred. My eyes (like everybody else )already have this feature because the retina has the highest number of cones in the center and decreases going outwards.
The lens of the eye focuses on this part of the retina.
That is why we want/need/should have eye tracking and foveated rendering. Especially for VR, but really for all real-time 3D content that wants any semblance of realism.
A DoF filter would be a replacement for the normal blurring shader, since it is essentially a blurring shader that takes depth into account. Also the DoF filter would also be applied within the full resolution center region, since this region is generally big enough that you could have objects of widely differing depth position present (even though the user would only be focusing on one or the other of these objects).
Yeah, a simple blurring shader would be no better than the (awful) "tilt-shift" photo filters you get on smartphones, that only look good in extremely specific circumstances. Not taking depth into account completely ruins the effect.

With foveated rendering the traditional approach is of course to simple reduce the resolution the further away you get from the center of the focus area, but I don't see any reason why you couldn't also reduce any number of other features such as for instance geometry detail (i.e. reverse tessellation), or jerry rigging the normal LOD systems already present in game engines to work not just based on Z distance, but also on X/Y distance to the focus area.

One problem with this though is that it would probably require a fairly significant rewrite of game engines, but hey VR in general tends to require this already to get the best possible results.
Would it, though? Would this be dramatically more complex than implementing on-the-fly resolution scaling, which is already supported by many engines? I get that it would need a decent amount of work, but I wouldn't expect it to be radically more demanding than implementing stereoscopic 3D or other techniques that require multiple viewports and renders of the same scene.

]This shoot of a trainyard scene is probably closer to what you should expect out of the engine in actual gameplay.
That looks rather depressing. I mean, that brick wall made me think of Half-Life 2. Although that might be the rosy-coloured glasses of nostalgia. But it definitely didn't look good.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Would it, though? Would this be dramatically more complex than implementing on-the-fly resolution scaling, which is already supported by many engines? I get that it would need a decent amount of work, but I wouldn't expect it to be radically more demanding than implementing stereoscopic 3D or other techniques that require multiple viewports and renders of the same scene.

Honestly, I'm not sure. As mentioned one could possibly repurpose the LOD system to do this, which would probably be much more straightforward than writing a new implementation from scratch.

That looks rather depressing. I mean, that brick wall made me think of Half-Life 2. Although that might be the rosy-coloured glasses of nostalgia. But it definitely didn't look good.

To be fair a lot of the quality also hinges on the quality of the assets used, not just the quality of the engine, so I would be careful about taking too much out of a simple demo like that.