Your Thoughts on Features to Ease the Performance Impact of Ray Tracing

zliqdedo

Member
Dec 10, 2010
59
10
81
So far we've seen ray tracing demoed on 2080 Ti at 1080p60. Yes, it's only a few games with pre-release software, but it's nevertheless quite clear ray tracing would be quite computationally expensive.

It is my belief that, now more than ever, developers should include techniques that have been used on console, often by the same developers, for many many years, but this time to offset the cost of ray tracing, rather than the relative weakness of console hardware. Techniques like dynamic resolution scaling, rendering the HUD at native resolution, upscaling from atypical resolutions, and checkerboarding to name a few.

Even one of these techniques could substantially improve the user experience. For example, dynamic resolution scaling by itself would help to keep the framerate up even in frantic situations, smoothing out gameplay and increasing responsiveness at a very low, and temporary, hit to image quality.

Combining these techniques would even allow for ray tracing on high-resolution monitors. Upscaling from 1280x1440, or 1720x1440 for a 21:9 monitor, rather than 1920x1080 on a 1440p monitor should make for a sharper image, or better still - upscale using checkboarding. That way you're still rendering a similar pixel count to 1080p's ~2m pixels, instead of a 1440p monitor's ~4-5m, with a small overhead. Dynamic scaling on top of this smartly reduced resolution should further improve things. The HUD can be permanently rendered at native resolution at little to no expense, avoiding fuzzy text and elements.

I could definitely see 2080 Ti, or even 2080, and hopefully future AMD hardware, powering a 1440p monitor with HDR and ray tracing at high settings and stable performance, using the above-mentioned techniques. The end result being vastly superior image quality, to what's being shown without these already existing techniques, on the same hardware.

What do you think? Any other techniques you can think of? I've always thought that these should've come to PC shortly after their introducing to console. There seems to be a prevailing understanding that PC is so powerful it could do anything, which is of course a bit silly, but more importantly - PC is a lot about playing the way you want, customizing, fiddling, and the absence of already-developed techniques that allow you to do this better, to get more out of your hardware, is quite unfortunate in my mind, ray tracing or not.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
Rendering the whole game at lower res, seems like a poor option.

I think it is more likely that devs will just reduce the Ray Tracing quality settings.

You can do things like reduce the amount of bounces each projected ray gets. It reduces the "reflections of reflections" effect, but it should still enable good effects. Or a few less rays, and maybe reduce the detail in reflections.

In short, I would rather have reduced resolution of reflections than reduced resolution of the whole scene.
 

Guru

Senior member
May 5, 2017
784
283
106
It's so penalizing because the hardware isn't there yet, but Nvidia is pushing all this ray tracing NOW in order to feature it as an advertisement for their new gpu's. Realistically the benefits of this partial and limited ray tracing are little to none, at a huge cost to performance!
 

zliqdedo

Member
Dec 10, 2010
59
10
81
Rendering the whole game at lower res, seems like a poor option.
Why would it be a poor solution when it works so well on console? These techniques can look very close to native and nothing like regular upscaling. I would encourage you to see examples in person for yourself, if not, video demonstrations could do too, Digital Foundry is a great place for that sort of stuff. Try any first party game on PS4 Pro on a decent UHD TV or monitor, none of them render at native 4K, and yet they all look so good, much better than running a game at 1440p from a PC on a 4K screen. Even more traditional upscaling from atypical resolutions can work great - try distinguishing between Far Cry 4 on Xbox One (1440x1080) and PS4 (1920x1080); it's harder than expected. Also remember that the HUD would always render at native res.

I really don't think you can have a firm opinion on this matter until you have seen good examples of this on console.
 

LTC8K6

Lifer
Mar 10, 2004
28,523
1,568
126
We don't really know enough about NVs RT implementation yet. We have very little info on the subject. As far as games, all we have seen are a few brief early examples of games running using RT.

Since Turing seems to have actual separate "RT cores" dedicated to RT, it seems like RT should not be the huge performance hit you'd think it would be.
 

crisium

Platinum Member
Aug 19, 2001
2,631
587
136
What zliqdedo is talking about:

Article: https://www.eurogamer.net/articles/digitalfoundry-2017-4k-gaming-what-can-pc-learn-from-ps4-pro
Video: https://www.youtube.com/watch?v=wSpHONwyBqg

PC examples they use: Watch Dogs 2 has checkerboarding, Titanfall 2 has dynamic resolution, Battlefield 1 has resolution scaling.

They also test using the Nvidia GPU to scale 3200x1800 to 3840x2160 in Witcher 3.

Very interesting stuff, and I'd like to see more of it in PC games going forward.

Though if RT appears to be 1920x1080 60hz, there's no room to work with. They still need to get it to at least 2560x1440 60hz to make these scaling options viable imo.
 
  • Like
Reactions: Ottonomous

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
If this scaling is such a great feature, why hasn't it shown up on PC already? Why would it take ray tracing to have it?
 

zliqdedo

Member
Dec 10, 2010
59
10
81
Though if RT appears to be 1920x1080 60hz, there's no room to work with. They still need to get it to at least 2560x1440 60hz to make these scaling options viable imo.
Not for a 4K monitor, but ~2m pixels, which is what 1920x1080 is, should be enough to power a 2560x1440 or 3440x1440 monitor in that way.
 

Dribble

Golden Member
Aug 9, 2005
1,797
329
126
I suggest people wait and see first. All we've seen are the first ever demo's, which were made without even having RTX hardware. There's a ton of optimisations to be done and tricks to do. Like everything else there will probably be a slider quality bar which you can adjust to suit the performance you desire. Almost certainly the ray tracing slider bar won't sit at "ultra" for this gen of hardware but it'll probably run fine at "normal" or whatever the middle of the bar is called which will still look a lot better then "none".
 

zliqdedo

Member
Dec 10, 2010
59
10
81
If this scaling is such a great feature, why hasn't it shown up on PC already? Why would it take ray tracing to have it?
That's a question many have asked after seeing the benefits of these rendering techniques. I guess skeptics have used it to cast doubt as well. I don't know the answer, but it's certainly not the first time, nor would it be the last, big companies have made questionable decisions. Maybe it requires a bit more resources to implement developers aren't willing to allocate, seeing as how PC gamers might be skeptical of it. The fact that there isn't public interest towards a feature, does not make it any less beneficial - just relatively unknown.

Ray tracing is completely unrelated, but it might spark interest, among developers and consumers, in these techniques since they have the potential to free up a significant amount of resources, with little visual impact, that can then be used for ray tracing, allowing smoother gameplay and/or higher resolutions.

I intend to post a few screenshot comparisons to help illustrate these benefits when I have more time. Again, Digital Foundry is a great place for anyone wanting to learn more about such stuff related to games.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
What zliqdedo is talking about:

Article: https://www.eurogamer.net/articles/digitalfoundry-2017-4k-gaming-what-can-pc-learn-from-ps4-pro
Video: https://www.youtube.com/watch?v=wSpHONwyBqg

PC examples they use: Watch Dogs 2 has checkerboarding, Titanfall 2 has dynamic resolution, Battlefield 1 has resolution scaling.

They also test using the Nvidia GPU to scale 3200x1800 to 3840x2160 in Witcher 3.

Very interesting stuff, and I'd like to see more of it in PC games going forward.

Though if RT appears to be 1920x1080 60hz, there's no room to work with. They still need to get it to at least 2560x1440 60hz to make these scaling options viable imo.
It's neither surprising, nor impressive. Running at lower resolution, shows obvious reduced quality.

As I said before. If we are talking cards that can easily do native resolution otherwise, I would much rather have nice sharp native objects/textures, with reduce quality on the RT effects.

IIRC in one of the videos Dice said there would be different quality/performance settings for RT effects.
 

zliqdedo

Member
Dec 10, 2010
59
10
81
I suggest people wait and see first. All we've seen are the first ever demo's, which were made without even having RTX hardware. There's a ton of optimisations to be done and tricks to do. Like everything else there will probably be a slider quality bar which you can adjust to suit the performance you desire. Almost certainly the ray tracing slider bar won't sit at "ultra" for this gen of hardware but it'll probably run fine at "normal" or whatever the middle of the bar is called which will still look a lot better then "none".
Sure, as I have argued at the start of the thread, but what's clear is that RT would be an expensive feature, and a desired one, and it would be nice to have more options to play around with. And I'm hoping it can act as a catalyst for the introduction of these already-existing techniques since they would be very beneficial to all PC games, regardless of ray tracing.

For example, if we had the option of dynamic resolution scaling in PC games, I'd would literally always have that enabled since it wouldn't do anything most of the time, but in that rare occasion where the framerate were to dip below 60, or whatever you're targeting, instead of doing so, the resolution would shift down a little (maybe 10-15%) for a brief period of time, and gameplay would remain perfectly smooth. Given how you're unlikely to even notice such a temporary drop in the heat of the action, I think it vastly preferable to the stutter and loss of responsiveness you're sure to notice.
 

crisium

Platinum Member
Aug 19, 2001
2,631
587
136
I agree, dynamic resolution is the most beneficial for 60hz V-Sync.

I played Far Cry 5 on my 4k TV maxed out but at 80% resolution scale (3072x1728) with 60hz V-Sync. It looked quite good and better than my 2560x1440 monitor. But sometimes it would dip below 60 that I could notice. I'd much prefer if it temporarily dropped to around 1440p-1620p for frantic action scenes.

Their Titan Fall 2 example was good. They found their card (GTX 1060 I think) could handle 60fps at 3200x1800 on Ultra. But there were some dips into the 50s. They decided to do the full 3840x2160 with the dynamic resolution (TF2 PC actually has this an an option). They are satisfied knowing that most of the time they'd be around 1800p, but that less demanding scenes get the full 4k and high action scenes run a bit lower.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
I agree, dynamic resolution is the most beneficial for 60hz V-Sync.

I played Far Cry 5 on my 4k TV maxed out but at 80% resolution scale (3072x1728) with 60hz V-Sync. It looked quite good and better than my 2560x1440 monitor. But sometimes it would dip below 60 that I could notice. I'd much prefer if it temporarily dropped to around 1440p-1620p for frantic action scenes.

Their Titan Fall 2 example was good. They found their card (GTX 1060 I think) could handle 60fps at 3200x1800 on Ultra. But there were some dips into the 50s. They decided to do the full 3840x2160 with the dynamic resolution (TF2 PC actually has this an an option). They are satisfied knowing that most of the time they'd be around 1800p, but that less demanding scenes get the full 4k and high action scenes run a bit lower.
Though I wonder how Dynamic, is Dynamic resolution scaling?

On a console where everyone is running the same HW and settings. It is a simple matter to profile the game, and set lower resolution shifts for the bottleneck areas of the games.

But on PC with near infinite combination of HW/Settings. You really can't know ahead of time where the bottlenecks are for one users specific HW, and it seems likely that trying to figure it out and the fly is won't work that well. You find the bottleneck after you have dropped frames, then you lower resolution, only to find you don't need to anymore. An automatic solution will lag.

IOW a real dynamic solution likely isn't that practical, and a pre-profiled one isn't that viable on a PC either.

So while Dynamic resolution may be the best technique from consoles, it has questionable viability on PC.
 

crisium

Platinum Member
Aug 19, 2001
2,631
587
136
You'd have to ask Titan Fall 2 players, I've never played it. DF though showed that the sub 60fps drops were gone once they turned on dynamic resolution.

It works dynamically, so I don't think they pre-profile it even on consoles besides first finding a good baseline resolution, framerate, and graphical settings (like one does on PC anyway). Especially with V-Sync there are frames to buffer it in.
 

zliqdedo

Member
Dec 10, 2010
59
10
81
Okay, so... here are a few screenshots that might help illustrate the benefits of advanced scaling. Keep in mind that it's easier to pick out differences in static shots versus actual motion, especially artefacts. Also, it's obvious that a zoomed-in section of a high-res shot looks worse than seeing the whole scene. These comparisons, screenshot or video, can only serve as a rough example of what to expect, you really have to see it for yourself to make a judgement.


Here we have PC (2160p, 8m pixels), Xbox One X (1800p, 5.76m p), and PS4 Pro (1440p, 3.69m p), not necessarily in that order, all outputting a full 4K frame to the same 4K TV. Keep in mind that PS4 Pro is using lower textures because it doesn't have enough memory to fit the better assets of Xbox and PC, but that doesn't have anything to do with resolution scaling, mind the difference b/n texture blur and resolution fuzziness.


Okay, this is a zoomed-in section of a 1080p shot, it's not pretty, I know, but it makes for good comparison, albeit an exaggerated one, since that's not what you'd actually see on a 1080p screen, so keep that in mind. Nevertheless, it's Far Cry 4 at 1920x1080 on PS4 and 1440x1080 (essentially 720p pixel count) on Xbox One. You can clearly see which is which, but it's not nearly as bad as simply displaying 1280x720 on a 1080p screen on PC, and much less apparent in motion.

Note that both of those are examples of fairly standard, albeit smart, upscaling. Here's what checkerboarding looks like:

Even though the final output is 3840x2160, the original render has a pixel count over two times lower! You can see some artefacts due to the way the image has been reconstructed, but they're much less noticeable in motion, when actually playing the game.

Here's how a more standard upscale from 1800p looks like:


Don't forget that the HUD doesn't have to be anything but super sharp, since it can be independently rendered at native res; in my experience, what gamers usually find so off-putting about sub-native res is the fuzziness of text, menus, etc.

Those are just some potentially unconvincing examples of the benefits of scaling techniques. If they were to just add dynamic resolution scaling, it would be a win as well. As discussed above, I too am not aware of any pre-profiling for consoles; it's supposed to work dynamically. The way it could work is to just give you on option to set at which point dynamic scaling kicks in, i.e. what framerate it should attempt to maintain, and perhaps how aggressively.

It's neither surprising, nor impressive. Running at lower resolution, shows obvious reduced quality.

As I said before. If we are talking cards that can easily do native resolution otherwise, I would much rather have nice sharp native objects/textures, with reduce quality on the RT effects.
I can't agree with that because I've seen simple upscaling from 1440p to 4K, and I've seen advanced scaling techniques such as checkerboarding, using a similar pixel count, and I can definitely see a marked improvement, up to a point where in motion it might be more difficult to spot than expected; and it's not just me, friends agree, and serious journalists like the folks from Digital Foundry say the same too; it's why developers are using it after all.

That being said, I understand the desire for maximum sharpness with lowered settings, but how far are you willing to go, and would RT hardware be able to render 4m pixels at a practical framerate today at all, and what about 8m pixels tomorrow? Even if it's not acceptable to you, and again I really think you might change your mind when you see it in person, that doesn't mean it's not a worthwhile feature to implement on PC - it gives more flexibility, more freedom.

The reason I'm so adamant about this is because I have been a firm supporter of native res at all means on LCD for a very long time. But seeing those advanced scaling techniques on console left an impression.

Source: Digital Foundry (screenshots)

P.S. A good way for anyone to judge whether these techniques would work for them, without buying a console, or inviting a friend with a console over, would be to get the relevant lossless source videos from Digital Foundry and play them back on a 4K screen. In the case of God of War, you can see how a 1440p frame can be constructed into a convincing 4K image. You'd have to be a patron to get access to those, though.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
If this scaling is such a great feature, why hasn't it shown up on PC already? Why would it take ray tracing to have it?
Unreal Engine supports temporal upsampling now. I believe it's equivalent or a near equivalent to the "temporal injection" that insomniac uses in ratchet and clank and spiderman.

https://docs.unrealengine.com/en-us/Engine/Rendering/ScreenPercentage

info and some comparison examples are here

They don't support dynamic resolution on pc yet (they want a way to more accurately get the gpu frame rendering time iirc ). The combination of the two features is what you really want.

You'd have to ask Titan Fall 2 players, I've never played it. DF though showed that the sub 60fps drops were gone once they turned on dynamic resolution.

It works dynamically, so I don't think they pre-profile it even on consoles besides first finding a good baseline resolution, framerate, and graphical settings (like one does on PC anyway). Especially with V-Sync there are frames to buffer it in.
It works pretty well in titanfall 2, but you have to set the framerate target a little bit higher than what you want. Probably because the frametime measurements aren't accurate enough, or their budgeting algorithm wasn't all that great yet. Also, I don't think their temporal anti-aliasing handled the resolution drops quite as well as the technique mentioned above.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
Though I wonder how Dynamic, is Dynamic resolution scaling?

On a console where everyone is running the same HW and settings. It is a simple matter to profile the game, and set lower resolution shifts for the bottleneck areas of the games.

IOW a real dynamic solution likely isn't that practical, and a pre-profiled one isn't that viable on a PC either.

So while Dynamic resolution may be the best technique from consoles, it has questionable viability on PC.
Fully dynamic. I don't think any games profile and predefine resolution for different areas of games. I've never heard of it at least.

DF made a pretty decent video on it.

https://www.youtube.com/watch?v=180nuQJccTA
 

maniacalpha1-1

Diamond Member
Feb 7, 2010
3,562
14
81
Every time it seems like GPUs are getting to a point where good priced GPUs are close to able to run good resolutions well, some new technology comes out that keeps the treadmill going. Is ray tracing worth getting back on the treadmill?
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
Okay, so... here are a few screenshots that might help illustrate the benefits of advanced scaling. Keep in mind that it's easier to pick out differences in static shots versus actual motion, especially artefacts. Also, it's obvious that a zoomed-in section of a high-res shot looks worse than seeing the whole scene. These comparisons, screenshot or video, can only serve as a rough example of what to expect, you really have to see it for yourself to make a judgement.


Here we have PC (2160p, 8m pixels), Xbox One X (1800p, 5.76m p), and PS4 Pro (1440p, 3.69m p), not necessarily in that order, all outputting a full 4K frame to the same 4K TV. Keep in mind that PS4 Pro is using lower textures because it doesn't have enough memory to fit the better assets of Xbox and PC, but that doesn't have anything to do with resolution scaling, mind the difference b/n texture blur and resolution fuzziness.


Okay, this is a zoomed-in section of a 1080p shot, it's not pretty, I know, but it makes for good comparison, albeit an exaggerated one, since that's not what you'd actually see on a 1080p screen, so keep that in mind. Nevertheless, it's Far Cry 4 at 1920x1080 on PS4 and 1440x1080 (essentially 720p pixel count) on Xbox One. You can clearly see which is which, but it's not nearly as bad as simply displaying 1280x720 on a 1080p screen on PC, and much less apparent in motion.

Note that both of those are examples of fairly standard, albeit smart, upscaling. Here's what checkerboarding looks like:

Even though the final output is 3840x2160, the original render has a pixel count over two times lower! You can see some artefacts due to the way the image has been reconstructed, but they're much less noticeable in motion, when actually playing the game.

Here's how a more standard upscale from 1800p looks like:


Don't forget that the HUD doesn't have to be anything but super sharp, since it can be independently rendered at native res; in my experience, what gamers usually find so off-putting about sub-native res is the fuzziness of text, menus, etc.

Those are just some potentially unconvincing examples of the benefits of scaling techniques. If they were to just add dynamic resolution scaling, it would be a win as well. As discussed above, I too am not aware of any pre-profiling for consoles; it's supposed to work dynamically. The way it could work is to just give you on option to set at which point dynamic scaling kicks in, i.e. what framerate it should attempt to maintain, and perhaps how aggressively.


I can't agree with that because I've seen simple upscaling from 1440p to 4K, and I've seen advanced scaling techniques such as checkerboarding, using a similar pixel count, and I can definitely see a marked improvement, up to a point where in motion it might be more difficult to spot than expected; and it's not just me, friends agree, and serious journalists like the folks from Digital Foundry say the same too; it's why developers are using it after all.

That being said, I understand the desire for maximum sharpness with lowered settings, but how far are you willing to go, and would RT hardware be able to render 4m pixels at a practical framerate today at all, and what about 8m pixels tomorrow? Even if it's not acceptable to you, and again I really think you might change your mind when you see it in person, that doesn't mean it's not a worthwhile feature to implement on PC - it gives more flexibility, more freedom.

The reason I'm so adamant about this is because I have been a firm supporter of native res at all means on LCD for a very long time. But seeing those advanced scaling techniques on console left an impression.

Source: Digital Foundry (screenshots)

P.S. A good way for anyone to judge whether these techniques would work for them, without buying a console, or inviting a friend with a console over, would be to get the relevant lossless source videos from Digital Foundry and play them back on a 4K screen. In the case of God of War, you can see how a 1440p frame can be constructed into a convincing 4K image. You'd have to be a patron to get access to those, though.
I read the story and looked at the comparison tool that was previously posted, and IMO the lower quality of scaled game was VERY obvious. Using a smaller crop so hopefully it won't get scaled:





To me this is NOT remotely something I would want. Native or bust.

If you are going to make the argument that these differences wouldn't be visible on a 4K screen, then that really makes the case that you don't need a 4K screen in the first place, since it would obviously be hiding details. You would be better off just getting 1440P and then you could comfortably max out everything, and not have to jump through hoops trying to run faster on 4K.
 
Last edited:

zliqdedo

Member
Dec 10, 2010
59
10
81
I read the story and looked at the comparison tool that was previously posted, and IMO the lower quality of scaled game was VERY obvious. Using a smaller crop so hopefully it won't get scaled:



So since you've read the story, you're either being misleading just to make your argument, or you haven't read the story carefully since the examples you've given are not examples of what we're talking about. They are examples that can be improved by the techniques we're discussing.

The shot from the Witcher is a bog-standard upscale, in this case from 1800p to 2160p; we've been discussing how bad that looks, and how it can be improved. Consider the shot from Assassin's Creed I've linked - it too is a 1800p scale to 2160p, but more advanced.

The shot from Watch Dogs is a very weird case; a bad implementation, in my opinion, that you wouldn't enable on PC in this way. What they're doing there is checkerboarding from a lower-than-1440p resolution to 1800p and then a standard upscale to 2160p. You can't fill 8m pixels with 2.88m efficiently, it goes without saying. It might be the "best" way to fill an 8m-pixel screen with just 2.88m pixels if you absolutely had to, but that doesn't make it a good solution, and nobody here has argued that. As crisium suggested: these techniques can work well up to double the original pixel count, i.e. filling a 1440p monitor with a 1080p+ frame, or a 4K monitor with a 1440p+ frame, roughly speaking.

To me this is NOT remotely something I would want. Native or bust.
Yes, I understand how standard upscaling, or trying to scale from a very low pixel count (relative to the target resolution), is not something you or probably any of us wants, that's why we're discussing better scaling techniques, and the sensible conclusion can't be native or bust.

It's easy to just jump through the galleries and find bad examples; I'm not saying you should read the numerous materials on DF, consider examples from many games, and see good implementations in person, but it's hard to form a knowledgeable opinion without doing so. Again, if you actually care to see how this works, it might be easiest to get your hands on those lossless, source video files and play them back on your 4K screen; by the sound of it you already have enough experience with native 4K on PC so it should be that much easier to make a judgement.

Again, I'm pressing on here because I was just as skeptical of scaling as you are, but after research and personal experience I was surprised to find out how well some techniques worked; and, yes, I admit, it would've been hard for some dude on a forum to change my mind. I consider myself quite sensitive to resolution sharpness, framerate drops, and poor responsiveness. Forget about RT, say, I can't quite max out a game at 2160p with the desired performance, but I can at 1800p, if I had access to good scaling tools, I would totally go down that path, despite my sensitivity - again, look at the shot from Assassin's Creed, it's an average example of good scaling; on PC it could be even better.

If you are going to make the argument that these differences wouldn't be visible on a 4K screen
I'm not trying to do that at all.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
I read the story and looked at the comparison tool that was previously posted, and IMO the lower quality of scaled game was VERY obvious. Using a smaller crop so hopefully it won't get scaled:





To me this is NOT remotely something I would want. Native or bust.

If you are going to make the argument that these differences wouldn't be visible on a 4K screen, then that really makes the case that you don't need a 4K screen in the first place, since it would obviously be hiding details. You would be better off just getting 1440P and then you could comfortably max out everything, and not have to jump through hoops trying to run faster on 4K.
IIRC, watch dogs 2 was a particularly bad example of checkerboard rendering (that shot looks pretty bad, something is going wrong there lol), and the witcher shot there is just normal upscaling. Check out something like horizon zero dawn for a much better, custom implementation.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
IIRC, watch dogs 2 was a particularly bad example of checkerboard rendering (that shot looks pretty bad, something is going wrong there lol), and the witcher shot there is just normal upscaling. Check out something like horizon zero dawn for a much better, custom implementation.
Conveniently, the cases that you say look better, don't have have comparisons, so we have no idea what the native resolutions would look like.

Scaling is always softer than native. The very act of scaling softens images.
 

dogen1

Senior member
Oct 14, 2014
739
40
91
Conveniently, the cases that you say look better, don't have have comparisons, so we have no idea what the native resolutions would look like.

Scaling is always softer than native. The very act of scaling softens images.
It isn't scaling in that sense. Reconstruction using data generated over multiple frames, is more accurate. There is a paper by guerilla games detailing their checkerboard implementation. Unfortunately there's still no comparison to plain 4k (they compare it to 4k with 16x super sampling though), but you can see a significant amount more detail compared to 1512p(half the pixels of 4k) with temporal AA, along with reduced aliasing.

http://advances.realtimerendering.com/s2017/DecimaSiggraph2017.pdf



you can kind of infer how a normal 4k shot would look like by comparing 1080p with 1512p


Not much increase in detail with 2x the pixels. Whereas checkerboard seems to make at least as much of a difference, maybe more.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
It isn't scaling in that sense. Reconstruction using data generated over multiple frames, is more accurate. There is a paper by guerilla games detailing their checkerboard implementation. Unfortunately there's still no comparison to plain 4k (they compare it to 4k with 16x super sampling though), but you can see a significant amount more detail compared to 1512p(half the pixels of 4k) with temporal AA, along with reduced aliasing.

http://advances.realtimerendering.com/s2017/DecimaSiggraph2017.pdf



you can kind of infer how a normal 4k shot would look like by comparing 1080p with 1512p


Not much increase in detail with 2x the pixels. Whereas checkerboard seems to make at least as much of a difference, maybe more.
That PDF was a nice read(AA and Checkerboard parts), thanks.

They really went above and beyond to squeeze the most of the PS4 Pro. But it is still a temporal technique combining resolution between frames, great for still shots, but it will tend to break down in motion. Some people think that doesn't matter, I am not so convinced.

It would definitely be an interesting option if this level of Checkerboard rendering could be added to Video cards though it sounds like it may require intimate work within the games.
 

ASK THE COMMUNITY