Discussion AMD Gaming Super Resolution GSR

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Timorous

Senior member
Oct 27, 2008
921
1,217
136
At this point I'm convinced he's doing this intentionally. I refuse to believe he's actually dumb enough to not notice these issues with his comparisons.
I think I will Hanlon's Razor it and assume he is just not as good as people claimed.

EDIT. I still think he went looking for images to backup his pre test conclusion, he just was not very good at it.
 
  • Like
Reactions: Tarkin77

Tup3x

Senior member
Dec 31, 2016
672
535
136
- To your first point, I fully expect DLSS 3.0 to be able to run on shaders, with a possible locked "ultra quality/performance" preset for cards with Tensor cores (as the tensor cores supposedly would process whatever algos faster than shaders). It would be crazy for NV not to at this point, as they've gotten a 3 year competitor free return on the tech and they know they'll have to go after FSR hard and fast to stop it from taking too deep a root.

To your second point, I fully expect FSR 2.0 to look more like Unreal Engine's TAAU by incorporating temporal data in addition to what it currently does to output a higher quality image. That's really the one big gaping hole in the tech at the moment, and it would help a lot of engines level the playing field with Unreal Engine on this front.
As it stands right now FSR and DLSS are two slightly different things. FSR is scaler while DLSS is temporal anti aliasing solution that can also reconstruct from lower resolution to higher. FSR doesn't do anti aliasing. That potential FSR 2.0 would be total rewrite.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
4,314
3,103
136
They will not be stopping FSR, AMD powers both xbox and playstation, and any multiplatform game is going to be using it because of that. Nvidia will pay to have their tech used exclusively in some PC ports probably, but that won't stop the momentum. The most likely scenario is that it plays out like the adaptive sync battle. The free solution dominates, but Nvidia will support it or a similar solution, as you have pointed out, while having a better exclusive version with a price tag.
- To be fair, DLSS might be the holy grail in porting games to the wildly popular switch (and I believe some games have announced support for it on the switch already) so there is definitely plenty of incentive for devs to look at both solutions, especially if something like DLSS is integrated at the engine level like it is in UE5.

No reason FSR couldn't work there as well, but NV hasn't exactly been known for playing fair when you're playing in their walled garden...
 
  • Like
Reactions: uzzi38 and Makaveli

coercitiv

Diamond Member
Jan 24, 2014
5,160
8,265
136
Apparently Alex applied CAS onto his "native" and TAAU comparison points, even on the updated review.
"simple scaling" :cool:

PS: I don't have a problem with him using Native+CAS to benchmark against FSR, but why not be open about it?
 
Last edited:

uzzi38

Platinum Member
Oct 16, 2019
2,328
4,827
116
"simple scaling" :cool:

PS: I don't have a problem with him using Native+CAS to benchmark against FSR, but why not be open about it?
Here's my issue with using Native+CAS comparisons: you're assuming the end user is competent enough to sit down and optimise the specific sharpening intensity that they would like to game with.

That's not indicative of what your average end-user will do.

Not to mention, if you're going to do that, why not do the same for FSR using driver level implementations of sharpening? Might as well fine-tune every implementation at that point. Unless Alex wasn't fine tuning but rather guessing how much sharpening is applied, which is perhaps even worse because you have no clue if you're providing a like-for-like comparison.
 
  • Like
Reactions: Tlh97 and Elfear

uzzi38

Platinum Member
Oct 16, 2019
2,328
4,827
116
If nothing else it will clarify Nvidia's PR regarding the tech DLSS uses. So far DLSS is limited to RTX cards due to requiring Tensor Cores. Switch doesn't have Tensor Cores.
The updated one this year will have them reportedly.
 

Saylick

Golden Member
Sep 10, 2012
1,993
3,293
136
If nothing else it will clarify Nvidia's PR regarding the tech DLSS uses. So far DLSS is limited to RTX cards due to requiring Tensor Cores. Switch doesn't have Tensor Cores.
If what this guy says is true, DLSS 2.0 doesn't use deep learning to upsample individual images themselves (i.e. DLSS 1.0); it's using a trained network to determine which frames to select in the multi-frame upsampling. Regardless, my understanding is that inferencing doesn't require tensor cores but having a lot of TOPs does help, so a bank of FP units which can do a ton of half-precision or even quarter-precision operations may suffice. I don't know how deep the neural network designed to select the sample frames needs to be, but I imagine it cannot be too deep. At the end of the day, deep learning is just used to determine which frames to select, but the multi-frame upsampling portion of the pipeline still needs to run as well. I imagine if the former took too long, it may eliminate any advantages it brings in accuracy/final image quality vs. just using all of the previous sample frames.

 

AtenRa

Lifer
Feb 2, 2009
13,817
3,023
136
Ok a few pics from me


RX5500XT 8GB

Riftbreaker

Native 1440p


1440p + FSR Performance


1440p + FSR Balanced


1440p + FSR Quality


1440p + FSR Ultra Quality

.
.
Virtual Super Resolution to 5120x2880

Native VSR 5120x2880

Native VSR 5120x2880 + FSR Performance


Native VSR 5120x2880 + FSR Balanced


Native VSR 5120x2880 + FSR Quality


Native VSR 5120x2880 + FSR Ultra Quality



If you want the best Image Quality with acceptable fps, then turn on VSR + FSR Performance. Just compare from the images above the Native 1440p to 5120x2880 + FSR Performance.
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,501
3,130
136
Played around with FSR in Dota. It looks gorgeous, particularily when using it at 99,5% scaling or with VSR.

I ended up playing with VSR 2880p with 75% scaling (4K actual rendering resolution) with dowscaling to 1440p.

I chose that as this lands my FPS closest to my monitors refresh rate (165Hz). Dota tends to have rough edges on some vegetation and clothing etc (like zeus' robe on the image) this managed to totally eliminate it.

Unfortunately I can't share the PNG images as they are very slightly over 32mb and no service lets me upload those ,but this is how it looks like as a jpg:


 
Last edited:

Krteq

Senior member
May 22, 2015
989
670
136
Regarding TAAU, it IS indeed breaking DoF... even in UE5.
Out of curiosity, will the new TAA upscaling behave well with depth of field? Currently when you set r.TemporalAA.Upsampling=1 , most of the DOF just disappears.
So when r.TemporalAA.Upsampling=1, it basically forces r.DOF.Recombine.Quality=0 that looses the slight DOF convolution, and that is due to DiaphragmDOF.cpp’s bSupportsSlightOutOfFocus. There needs to have some changes in the handling of the slight out of convolution (about 5pixels and below) when doing temporal upsampling that I didn’t have time to come down to. And we were only using temporal upsampling on current gen consoles. Wasn’t a big deal back then because if your frame would need to be temporally upsampled, that probably meant you didn’t have the performance to run DOF’s slight out of focus… However we exactly ran into this issue for our Lumen in the Land of Nanite demo running on PS5, but it is still prototype and I’m not sure whether I’m gonna have this finished by 4.26’s release. But yeah given how temporal upsampling is going to become important, it’s definitely something to fix very high on the priority list.
forums.unrealengine.com - GEN 5 TEMPORAL ANTI-ALIASING
 
Last edited:

uzzi38

Platinum Member
Oct 16, 2019
2,328
4,827
116
Regarding TAAU, it IS indeed breaking DoF... even in UE5.


forums.unrealengine.com - GEN 5 TEMPORAL ANTI-ALIASING
If I'm reading that post correctly, it's probably fixed in UE5 now, but it was broken in UE4, Epic Games were aware of it but due to the fact that TAAU was primarily only being used on consoles where DoF generally wasn't being applied, they just left it be.

At the end of the day, it's the same result - DoF was broken no matter how you cut it, and the supposed industry expert on image clarity didn't just miss it once, but they missed it twice.

Laughable.
 

Dribble

Golden Member
Aug 9, 2005
1,979
497
136
New video from Hardware Unboxed (DOTA 2) with iGPU and older entry level dGPUs like the GTX750 Ti, RX550 and GT1030 DDR4

Not a great example as you can run DOTA on anything fine. However as someone said previously the fact FSR is more effective at 1440p probably says more about the quality of the assets in DOTA then it does about how good FSR is. DOTA is clearly built for 1080p, if it was built for 1440p (higher quality assets) then FSR would have exactly the same increase in blurriness as you currently see at 1080p - the % reduction FSR uses is identical after all.
 
  • Like
Reactions: Tlh97

Kedas

Senior member
Dec 6, 2018
354
339
106
if game developers do not add FSR in their games it's because they don't want to.
Even modders where able to add it in Grand Theft Auto V
It's not perfect but indicates the amount of effort needed to get it done.
 

ASK THE COMMUNITY