Discussion AMD FSR vs Nvidia DLSS2.0 vs Temporal Super Resolution (TSR)

Kedas

Senior member
Dec 6, 2018
355
339
136
"Edge of Eternity" is one of the first games that have FSR and DLSS I assume.
This is an interesting interview with the game developer about the difference between both.
It looks indeed that the difference between FSR and DLSS may not be worth it to implement both.


Cost to add it
I think FSR being very easy to integrate means more games are going to get it, DLSS is more complicated to integrate.

Quality
Quality-wise, when there is a lot of pixel info available for upscaling to 4K, both technologies give amazing results, and I have a hard time seeing differences between them. I’d even say that I slightly prefer the FSR for 4K resolution since it doesn’t introduce any artifacts/minor blurriness that DLSS can sometimes introduce. For lower resolutions like upscaling to 1080p or 720p, I think DLSS gives a better result since it can reconstruct parts of missing details due to the nature of the technique.

I didn't expect him to say this below, since now people assume DLSS2.0 is still always better:
I slightly prefer the FSR for 4K resolution since it doesn’t introduce any artifacts/minor blurriness that DLSS can sometimes introduce

From another interview with AMD it was clear they were already busy with FSR2.0
I wonder if this will have time involved because that introduces artifacts, so may be harder to get right, certainly if you compare it to a good FSR 1.0
The likely comment for FSR2.0 will then probably be: looks a bit better than FSR1.0 but has sometimes extra artifacts.
 

pj-

Senior member
May 5, 2015
481
249
116
Not really interested in anyone's opinion of which looks better.. Show me images and videos of them side by side.
 

BFG10K

Lifer
Aug 14, 2000
22,674
2,824
126
From the article:
Unfortunately, the Unity DX12 implementation on the built-in pipeline is not solid enough. We have worse performance in DX12, and they made all the cool features available only for HDRP. Switching to this pipeline would basically require us redoing the project from scratch, so it’s very unlikely to happen.
I still remember being told in this very forum that all Indie developers will get "automatic" performance boosts just by "flipping a switch" to DX12 when engines support it.

About a month ago I tried starting a new game of Shadow of Tomb Raider under DX12 and got my first ever BSOD on Windows 10, with multiple drivers.

If the game's at fault good luck with that, as there's virtually no chance of a patch coming. 99.99% of games don't get perpetually patched, including this massive AAA title. That's exactly why the driver should do the heavy-lifting.

Aside from rare exceptions like Doom Eternal, low-level APIs are absolute garbage in practice, just like I was saying 18 months ago.
 
Last edited:

Kedas

Senior member
Dec 6, 2018
355
339
136
Not really interested in anyone's opinion of which looks better.. Show me images and videos of them side by side.
Sure that will come but the opinion of someone who actually had to put energy in fine tuning both in the development and used/compared it, this has a lot of weight.

The difference may come down to user 'taste'
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
- I'm surprised by his honesty, but his answer is a bit polyanna-ish (FSR is good here, DLSS is better there) but the fact that he gave Unity's DX12 pipeline a thumbs down suggests he may be a straight shooter on this one.

From the article:
Aside from rare exceptions like Doom Eternal, low-level APIs are absolute garbage in practice, just like I was saying 18 months ago.

Basically any id tech 6 engine game works incredibly well with Vulkan. My relic of a rig plays new great looking id tech 6 engine games faster than worse looking older games on DX11. Mantle was also a pretty solid low level renderer in my experience.

So maybe it has more to do with DX12 being kinda shoddy or Shadow of the Tomb raider being one of the earlier games with a hybrid renderer that resulted in the issues you experienced. Alternatively (and I know no one ever wants to hear this) but its possible your system is just manifesting instability in a narrow, specific workload associated with this game in DX12.
 
  • Like
Reactions: Tlh97 and Shmee

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
The difference may come down to user 'taste'

That's certainly way a condensed way of saying, a person prefers "whichever brand of card the person has purchased, which is coincidentally the latest in a long line of purchases." :p

The worst part about these technologies are going to be the arguments over which technology looks better, never mind the trolls that are going to have an easy time getting people riled up over something that can be argued endless due to how much subjectivity is involved.
 
  • Like
Reactions: blckgrffn

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
From the article:

I still remember being told in this very forum that all Indie developers will get "automatic" performance boosts just by "flipping a switch" to DX12 when engines support it.

About a month ago I tried starting a new game of Shadow of Tomb Raider under DX12 and got my first ever BSOD on Windows 10, with multiple drivers.

If the game's at fault good luck with that, as there's virtually no chance of a patch coming. 99.99% of games don't get perpetually patched, including this massive AAA title. That's exactly why the driver should do the heavy-lifting.

Aside from rare exceptions like Doom Eternal, low-level APIs are absolute garbage in practice, just like I was saying 18 months ago.
By the words of your comment, that is 100% the engine's fault, not the API's.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Wasn't that really about the only game that actually did anything with Mantle (before it basically morphed into Vulkan) though? There may have been a few others that also used it, but BF4 was really the only one that I remember.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,719
7,016
136
Wasn't that really about the only game that actually did anything with Mantle (before it basically morphed into Vulkan) though? There may have been a few others that also used it, but BF4 was really the only one that I remember.

-Dragon Age Inquisition is the mantle game I played most. The performance boost on my aging Q9550 + HD7950 system was shocking to say the least, like a solid 50% FPS boost it was nuts.

I think EA was the biggest adopter at the time and incorporated a mantle renderer into their Frostbite engine at the time so it applied to that entire generation of games.

-Edit: Correction, you're right, the list of games that were planned for Mantle was pretty long,but the list of games that actually ended up with Mantle is pretty short.

Looks like DX12 and Vulkan took the wind out from under Mantle's wings pretty hard and fast, so a lot of the titles in development dropped mantle support.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
-Dragon Age Inquisition is the mantle game I played most. The performance boost on my aging Q9550 + HD7950 system was shocking to say the least, like a solid 50% FPS boost it was nuts.

I think EA was the biggest adopter at the time and incorporated a mantle renderer into their Frostbite engine at the time so it applied to that entire generation of games.

-Edit: Correction, you're right, the list of games that were planned for Mantle was pretty long,but the list of games that actually ended up with Mantle is pretty short.

Looks like DX12 and Vulkan took the wind out from under Mantle's wings pretty hard and fast, so a lot of the titles in development dropped mantle support.

Wasn't the guy behind Frostbite the same guy who worked on Mantle itself?
 

blckgrffn

Diamond Member
May 1, 2003
9,111
3,029
136
www.teamjuchems.com
That's certainly way a condensed way of saying, a person prefers "whichever brand of card the person has purchased, which is coincidentally the latest in a long line of purchases." :p

The worst part about these technologies are going to be the arguments over which technology looks better, never mind the trolls that are going to have an easy time getting people riled up over something that can be argued endless due to how much subjectivity is involved.

This comment is so true to my ears.

I remember super threads on AF quality, AA quality, all sorts of garbage like that over the last 20 years. Sometimes it was justified. More often it was just whacks at the hornets nest. This latest round appears to be just another log for the ever burning fire :D Honestly, that's the best possible outcome for all of us.
 
  • Like
Reactions: Tlh97 and Mopetar

Kuiva maa

Member
May 1, 2014
181
232
116
-Dragon Age Inquisition is the mantle game I played most. The performance boost on my aging Q9550 + HD7950 system was shocking to say the least, like a solid 50% FPS boost it was nuts.

I think EA was the biggest adopter at the time and incorporated a mantle renderer into their Frostbite engine at the time so it applied to that entire generation of games.

-Edit: Correction, you're right, the list of games that were planned for Mantle was pretty long,but the list of games that actually ended up with Mantle is pretty short.

Looks like DX12 and Vulkan took the wind out from under Mantle's wings pretty hard and fast, so a lot of the titles in development dropped mantle support.
Wasn't the guy behind Frostbite the same guy who worked on Mantle itself?

Indeed Johan Andersson,then head honcho of frostbite was the driving force behind Mantle. He was extremely jaded with DX9/11 so he pitched the idea of a lower level,more efficient API to Intel,Nvidia and AMD. Only AMD reciprocated and the rest is history. In a sense Mantle didn't vanish, Vulkan is the continuation of it and DX12 started as a very offshoot of mantle, so its purpose to revolutionize PC graphics API was a very quick success.

From the article:

I still remember being told in this very forum that all Indie developers will get "automatic" performance boosts just by "flipping a switch" to DX12 when engines support it.

About a month ago I tried starting a new game of Shadow of Tomb Raider under DX12 and got my first ever BSOD on Windows 10, with multiple drivers.

If the game's at fault good luck with that, as there's virtually no chance of a patch coming. 99.99% of games don't get perpetually patched, including this massive AAA title. That's exactly why the driver should do the heavy-lifting.

Aside from rare exceptions like Doom Eternal, low-level APIs are absolute garbage in practice, just like I was saying 18 months ago.

What I remember personally is everyone warning that lower level APIs are much more work for the devs because they precisely take abstraction layers away, giving more control but making the taks harder for engineers. Exactly what happend. DX12/Vulkan is now almost everywhere in AAA, a big success story (From RDR2 and World of Warcraft, to CoD Warzone and AC Valhalla)- a random BSOD in an old game doesn't say much. I recently loaded sniper 2 and DX9 path was so bugged with my 6800XT it was unplayable, DX11 was fine though.
 
Last edited:

ThatBuzzkiller

Golden Member
Nov 14, 2014
1,120
260
136
D3D12 and Vulkan aren't always ideal abstractions in every cases ...

D3D12 is never a good idea on mobile tile-based GPUs because it's super common for games to do dozens of fullscreen passes which kills most of the advantages of tile-based rendering. D3D12 is also a bad match for Intel GPUs too since their hardware doesn't truly support pointers but are actually closer to bindless arrays and the resource state API complicates resource compression as well so it's harder to apply colour compression on Intel GPU compared to triggering DCC for AMD GPUs. I don't know if this was an accident or not but developers do "descriptor aliasing" which happens to be a memory/performance optimization on AMD but to other vendors it's probably another scary game bug that they have to workaround by applying hacks in their drivers. Some games will also use CBV tables or bindless CBVs for the laughs even though NV GPUs don't natively support this functionality so they have driver hacks for this in some cases. The Root Signature API is a disaster on Nvidia and there's dumb pitfalls with the ExecuteIndirect API on AMD ...

Vulkan by comparison is not nearly as haphazardly designed as D3D12 was but the windows system integration stuff sucks really hard over there compared to D3D12. Vulkan can be described as a theoretical mixture between AMD and mobile hardware which turns off most AAA developers since they feel that the renderpasses/subpasses API is unnecessary since those concepts only apply to mobile hardware. The only real tangible advantage it has going for it is the tight community feedback and the extension model ...
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
Wow..."Mantle"...that's a name i haven't heard in YEARS. so many forum posts, so many arguments, and now who even remembers it?
and now we live in the post-Mantle/DX12 era, and some of us have high hopes for Vulkan (i play Path of Exile, Vulkan doesn't work very well there)
in some years we'll be in the post-vulkan/directX era and we'll argue about something else.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
AMD gave a lot of what they'd worked on with Mantle to the Khronos group (the organization that oversaw OpenGL to which Vulkan is the successor) who used that to build Vulkan. It's not backwards compatible, but Mantle is pretty clearly part of the current era of graphics APIs. Frankly it's good that AMD opened it up because the last thing the industry really needed was yet another proprietary API to develop for.
 

Krteq

Senior member
May 22, 2015
991
671
136
FSR implemented to Unity and UE4

  • Unity: AMD FidelityFX Super Resolution is available now in a special preview beta branch of Unity 2021.2 HDRP. You can learn more in the Unity section on our GPUOpen FSR page and see FSR in action in the Unity HDRP Spaceship demo video above showing that FSR delivers up to a 1.9x performance boost at 4K in “Performance” mode.

  • Unreal Engine 4: On GPUOpen we already have had patches that can be applied to various versions of Unreal Engine for a while, to add support for some of our other FidelityFX features. From today, we're excited to be able to add AMD FidelityFX Super Resolution to that list! Head over to the Unreal Engine patches page to for more information. We've also updated the Unreal Engine Performance Guide to talk about UE4 upscaling as well, including the FSR upscaling we now provide.
AMD community - AMD FSR Updates: More Games, GPUOpen Release, Unity and Unreal Support
_________________________

Another titles with FSR support
First up on July 16, AMD FidelityFX Super Resolution updates are coming for Arcadegeddon, and Necromunda: Hired Gun -- then next week, Edge Of Eternity and Resident Evil Village will be updated with FSR support.
 
  • Like
Reactions: Tlh97 and Mopetar

BFG10K

Lifer
Aug 14, 2000
22,674
2,824
126
Unity and Unreal engine support is a big deal as DLSS proponents keep harping on about that. This time AMD has put extremely simple tech right in the face of developers and done the heavy-lifting for them, unlike their previous open source gimmicks.

It'll be interesting to see how this pans out. I'm going to say with 60% confidence it'll displace DLSS and relegate it to a niche category.

By the words of your comment, that is 100% the engine's fault, not the API's.
Who cares? The net result is the same: the game's unplayable under DX12. That makes DX12 a failure for that use-case.

You can google scores of people who have the same problem so it's not "isolated to my machine combo".
 
  • Like
Reactions: Tlh97

positivedoppler

Golden Member
Apr 30, 2012
1,103
171
106
Wow..."Mantle"...that's a name i haven't heard in YEARS. so many forum posts, so many arguments, and now who even remembers it?
and now we live in the post-Mantle/DX12 era, and some of us have high hopes for Vulkan (i play Path of Exile, Vulkan doesn't work very well there)
in some years we'll be in the post-vulkan/directX era and we'll argue about something else.
I think poe needs a good cpu. Even my 5900x gets all maxed out with a gazillion monsters on screen
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Unity and Unreal engine support is a big deal as DLSS proponents keep harping on about that. This time AMD has put extremely simple tech right in the face of developers and done the heavy-lifting for them, unlike their previous open source gimmicks.

It'll be interesting to see how this pans out. I'm going to say with 60% confidence it'll displace DLSS and relegate it to a niche category.


Who cares? The net result is the same: the game's unplayable under DX12. That makes DX12 a failure for that use-case.

You can google scores of people who have the same problem so it's not "isolated to my machine combo".
I think more likely what will happen is now AMD have open sourced it they won't touch it again - it's not like FSR is new tech, it's just another attempt at an upscaler standing on the shoulders of many others that does some things well and some things not so well (e.g. it's got no temporal component to reduce shimmer). Engines won't switch to using FSR as their built in upscaler, however because it's open source they will take a look and integrate any useful bits into their own upscaling algorithms. FSR as a AMD maintained shader based upscaler will disappear.
DLSS will get more updates although as the tech matures the number will drop.
At some point AMD will add some AI hardware to their gpu's and then we will get a true competitor to DLSS and the whole fanboy war will start again.
 

Justinus

Diamond Member
Oct 10, 2005
3,167
1,509
136
I think more likely what will happen is now AMD have open sourced it they won't touch it again - it's not like FSR is new tech, it's just another attempt at an upscaler standing on the shoulders of many others that does some things well and some things not so well (e.g. it's got no temporal component to reduce shimmer). Engines won't switch to using FSR as their built in upscaler, however because it's open source they will take a look and integrate any useful bits into their own upscaling algorithms. FSR as a AMD maintained shader based upscaler will disappear.
DLSS will get more updates although as the tech matures the number will drop.
At some point AMD will add some AI hardware to their gpu's and then we will get a true competitor to DLSS and the whole fanboy war will start again.

One recent comparison I saw between FSR and DLSS said they were already working on FSR 2.0 and speculated it might have a temporal element or history buffer or motion vectors or something.

Whether or not AMD working on FSR 2.0 was also speculation was not made clear.

If they are already working on a more advanced implementation it makes sense they won't do much, if any maintenance on the currently released FSR. It seems like it has enough knobs and switches for a developer to do a great implementation like Necromunda where it is on par or occasionally better than DLSS, or a bad implementation like Godfall where the entire game looks like Vaseline was smeared on the camera.

It seems like it's all up to how much time the developer is willing to put into tuning it to perfection for their game. I'd wager it still is a fraction of the effort of implementing DLSS.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
One recent comparison I saw between FSR and DLSS said they were already working on FSR 2.0 and speculated it might have a temporal element or history buffer or motion vectors or something.

Whether or not AMD working on FSR 2.0 was also speculation was not made clear.

If they are already working on a more advanced implementation it makes sense they won't do much, if any maintenance on the currently released FSR. It seems like it has enough knobs and switches for a developer to do a great implementation like Necromunda where it is on par or occasionally better than DLSS, or a bad implementation like Godfall where the entire game looks like Vaseline was smeared on the camera.

It seems like it's all up to how much time the developer is willing to put into tuning it to perfection for their game. I'd wager it still is a fraction of the effort of implementing DLSS.
Iit's a bit difficult to accurately compare FSR and DLSS as right now FSR is 2 things:
1) an upscaler.
2) image sharpening (CAS).

Where as DLSS is effectively only step 1) - it doesn't do any sharpening. You can add sharpening to DLSS but it's separate, and because it's separate you can tweak how much you want. If you want to compare them you really need FSR without any sharpening applied.

The alternative would be to compare after sharpening was applied to DLSS - however no one does that as the devs know that non-sharpened DLSS looks more like the source image then it would with sharpening applied. FSR is like going into a TV store - all the TV's have the saturation and contrast turned up to 10 to stand out from each other, completely unrealistic but it's how you sell TV's. FSR is similar in that it has CAS dialled right up to stand out no matter how unrealistic it makes the image.

The other major difference like you say is DLSS is temporal which means on top of having more information to work off it is going to be able to reduce shimmer and texture crawl in a way no non temporal solution can but you can't see that in still images. So in addition to the above AMD would also really need a temporal version of the upscaler. I suspect that is an awful lot more work then AMD will want to do (it's probably taken a lot of effort from UE4 devs for example for their latest temporal upscaler) particularly as eventually AMD's aim has to be a DLSS equivalent which is a completely different technology.