Discussion AMD Gaming Super Resolution GSR

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DisEnchantment

Golden Member
Mar 3, 2017
1,590
5,722
136
New Patent came up today for AMD's FSR




20210150669
GAMING SUPER RESOLUTION

Abstract
A processing device is provided which includes memory and a processor. The processor is configured to receive an input image having a first resolution, generate linear down-sampled versions of the input image by down-sampling the input image via a linear upscaling network and generate non-linear down-sampled versions of the input image by down-sampling the input image via a non-linear upscaling network. The processor is also configured to convert the down-sampled versions of the input image into pixels of an output image having a second resolution higher than the first resolution and provide the output image for display


[0008] Conventional super-resolution techniques include a variety of conventional neural network architectures which perform super-resolution by upscaling images using linear functions. These linear functions do not, however, utilize the advantages of other types of information (e.g., non-linear information), which typically results in blurry and/or corrupted images. In addition, conventional neural network architectures are generalizable and trained to operate without significant knowledge of an immediate problem. Other conventional super-resolution techniques use deep learning approaches. The deep learning techniques do not, however, incorporate important aspects of the original image, resulting in lost color and lost detail information.

[0009] The present application provides devices and methods for efficiently super-resolving an image, which preserves the original information of the image while upscaling the image and improving fidelity. The devices and methods utilize linear and non-linear up-sampling in a wholly learned environment.

[0010] The devices and methods include a gaming super resolution (GSR) network architecture which efficiently super resolves images in a convolutional and generalizable manner. The GSR architecture employs image condensation and a combination of linear and nonlinear operations to accelerate the process to gaming viable levels. GSR renders images at a low quality scale to create high quality image approximations and achieve high framerates. High quality reference images are approximated by applying a specific configuration of convolutional layers and activation functions to a low quality reference image. The GSR network approximates more generalized problems more accurately and efficiently than conventional super resolution techniques by training the weights of the convolutional layers with a corpus of images.

[0011] A processing device is provided which includes memory and a processor. The processor is configured to receive an input image having a first resolution, generate linear down-sampled versions of the input image by down-sampling the input image via a linear upscaling network and generate non-linear down-sampled versions of the input image by down-sampling the input image via a non-linear upscaling network. The processor is also configured to convert the down-sampled versions of the input image into pixels of an output image having a second resolution higher than the first resolution and provide the output image for display.

[0012] A processing device is provided which includes memory and a processor configured to receive an input image having a first resolution. The processor is also configured to generate a plurality of non-linear down-sampled versions of the input image via a non-linear upscaling network and generate one or more linear down-sampled versions of the input image via a linear upscaling network. The processor is also configured to combine the non-linear down-sampled versions and the one or more linear down-sampled versions to provide a plurality of combined down-sampled versions. The processor is also configured to convert the combined down-sampled versions of the input image into pixels of an output image having a second resolution higher than the first resolution by assigning, to each of a plurality of pixel blocks of the output image, a co-located pixel in each of the combined down-sampled versions and provide the output image for display.

[0013] A super resolution processing method is provided which improves processing performance. The method includes receiving an input image having a first resolution, generating linear down-sampled versions of the input image by down-sampling the input image via a linear upscaling network and generating non-linear down-sampled versions of the input image by down-sampling the input image via a non-linear upscaling network. The method also includes converting the down-sampled versions of the input image into pixels of an output image having a second resolution higher than the first resolution and providing the output image for display.

It uses Inferencing for upscaling. As will all ML models, how you assemble the layers, what kind of parameters you choose, which activation functions you choose etc, matters a lot, and the difference could be night and day in accuracy, performance and memory

1621500232759.png

1621500205339.png
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
Ironic to think that this narrative was brought by people who considered the FarCry RT remake and bug fest CP2077 the best form of artistic expression of RT tech and belittling PS5/Insomniac's achievements in R&C
While I agree DF, at least Alex, seems to be very Nvidia favored and always glass-half-empty to the AMD side, how did they exactly belittle R&C? IMO they praised it quite a bit and only spoke well of CP2077 at it's release which was before decent RT games on PS5.

And regarding Cyberpunk, despite the bugs and being a meh game compared to something like Witcher 3, CP2077 did look very very good at times. The city really clicks at times no other game has for me (art wise). So all-in-all a bad game, passable story, but artisticly excellent at times.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
I can't agree more.
With the UE5 demo as well, what was hilarious to me was that Alex had a tough time acknowledging 6800XT is faster than NV's best in UE5 demos. They even went on to ridicule it by putting some meme.
The other guy was Richard with his flailing arms distracting me more than an expressive Italian waving his hands about trying to get his point across.
The rest of the guys are OK.
I enjoy John's retro stuff. Alex is part of the Nvidia Marketing dept. human caterpillar. Richard is all over the place, but he likes money.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Has someone compared it to the hardware GPU upscaler that AMD gpus have?

For example for APUs, i saw FSR tests with APUs that enable to run games at 1080p at a better fps, but i think that is just a bad idea, im yet to find a tech site that has any idea of how run games in a APU.
The way to do it would be, if you have a 1080P monitor, run the game at 900p fullscreen w/enabled GPU Upscaling on the radeon settings. And guess what? this works for EVERY GAME, and even the Windows desktop if you set it at a resolution below native.
This is how i managed to survive for a year with a 2200G running games at 720P/900P but NEVER at 1080P, because you are always better off at 900P with higher quality than running at 1080p native.

I also used this for when i had a RX570 and a 4K monitor... i ran games at 1080p and 2K (if the game could run OK) at fullscreen with gpu upscaling, never a issue because the monitor was always getting a 4K input.

All this whiout artifacts or per game support, the only issue was running at fullscreen. I really want to see a comparison with FSR.
 
  • Like
Reactions: Tlh97

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Has someone compared it to the hardware GPU upscaler that AMD gpus have?
Yes, Hardware Unboxed did the comparison: it beats upscaling, it beats upscaling + real-time sharpening, and can only be partially matched through sharpening in Adobe tools (alas FSR sharpening is adaptive, while Adobe's static sharpening has issues in some areas).

I also used this for when i had a RX570 and a 4K monitor... i ran games at 1080p and 2K (if the game could run OK) at fullscreen with gpu upscaling, never a issue because the monitor was always getting a 4K input.
And now you would have the option to run FSR Perfomance to upscale 1080p to 4K and FSR Quality to upscale 1440p.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
I give to you FSR "Uber" Quality mode:

fsr-uber.jpg

1. Enable Virtual Super Resolution in the AMD driver.
2. Set in-game resolution to 2X the native resolution.
3. Enable FSR Performance mode to render the game at half the resolution (effectively going back to native resolution).
4. Observe what FSR does with native resolution content.

The crop above is from a 1440p render. Native is to the left, VSR(5120x2880)+FSR Performance to the right.
 

Kepler_L2

Senior member
Sep 6, 2020
308
977
106
I give to you FSR "Uber" Quality mode:

View attachment 46126

1. Enable Virtual Super Resolution in the AMD driver.
2. Set in-game resolution to 2X the native resolution.
3. Enable FSR Performance mode to render the game at half the resolution (effectively going back to native resolution).
4. Observe what FSR does with native resolution content.

The crop above is from a 1440p render. Native is to the left, VSR(5120x2880)+FSR Performance to the right.
Interesting, what's the performance loss from this?
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
Interesting, what's the performance loss from this?

Someone on Reddit tried this with DSR and said the loss was minimal. They also tested other combinations and found that there were ways to boost performance and get a better than a native 1080p image at the same time. Link
 
Last edited:
  • Like
Reactions: Leeea

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Interesting, what's the performance loss from this?
Typical for what you've seen in reviews, just by looking at the FPS counter I'd say 10%.

They also tested other combinations and found that there were ways to boost performance and get a better than a native 1080p image at the same time.
That's definitely not happening. FSR has a cost, whether in performance or IQ.
 

biostud

Lifer
Feb 27, 2003
18,193
4,674
136
.

As for a fully fledged desktop, I'm glad to see FSR isn't just a complete turd like many of us feared it would be, but I stick by the same statement I made regarding DLSS: this is great to extend the life of an aging video card, but i desperately hope it does not become the crutch used to justify poor gen to gen native res performance increases as we go forward.

Neither of these features should be necessary on a new card gaming at "common" (up to 4K) monitor resolutions at the top end.

With this technology you can add features to a game a generation of video cards earlier, than it would be possible if it did not exist.
 
  • Like
Reactions: Tlh97 and Leeea

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
Typical for what you've seen in reviews, just by looking at the FPS counter I'd say 10%.


That's definitely not happening. FSR has a cost, whether in performance or IQ.

They combined FSR and DSR but they did not use 2x DSR. FSR balanced* + DSR 1.5x seemed to provide good results.

I did provide the link so check it out yourself. Link

EDIT. *I just double checked and it was FSR Quality and FSR Ultra Quality.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Yes, Hardware Unboxed did the comparison: it beats upscaling, it beats upscaling + real-time sharpening, and can only be partially matched through sharpening in Adobe tools (alas FSR sharpening is adaptive, while Adobe's static sharpening has issues in some areas).


And now you would have the option to run FSR Perfomance to upscale 1080p to 4K and FSR Quality to upscale 1440p.

I just saw it, the difference is lower than i expected, at least in that one game, the good thing is that FSR is better than that and it can work in borderless, while the traditional GPU upscaling can be used on games that does not support it.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
I'll try the Riftbreaker demo out when I get the time (hopefully tonight) on my RX 6800 with Raytracing and everything maxed with 2880p and balanced mode (downscaled to 1440p)compared to native 1440p.

Overall seems very impressive as far as Quality and Ultra Quality are concernd (The fact that the performance allows it to run Ultra vs DLSS Quality at similar FPS is really nice)

It is a bit of a dissapointment than 1080p -> 4K is a blurrfest, while TSR and other temporal algorithms are still very good at it.

Still an excellent result overall for a first version. I stilldo hope we eventually also get a temporal one, but even in it's current implementation it can be very useful. It just needs wider game support
 

TestKing123

Senior member
Sep 9, 2007
204
15
81
This is still a frame by frame upscaler and thus really won't be that much better than standard upscalers that uses single frame analysis. It's weakness will also be hidden in games in varied contrast and lower resolution assets, so far the FSR supported games aren't a good representation of this. Some reviews have noticed this, FSR will likely not be a good showing for some of the other AAA games with higher quality assets which is why we didn't see it enabled on those games, just basically Godfall which is already an AMD sponsored showcase. I still think AMD needs an AI based solution to compete with DLSS 2.0+, this just seems like a stopgap and not really useful unless you run specifically in 4k with ultra quality mode. Lower resolution and lower settings just makes it worthless to turn on.
 
  • Like
Reactions: xpea and Dribble

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
This is still a frame by frame upscaler and thus really won't be that much better than standard upscalers that uses single frame analysis. It's weakness will also be hidden in games in varied contrast and lower resolution assets, so far the FSR supported games aren't a good representation of this. Some reviews have noticed this, FSR will likely not be a good showing for some of the other AAA games with higher quality assets which is why we didn't see it enabled on those games, just basically Godfall which is already an AMD sponsored showcase. I still think AMD needs an AI based solution to compete with DLSS 2.0+, this just seems like a stopgap and not really useful unless you run specifically in 4k with ultra quality mode. Lower resolution and lower settings just makes it worthless to turn on.
Setting aside your speculation, I will share my thoughts on the bold text.

DLSS only works on RTX cards. FSR works on cards from both companies, going back as far as 5yrs. That is not competition, competition is when it is your product's features v. their product features. This is a free tool for everyone; even Intel is going to work with it.

It is weird to me to read and watch any negativity about FSR. When from my perspective it shines a bright light on how greedy Nvidia is. G-sync (until it became clear few were willing to pay the markup for it) and DLSS are reserved for those willing and able to pay for them. Free-sync and FSR are available to all. One company's offerings benefits the gaming community as a whole, even console players, the other attempts to stratify it.

Yes, it has some major limitations at the moment, but I appreciate what AMD is doing. These are steps in the right direction i.e. inclusive not exclusive.
 

Makaveli

Diamond Member
Feb 8, 2002
4,715
1,049
136
All upscalers (of which DLSS is also one) are stopgaps to actual native resolutions.

Yup i'm old school and prefer native to DLSS or FSR.

And will always choose native over upscalers. And if it means buying new hardware to meet the resolution target then I upgrade.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
Yup i'm old school and prefer native to DLSS or FSR.

And will always choose native over upscalers. And if it means buying new hardware to meet the resolution target then I upgrade.
That is what tech forums are now, us old schoolers that are financially established with disposable income. Redditt is where everyone else is? And we do not represent the majority. DLSS is not that useful imo, because it is reserved for what in most countries, is expensive hardware. Even before the perfect storm hit, putting the prices well out of reach for many.

Unfortunately, FSR doesn't do anything for those that need it most either. 1080p is where most still game, so until if/when FSR can give a good uplift at that res, it is too limited as well. The kicker to me, is it will be of great use to 4K TV couch gamers. Which makes it seem obvious this is a "console first" tech.
 

Hitman928

Diamond Member
Apr 15, 2012
5,179
7,630
136
That is what tech forums are now, us old schoolers that are financially established with disposable income. Redditt is where everyone else is? And we do not represent the majority. DLSS is not that useful imo, because it is reserved for what in most countries, is expensive hardware. Even before the perfect storm hit, putting the prices well out of reach for many.

Unfortunately, FSR doesn't do anything for those that need it most either. 1080p is where most still game, so until if/when FSR can give a good uplift at that res, it is too limited as well. The kicker to me, is it will be of great use to 4K TV couch gamers. Which makes it seem obvious this is a "console first" tech.

From Gamer's Nexus, it seems like FSR could still be useful for low end by improving image quality for a small performance hit by setting 2x higher display res but then use performance mode of FSR to bring render res back down to the 'original' resolution. The end result was ~10% performance hit compared to native rendering at the 'original' resolution for better quality than could be achieved with the native rendering. I do hope they can continue to improve it to make it even more valuable, especially at lower resolutions, but as others have said, it is a solid first step and will probably have a much faster adoption rate than DLSS.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,718
7,015
136
Setting aside your speculation, I will share my thoughts on the bold text.

DLSS only works on RTX cards. FSR works on cards from both companies, going back as far as 5yrs. That is not competition, competition is when it is your product's features v. their product features. This is a free tool for everyone; even Intel is going to work with it.

...

Yes, it has some major limitations at the moment, but I appreciate what AMD is doing. These are steps in the right direction i.e. inclusive not exclusive.

- To your first point, I fully expect DLSS 3.0 to be able to run on shaders, with a possible locked "ultra quality/performance" preset for cards with Tensor cores (as the tensor cores supposedly would process whatever algos faster than shaders). It would be crazy for NV not to at this point, as they've gotten a 3 year competitor free return on the tech and they know they'll have to go after FSR hard and fast to stop it from taking too deep a root.

To your second point, I fully expect FSR 2.0 to look more like Unreal Engine's TAAU by incorporating temporal data in addition to what it currently does to output a higher quality image. That's really the one big gaping hole in the tech at the moment, and it would help a lot of engines level the playing field with Unreal Engine on this front.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
- To your first point, I fully expect DLSS 3.0 to be able to run on shaders, with a possible locked "ultra quality/performance" preset for cards with Tensor cores (as the tensor cores supposedly would process whatever algos faster than shaders). It would be crazy for NV not to at this point, as they've gotten a 3 year competitor free return on the tech and they know they'll have to go after FSR hard and fast to stop it from taking too deep a root.
They will not be stopping FSR, AMD powers both xbox and playstation, and any multiplatform game is going to be using it because of that. Nvidia will pay to have their tech used exclusively in some PC ports probably, but that won't stop the momentum. The most likely scenario is that it plays out like the adaptive sync battle. The free solution dominates, but Nvidia will support it or a similar solution, as you have pointed out, while having a better exclusive version with a price tag.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,922
146
I do hope they can continue to improve it to make it even more valuable, especially at lower resolutions, but as others have said, it is a solid first step and will probably have a much faster adoption rate than DLSS.
This is my position too.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Yup i'm old school and prefer native to DLSS or FSR.

And will always choose native over upscalers.
I think Nvidia managed to skew people's perception about upscalers since they needed DLSS to be a crutch for RTX performance.

There's nothing stopping us from using native rendering in conjunction with an upscaler such as FSR to get better IQ. The performance hit is worth it.