Poll: Do you care about ray tracing / upscaling?

Page 29 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you care about ray tracing / upscaling?


  • Total voters
    237

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,755
31,727
146
So do you think one should use XeSS in all cases, or does it depend?
Game dependent. Most annoying thing I see with the DP4a version of XeSS is ghosting. With FSR, image stability/sizzle/shimmering and ghosting can be annoying as hell at lower resolutions. In most the games I have tried both e.g. Spidey games, cyberpunk, Hogwarts, AC: Mirage, XeSS IQ issues are less distracting. In the Spidey games Insomniac's IGTI is usually better than both for IQ overall. All my personal preferences of course.

Hardware XMX XeSS looks and runs better on ARC than the software fallback on other IHVs, of course.
I haven't tried XeSS. I didnt know it worked on AMD cards, I thought it was for Intel only. TBH, I think FSR on the 7600 is worthless: Very little increase in performance and horrible image quality, no matter what the setting.
Sounds like you are fully CPU limited then. FSR balanced vs native should be a significant fps boost at the expense of IQ.
From what I understand it is much better on the current AMD architecture.
FSR 4 is better than anything but DLSS transformer.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,831
6,465
136
It's a different architecture. Traditional GPUs are raster first and RT tacked on. This GPU is RT first and raster as an afterthought just for compatibility.

Does that explain the piss poor FP32 performance? This doesn't look like a card designed for your average consumer.
 
Jul 27, 2020
26,354
18,124
146
Does that explain the piss poor FP32 performance? This doesn't look like a card designed for your average consumer.
Depends on how they perform. If they provide decent DX12/Vulkan support on day one, I can see enthusiasts paying $3000 for the 64GB card. The success of that could make cheaper 16GB and 32GB cards possible with higher PT performance than 5090. The 128GB card at $5000 would be better value than Nvidia's 96GB PRO card for workstation/AI use.
 

marees

Golden Member
Apr 28, 2024
1,301
1,855
96

Current AMD research​

We are actively researching neural techniques for Monte Carlo denoising with the goal of moving towards real-time path tracing on AMD RDNA™ 2 architecture or higher GPUs. Our research sets a few aims as follows:

  • Reconstruct spatially and temporally outstanding quality pixels with fine details given extremely noisy images rendered with 1 sample per pixel.
  • Use minimal input by taking a noisy color image as input instead of separated noisy diffuse and specular signals.
  • Handle various noise from all lighting effects with a single denoiser instead of multiple denoisers for different effects.
  • Support both denoising-only and denoising/upscaling modes from a single neural network for wider use cases.
  • Highly optimized performance for real-time path tracing at 4K resolution.
With these goals, we research a Neural Supersampling and Denoising technique which generates high quality denoised and supersampled images at higher display resolution than render resolution for real-time path tracing with a single neural network.


Our technique can replace multiple denoisers used for different lighting effects in rendering engine by denoising all noise in a single pass as well as at low resolution. Depending on use cases, a denoising-only output can be utilized, which is identical to 1x upscaling by skipping upscale filtering.


https://gpuopen.com/learn/neural_su...e=twitter&utm_medium=social&utm_campaign=nssd


Reconstructing pixels in noisy rendering​

Denoising is one of techniques to address the problem of the high number of samples required in Monte Carlo path tracing. It reconstructs high quality pixels from a noisy image rendered with low samples per pixel. Often, auxiliary buffers like albedo, normal, roughness, and depth are used as guiding information that are available in deferred rendering. By reconstructing high quality pixels from a noisy image within much shorter time than that full path tracing takes, denoising becomes an inevitable component in real-time path tracing.


Neural Denoising​

Neural denoisers [3,4,5,6,7,8] use a deep neural network to predict denoising filter weights in a process of training on a large dataset. They are achieving remarkable progress in denoising quality compared to hand-crafted analytical denoising filters [2]. Depending on the complexity of a neural network and how it cooperates with other optimization techniques, neural denoisers are getting more attention to be used for real-time Monte Carlo path tracing.

A unified denoising and supersampling [7] takes noisy images rendered at low resolution with low samples per pixel and generates a denoised as well as upscaled image to target display resolution. Such joint denoising and supersampling with a single neural network gives an advantage of sharing learned parameters in the feature space to efficiently predict denoising filter weights and upscaling filter weights. Most performance gain is obtained from low resolution rendering as well as low samples per pixel, giving more time budget for neural denoising to reconstruct high quality pixels.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,755
31,727
146
Oh boy, that’s rich coming from a YouTuber.

Anyway, FG is good option to have. No need to endless videos about it to milk views.
Yeah, the kid's clickbait game is on point. And I agree, there was no new info to be gleaned. It is also good to see Nvidia improving backwards compatibility.
 
Jul 27, 2020
26,354
18,124
146

Good video about the Nvidia RTX SDK problem. Games using it will have crap performance on AMD cards. But the future is bright with Unreal Engine's Lumen.
 
  • Like
Reactions: marees

marees

Golden Member
Apr 28, 2024
1,301
1,855
96

Good video about the Nvidia RTX SDK problem. Games using it will have crap performance on AMD cards. But the future is bright with Unreal Engine's Lumen.
Nvidia's branch of RT

Another exhibit

If you've been following GPU news in recent days, you may have come across some findings highlighting that some games based on Unreal Engine 4 (UE4) suffer from heavy stuttering with AMD's RX 9000-series cards when ray-tracing (RT) effects are enabled. More details have now come to light on the underlying reason -- using a NVIDIA-optimized branch of UE4.

The initial story prompted much finger-pointing in various directions, namely but not only AMD's driver team, but the facts are reasonably simple. This all started with a video from DigitalFoundry, where the site's team took a look at a handful of games that exhibited the issue on an RX 9070 XT card. At the time, the community's running theory was that AMD had some nasty driver bug unfixed, that could hypothetically reveal itself in games not tested for the video.

The Youtube channel Tech Yes City (TYC) then did some additional digging by testing Hellblade: Senhua's Sacrifice and The Ascent in depth. Both games exhibited serious stuttering on the RX 9070, up to several seconds long in Hellblade. As the investigation went on, TYC found out that these two games, although built on Unreal Engine 4, used a NVIDIA-optimized fork of the engine for their RT effects, called NvRTX. This is in contrast to the standard implementation that used the vendor-agnostic DirectX Raytracing (DXR).