Poll: Do you care about ray tracing / upscaling?

Page 30 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you care about ray tracing / upscaling?


  • Total voters
    243

Ranulf

Platinum Member
Jul 18, 2001
2,906
2,575
136
No Man's Sky has a big update, interesting gpu rendering upgrades, dlss4, PSSR and XESS2. No FSR though...must be a conspiracy.

Via Steam forum page:
"Rendering

- Added support for NVIDIA DLSS4. Deep Learning Super Sampling is a revolutionary suite of neural rendering technologies that uses AI to boost FPS, reduce latency, and improve image quality.

- Added support for PlayStation® Spectral Super Resolution (PSSR), improving image clarity using AI-enhanced resolution, for ultra-high definition and incredible detail.

- Added support for Intel Xe Super Sampling (XeSS) 2, which uses machine learning to deliver higher performance with exceptional image quality."

Dev site with fps comparision for nvidia towards the bottom:

45fps to 84fps and a slider showing the "difference"

 

marees

Platinum Member
Apr 28, 2024
2,192
2,839
96
Fast paced action game Battlefield 6 has no plans to implement Ray Tracing

Battlefield 6 says no to ray tracing now and in the near future — dev says decision made to ‘focus on making sure it was performance for everyone else’​

News
By Jowi Morales published 5 hours ago
For fast action titles, a more optimized game beats pixel peeping eye candy.


 

poke01

Diamond Member
Mar 8, 2022
4,761
6,095
106
We need more developers to take a similar stance, at least until common gaming hardware can handle RT at higher framerates. Right now there's too much reliance on frame generation.
For BF6 it makes sense but for single player story driven games give me RTRT. RDNA4 is good enough on the AMD side but what I don't like it being forced like Indy.
 

adroc_thurston

Diamond Member
Jul 2, 2023
8,253
11,001
106
at least until common gaming hardware can handle RT at higher framerates
That's kinda the issue, it won't.
RTRT in general is a bad workload for anything, but especially GPUs.
And since we ran out of xtor cost scaling, good luck putting enough shader cores in mainstream to make RTRT at least acceptable perf.
 

Vikv1918

Member
Mar 12, 2025
74
195
66
Im trying to understand how hardware companies and game devs will square the circle of pushing heavy raytracing while also going hard on handhelds like Switch 2, PS6 portable, Xbox Ally etc. All the advantages of "easy lighting work" that path tracing is supposed to provide on the dev side is pointless is they need to make handheld capable versions of the game with a raster/hybrid RT fallback. Maybe that's what AMD GI 1.0 is supposed to solve. I think even in the medium-term future, PT is just going to be Ultra+ graphics option rather than a genuine replacement for raster for the majority of gamers.

And I don't understand why no one talks about how much nvidias RT performance has stagnated, when normalized for raster. A 3080 performs very similar to a 4070 in RT and PT. A 4080 performs very similar, and sometimes even better, than a 5070 Ti. They're basically relying on more cores/more power to push RT rather than actually improving the real-world RT performance of their GPUs. Its been seven years since the RTX meme started and, normalized for raster, its crazy how little the RTX 50 series has improved in RT compared to the RTX 20 series. Despite all their rhetoric, nvidia doesn't seem to want to allocate more transistor budget for RT, and thereby fall behind in raster compared to AMD.
 

ToTTenTranz

Senior member
Feb 4, 2021
908
1,519
136
Im trying to understand how hardware companies and game devs will square the circle of pushing heavy raytracing while also going hard on handhelds like Switch 2, PS6 portable, Xbox Ally etc. All the advantages of "easy lighting work" that path tracing is supposed to provide on the dev side is pointless is they need to make handheld capable versions of the game with a raster/hybrid RT fallback. Maybe that's what AMD GI 1.0 is supposed to solve. I think even in the medium-term future, PT is just going to be Ultra+ graphics option rather than a genuine replacement for raster for the majority of gamers.

Switch 2 will be its own thing, and people ought to dismiss the idea that it's getting any kind of multiplatform titles after 2027 when the other consoles release. The Xbox Ally is a PC gamepass handheld and it does well enough at that. Magnus will be the next-gen Xbox, not the Xbox Ally.
The PS6 Canis with RDNA5 could still be a RT "powerhouse" for low resolutions and 1st-party developers could indeed be passing on shadowmaps and pre-baked lighting for its games. With AMD getting their sights on ML upscaling, ray reconstruction and newer, more flexible and lighter RT approaches it could be that 16 CUs RDNA5 + 192bit GDDR5X are adequate for e.g. a 540p (or lower) base resolution.



And I don't understand why no one talks about how much nvidias RT performance has stagnated, when normalized for raster. A 3080 performs very similar to a 4070 in RT and PT. A 4080 performs very similar, and sometimes even better, than a 5070 Ti. They're basically relying on more cores/more power to push RT rather than actually improving the real-world RT performance of their GPUs. Its been seven years since the RTX meme started and, normalized for raster, its crazy how little the RTX 50 series has improved in RT compared to the RTX 20 series. Despite all their rhetoric, nvidia doesn't seem to want to allocate more transistor budget for RT, and thereby fall behind in raster compared to AMD.
Nvidia kept the same process between Ada and Blackwell while increasing bandwidth by 50% thanks to GDDR7. They were definitely counting on much better voltage-per-clock curves than the ones they ended up with, like a Kepler -> Maxwell transition.
 
  • Like
Reactions: Tlh97
Jul 27, 2020
28,173
19,209
146
And since we ran out of xtor cost scaling, good luck putting enough shader cores in mainstream to make RTRT at least acceptable perf.
Basic idea of a new patent:

Restrict RTRT to 1080p.
Upscale the RTRT frame to desired resolution (1440p or 2160p).
Use that frame for neural enhancement of final frame at target resolution.

Make the RTRT hardware a fixed function unit that is available on all SKUs with no performance differences since all of them have to render the base frame at 1080p anyway.

Yes, I have no idea what I'm talking about but sounds good to me :D
 

Ranulf

Platinum Member
Jul 18, 2001
2,906
2,575
136
Threat Interactive is back... top 10 most neglected areas of modern graphics. Starts off with FXAA and blurriness.


Edit: Drama or something as the original is privated now, they put out a revised video due to lawyer advice. Revised link:

 
Last edited:

ToTTenTranz

Senior member
Feb 4, 2021
908
1,519
136
The very first game I tried after installing my brand new RTX 5090 was CP2077 and Indiana Jones and the Great Circle on their Path Tracing maxed out forms, because Digital Foundry said they were absolutely transformative and didn't look anything like a rasterized game with prebaked lighting.

Turns out they... actually don't look that much of an upgrade? CP2077 I used to play on my RX 6900XT on the weakest level of raytracing and it doesn't look like I was missing a lot, if anything. Another game I started playing was Alyx from 2020 on SteamVR and... it actually looks better in places.



Especially after playing Alyx, I honestly feel we'd all be having much more immersive and satisfying experiences had we focused on rasterization over raytracing for great Virtual Reality experiences as well as handhelds.
 

DrMrLordX

Lifer
Apr 27, 2000
23,177
13,265
136
Especially after playing Alyx, I honestly feel we'd all be having much more immersive and satisfying experiences had we focused on rasterization over raytracing for great Virtual Reality experiences as well as handhelds.

Rasterization performance is good even for traditional 2D displays. Blame NV for the RTRT nonsense. Hopefully the industry will wake up eventually and demand better rasterization performance out of cards. Wouldn't count on it though since NV has muddied the waters.
 

SolidQ

Golden Member
Jul 13, 2023
1,542
2,546
106
What happens with NV in RT mode :rolleyes:
cc71f961770922d285e4331dc7b741d0.png

c7e705fb6cbbcfa9c816ace7ca840fca.png
 
  • Like
Reactions: psolord
Jul 27, 2020
28,173
19,209
146
I find these GameGPU graphs always confusing. What's different in the settings since both graphs don't seem to indicate anything other than having the same settings?

EDIT: Oh, I see it now. Just using an underline to highlight the setting used :rolleyes:
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,350
33,247
146
I find these GameGPU graphs always confusing. What's different in the settings since both graphs don't seem to indicate anything other than having the same settings?

EDIT: Oh, I see it now. Just using an underline to highlight the setting used :rolleyes:
Is that the site that runs the memory without XMP/EXPO?

No native XeSS or FSR4 support? Thanks Nvidia!
 
  • Haha
Reactions: igor_kavinski
Jul 27, 2020
28,173
19,209
146
AT did run memory at JEDEC though.
And I hated them for it. I suspect it was more laziness to avoid the extra testing of running every test again at higher RAM speeds. When computerbase.de was already doing it, they had no good reason to do it other than to irk people. If they wanted to be extra anal, they should've reviewed ONLY office desktops from Dell, HP and Lenovo since those would run JEDEC speeds and not expensive Taiwanese mobos capable of so much more.
 

marees

Platinum Member
Apr 28, 2024
2,192
2,839
96
Do you care about ...... hardware Lumen

Both Senua's Saga Hellblade 2 & Silent Hill F didn't have it
(& both had a showcase performance)

but pc version of Senua's Saga Hellblade 2 I believe had hardware lumen (nvidia branch of UE5 that caused optimization issues for the 9070xt)

Silent Hill F will be a great showcase of what Unreal Engine 5 can deliver with proper work in terms of image quality and performance, judging from some early impressions.
In a new video shared on YouTube today, the tech experts at Digital Foundry revealed that the PC and PlayStation 5 build available during Gamescom last month behind closed doors did an excellent job showing how the engine by Epic is fully capable of delivering clean image quality and stable performance with no stuttering. The game's performance, in particular, sounds impressive for a game powered by the engine, as it ran at stable 60 FPS on base PlayStation and PC, though not at native 4K resolution, with 30 FPS-locked cutscene that still managed to look great thanks to a very good usage of motion blur. Better yet, none of the obnoxiuous stuttering issues too often seen in Unreal Engine 5-powered games were present, which definitely sounds promising. The game looked so clean, without the typical temporal noise seen in other games, and ran so well that it was almost difficult to tell it is powered by Unreal Engine 5, according to Digital Foundry, although a more in-depth look revealed the use of software Lumen which delivers great lighting in the vast majority of scenarios.

 

Vikv1918

Member
Mar 12, 2025
74
195
66
Looks like Epic is doing more damage control about UE5 after the recent spate of optimization debacles. First Sweeney himself blaming developers, and now theyre shilling a future game having no stuttering (apparently a great achievement). And of course they have Digifoundry as the "trusted" mouthpiece to deliver this news.