Poll: Do you care about ray tracing / upscaling?

Page 31 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you care about ray tracing / upscaling?


  • Total voters
    241

DavidC1

Golden Member
Dec 29, 2023
1,884
3,030
96
Threat Interactive made a comment that I was suspecting for a while. Nvidia is moving the entire graphics pipeline one by one so they keep their monopoly by forcing competitors to catchup. Since Nvidia is the one that comes up with it first, the focus is always on them so no matter how well AMD/Intel does with the catchup, in the end, it's just that - Nvidia but late and worse.

I know Intel does research in R&D and I read somewhere that in terms of graphics patents they have at least much as AMD does, if not more. Both vendors need to ignore what Nvidia is doing and push their own. One thing I noticed after being absent in latest games for so long is that modern games are so noisy. I disabled all upscaling features because even without upscaling it's noisy anyway. We should have an option to enable traditional AA such as SSAA and MSAA.

I'm pretty sure Matrox Parhelia's Edge AA can be implemented in modern silicon with very acceptable die size, and with much better performance and zero blurring, better than even SS/MSAA.
 

adroc_thurston

Diamond Member
Jul 2, 2023
7,192
9,969
106
We should have an option to enable traditional AA such as SSAA and MSAA
How do I do this in deferred without doing GPU-nuking levels of overdraw?
Threat Interactive
He's an idiot.
Nvidia is moving the entire graphics pipeline one by one so they keep their monopoly by forcing competitors to catchup. Since Nvidia is the one that comes up with it first, the focus is always on them so no matter how well AMD/Intel does with the catchup, in the end, it's just that - Nvidia but late and worse.
They're the only ones trying to push the envelope.
 
  • Like
Reactions: NTMBK

adroc_thurston

Diamond Member
Jul 2, 2023
7,192
9,969
106
I don't know if I'd call the current ideas of frame generation or the latest ray tracing gimmick that barely runs on their own "top" end hardware pushing the envelope.
They technically are irrespective of their actual IQ/perf/whatever contribution.
 

Thunder 57

Diamond Member
Aug 19, 2007
4,038
6,753
136
Threat Interactive made a comment that I was suspecting for a while. Nvidia is moving the entire graphics pipeline one by one so they keep their monopoly by forcing competitors to catchup. Since Nvidia is the one that comes up with it first, the focus is always on them so no matter how well AMD/Intel does with the catchup, in the end, it's just that - Nvidia but late and worse.

I know Intel does research in R&D and I read somewhere that in terms of graphics patents they have at least much as AMD does, if not more. Both vendors need to ignore what Nvidia is doing and push their own. One thing I noticed after being absent in latest games for so long is that modern games are so noisy. I disabled all upscaling features because even without upscaling it's noisy anyway. We should have an option to enable traditional AA such as SSAA and MSAA.

I'm pretty sure Matrox Parhelia's Edge AA can be implemented in modern silicon with very acceptable die size, and with much better performance and zero blurring, better than even SS/MSAA.

Isn't supersampling just part of "Resolution Scaling" these days? Where you can render at 100%+ of your monitors resoultion and then it is displayed at your normal resoultion? Hell Battlefield even describes it that way. I turn off the crap AA and bump up the resolution to 110-120% or higher, whatever runs well enough.
 
Last edited:

DavidC1

Golden Member
Dec 29, 2023
1,884
3,030
96
Isn't supersampling just part of "Resolution Scaling" these days? Where you can render at 100%+ of your monitors resoultion and then it is displayed at your normal resoultion? Hell Battlefield even decsribes it that way. I turn off the crap AA and bump up the resolution to 110-120% or higher, whatever runs well enough.
It's actually different, because higher resolution on a monitor means better quality without sacrificing clarity. When you supersample AA, you get an overall more blurry result.

4K is at the point where you don't really need more, and 1000-series cards were starting to be too powerful for it, so they had to introduce a new feature that could continue to sell more cards - Ray Tracing. But because the performance hit was so high and Moore's Law gain crashing, they had to make it up with ML rendering techniques, which is basically 180 degrees opposite of SSAA, reversing better quality trend of past 40 years. Add to that you have lazy and incompetent DEI-driven developer teams that can't code even if their life depended on it, now games "need" ML rendered subsampling AA so it can perform reasonably, and literally fake frames. I know the trend isn't really going to change, because average people are deaf, dumb, and blind, sprinkled with apathy.


Mind you, games are NOWHERE near photorealism. Carefully and artistically rendered marketing photos might convince you otherwise, but at an actual game level it isn't. So it makes things like artificial limitation such as Ray Tracing pointless in the first place. Because a labor of love such as good art direction lasts - Countless Mario games, or more modern examples like Warcraft series. But a "photorealistic" title looks crap 10 years later, because it's not at photorealism.
 

Thunder 57

Diamond Member
Aug 19, 2007
4,038
6,753
136
It's actually different, because higher resolution on a monitor means better quality without sacrificing clarity. When you supersample AA, you get an overall more blurry result.

4K is at the point where you don't really need more, and 1000-series cards were starting to be too powerful for it, so they had to introduce a new feature that could continue to sell more cards - Ray Tracing. But because the performance hit was so high and Moore's Law gain crashing, they had to make it up with ML rendering techniques, which is basically 180 degrees opposite of SSAA, reversing better quality trend of past 40 years. Add to that you have lazy and incompetent DEI-driven developer teams that can't code even if their life depended on it, now games "need" ML rendered subsampling AA so it can perform reasonably, and literally fake frames. I know the trend isn't really going to change, because average people are deaf, dumb, and blind, sprinkled with apathy.


Mind you, games are NOWHERE near photorealism. Carefully and artistically rendered marketing photos might convince you otherwise, but at an actual game level it isn't. So it makes things like artificial limitation such as Ray Tracing pointless in the first place. Because a labor of love such as good art direction lasts - Countless Mario games, or more modern examples like Warcraft series. But a "photorealistic" title looks crap 10 years later, because it's not at photorealism.

I'll agree to that. I can play 9 year old games that look fantastic. I played Doom The Dark Ages which requires ray tracing and for what? It doesn't look any better than the past two and performance is crap compared to them. You can run Doom 2016 on a potato. I ended my Gamepass subscription because new games do nothing for me. I'll just keep playing the oldies but goodies.
 

dr1337

Senior member
May 25, 2020
523
807
136
4K is at the point where you don't really need more
hard disagree
and 1000-series cards were starting to be too powerful for it
literally the last two games I bought don't use RT and I have to use FSR to make them run at/above 60fps at 4k. 5800x3d and 6900xt
Mind you, games are NOWHERE near photorealism. Carefully and artistically rendered marketing photos might convince you otherwise, but at an actual game level it isn't.
sure polycounts and textures aren't massively increased over the years, but they're only one part of a scene. photorealistic lighting is one (very important) part of how we get to photorealistic real time rendering.

Frankly reflections are one of the biggest parts of what makes something feel real. A lot of other things such as wet surfaces, water, metallic surfaces, ect all go up in fidelity a lot with higher quality lighting and true reflections.

Do old games look bad? no. Do they look better with RT? IMO not necessarily, I find the quake 2 and hl2 RT conversions to be a bit uncanny valley with how only the lighting system is upgraded. But does RT implemented correctly look better than baked in lighting? also not necessarily but if the industry can make lighting something that gets its own specific acceleration and isn't just a shader, I think we're going to wind up better off in the future. UE is already a nightmare with shader compile times, and it would better for things to be unified instead of every engine having its own solution
 
Last edited:

DavidC1

Golden Member
Dec 29, 2023
1,884
3,030
96
hard disagree
If you aren't slowly walking around, you won't notice most of such details. Also, the gains are so miniscule, when there are other areas to improve.

Blurriness and noise is easily noticeable even during motion, and often it gets worse.
literally the last two games I bought don't use RT and I have to use FSR to make them run at/above 60fps at 4k. 5800x3d and 6900xt
Cause they are wasting resources due to abundance of them. Countless examples of people wasting when given a lot. Innovations and efficiency comes from constraints. Also, the lack of efficiency really started accelerating after Frame Gen debuted. Enough games are having FG as minimum/recommended system requirements!
sure polycounts and textures aren't massively increased over the years, but they're only one part of a scene. photorealistic lighting is one (very important) part of how we get to photorealistic real time rendering.
And they still look mostly like cartoons, except in the first year or two when it's released. I know, I grew up with them. I read articles where people thought games in 1995 were realistic. I thought Quake 3 looked so good too.

The games that focus on photorealism almost looks disgusting, while good art direction still looks good.
 

DavidC1

Golden Member
Dec 29, 2023
1,884
3,030
96
Kinda wrong.
Most (all?) PBR titles hold up.
Cartoons with fancy lighting and shadows are still cartoons. None of the games today look anything near real life. And if it does look semi real life, it'll fade away in a few years. You can fool gamers who subconsciously have a separate standard in games versus real life, but objectively it's nowhere near "photorealistic".
 
  • Like
Reactions: Tlh97

DavidC1

Golden Member
Dec 29, 2023
1,884
3,030
96
I admire your autism but PBR mats are PBR.
I admire your constant personal attacks that are finely tuned to just stay under the moderator radar while being in a personal reality bubble that rivals Apple's RDF in size and strength.
 

adroc_thurston

Diamond Member
Jul 2, 2023
7,192
9,969
106
I admire your constant personal attacks that are finely tuned to just stay under the moderator radar while being in a personal reality bubble that rivals Apple's RDF in size and strength.
You still haven't refuted my point on PBR mats.
 

marees

Golden Member
Apr 28, 2024
1,777
2,397
96
ML2CODE is included in AMD's GPU computing (GPGPU) platform, ROCm 6.1 and later.

Rather than executing trained AI cores at runtime, it optimizes them as existing Compute Shader code and enables native execution.

ROCm vs ML2CODE

From the horse's mouth

ROCm is a GPU computing platform that enables general-purpose AI core development and operation on AMD GPUs. It will be developed with a focus on comprehensive support for industry standards to work with hardware other than AMD.
 On the other hand, ML2CODE is an in-house dedicated framework developed by AMD (Author's note: In other words, there are no plans to release it to the public in the near future). The biggest advantage is that it allows you to seamlessly integrate ML2CODE products like FSR Redstone directly into DirectX and Vulkan graphics pipelines with minimal latency.
 We believe that the ML2CODE solution is the best way to integrate and deploy 3D graphics and AI technologies, at least for now.


https://www.4gamer.net/games/869/G086962/20250612045/
 

Ranulf

Platinum Member
Jul 18, 2001
2,864
2,514
136
Hardly shocking an older engine (CryEngine) runs well and looks decent. Game was janky and buggy though from most reports I've read, much like the first game. When the first game came out in 2018 it pushed many systems to their limit.
 
  • Like
Reactions: marees

Ranulf

Platinum Member
Jul 18, 2001
2,864
2,514
136
This post on the Borderlands4 steam forum is... a love poem to frame generation and graphics pipelines.


"DLSS Frame Generation isn’t “faking” anything. You’re just functionally illiterate with silicon."

"It does not “guess” randomly.

It does not duplicate previous frames.

It does not inject latency, unless you’re using it improperly (e.g., V-Sync + Reflex off).

It does not replace rendering pipelines - it augments them.

Latency impact is negligible when configured properly:
DLSS Frame Gen + NVIDIA Reflex = Input-to-photon latency at or below native rendering in most cases. 3X/4X modes are more aggressive but maintain perceptual smoothness and maintain animation coherency, thanks to high-frequency OFA sampling and consistent motion vector integrity. On a tuned system with 5XXX series GPUs and proper system-level latency optimization:
-Frame Gen 4x = 200-300 fps on Ultra settings in modern titles
-GPU Frame Queue stays consistent
-No perceptible artifacting in motion
-L1 and L2 cache coherency + RAM tuning (e.g., tREFI and WRRD tuning) further reduce microstutter
-Reflex holds latency to sub-15ms end-to-end even with high frame counts"

"If you still think it’s “fake” then I’m sorry, but you’re not a technical user - you’re a performance LARPer. You think “real frames” = raster pass only, but you’re living in 2006. The pipeline has evolved. Just like G-Sync and FreeSync were misunderstood when they dropped, Frame Gen is the next step. Don’t be the guy who called SSDs “cheating” back in 2010. That guy was wrong too. Please stop commenting and making posts unless you actually know what a frame buffer is, you delusional clowns."

You're living in the past man! Contemporize past 2006! Modernize bro!

Oh, but don't use phys-x now on Borderlands 2/3, the new 5000 cards don't support it anymore.

Niktek brings us some highlights from tech tube reviewers: