• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Poll: Do you care about ray tracing / upscaling?

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you care about ray tracing / upscaling?


  • Total voters
    259
Or you can get 97%* of the experience by flipping one (1) switch.

* this is an approximation and the exact value depends on how much one fears missing out on marginal visual improvements
No no no, Tim says the experience can be transformative. Which oddly he started using that term not long before DLSS 4 details came out.


coincidence-incredibles.gif


😛
 
Time heals everything, including the pain of fps as slow as molasses. I remember Riddick running at less than 30 fps due to turning on Shader Model 3.0 on a Geforce 6600 GT and setting everything else to max. I felt dismayed. In the moment, it felt really bad. That wasn't exactly a cheap card for me and it wasn't even mine (someone loaned it to me for a few days). And now the Quadro K1000M in my Ivy Bridge laptop runs it at frickin' 100+ fps with everything maxed out. It will be the same with RT and even Path Tracing one day.
 
It will be the same with RT and even Path Tracing one day.
Playing through the tomb raider games maxed at 4K is great. And I certainly hope this quote is right. Ending support for 32bit PhysX has me concerned though. Invoking the slippery slope fallacy - What is to say in 10 years time that the current method of RT/PT won't be 💩 canned too? I.E. backwards compatibility with these features being removed from future hardware.
 
Playing through the tomb raider games maxed at 4K is great. And I certainly hope this quote is right. Ending support for 32bit PhysX has me concerned though. Invoking the slippery slope fallacy - What is to say in 10 years time that the current method of RT/PT won't be 💩 canned too? I.E. backwards compatibility with these features being removed from future hardware.

They can’t be removed unless 64-bit computing is dead. All NV features now are 64-bit thankfully.

Edit: NV removed 32-bit support because they can’t be bothered to develop and maintain 32-bit cuda unless that happens with 64-bit cuda which is very highly unlikely I wouldn’t be worried
 
TIm being rather effusive in his appraisal and praise of DLSS4. I have to say; it making the textures appear to be higher quality is a big win. If 8GB RTX can use DLSS 4 to make up for having to turn textures down, in order to keep the framebuffer from overflowing? That's very good stuff for owners.

 
Time heals everything, including the pain of fps as slow as molasses. I remember Riddick running at less than 30 fps due to turning on Shader Model 3.0 on a Geforce 6600 GT and setting everything else to max. I felt dismayed. In the moment, it felt really bad. That wasn't exactly a cheap card for me and it wasn't even mine (someone loaned it to me for a few days). And now the Quadro K1000M in my Ivy Bridge laptop runs it at frickin' 100+ fps with everything maxed out. It will be the same with RT and even Path Tracing one day.

By the time 100 FPS RT becomes cheap mainstream like that we're all going to be too busy with our sex robots to care.
 
I am going to post about 32bit PhysX backwards compatibility being deprecated on 50 series. The OP rage quit long ago, and they pushed it the same way they push RT. Some are claiming the current tech can't be left behind like that, but never say never.

This is how they used to promote the feature -


Much more obvious impact than RT IMO.

 
I have no problem with them dropping Physx, nothing lasts forever. It’s also their choice BUT shouldn’t a company have a decency to say they are dropping X support when it was supported in the prior gen.

NV marketed and promoted the hell out Physx and then silently kills is the problem here. Inform the customer on what is missing from last gen and not thru a ******** forum post but in your Specs page.
 
Threat Interactive on AlanWake2's graphics.


"It's time to end the LIES about how great the graphics are and how it's a "9th gen wonder" by exposing the disgusting botchery of rasterized graphics to promote their sponsor's (Nvidia) unneeded vendor accelerated software."

 
TLDW = Good callout of DF Alex, the Nvidia human centipede middle segment.

Dig that he refers to Days Gone in his videos as an example of good optimization. Game looks great for how easy it is to run.

Ask yourself why every reviewer uses every RTX optimized game in their test suites. My answer is shilling, plain and simple. Nothing is worse than pop-in on my list of IQ issues. Figures it is using a bunch of baked lighting. And FINALLY! someone that hates how bad RT shadows can be as much as I do. I have written numerous times of how bad they look in some games, and why I never use them. Almost always look worse/wrong and tank performance on my AMD cards.
 
Monster Hunter Wilds at 4K on a Gigabyte RTX 4080 Super Aero


For my final trick, I will improve my frame rate by 29% with only a couple movements of my mouse. I'm disabling ray tracing. Not lowering it, either, just flat-out disabling it. This is because I've seen a massive improvement in performance, from 68.25 fps to 88.23 fps, which reduces much of the chance of a sub-60 frame rate at any point. I don't miss it. The loss of accurate reflections in the one pond shown in the benchmark is easily negated by the fluidity of the game on my 144 Hz monitor. Maybe in the full game I'll be missing my reflection more, the vain PC gamer I am, but not right now.

Strangely, my PC's best performance in-game only earns me a rating of 'Good'. Looks like Frame Generation really messes with the game's ability to give out nice compliments. It's fine.

So, there we have it. I landed on a solid combination by doing entirely the expected: enabling DLSS and disabling ray tracing. A tale as old as time—or at the very least, as old as 2018.


 
I was just roasting Alex, but that is because I call them like I see them. And that means I now have to give props where they are due. He nails it in the first couple of minutes why 32 bit PhysX removal sucks. Backwards compatibility; what are we filthy console peasants? And the joy of buying a new card then loading up your old games and having the fps fly. The firs clip is 17 fps on a 5080... GTFO.

 
From what I've seen of comparisons so far, no. Some games unfortunately don't have a sharpen slider so you can't disable it (DA:V) and some disable the slider when using DLAA (BG3), but in general I had more problems with forced sharpening 1-2 years ago than I've seen with the Transformer model so far (watching most comparison videos at 4K, and playing 5-10 games with swapped DLL's the last few weeks).
 
Control was wonky with DLSS Quality, major frequent frame spikes compared to no DLSS. Shadow of the Tomb Raider crashed when I looked in one particular direction several times in a row. But again, this is with forcing the most recent files so I kind of have myself to blame (the 50 series might have some bugs too as well for all I know). The only game I've played at length is HFW and that was fine.

No idea about difference compared to CNN except for that, but I'd assume its the generally agreed upon roughly 1-1.5 tier better quality at same setting and average ~8% slower (just from memory).
 
Back
Top