Poll: Do you care about ray tracing / upscaling?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you care about ray tracing / upscaling?


  • Total voters
    175

Mopetar

Diamond Member
Jan 31, 2011
7,941
6,242
136
There's nothing stopping Sony from working with third parties to find ways to improve engine performance through a variety of means, but I doubt they'd favor any one company.

If anything they might look at how they can add specialized instructions to the ISA that can replace multiple instructions that one of the engines might otherwise use instead. Even Lumen will make use of RT hardware so I don't see either Sony or Microsoft stripping it out entirely.

The next console generation will probably drop in 4 years or thereabouts, so the silicon for them is likely to start being developed now. I don't think we'll be seeing massive RT improvements that would enable what will be mid-range PC hardware of that era to drive heavy RT workloads. Look how much something like Portal runs even on top hardware of today, and that's a decades old game.

I'd love to see either Sony or Microsoft try to take even more advantage of some of the process side techniques being employed by AMD in future designs. Having an MCM design and massive amount of v-cache available to share for the CPU/GPU would help curb costs while enabling a lot of added performance uplift. Given consoles are a fixed development target it might even let them try something even more radical.
 
  • Like
Reactions: KompuKare

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
Yes, both are important, but the upscaling is much more important.

I have played a few games with RT, The Guardians of the Galaxy and The Ascent, spring up to mind. They looked great and there was notable difference at specific locations. In my experience, RT is something that must be witnessed first hand and absolutely in a moving picture. Stills don't do it justice. There is some fine playful lighting in there, that the brain needs to process within a few frames, to appreciate the difference. A picture is worth 1000 words they say, but a picture with RT, is worth at least 1500, mkay? xD


I did enable RT in Jedi Survivor in some specific locations and the difference was very easy to see. However I cannot play it with RT, thanks to the publisher's rushing of the game. Also I could swear to God, they are actively boycotting anything Intel+Nv with that game. Mostly Nv. The good news is that even without RT, it is downright great looking. Much better than Jedi Fallen Order for sure.

I updated my screenshot collection, with more screenshots, if anyone cares (haven't finished the game yet). All that at 4k/epic/ fsr2 balanced (yeah not even fsr2 quality, I'm cheap xD).


RT comes with extra money requirement, more gpu power needed as well as cpu power too, which also brings the whole thing to an extra +100W power draw requirement or more. It's nice to have, if you are aware of the requirements and it works correctly that is.

============


Now upscaling is another can of worms for sure. It has grown to be super important for me and that comes from a die hard native resolutioner. The main reason for that, is that my gaming, has mostly moved from a close up 1080p panel, to a far away 4K panel. So I cannot really see the difference. I see the difference in the settings, but not the difference in resolution, except from very few instances.


Upscaling is the exact opposite of RT, in terms of power requirements and money requirements. It brings gpu requirements down, both in power draw and vram and expelled heat and money, while in my case, I lose absolutely nothing. Personally I have only to gain from that.

A few months back I was playing Chorus on the 3060ti, at 4k/epic and while the card could do it, I still enabled dlss, to bring power draw 100W down. I mean why not? The result was virtually the same. I am doing the same thing now with Jedi Survivor, as I said above. 4k/fsr2 balanced, not even quality. It looks great. Ok for people that have a screen, positioned less than 1m in front of them, maybe it will be more obvious, but I kinda doubt how many rogue pixels an individual can really notice, when he has a 4k panel in from of him.

Fow lower resolutions, while speaking of FSR2, I also did a couple tests just two days ago, on the two most notorious PC games of late, Jedi Survivor and The Last of Us, on my old GTX 1070 and my 1080p panel, which is very close, typical of a PC setup. The upscaler helped the card a lot, even at 1080p and although this time I could see the difference in favor of the native res, FSR2 does a damn fine job, even up close. While you are getting lost in the game world, it's far easier to see the increased smoothness of the framerate, than any crooked pixels that look funny. For me, it's far more important to keep higher settings at lower res, with upscaler, than lower the settings in order to gain framerate at native res.

Proof of concept in these two vids (non monetized)




Yeah for people with older hardware, upscalers are super important. As a maintainer of the above hobbyist channel, I receive many many questions, regarding older hardware, that people are looking to buy NOW, because that's what they can afford. Not everyone is PCMR in the PC space, but that's why I love PC gaming. It's the sheer scaling the whole thing has. I mean you CAN play Jedi Survivor, from PS3 era hardware, to PS5 era hardware and beyond.


PS Fan fact, I tried writing this post before and I accidentally pressed ctrl+w, dunno how, but I did. Everything was lost. The forum needs an autosave feature for retards like me, asap.
 
Last edited:

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,450
2,490
146
In my experience, DLSS and FSR are nice for squeezing out a bit more FPS without having to turn down settings too much. I do this on some modern games as I have a 270Hz 1440p monitor, and I try to keep my FPS up around there. RT, on the other hand, has far too much performance cost for my taste, and the only game I have used it on is Quake 2 RTX.

I feel with current hardware, RT and 200+ FPS just aren't feasible yet.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
I'd like to post an example, as to why I like upscaling, dlss in this case, in A Plague Tale Requiem, on the 4070ti, that I am playing right now.

I took five screenshots, with msi osd enabled. Four at 4k, (native, dlss quality, balanced and performance) and one 1080p native. All at ultra preset, no rt shadows. The 1080p native is to be compared with the 4k performance one, since it also renders at 1080p internally.

4k native.
APlagueTaleRequiem_x64_2023_06_18_11_46_13_009 4k native.jpg


4k dlss quality

APlagueTaleRequiem_x64_2023_06_18_11_46_38_737 4k quality.jpg

4k dlss balanced

APlagueTaleRequiem_x64_2023_06_18_11_46_57_630 4k balanced.jpg

4k dlss performance
APlagueTaleRequiem_x64_2023_06_18_11_47_21_401 4k performance.jpg

1080p native

APlagueTaleRequiem_x64_2023_06_18_11_48_38_904 1080native.jpg

Best way to view them is to download them and view them on a 4k screen, in full screen mode. I use irfanview, in full screen mode, so I can quickly change the viewed picture with the arrow keys and see the differences in a more pronounced way.

Here is a zip file with all of them, if anyone cares.

So what are my results you say? I will start with the obvious. For the 4k native, as you can see, the card is pulling max 270w and 60fps is not possible. At 4k quality, 60fps is possible at 265w. At 4k balanced we drop to 215w. 4k performance is at 187w which seems a little high compared to the native 1080p at only 135w. However there's a reason for that.

And the reason is the image quality. I mean look at the trunk of the tree right in front of Amicia and Hugo. In 4k performance it's pretty defined, while it's blurry at 1080p native. Same goes with the flower on the right and grass on the left. Those tensor cores are not there to make the gpu die pretty I guess. And that extra 50watts are really giving their money's worth.

In all dlss screenshots, the differences are much smaller. You can see them on the boards on the roof of the bridge for example, it the background behind the tree. They look...different after the 4k quality mode. There are others. However while playing, I find the balanced preset, to be, the most...errr...balanced.


For me, the framerate itself is the most important. Then the settings, then the dlss settings. And dlss helps all the above. If dlss or fsr for that matter, was not an option for this game, aside from lowering resolution, which would mess up the 1:1 pixel mapping of the panel, my next option would be a much more expensive card or a more power hungry card. Non of these would do for me. So yeah, upscaling all the way baby.


ps why no rt shadows you ask? because it messes up with the frametimes and I believe the cpu is the culprit. It has become more evident, that the (my 12400f) cpu is more problematic than the video card itself, when it comes to RT. The 4070ti can do 4k/balanced/rt with +25w from what I have seen. Yeah this 12400f will become a 13600k when prices drop a bit, but I am hearing something about compatibility with the 14th gen with the z690 so I wait.


ps2 yeah while we are at it, although a bit off topic, dlss helps greatly on the vram side. 8gb seems to be enough for the balanced preset and I swear this is the best looking game I have seen to date. Even better than Jedi Survivor. If there is something next gen out there, this is it. And very interesting and different as a game I might add.


ps3 and for those people saying they will go get a console instead, keep in mind that on the ps5 at least, tested by digital foundry, this game went from 1440p/30/medium with ultra textures, to 1080p/60/less than medium/less than ultra textures. Nor the extra video ram helped or anything. You can have a great 4k/60/ultra/dlss balanced gaming experience even on a 4070. And I don't know maybe a 4060ti? Go get consoles buddies and good luck.:)
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,257
12,197
136
So what are my results you say? I will start with the obvious. For the 4k native, as you can see, the card is pulling max 270w and 60fps is not possible. At 4k quality, 60fps is possible at 265w. At 4k balanced we drop to 215w. 4k performance is at 187w which seems a little high compared to the native 1080p at only 135w. However there's a reason for that.

And the reason is the image quality.
How much power for 1440p native?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
...aside from lowering resolution, which would mess up the 1:1 pixel mapping of the panel...
Err, what? Using GPU scaling sends a native image to the panel regardless of chosen resolution.

Now if you say "but it isn't native 1:1 internally!" well guess what? DLSS/FSR/XeSS aren't either.
 
  • Like
Reactions: Tlh97 and Leeea

CP5670

Diamond Member
Jun 24, 2004
5,517
592
126
Both are important, but very game specific for me. RT in some games looks amazing (Control, Metro Exodus) and in other games (Jedi Survivor, Cyberpunk) I can barely see any difference or there is some difference but it's hard to say that it's actually an improvement.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
Err, what? Using GPU scaling sends a native image to the panel regardless of chosen resolution.

Now if you say "but it isn't native 1:1 internally!" well guess what? DLSS/FSR/XeSS aren't either.
Oh that's great. I will have to do some on/off testing though, but I guess that explains why The Callisto Protocol, which I played at 1080p on the 3060ti, on that 4k panel was looking ok. That and maybe because it was very dark, due to the nature of the game. However there was a clear difference for A Plague Tale Requiem.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
So ummm, is there any specific test that measures cpu performance with RT on. What cpu is best for RT? (and why RT affects the cpu or vice versa-I don't get it)
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,949
7,363
136
The one test everyone seems to miss is how does all this upscaling gizmogetry look compared to old school monitor scaling.

We see it compared against native res, but what about compared against just stretching 1080p across a 4k display?
 

coercitiv

Diamond Member
Jan 24, 2014
6,257
12,197
136
What cpu is best for RT? (and why RT affects the cpu or vice versa-I don't get it)
AFAIK some of the calculations associated with RT are done on the CPU (for now at least). This means an additional CPU workload which may introduce a bottleneck depending on the game's CPU utilization. The worst result is when the game is CPU bound before enabling RT.

The best "raster" gaming CPUs are usually best at handling the RT overhead as well, if only for the fact that they already start from a pole position in terms of processing the game. Besides that, CPUs with generous caches and low latency access to memory should be favored in this race, as AFAIK the RT related CPU workload favors low latency access to memory.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,853
3,211
126
I am too busy watching the big explosions in front of me, trying not to die, and saving me a restart of 20-30 min of progress then watching the reflections off a puddle on the floor, or watching the grass wave in unison with the wind.

So yea.... RTS is nice... but i don't really care most of the time.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
The one test everyone seems to miss is how does all this upscaling gizmogetry look compared to old school monitor scaling.

We see it compared against native res, but what about compared against just stretching 1080p across a 4k display?
BFG suggested it and I will have to look for the gpu scaling option in nvidias control panel. I use default settings. If it is enabled by default, then it was enabled when I did the tests. The 4k/performance dlss compared to the native 1080p, was night and day. And this can be seen in the screenshots too. I mentioned the trunk for example.

One can question, what happens with the motion vectors, because the screenshots were taken on a still position, however I can only attest my personal experience, dlss performance was also far better.

I will also test the native 1440p as was suggested with power measurement and all and I'll see. I think I had done such test on the 3060ti a while back, in Chorus, on the same panel, and 4k/dlss was also better. But I will retest, because memory falters.
 

CP5670

Diamond Member
Jun 24, 2004
5,517
592
126
I don't use GPU scaling because somehow the Nvidia drivers don't output 4K properly with it. It may some specific issue with the LG OLEDs, but it looks cropped with black bars and blurry.
 

Hotrod2go

Senior member
Nov 17, 2021
298
168
86
Never use upscaling nor ray tracing, at least with my current interests in gaming atm. Even into the foreseeable future, unless a game comes out that I really want these features on then I don't care about them.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
So out of 111 votes, roughly 2/3 of people don't give a crap about RT. That's telling for a tech forum that tends to be more informed than the masses.

I couldn't even tell an IQ difference with some of these. And with some others, rasterization looked better. Of course RT still universally takes a massive dump on performance.


 
  • Like
Reactions: Tlh97 and Ranulf

GodisanAtheist

Diamond Member
Nov 16, 2006
6,949
7,363
136
So out of 111 votes, roughly 2/3 of people don't give a crap about RT. That's telling for a tech forum that tends to be more informed than the masses.

I couldn't even tell an IQ difference with some of these. And with some others, rasterization looked better. Of course RT still universally takes a massive dump on performance.



-Eh the poll isn't worded particularly well. I voted yes to both because they are important from a tech/marketing/competitive standpoint but they're not terribly relevant to me and the games I actually play and the settings I actually play them at (i.e. they're important but 144fps native is even more important).

I'm sure some folks voted "no" for the same reason and a whole host of other reasons.

Also this board is full of a bunch of geezers who aren't really representative of the broader market that tends to latch on to and assign undue importance to marketing buzzwords.
 

Ranulf

Platinum Member
Jul 18, 2001
2,388
1,270
136
Vex in that second video makes a good argument about ray tracing being really good for creative stuff in tv/movies but not so much in games yet. There are some really good amateur videos on youtube using unreal engine to make fairly realistic looking background scenes. Just people doing this at home studios with an average PC, not major workstations with professional gpu level gear.
 

Mopetar

Diamond Member
Jan 31, 2011
7,941
6,242
136
I'm surprised that even a third of people here actually care. The technology is still too new and developers too inexperienced to really put it to good use. Unless you have a top-end GPU the performance hit just isn't worth it in my opinion or you have to use some other technology that reduces quality to compensate for the performance loss.

Unless we go to 8K next generation, whatever the 5090 turns out to be is going to be able to spit out more frames than most monitors will be able to handle. At least the RT offers some extra eye candy for them. Or they're going to be able to play whatever decade old game that gets a rerelease with a full RT mode at passable frame rates.

It'll be more interesting to see what the next generation of consoles does. RT was their buzzword for this generation, but they never stick with the same thing twice. 8K seems like something that's out of reach for console hardware, but perhaps they'll implement some DLSS/FSR into the hardware to make it possible to get 60 FPS 8K on a console.
 

coercitiv

Diamond Member
Jan 24, 2014
6,257
12,197
136
Also this board is full of a bunch of geezers who aren't really representative of the broader market that tends to latch on to and assign undue importance to marketing buzzwords.
And yet the OP just posted 2 videos where young content creators talk about how RT isn't that impressive for them, mostly because delivering RT for affordable cards means sacrificing quality to a point where you might as well not bother. Impressive RT needs lots of compute power to deliver both impressive quality and super fluid gameplay. This requires high-end systems, both GPU and CPU wise. And who has the high-end systems? Mostly the geezers :p
 
  • Like
Reactions: Ranulf

menhera

Junior Member
Dec 10, 2020
21
66
61
RT in Control.png
I believe computer graphics seek realism, so I voted for ray tracing, but not for upscaling. I just don't like bad implementations full of noise and artifacts like in the pic.
 
  • Like
Reactions: MrPickins

Aapje

Golden Member
Mar 21, 2022
1,434
1,955
106
I'm surprised that even a third of people here actually care. The technology is still too new and developers too inexperienced to really put it to good use.
I think that the issue is more that mainstream tech, and then I'm looking at consoles in particular, are just too weak for it. So they can't actually make games just for RT, but have to make the game primarily for non-RT. That means that they can't optimize the game for RT, so it generally won't take advantage of the possibilities.

So the games are then actually mostly non-RT games with reflective glass and puddles.
Unless we go to 8K next generation, whatever the 5090 turns out to be is going to be able to spit out more frames than most monitors will be able to handle. At least the RT offers some extra eye candy for them.
Yeah, but a proper breakthrough requires the x060 to be able to run it decently, not the 90 class card. And with stagnating price/performance, it will take longer for us to get to that point.

It'll be more interesting to see what the next generation of consoles does. RT was their buzzword for this generation, but they never stick with the same thing twice. 8K seems like something that's out of reach for console hardware, but perhaps they'll implement some DLSS/FSR into the hardware to make it possible to get 60 FPS 8K on a console.

They could finally go for high refresh rates rather than 30/60Hz.
 

psolord

Golden Member
Sep 16, 2009
1,968
1,205
136
A quick stop frame benchmark again. Starfield this time. 4070ti 4k ultra. Native res vs fsr2 lowest resolution.

41fps at native with 225W power draw. 60fps with fsr2 and 112W power draw. The game is completely unplayable at 4k native on the 4070ti while it's buttery smooth with fsr2.

I don't know, is native really 113W better looking?

Starfield_2023_09_03_20_35_59_399.jpg



Starfield_2023_09_03_20_35_13_465.jpg

Also 1GB less vram for fsr2, which in percentage is 18% less. And yeah, 8gb is enough once again, on a very high caliber AAA game, but that is a subject for another thread. I have many more screenshots for that specific matter, which will be posted soon.