Question DF: Fortnite's Unreal Engine 5 Upgrade Reviewed - ( UE 5 reduce the need for HW RT?)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
I just watched Digital Foundry analysis of the Unreal Engine 5 upgrade to Fortnite.

It's Lumen Global Illumination Software mode looks as good as HW version most of the time - Very impressive.

If more games use UE5 then I think most people don't really need to be that concerned about HW RT, while still getting excellent Global Illumination:

 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,827
7,190
136
If we use software ray tracing then we can no longer justify the purchase of the best GPU for hardware ray tracing.

- I don't know if you've heard but software ray tracing isn't real.

When rays are traced by the CPU rather than specialized hardware on a GPU, the rays are objectively of lower quality, typically made in some east asian factory by children paid for volume production over quality.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Once it was confirmed that the PS5 and Series consoles would have HW RT, I did think support for it would take off once support for the last-gen consoles were dropped. But that UE5 takes such a stance for using software raytracing seriously put a dent in my beliefs.
Especially on consoles, where they are able to make the solutions much more tailor made, I had thought that they would offload all that they could to the dedicated units.

I just cannot see the big hype for HWRT right now. The current tech demos for PC are just so demanding they make the case for Lumen alot stronger IMO.

Whether it's hardware limitations or API limitations, it just feels like HWRT came too soon. The next "big thing" in rendering has been here since 2018 but is actually deemed too slow/inflexible as the standard solution for the most popular graphics engine.
If Epic teamed up with one of the IHVs just for a proof of work concept, using whatever extensions neccessary, showing that it is possible to update DXR to accelerate SW Lumen, or remove the barriers making them unable to recommend always using the current path for HW RT, the case for dedicated HW accelleration would be alot stronger again.
 
  • Like
Reactions: GodisanAtheist

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
This actually reminds me of a lot of PhysX. When PhysX stuff first came out, it was a hardware only option (required CUDA). And nVidia people went on and on about it requiring their hardware in order to work. People would even run a second GPU that would only do the physics work. Even people with AMD cards, would run a second nVidia GPU for it.

Then, after a few years, and game companies being overly annoyed at spending time to develop features that only worked on some cards, it got opened up. And low and behold, it ran BETTER in software than it did on a GPU.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It came out 3 years ago (2019 after I looked it up), so my memory of it was a bit off. But all the heavy work is done by the CPU. And it works on any GPU, even GPU's like a GTX 1660 that otherwise have no HW RT.

It's the video that @coercitiv linked to, and it's as I said, the GPU still does the ray tracing, but the CPU does extensive BVH setup and maintenance. BVH Embree dramatically accelerates ray tracing workloads, to the point where it could be done on the shader cores without requiring HW acceleration.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It took decades for real-time RT to be feasible, I think we can wait a bit more to have it done efficiently with a mix of software and hardware.

Meanwhile, not using "jiggery poker" is costing us an arm and a leg both in terms of prices and power consumption. "Moore's Law is Dead" so we decided we want to take the EXPENSIVE route to improving IQ in games. What could go wrong?

Either way is going to require a lot of computation. The consoles have to turn off Lumen in Fortnite 120 FPS mode because the CPU is heavily taxed. And this is considering that consoles have hardware blocs and more efficient APIs to reduce CPU utilization for things like draw calls and compression/decompression.

PC has much more powerful CPUs available yes, but the overhead is also going to be much greater as well. The optimal performance/graphical fidelity combination may be software lumen with hardware ray tracing for reflections.

Lumen is also UE5 only, so it won't be available in other engines unless it's licensed out.

Overall though, I think Nvidia and AMD (but especially Nvidia) as well as Microsoft and Sony are too invested in HW accelerated RT and upscaling to let up off the gas pedal much less abandon these technologies.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
"jiggery poker" indeed. I see a bit of a similarity to the ornithopter vs conventional aircraft debate. AFAIK, the gimmick used by older games was quite efficient computationally vs the brute force RT method.

The argument "All that hassle just to get a reflection in a mirror." seems so ignorant of computational realities. As an aside, you need a universe to fully simulate a universe.

My point is that to get a simple reflection took a whole lot of developer effort back in the day and it wasn't reproducible throughout the game. With RT, accurate off screen reflections come with zero or very little developer effort because the algorithm takes care of such things. Same thing with global illumination.

RT is a huge benefit for developers when it comes to game development because they don't have to spend nearly as much time or effort on making sure that lighting, shadows and reflections are properly implemented.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Kind of an odd stance to take where it needs to be real HW ray tracing to count, but it doesn't matter if the entire frame isn't real.

That's a poor analogy, because FG produces a frame that is nearly 100% identical to if it had been rendered. I personally can't tell the difference at all. But with software "ray tracing," off screen reflections (which is a distinguish feature of ray tracing) is completely absent.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
This actually reminds me of a lot of PhysX. When PhysX stuff first came out, it was a hardware only option (required CUDA). And nVidia people went on and on about it requiring their hardware in order to work. People would even run a second GPU that would only do the physics work. Even people with AMD cards, would run a second nVidia GPU for it.

Then, after a few years, and game companies being overly annoyed at spending time to develop features that only worked on some cards, it got opened up. And low and behold, it ran BETTER in software than it did on a GPU.

I see your point, but PhysX was proprietary and doomed to fail just for that alone, while DXR is available to Nvidia, Intel and AMD.

In the end, I think Lumen is going to be beneficial to the evolution of ray tracing. It's a hybridized solution that works well in tangent with hardware RT and once it becomes properly optimized for hardware acceleration, it could very well be the most performant yet.
 

biostud

Lifer
Feb 27, 2003
18,251
4,765
136
This actually reminds me of a lot of PhysX. When PhysX stuff first came out, it was a hardware only option (required CUDA). And nVidia people went on and on about it requiring their hardware in order to work. People would even run a second GPU that would only do the physics work. Even people with AMD cards, would run a second nVidia GPU for it.

Then, after a few years, and game companies being overly annoyed at spending time to develop features that only worked on some cards, it got opened up. And low and behold, it ran BETTER in software than it did on a GPU.
When physX came out it required a physX card from Ageia. It was later bought by nvidia.

DSC00409.jpg
 
  • Like
Reactions: Stuka87

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Once it was confirmed that the PS5 and Series consoles would have HW RT, I did think support for it would take off once support for the last-gen consoles were dropped. But that UE5 takes such a stance for using software raytracing seriously put a dent in my beliefs.
Especially on consoles, where they are able to make the solutions much more tailor made, I had thought that they would offload all that they could to the dedicated units.

I just cannot see the big hype for HWRT right now. The current tech demos for PC are just so demanding they make the case for Lumen alot stronger IMO.

Whether it's hardware limitations or API limitations, it just feels like HWRT came too soon. The next "big thing" in rendering has been here since 2018 but is actually deemed too slow/inflexible as the standard solution for the most popular graphics engine.
If Epic teamed up with one of the IHVs just for a proof of work concept, using whatever extensions neccessary, showing that it is possible to update DXR to accelerate SW Lumen, or remove the barriers making them unable to recommend always using the current path for HW RT, the case for dedicated HW accelleration would be alot stronger again.
There has to be something new to sell. More of the same is a much weaker selling point then new features. HWRT, AI upscaling and AI frame interpolation are new features - we will spend more to get something that supports the new features. Hence they will always get the hype.

That said the "HWRT came too soon" argument never held water - plenty of good HWRT games exist that look much better with the RT on. UE5 and lumen in it are also like they are specifically because they were optimised for the consoles which have rubbish HWRT, if they had better HWRT support then UE5 would have been different. I am sure EPIC would have loved better HWRT on the consoles but it didn't exist so they made the best renderer they could manage with the HW available.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
When physX came out it required a physX card from Ageia. It was later bought by nvidia.

DSC00409.jpg

Oh, right! I forgot that they originally developed it. But I still think it the comparison holds up. HW RT just has the special hardware built into the GPU, instead of an add-on card.
 
  • Like
Reactions: biostud

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136

Spjut

Senior member
Apr 9, 2011
928
149
106
That said the "HWRT came too soon" argument never held water - plenty of good HWRT games exist that look much better with the RT on. UE5 and lumen in it are also like they are specifically because they were optimised for the consoles which have rubbish HWRT, if they had better HWRT support then UE5 would have been different. I am sure EPIC would have loved better HWRT on the consoles but it didn't exist so they made the best renderer they could manage with the HW available.

"Much better" is subjective, but the majority of times when comparisons are made there's a vocal group about it not being worth the performance cost.

Lumen being optimized for the consoles is the other side of the coin I think people are looking at the wrong way. I think it speaks volumes of the current DXR hardware when EPIC decided to bet on a solution that's leaving the dedicated units idle. And that's on consoles, where every "wasted" CPU and GPU cycle costs them dearly. If the RT units had been able to contribute in any way, that would have benefitted their performance targets.

As it is now, these hardware blocks are idle when using the software path, and Epic is unable to recommend the hardware path for all use cases. And looking at Fortnite, I do think the HWRT is getting more praise than it deserves considering what SW RT does without having special hardware to accelerate it. Having had these hardware blocks changed for compute had probably benefitted most UE5 titles more.

The one thing that has proven itself though IMO is DLSS. It has been challenged more with the improvements of FSR2 and XeSS, but that's a feature that gives a huge performance boost with trade-offs that many think aren't noticeable to the average joe.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
That's a poor analogy, because FG produces a frame that is nearly 100% identical to if it had been rendered. I personally can't tell the difference at all. But with software "ray tracing," off screen reflections (which is a distinguish feature of ray tracing) is completely absent.

In the best case scenario it produces something close enough that unless you're looking for it or the frame time is so low it has to be particularly awful perhaps, but there are plenty of examples of where it doesn't work and what it does output looks bad even in motion and almost like pure graphics nightmare fuel in isolation.

But the same argument applies in either direction. Some people won't notice the lack of RT that's been faked to some or a total degree, so why get any more upset about it. If your own personal criteria are what you're using to determine which technologies are good and which are bad it doesn't really matter as far as buying whatever satisfies you, but trying to post on a form as though it's anything other than your own subjective view is going to get pushback.

Methinks you just want to enjoy an expensive product and need to believe all of the marketing hype to justify the decision to yourself because it was expensive and if all of those bells and whistles aren't the best, then how could the cost be justified. I can't think of any other reason why anyone would care so much about hardware ray tracing but still be perfectly fine with software frames. The only problem is that you end up arguing from a conclusion without thinking through the points to get there first and it winds up looking ridiculous to anyone who isn't working through the process backwards and spots the glaring inconsistencies.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
In the best case scenario it produces something close enough that unless you're looking for it or the frame time is so low it has to be particularly awful perhaps, but there are plenty of examples of where it doesn't work and what it does output looks bad even in motion and almost like pure graphics nightmare fuel in isolation.

But the same argument applies in either direction. Some people won't notice the lack of RT that's been faked to some or a total degree, so why get any more upset about it. If your own personal criteria are what you're using to determine which technologies are good and which are bad it doesn't really matter as far as buying whatever satisfies you, but trying to post on a form as though it's anything other than your own subjective view is going to get pushback.

Methinks you just want to enjoy an expensive product and need to believe all of the marketing hype to justify the decision to yourself because it was expensive and if all of those bells and whistles aren't the best, then how could the cost be justified. I can't think of any other reason why anyone would care so much about hardware ray tracing but still be perfectly fine with software frames. The only problem is that you end up arguing from a conclusion without thinking through the points to get there first and it winds up looking ridiculous to anyone who isn't working through the process backwards and spots the glaring inconsistencies.

That's just it, Lumen ISNT fake ray tracing. It is ray tracing. But is more like, truncated ray tracing, when compared to DXR. Where its 95% of the data, for 50% more performance.

And yes, some people certainly have to justify their purchases.
 
  • Like
Reactions: DAPUNISHER

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,622
146
People also buy RTX 4090s for productivity as they are much cheaper than Quadros if your workflow suits it
Who cares? This thread isn't about workflows, it's about gaming. And how alternative approaches to Nvidia's are proving not only viable, but in important ways, preferable.

The narrative that has been pushed hard, is that you have to trace them rays! So you got to get yo self one of 'dem dar RTX cards for it! While you're at it, why not buy the most expensive one. That'll pump that stock price up so I can'z afford one using the profits I make off of it! ;)

That narrative is running out of steam. It makes me recall that I read all the talking points posted over and over about why paying more for G-sync was "totally worth it". Here we are years later, and expensive hardware G-sync is practically dead, while the free to use implementations have proliferated. It is my fervent desire that RTX ends up experiencing the same fate. Because it tries to force gamers to use Nvidia hardware... yet again.

The vast majority of us all agree that inclusive is greater than exclusive. If the alternative to Nvidia hardware is another mildly inferior but universally useable alternative like Freesync or FSR? Sign me up.
 

poke01

Senior member
Mar 8, 2022
741
726
106
I don't even have a RTX 4090 but have a RX 7900 XT. I believe HW accelerated Lumen is faster and smoother ie higher frames than the software based lumen. Look I get we want alternatives to RTX but HW based RT is not going anywhere as Lumen also uses it. I want AMD to match or surpass Nvidia in RT. I feel like this thread was made just because AMD is behind in HW RT.

If AMD was ahead of Nvidia in HW RT this thread would not exist and if it did people would say just get a Radeon.
 
  • Like
Reactions: Carfax83

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
I don't even have a RTX 4090 but have a RX 7900 XT. I believe HW accelerated Lumen is faster and smoother ie higher frames than the software based lumen. Look I get we want alternatives to RTX but HW based RT is not going anywhere as Lumen also uses it. I want AMD to match or surpass Nvidia in RT. I feel like this thread was made just because AMD is behind in HW RT.

If AMD was ahead of Nvidia in HW RT this thread would not exist and if it did people would say just get a Radeon.

I wasn't talking about you. I think the comments have more to do with NVIDIA's crappy behavior lately that has angered many people.

There's no shortage of people complaining about AMD's 7900XT pricing or overhyping RDNA3.