Question DF: Fortnite's Unreal Engine 5 Upgrade Reviewed - ( UE 5 reduce the need for HW RT?)

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Heartbreaker

Diamond Member
Apr 3, 2006
4,228
5,228
136
I just watched Digital Foundry analysis of the Unreal Engine 5 upgrade to Fortnite.

It's Lumen Global Illumination Software mode looks as good as HW version most of the time - Very impressive.

If more games use UE5 then I think most people don't really need to be that concerned about HW RT, while still getting excellent Global Illumination:

 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
If your own personal criteria are what you're using to determine which technologies are good and which are bad it doesn't really matter as far as buying whatever satisfies you, but trying to post on a form as though it's anything other than your own subjective view is going to get pushback.

You're acting as though I'm anti Lumen or something. I support Lumen and I've said many times in this thread that I find it impressive for a software solution. The only thing I said was that software Lumen isn't real ray tracing because it cannot do off screen reflections. That's just my opinion, I never presented it as factual.

But it is a fact that software Lumen cannot do off screen reflections. Hardware Lumen can, however.

Methinks you just want to enjoy an expensive product and need to believe all of the marketing hype to justify the decision to yourself because it was expensive and if all of those bells and whistles aren't the best, then how could the cost be justified. I can't think of any other reason why anyone would care so much about hardware ray tracing but still be perfectly fine with software frames. The only problem is that you end up arguing from a conclusion without thinking through the points to get there first and it winds up looking ridiculous to anyone who isn't working through the process backwards and spots the glaring inconsistencies.

You can believe whatever you like, but it's clear you're misrepresenting my stance on this. I'm not anti Lumen, and I'm not saying that developers should forego it in favor of hardware ray tracing. I understand exactly why software Lumen was created believe me.

However, just because Lumen uses ray tracing in its calculations doesn't mean it's actual ray tracing. If Lumen is actual ray tracing, why is there a separation between hardware and software? Why the differentiation?
 

NTMBK

Lifer
Nov 14, 2011
10,241
5,027
136
Imagination deficit?

Once it can fool the brain, it's good enough.

So you look off screen?

:eek:

It's very, very easy to spot the screen space effects, because you get discontinuities at the edge of the screen, and when objects in-scene occlude. Just rewatch the Digital Foundry video and look for the weird shimmering that happens, especially whenever the camera moves.

It's unfortunate that DF put so many static shots in their video, because that's the best case for these effects. The temporal upsampling has time to fully resolve, with no errors (because it's sampling the exact same scene multiple times), and you don't see the errors at the edge of the scene shifting and changing. Once you see it in motion, it's honestly kind of a blotchy mess. I'd rather take no RT over that sort of distracting shimmer.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That's just it, Lumen ISNT fake ray tracing. It is ray tracing. But is more like, truncated ray tracing, when compared to DXR. Where its 95% of the data, for 50% more performance.

Never said, or even implied that Lumen was fake ray tracing. Software Lumen uses ray tracing to a much more limited extent than DXR, which is why it lacks off screen reflections.

And yes, some people certainly have to justify their purchases.

Contrary to some of the opinions in this thread, software Lumen is not a replacement for or even a threat for hardware ray tracing.

Like many things in life, ray tracing has tiers or degrees. Software Lumen is just another tier of ray tracing with limited scope to improve performance. That's all it is.
 

NTMBK

Lifer
Nov 14, 2011
10,241
5,027
136
Never said, or even implied that Lumen was fake ray tracing. Software Lumen uses ray tracing to a much more limited extent than DXR, which is why it lacks off screen reflections.



Contrary to some of the opinions in this thread, software Lumen is not a replacement for or even a threat for hardware ray tracing.

Like many things in life, ray tracing has tiers or degrees. Software Lumen is just another tier of ray tracing with limited scope to improve performance. That's all it is.

I think it's unhelpful that Epic just called the two options "hardware" and "software". That implies that it's the same algorithm in both cases, which simply isn't true. Software Lumen is using screen-space rays, while hardware Lumen is using the BVH. They're different algorithms, with different performance trade offs and different results.

Screen space rays is a worse approximation, which is why it performs fast enough to run without hardware acceleration, and why it gives visibly worse image quality. They are both crude approximations of real physics, but HW Lumen is a closer approximation.
 
  • Like
Reactions: Carfax83

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I don't even have a RTX 4090 but have a RX 7900 XT. I believe HW accelerated Lumen is faster and smoother ie higher frames than the software based lumen. Look I get we want alternatives to RTX but HW based RT is not going anywhere as Lumen also uses it. I want AMD to match or surpass Nvidia in RT. I feel like this thread was made just because AMD is behind in HW RT.

If AMD was ahead of Nvidia in HW RT this thread would not exist and if it did people would say just get a Radeon.

I'm glad you stated this. There's definitely a trend with pro AMD users mocking Nvidia features (and Nvidia users by extent) for X, Y or Z reasons.......but once similar features are adopted by AMD, then all of a sudden, they are all onboard and gung-ho about it. It was very noticeable with DLSS and FSR.

I admit that I was skeptical of frame generation, but once I actually used it and saw no differences in image quality and experience, I'm convinced it is extremely useful. How many games come out on PC (especially at launch) that are horribly optimized, especially when it comes to CPU utilization? FG is like a magic feature that can neutralize developer incompetence/inaction when it comes to optimizing for the PC platform.

So far, I'm only really using it in one game, Witcher 3 Next Gen edition. The CPU optimization is really bad because they used a DX12 wrapper so the game is really only using two threads for rendering and ray tracing calculations which makes it CPU bottlenecked, even with a 13900KF.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
I'm glad you stated this. There's definitely a trend with pro AMD users mocking Nvidia features (and Nvidia users by extent) for X, Y or Z reasons.......but once similar features are adopted by AMD, then all of a sudden, they are all onboard and gung-ho about it. It was very noticeable with DLSS and FSR.

I admit that I was skeptical of frame generation, but once I actually used it and saw no differences in image quality and experience, I'm convinced it is extremely useful. How many games come out on PC (especially at launch) that are horribly optimized, especially when it comes to CPU utilization? FG is like a magic feature that can neutralize developer incompetence/inaction when it comes to optimizing for the PC platform.

So far, I'm only really using it in one game, Witcher 3 Next Gen edition. The CPU optimization is really bad because they used a DX12 wrapper so the game is really only using two threads for rendering and ray tracing calculations which makes it CPU bottlenecked, even with a 13900KF.

NVIDIA Marketing should look in to hiring you. "DLSS 3 Technology, now powered by magic!"
 

maddie

Diamond Member
Jul 18, 2010
4,750
4,692
136
You're acting as though I'm anti Lumen or something. I support Lumen and I've said many times in this thread that I find it impressive for a software solution. The only thing I said was that software Lumen isn't real ray tracing because it cannot do off screen reflections. That's just my opinion, I never presented it as factual.

But it is a fact that software Lumen cannot do off screen reflections. Hardware Lumen can, however.



You can believe whatever you like, but it's clear you're misrepresenting my stance on this. I'm not anti Lumen, and I'm not saying that developers should forego it in favor of hardware ray tracing. I understand exactly why software Lumen was created believe me.

However, just because Lumen uses ray tracing in its calculations doesn't mean it's actual ray tracing. If Lumen is actual ray tracing, why is there a separation between hardware and software? Why the differentiation?
Never said, or even implied that Lumen was fake ray tracing. Software Lumen uses ray tracing to a much more limited extent than DXR, which is why it lacks off screen reflections.



Contrary to some of the opinions in this thread, software Lumen is not a replacement for or even a threat for hardware ray tracing.

Like many things in life, ray tracing has tiers or degrees. Software Lumen is just another tier of ray tracing with limited scope to improve performance. That's all it is.
Post 1) However, just because Lumen uses ray tracing in its calculations doesn't mean it's actual ray tracing.
Post 2) Never said, or even implied that Lumen was fake ray tracing.


From two consecutive posts by you.
 

coercitiv

Diamond Member
Jan 24, 2014
6,215
11,963
136
I'm glad you stated this. There's definitely a trend with pro AMD users mocking Nvidia features (and Nvidia users by extent) for X, Y or Z reasons.......but once similar features are adopted by AMD, then all of a sudden, they are all onboard and gung-ho about it. It was very noticeable with DLSS and FSR.
You seem confused. Many were against the first version of DLSS, since it introduced a VERY noticeable drop in image quality. Since then the overall opinion about DLSS on this forum with respect to image quality has changed drastically, as DLSS 2+ is worth considering. The same applied to FSR, people cheered the open nature and the noticeable improvement in quality from 1.0 to 2.0, but I have yet to see this sudden mass of users actively promoting the use of FSR that you speak of.
 

poke01

Senior member
Mar 8, 2022
741
727
106
You seem confused. Many were against the first version of DLSS, since it introduced a VERY noticeable drop in image quality. Since then the overall opinion about DLSS on this forum with respect to image quality has changed drastically, as DLSS 2+ is worth considering. The same applied to FSR, people cheered the open nature and the noticeable improvement in quality from 1.0 to 2.0, but I have yet to see this sudden mass of users actively promoting the use of FSR that you speak of.
Funny you say that, the same thing happened with DLSS 3. People spoke ill of it, calling it 'fake frames' on YouTube and Twitter but December comes and AMD says it's doing it next year and most people were suddenly okay with frame get generation.

Even with the advent of RDNA 3 people hyped it beyond the moon. It will be more efficient and powerful than Lovelace. Instead it was the opposite and the only reason I have an AMD is cause of Linux and they open source their driver's. The moment Nvidia open sources their drivers and is more consumer Friendly and has sane prices I will look into an RTX card.
 
  • Like
Reactions: Carfax83

coercitiv

Diamond Member
Jan 24, 2014
6,215
11,963
136
Funny you say that, the same thing happened with DLSS 3. People spoke ill of it, calling it 'fake frames' on YouTube and Twitter but December comes and AMD says it's doing it next year and most people were suddenly okay with frame get generation.
Show me the discussion on this forum where "most people" are suddenly okay with frame generation.

Even with the advent of RDNA 3 people hyped it beyond the moon. It will be more efficient and powerful than Lovelace.
That's the problem of the people who fell for the hype. I followed the RDNA3 thread as I was in the market for a GPU this year, and there was no shortage of skeptics either. More importantly though, there's no shortage of critics in the thread today, after AMD managed to screw the pooch with their marketing and relatively poor performance/dollar. The day of the RDNA3 presentation was the day I chose to buy RDNA2.

The moment Nvidia open sources their drivers and is more consumer Friendly and has sane prices I will look into an RTX card.
So you're staying with AMD or Intel.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
I don't get why people seem to attack only Nvidia based tech.
Again, who cares? If you don't have a horse in the race, what is it to you? It's weird how people get a persecution complex about hardware they have nothing to do with, other than being a consumer of.

I will answer the question though, since you seem to be missing the forest for all the trees blocking your view. First we will dispense with the word attack. That is some pearl clutching hyperbole. Moving on, the reason I am critical of Nvidia is that they make everything proprietary. Then further engage in consumer unfriendly behavior by not providing features that will even work for all of their own customers. Worse yet, their competitors are the ones that looked out for millions of GTX customers by including them in FSR and XeSS support. And to beat the other dead horse, having to fight out the VRR battle to make Nvidia allow their own customers to have it without buying a more expensive monitor. Consumer unfriendly behavior is going to get called out every time.

Intel is new to the market, and has a bunch of wood to chop. Yet they still took the time to make a version of XeSS available to other vendors. It may not be great, but I for one applaud the effort.

None of that gets AMD off the hook for their shenanigans either. I don't like that they follow Nvidia's lead by paying to have games exclude competitors features. For example: The Callisto Protocol not having DLSS is uncool. Yes, everyone can use FSR, but DLSS on a supporting card should be an option too. It is a stupid way to spend money, and the sooner the practice stops, the better it is for PC gamers.
 
  • Like
Reactions: KompuKare

Aapje

Golden Member
Mar 21, 2022
1,385
1,865
106
That's the problem of the people who fell for the hype. I followed the RDNA3 thread as I was in the market for a GPU this year, and there was no shortage of skeptics either. More importantly though, there's no shortage of critics in the thread today, after AMD managed to screw the pooch with their marketing and relatively poor performance/dollar.

Indeed. But the weird part is that I see people arguing that only Nvidia gets criticism (and not just here), but don't see the same thing about AMD.

I'm just wondering where this delusion comes from.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
Indeed. But the weird part is that I see people arguing that only Nvidia gets criticism (and not just here), but don't see the same thing about AMD.

I'm just wondering where this delusion comes from.
Persecution complex, confirmation bias, whatever it is, it's weak. It's also inaccurate, as has been pointed out.

Take a look at how many pages the RX6400&6500 thread is, and the number of people roasting them. Then have a look at how many pages the GTX 1630 is, and the number of people roasting it. The 1630 is the worst fps per dollar release I can think of. Yet, the reaction was largely muted. With those most critical of the RX cards being conspicuously absent from the thread.

That's only one example of how AMD gets criticized when they have it coming. While also being an example of how an Nvidia product that is the worst in ages received far less ire than it deserved.

White knighting Nvidia is bad, and they should feel bad.
 
  • Like
Reactions: Stuka87

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
NVIDIA Marketing should look in to hiring you. "DLSS 3 Technology, now powered by magic!"

It may as well be magic as it can in some cases double or even quintuple framerates. This technology is very useful for circumventing CPU limitations in poorly optimized titles, or increasing performance dramatically in extremely heavy RT titles like Portal RTX.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Post 1) However, just because Lumen uses ray tracing in its calculations doesn't mean it's actual ray tracing.
Post 2) Never said, or even implied that Lumen was fake ray tracing.


From two consecutive posts by you.

You're taking an isolated sentence without looking at the overall context of my argument.

Just because I don't consider software Lumen to be actual ray tracing, doesn't mean I think it's fake. I haven't used the word "fake" in association with the word Lumen throughout this entire thread.

Lumen clearly uses ray tracing, but at a much lower approximation compared to hardware solutions (including HW Lumen) to the point where a major advantage of ray tracing like off screen reflections is not implemented.

If software Lumen had off screen reflections, I would never have even said anything. To me that feature is one of the distinguishing features of ray tracing, so it being absent to me disqualifies it from being considered ray tracing.......in my opinion.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You seem confused. Many were against the first version of DLSS, since it introduced a VERY noticeable drop in image quality. Since then the overall opinion about DLSS on this forum with respect to image quality has changed drastically, as DLSS 2+ is worth considering. The same applied to FSR, people cheered the open nature and the noticeable improvement in quality from 1.0 to 2.0, but I have yet to see this sudden mass of users actively promoting the use of FSR that you speak of.

The original incarnation of DLSS was promoted mostly to increase performance when RT was enabled. Most Nvidia users understood that as the performance hit with RTX enabled was very high back in the day, but I recall comments from the pro AMD crowd about how DLSS was cheating and what not. When the technology improved in terms of image quality, those sentiments diminished.

The anti DLSS 3 crowd remind me of the anti DLSS crowd, in how they are initially resentful and distrusting towards the new technology, but once AMD introduces their version, they suddenly become more receptive.
 
  • Like
Reactions: igor_kavinski

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
Oh My goodness. Even AMD is bringing Frame generation next year. I don't get why people seem to attack only Nvidia based tech. Do you know how silly that reply is?

They pretty much have to. Surely NVIDIA will market it heavily and if AMD didn't bring there own version it would be one less box to tick.

Funny you say that, the same thing happened with DLSS 3. People spoke ill of it, calling it 'fake frames' on YouTube and Twitter but December comes and AMD says it's doing it next year and most people were suddenly okay with frame get generation...

Just because you say so doesn't make it true. Give me an example. Carfax seems to think so but he surely doesn't have a pro-AMD agenda.

The original incarnation of DLSS was promoted mostly to increase performance when RT was enabled. Most Nvidia users understood that as the performance hit with RTX enabled was very high back in the day, but I recall comments from the pro AMD crowd about how DLSS was cheating and what not. When the technology improved in terms of image quality, those sentiments diminished.

The anti DLSS 3 crowd remind me of the anti DLSS crowd, in how they are initially resentful and distrusting towards the new technology, but once AMD introduces their version, they suddenly become more receptive.

I think there was probably some confusion/misunderstanding about DLSS when it first came out. I remember people arguing as to whether AA or DLSS was better. With DLSS 2 that went away. Your speculation about FSR 3, well, let's wait and see. My guess is people will still see it as a gimmick or that games will become so demanding it will be seen as a "necessary evil". I don't think any of that has to do with company preference. There is no smoke filled AMD misinformation spreader rooms out there, such as userbenchmark is for Intel.
 
  • Like
Reactions: Stuka87

Mopetar

Diamond Member
Jan 31, 2011
7,850
6,015
136
The anti DLSS 3 crowd remind me of the anti DLSS crowd, in how they are initially resentful and distrusting towards the new technology, but once AMD introduces their version, they suddenly become more receptive.

DLSS at least had one aspect that I thought was potentially useful, which was the ability to extend the life of older cards. To what extent companies support or try to discourage this is another matter, but the use case is there. The rest of it is at best a setting to minimize loss of image quality for the performance boost.

DLSS3 does not offer the same sort of uplift prospects since it seems to work best for most titles when the frame rate is already high enough that any added latency isn't felt and the frame time is low enough so that any particular bad frame isn't around long enough to really be noticed. It really only seems to be there to make impressive looking graphs. Encouraging this sort of behavior will only lead to more of it.

Why add more shaders when that silicon area can be spent on frame generation hardware instead? Don't worry about minimizing the impact to image quality, just tune the algorithms to give us bigger numbers. After all the people buying the top-tier thousand dollar cards keep singing our praises and far be us from not giving them what they want.

All of that aside, what irks me the most is that the same crowd that's now largely praising this technology are the same sort that five years ago would look down their nose at you for suggesting lowering settings to very high (or god forbid high) to sacrifice some graphical quality for the uplift in frame rate, but are now happy to do just that. Of course it still says "ultra" on the settings so it must be the best.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
All of that aside, what irks me the most is that the same crowd that's now largely praising this technology are the same sort that five years ago would look down their nose at you for suggesting lowering settings to very high (or god forbid high) to sacrifice some graphical quality for the uplift in frame rate, but are now happy to do just that. Of course it still says "ultra" on the settings so it must be the best.

Lowering from Ultra settings? Pure heresy.

Can't say I notice much difference between very high and ultra these days. Must be getting old. Ultra often seems to tank framerates without any noticeable improvement in visual quality. Or that's my opinion anyway.
 
  • Like
Reactions: igor_kavinski

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,501
20,626
146
Lowering from Ultra settings? Pure heresy.

Can't say I notice much difference between very high and ultra these days. Must be getting old. Ultra often seems to tank framerates without any noticeable improvement in visual quality. Or that's my opinion anyway.
It isn't just you. Multiple tech tubers have turned out content about how dumb playing on max settings is in the vast majority of games. For myself, max settings are for revisiting games years later, when the latest mid tier card can run them better than the flagships did at that time.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
The DLSS and DLSS 3 hate is near identical. DLSS was raged against for it's "fake" resolution because it was upscaling, and being fake should be discounted in every test and feature comparison even if the graphical results were near identical. DLSS 3 with "fake" frames sounds exactly the same to me.

I am sure there are still a few that still hate DLSS 2, but the reality is most of the opposition magically disappeared when FSR 2 came out. I am sure the same will be true for DLSS 3.
 

Aapje

Golden Member
Mar 21, 2022
1,385
1,865
106
The next version of DLSS 3 does seem to fix the overlay artifacts, which is a major issue I have with it from seeing it in action on Youtube.