Metro Exodus benchmarks

Hitman928

Golden Member
Apr 15, 2012
1,718
162
136
#2
Pretty different results between the 2 sites. Haven't dug in to see if they include details that could explain the difference.
 

amenx

Platinum Member
Dec 17, 2004
2,380
37
126
#3
TPU has even tougher settings, with advanced physx and hairworks on.
 

Hitman928

Golden Member
Apr 15, 2012
1,718
162
136
#4
TPU has even tougher settings, with advanced physx and hairworks on.
Why then are TPU's results showing so much higher performance than Guru3D's?


 

amenx

Platinum Member
Dec 17, 2004
2,380
37
126
#5
OK, found out why. They used different benches. Guru3d used the games inbuilt benchmarking tool, which should more accuately reflect the tough and light parts of the performance. TPU used actual gameplay to bench, and they apparently must have only gone through the lighter parts of the game.
 

Hitman928

Golden Member
Apr 15, 2012
1,718
162
136
#6
I wish sites would post a video of their test sequence if they choose to roll their own, otherwise it's hard to put any credence in the results.

It also looks like ray tracing can only be set to high or ultra, so when they are testing high, they are really testing what I think most would call medium.

Really good looking game though and the RTX used for GI I think is it's best use case right now. I know someone in a "tear down" video pointed out some spots where the GI was a bit noisy looking but you'd probably never see it while playing unless you stopped to look for it. Always liked the Metro games, might have to pick this one up too.
 

pauldun170

Diamond Member
Sep 26, 2011
5,649
83
126
#7
Too bad they pulled this off steam. As an rtx owner I would have bought this on release day
 
Aug 11, 2008
10,457
66
126
#8
They pulled it from Steam? Is it coming back? I got a steam gift card for Christmas and this was one of the few games I was interested in.
 

amenx

Platinum Member
Dec 17, 2004
2,380
37
126
#9

mohit9206

Golden Member
Jul 2, 2013
1,012
54
116
#10
How about some medium settings test on 1080p to see if cards like RX560/1050Ti etc can provide a smooth experience.
 

Dribble

Golden Member
Aug 9, 2005
1,654
101
106
#12
Metro went with Epic Games Store. This title at least wont be back on steam.
https://arstechnica.com/gaming/2019/01/epic-games-store-snags-metro-exodus-away-from-steam/
It's a 1 year exclusive on Epic, then it will be back on steam. The backlash was pretty heavy which is good so I think a lot of other game makers will think twice when offered the same deal.

As for the game, well the lighting is just better. It needs a seriously powerful gpu to manage it (with RTX) but you get next gen lighting if you have it. Reminds me of the olden days when new games came out and actually made gpu's struggle.
 

Veradun

Senior member
Jul 29, 2016
259
21
86
#13
Jul 12, 2006
94,005
2,091
136
#15

railven

Diamond Member
Mar 25, 2010
6,404
99
126
#17
I'm starting to feel like the enthusiast PC Gaming Nerd is a relic and slowly dying. What I mean is, it seems people (or at least from my perspective) are more accepting of this garbage blur than when card reviewers did Image quality comparisons.

With the introduction of on-the-fly scaling (thanks consoles) there is more support for "as long as it's 4K" versus "it has to be native 4K". Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it. CGI in movies has gotten to the point where you can't even see where a frame starts and ends, it's all one blurred mess. And the kids love it!

I'm not even 40 and I need a rocking chair already.
 

n0x1ous

Platinum Member
Sep 9, 2010
2,455
37
106
#18
yeah the blur from DLSS is a no-go for me. I buy top end cards for top performance and image quality. RTX BF5 performance has improved a lot since launch though and 4k RTX ultra without DLSS on 2080ti is very playable now for single player imo.
 

ZGR

Golden Member
Oct 26, 2012
1,796
37
106
#19
No DLSS + No TAA will make the game nice and crisp!
I'm starting to feel like the enthusiast PC Gaming Nerd is a relic and slowly dying. What I mean is, it seems people (or at least from my perspective) are more accepting of this garbage blur than when card reviewers did Image quality comparisons.

With the introduction of on-the-fly scaling (thanks consoles) there is more support for "as long as it's 4K" versus "it has to be native 4K". Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it. CGI in movies has gotten to the point where you can't even see where a frame starts and ends, it's all one blurred mess. And the kids love it!

I'm not even 40 and I need a rocking chair already.
Most people I know who game don't care about framerate, resolution, and all that good stuff. I can't imagine sinking in nearly 1000 hours on a game locked at 30fps at a weird downscaled resolution when I can play the game at unlocked fps at native resolution.

I have the same issue with Gears of War 4 though for PC. TAA is globally enabled to enhance performance. Just a nice smear of vaseline to increase performance.
 
Feb 19, 2016
154
7
71
#20
I'm starting to feel like the enthusiast PC Gaming Nerd is a relic and slowly dying. What I mean is, it seems people (or at least from my perspective) are more accepting of this garbage blur than when card reviewers did Image quality comparisons.

With the introduction of on-the-fly scaling (thanks consoles) there is more support for "as long as it's 4K" versus "it has to be native 4K". Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it. CGI in movies has gotten to the point where you can't even see where a frame starts and ends, it's all one blurred mess. And the kids love it!

I'm not even 40 and I need a rocking chair already.
There were always people like that. Some just doesn't care about visual quality. It's like for them good visual quality is not required to enjoy the game. I know people like that.
 

coercitiv

Diamond Member
Jan 24, 2014
3,111
387
136
#21
Got into a discussion with my oldest nephew who is into gaming, and he doesn't seem to care when I was talking about blurring. "But if your FPS is higher, why does it matter?" I'm like "because it's blurry, it's a blurry mess" and of course "do you stop to look at the scenery or are you playing a game?" Outside of "stop and smell the roses" kind of reply, I had nothing else to say.

If RTX+DLSS gives people more visual punch at the expense of blur, I get the feeling they'll take it.
I don't get that feeling at all, and it's based on the same discussion you had with your nephew: he cares about FPS - about how the environment feels in motion - not whether he can stop and enjoy rose petal reflections into puddles.

There's an interesting paradox here: DLSS and RT were introduced as complementary features in RTX, with DLSS purposely aimed at increasing performance in order to compensate for the cost of RT features (reflections, illumination etc). However, in order for this goal to be achieved, the dichotomy is inescapable: DLSS must sacrifice image quality for performance (so that RT can do the exact opposite).

What I suspect you would like to see is DLSS working as an "independent" feature, and that is the DLSS 2X: running the game engine at native resolution and having the AI improve on that. Maybe we'll see that in the future.
 
Last edited:
Jul 24, 2017
65
1
41
#22

According to GN there is a significant performance difference between the benchmark and the actual gameplay.

Honestly I don't think the performance is that bad. Ultra + RTX High = 50 fps avg at 4K on the 2080 ti. That's actually pretty impressive and if you ran at High + RTX High, given the large delta in performance between ultra and high, you could probably run at 4K/60 with RTX.

And it looks like the RTX 2060 is capable of 1080p/60 with RTX on. That's actually pretty good.

Overall these are not bad numbers at all.
 

railven

Diamond Member
Mar 25, 2010
6,404
99
126
#23
Most people I know who game don't care about framerate, resolution, and all that good stuff. I can't imagine sinking in nearly 1000 hours on a game locked at 30fps at a weird downscaled resolution when I can play the game at unlocked fps at native resolution.

I have the same issue with Gears of War 4 though for PC. TAA is globally enabled to enhance performance. Just a nice smear of vaseline to increase performance.
My post probably came off lacking some details, but this is basically what I mean. I get the image quality is subjective, but I see more "gamers" quick to through native resolution out the window for quasi-4K that introduces a myriad of distortion such as blurriness, rendering issues, and etc.

There were always people like that. Some just doesn't care about visual quality. It's like for them good visual quality is not required to enjoy the game. I know people like that.
Oh yeah, that's always been a given. I guess more so now we got our GPU vendors pushing for this stuff that makes it weird. Historically they've pushed for higher frame rates and higher resolutions. Now we see NV dipping it's toes into upscaling. Someone had some great comparison shots (granted still images so you got that) of basically...wait let me just link them.

https://www.techpowerup.com/forums/threads/nvidia-dlss-test-in-battlefield-v.252545/#post-3994023

And to me, I'd take the 75% Resolution with TAA over DLSS in this particular situation, but ultimately I'd just settle with native+TAA if I could swing it.
I don't get that feeling at all, and it's based on the same discussion you had with your nephew: he cares about FPS - about how the environment feels in motion - not whether he can stop and enjoy rose petal reflections into puddles.
If that were true he wouldn't load up mods that essentially cripple his frame rate. He asked me why I cared since I'm the one thats "I just bought a new video card, my rates improved" so he thinks i only care about frame rate. When I showed him my settings, he asked why mine look so sharp, I said I don't use SMAA or TAA, I usa SSAA whenever I can, which is why I often talk about the GPU performance, ie frame rates. He games at like <30 FPS because he uses texture modes that basically destroy his VRAM. I've tried to explain it, but "it runs better than my Xbox" well of course it does, but still...oh well. Let them be.

EDIT: I also setup their PC for TV gaming. So they run at 1080p@60. So their GPU requirements aren't as high. When they come over I switch over to the living room TV which is also 1080p@60, so I got tons of processing power to slather some SSAA in the games they'd most likely want to play (minecraft for the younger ones, and COD for the older ones).
 

Triloby

Senior member
Mar 18, 2016
561
53
86
#25
Probably a stupid comment, but how is enabling DLSS any different than adjusting the resolution slider in a game's graphical options? To me, it looks like you're loosing a fair amount of sharpness just to gain performance. Except you could do the exact same thing by decreasing the internal resolution a bit and get similar performance boosts, but not lose that much in image quality.

The ray tracing is a different thing, since it looks good in certain areas of the game, but doesn't seem to make much of a difference in other parts of the game. Sounds like to me most people would be happy playing the game with RTX off. To me, it's just a nice bonus; not a "must have" feature needed to enjoy the game.
 


ASK THE COMMUNITY

TRENDING THREADS