Ultra vs Low Textures in Deus Ex Mankind Divided Frametime testing

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
...That's my point, you're trying to shoehorn those ~500 Gflops worth of computational time into effects that have been used for at least 15 years in PC games. The original Serious Sam had both iirc, it ran well on GPUs with less than 2Gflops.

How about this:
rise-of-the-tomb-raider-texture-quality-performance-640px.png


There are a number of games that show very similar performance hits when they increase their texture atlas. Your argument just isn't sufficient as proof.

Dude are you serious? o_O

You honestly think a measly 2-3 FPS is anything? That's basically standard benchmark deviation as far as I'm concerned. You could run the benchmark 10 times and I can practically guarantee you that there'd be times when the very high setting would get a higher framerate than the low setting.. The reviewer implied as much when he said:

Performance: As you'd expect, adjusting Texture Quality has a minimal impact on performance, with only two frames per second separating Very High and Low.

This "discussion" has really hit rock bottom if all you guys have to pull up are benchmarks with a 2-3 FPS difference :rolleyes:
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
And yes Deus Ex is exactly like GoW 4, when it comes to the use of bump mapping and specular mapping (and having it tied to the texture quality setting), the fact that you are unwilling to accept that normal mapping and specular mapping is used extensively by pretty much any modern game and actually believe that GoW4 is unique in this aspect, really doesn't do anything other than expose your own ignorance.

If bump mapping and specular mapping are used extensively in "pretty much any modern game" as you put it, then why isn't there a similar performance hit for textures in games like GTA V, Far Cry 4, Witcher 3, Rise of the Tomb Raider, Fallout 4, Just Cause 3, AC Syndicate, MGS V TPP? All have negligible deviation in performance with texture settings.. :rolleyes:

Out of the most recent titles on Geforce guides, only Call of Duty Black Ops 3 is similar to Gears of War 4 in performance hit from textures, and the article explains that it's because the "extra" setting affects the quality of other assets, much like the World texture detail setting does in Gears 4.

Whether or not the texture quality setting affects performance, has less to do with the performance hit of the texture quality itself and more to do with exactly which textures are controlled by the setting. For instance if only color map resolution is controlled by the setting, then the performance hit will be negligible (5% or less), on the other hand if other textures such as normal maps and specular maps are controlled by the setting (as is the case in GoW4 and Deus Ex), then the performance hit will be larger (10% or so). Some games will have even more textures controlled by the setting, such as effect textures and shadow textures.

Do you actually have any evidence that Deus Ex MD uses a similar system as Gears 4, or are you talking out of your ass like I was earlier? I guess I'm going to have to e-mail Nixxes and see what they say :D

Also you might want to stop misquoting the devs. They clearly state that if the setting is set too high the game may start stuttering and slowing down, but otherwise (i.e. if you don't set it too high) It does not impact performance on the GPU or the CPU much.

Right, so how on Earth do you seem to think that this comment somehow equals,"we have other settings bundled along with the ultra quality textures" to explain Bacon1's performance loss? o_O
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
:rolleyes:

Considering that's the whole basis of your argument. :( :oops: :eek:

https://forums.anandtech.com/thread...rametime-testing.2489005/page-2#post-38522333

There is a ~4.1 fps difference in GOW4 texture settings from low -> Ultra, and I only lost 5.5fps @ 3440x1440 which is over 30% more pixels than 1440p.

Do your own testing :D

You know, I have to say I have a big problem with how you're abusing the numbers here to suit your argument... The benchmark deviations for Character: 0.3, Effects : 0.4, and Lighting: 0.3, should really be viewed as negligible, because that's what they are. We are talking not even half a frame per second for these deviations! o_O You're just using those to inflate the difference as much as possible to make your argument less ridiculous than it actually is.. :rolleyes:

If NVidia had done the benchmark 10 times, they would have gotten totally different results every time, and ultra could have come out on top! So with these taken out of the picture, you're really looking at a 5.6% loss. Still statistically significant, but not as inflated as before..
 
  • Like
Reactions: Sweepr

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
You know, I have to say I have a big problem with how you're abusing the numbers here to suit your argument

Gotta say the exact same about you.

The benchmark deviations for Character: 0.3, Effects : 0.4, and Lighting: 0.3, should really be viewed as negligible, because that's what they are.

You are leaving off the biggest one... which was 3.1 or 6% by itself. Kinda goes back to my first sentence...

If NVidia had done the benchmark 10 times, they would have gotten totally different results every time, and ultra could have come out on top!

TIL guides by Nvidia aren't proper benchmarks, and that it didn't clearly show each option slightly worse than the previous...

Now you doubt both Nvidia and my benchmarks, so sad you can't perform this test yourself. Do you even own the game and hardware? I'm starting to doubt you do since you won't just test it and instead have now called myself and Nvidia's benchmark guide liers.
 

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
If bump mapping and specular mapping are used extensively in "pretty much any modern game" as you put it, then why isn't there a similar performance hit for textures in games like GTA V, Far Cry 4, Witcher 3, Rise of the Tomb Raider, Fallout 4, Just Cause 3, AC Syndicate, MGS V TPP? All have negligible deviation in performance with texture settings.. :rolleyes:

Out of the most recent titles on Geforce guides, only Call of Duty Black Ops 3 is similar to Gears of War 4 in performance hit from textures, and the article explains that it's because the "extra" setting affects the quality of other assets, much like the World texture detail setting does in Gears 4.

It's incredible that you have to have this explained to you again and again and again.

If a game only controls the resolution of its color maps with the texture quality setting, then there will be only a negligible hit to performance. If on the other hand a game controls the resolution of other types of texture (such as normal maps and specular maps) with the texture quality setting, then there will be a bigger hit to performance.

The fact that many games don't have their normal map and specular map resolution tied to the texture quality setting, doesn't mean that they don't use normal maps and specular maps.

Do you actually have any evidence that Deus Ex MD uses a similar system as Gears 4, or are you talking out of your ass like I was earlier? I guess I'm going to have to e-mail Nixxes and see what they say :D

You mean evidence such as screenshots that clearly show the increase in normal map resolution such as these ones (medium, ultra) or these ones (medium, ultra). The difference in normal mapping is most evident in the uniforms (there are obvious artifact from low resolution bump mapping with medium textures) or in the stitching in Jensen's coat, you can also see the difference in POM in the cobblestone.

Right, so how on Earth do you seem to think that this comment somehow equals,"we have other settings bundled along with the ultra quality textures" to explain Bacon1's performance loss? o_O

Question is why do you seem to think that the statement equals " we control color map resolution and only color map resolution with the texture quality setting"?

By all means go out and test it with a 6/8GB card and provide some actual evidence to support your position.

If NVidia had done the benchmark 10 times, they would have gotten totally different results every time, and ultra could have come out on top!

Talk about circumventing the laws of physics :rolleyes:
 
Last edited:

Piroko

Senior member
Jan 10, 2013
905
79
91
Dude are you serious? o_O
Yes.

You honestly think a measly 2-3 FPS is anything? That's basically standard benchmark deviation as far as I'm concerned.
Lets recapitulate your basis of argumentation:

Are we looking at the same graph? The blue represents ultra textures, and it's clear that it has many more frame time spikes into the 30 and 40ms range than the orange. Low textures seems to stick around the 15-22ms range, but ultra is much higher in the mid to upper 20s for the most part, but with a lot more spikes into the 30s and 40s..
A short look at a single frametime graph is "/end discussion" for you, even after it was shown that the graph you were so insistent on analysing probably isn't representative of displayed frametimes but may get paced before final output. Then you go on to focus on the fps difference as a sign for issues but throw out any data that notes fps differences in the same range regardless of vram size. Are you by chance doing gender studies?

You could run the benchmark 10 times and I can practically guarantee you that there'd be times when the very high setting would get a higher framerate than the low setting..
63.50, 61.92, 61.36, 60.41
All numbers +- 0.2. Take a guess.
 
  • Like
Reactions: Bacon1

coercitiv

Diamond Member
Jan 24, 2014
7,331
17,341
136
You honestly think a measly 2-3 FPS is anything? That's basically standard benchmark deviation as far as I'm concerned. You could run the benchmark 10 times and I can practically guarantee you that there'd be times when the very high setting would get a higher framerate than the low setting.. The reviewer implied as much when he said:
Need some help over here, did you just say Nvidia chose the worst possible benchamark runs to portray their product performance? That sounds a bit counter intuitive, wouldn't they be better off by choosing an averaged result to paint their product in a fair light? What Nvidia employee would willingly portray a false 5-6% loss in product performance for a setting change which is free in reality?

This "discussion" has really hit rock bottom if all you guys have to pull up are benchmarks with a 2-3 FPS difference :rolleyes:
Don't worry about that, it was rock bottom well before that. :tropicaldrink:
 
  • Like
Reactions: Bacon1

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You are leaving off the biggest one... which was 3.1 or 6% by itself. Kinda goes back to my first sentence...

I already acknowledged that one as being legitimate. The others however, are not. So this throws a big spike into your numbers, because for your correlation with Deus Ex MD to be accurate, you'd have to show that ultra texture quality has other enhancements that affect performance in a negative manner to a level that would reach double digits.

So far you have not done this, and instead have relied on assumptions to make sense of the fact that the Fury falls flat on its ass with ultra quality textures. . I've read and seen multiple sources which suggest that ultra quality textures and 4GB GPUs don't mix well in Deus Ex MD. You're the only one saying otherwise, and given your history of hyping HBM as being able to defy the normal limitations of VRAM, then you can surely understand why I'm suspicious. So the only thing that would satisfy me at this stage would be for you to do a video upload so I can see for myself the stutters and hitching, similar to what that other guy on YouTube did.

TIL guides by Nvidia aren't proper benchmarks, and that it didn't clearly show each option slightly worse than the previous...

It's not a question of whether they are proper benchmarks. Benchmarks by their nature always have some slight deviation. You run any benchmark program that tests performance, whether on the CPU, RAM or GPU, and the scores will be different each time most likely.. That's just the nature of benchmarking. The fact that you don't know this, tells me you must be new to benchmarking or something..:confused:

Now you doubt both Nvidia and my benchmarks, so sad you can't perform this test yourself. Do you even own the game and hardware? I'm starting to doubt you do since you won't just test it and instead have now called myself and Nvidia's benchmark guide liers.

I don't doubt NVidia's benches, but I doubt your interpretation of their data, and your own. Your 99th percentile for ultra quality was around 35ms, which is pretty damn high. There is no way in hell there wasn't any stuttering or frame lag.. o_O

As for whether I own the game and hardware, yes I do. If you want proof, I can furnish it easily by showing you a pic of my rig (you will know it's mine because my specs are listed in my sig), or a screenshot of a specific location in the game of your choice to prove that I actually own the game..

The reason why I don't want to conduct any tests on my end, is because:

1) There's no doubt that a GTX 1080 can easily handle ultra quality textures. I've been using ultra quality textures since the game launched without any problems and no impact on performance that I've noticed...

2) I'm too lazy to conduct any in depth benchmarks, especially if the outcome is already known.

3) A comparison with an NVidia GPU isn't the best option, because NVidia's drivers have a different VRAM management algorithm. An AMD GPU with 8GB would be the best comparison, or perhaps someone else with a Fury.

In fact, I asked a guy on another forum with a Fury whether he could use ultra quality textures in Deus Ex Mankind Divided. He has yet to reply to me :D

But when he does, I'll let you know..
 
  • Like
Reactions: Sweepr

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's incredible that you have to have this explained to you again and again and again.

If a game only controls the resolution of its color maps with the texture quality setting, then there will be only a negligible hit to performance. If on the other hand a game controls the resolution of other types of texture (such as normal maps and specular maps) with the texture quality setting, then there will be a bigger hit to performance.

The fact that many games don't have their normal map and specular map resolution tied to the texture quality setting, doesn't mean that they don't use normal maps and specular maps.

I e-mailed Nixxes asking them about the ultra quality textures, so we'll see what they say about it.. I also asked another guy on a forum who has a Fury to test Deus Ex MD with ultra quality textures so we can have two perspectives..

My issue is, I don't believe Bacon1 when he says that he has no stuttering or frame lag with ultra quality textures. His 99th percentile was too high to suggest smooth game play.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
A short look at a single frametime graph is "/end discussion" for you, even after it was shown that the graph you were so insistent on analysing probably isn't representative of displayed frametimes but may get paced before final output. Then you go on to focus on the fps difference as a sign for issues but throw out any data that notes fps differences in the same range regardless of vram size. Are you by chance doing gender studies?

Did you just jump into the tail end of this discussion without reading the entire thread? Yes I thought so.. :rolleyes:

Actually, what ended the discussion for me was the raw data dump. And when antihelten put up the percentile chart, which showed that his 99th percentile was approximately 35ms, which is way too high to afford smooth game play. His 99th percentile would have to be much lower if he wants me to believe him that and I quote,"There was no noticed in game stutter while I ran around for 3 minutes in Prague while it recorded."
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Need some help over here, did you just say Nvidia chose the worst possible benchamark runs to portray their product performance? That sounds a bit counter intuitive, wouldn't they be better off by choosing an averaged result to paint their product in a fair light? What Nvidia employee would willingly portray a false 5-6% loss in product performance for a setting change which is free in reality?

I don't know where you, or anyone else is getting the idea that I somehow don't trust NVidia's numbers... I do. The point I was making, is that benchmarks have minor deviations from one run to another. I run benchmarks all the time, and it's not uncommon for there to be slight differences between each run, even with the same settings.. So benchmarks with a few FPS difference between them are typically viewed as negligible and just standard deviation.

Anyone that runs benchmarks on a regular basis will tell you this.. Doesn't matter if your're testing the CPU, GPU, RAM or whatever, there will always be minor deviations.
 

stahlhart

Super Moderator Graphics Cards
Dec 21, 2010
4,273
77
91
Thread closed. Take your stupid, pointless arguing to PM.
-- stahlhart
 
Status
Not open for further replies.