Ultra vs Low Textures in Deus Ex Mankind Divided Frametime testing

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
There's still one more comparison that needs to happen - it's clear that moving to Ultra textures on the Fury gets you a net loss of frametimes. What we need to see though is if that same behavior happens on a card with a bigger VRAM buffer (e.g. an 8gb card such as 390x since it is closest in speed and same architecture). If both an 8gb card and the 4gb Fury both show the same drop in frametimes from going to Ultra, then we would know that Ultra is just a taxing setting altogether. If it only drops on Fury, then it is likely because of VRAM limitations.

We need the control data - does anyone with a 390 or 390x want to contribute? I only have a 4gb 290 myself...

You might need that comparison, but I don't. He used the same settings with the only difference being one with ultra quality textures, and the other with low quality textures which is adequate enough.. There is nothing at all unusual about these benchmarks. Deus Ex MD WARNS you that ultra textures are for GPUs with MORE than 4GB of VRAM, so I don't know why Bacon1 is still so obstinate about this.. :confused:
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No it shows that it takes more FPS, not that it causes stuttering and issues. Moving up settings should take more FPS. If you or someone else wants to reproduce the test I've done with a 6/8GB card please, please feel free to prove me wrong and show that changing those settings has 0 Fps hit for you.

As I've told you before, textures don't impact FPS because there's no processing involved. Textures are simply stored in VRAM in a compressed state, and when they are needed they are decompressed on the fly either by the CPU or GPU which happens very quickly. The fact that you are getting a FRAMERATE LOSS with ultra quality textures enabled says it all, because it means that your GPU is having to swap textures in and out of RAM much more often, which takes time and increases frame latency.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Yep that would be best. However if Nvidia cards show the same small fps drop, then it proves that it is the setting itself, not VRAM. It would be if Nvidia cards showed something different that we'd then need to double check if it was AMD vs Nvidia or 4 vs 6/8.

You should pay attention to your own testing methods. In your OP, these were the settings you used. The only setting that you changed between the runs was the texture setting. Everything else remained the same, so that shows that it is indeed the texture setting that negatively affected your frametime and framerate..

DX12
3440x1440
Ultra / Low Textures
16xAF
Shadow Quality: High
AO: On
CHS: Off
POM: High
DOF: On
LOD: High
Volumetric Lighting: On
Screenscape Reflections: On

TAA - On
Motion Blur - Off
Sharpen - Off
Bloom - On
Lens Flares - On
Cloth Physics - On
Subsurface Scattering - On
Chromatic Aberration - Off
Tessellation - On

I'm saying that the spikes aren't actually visible when playing and that they look worse than how the gameplay actually is, and that is both with Nvidia and AMD (hence why I linked a 980 Ti). I don't have two systems to run FCAT with so I'll have to do with PresentMon.

Do your own testing and post your results, thats why this thread exists.

Just because you can't see them, doesn't mean they aren't there. Some people are oblivious to microstutters, and some people can barely tell the difference between 30 FPS and 60 FPS. So just because you can't notice it, doesn't mean it's not there..
 
  • Like
Reactions: Sweepr

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
You should pay attention to your own testing methods. In your OP, these were the settings you used. The only setting that you changed between the runs was the texture setting. Everything else remained the same, so that shows that it is indeed the texture setting that negatively affected your frametime and framerate..

I never said it wasn't the setting. What I was asking for other people to test the setting to see if they also have their FPS affected. I was specifically asking people to test so we can determine if there is always a FPS drop or if it was VRAM related.

Just because you can't see them, doesn't mean they aren't there

Please explain why the same spikes happen even at low textures then? It is obviously not VRAM related there as it was at ~3.3/4GB used.
As I've told you before, textures don't impact FPS because there's no processing involved.

So prove it. Test it.

Even your wonderchild GoW4 has the texture settings split into multiple areas and shows slight fps drops even on a 1080. And those results are by Nvidia themselves.

http://www.geforce.com/whats-new/guides/gears-of-war-4-graphics-and-performance-guide

Please explain why the spikes show up on all of these, even while under 4gb of VRAM.

o9udaF2.png


b9b0Hwl.png


CWb1nz3.png


Bumping up textures to High from low made a small overall fps difference, vh made more and ultra more after that as shown in the graph. Only Ultra was hitting the 4GB cap with VH at ~3.8GB
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I never said it wasn't the setting. What I was asking for other people to test the setting to see if they also have their FPS affected. I was specifically asking people to test so we can determine if there is always a FPS drop or if it was VRAM related.

This is fairly common knowledge amongst computer enthusiasts, so it shouldn't need to be tested.. Like I said before, you're the only person that actually finds these results surprising..

Please explain why the same spikes happen even at low textures then? It is obviously not VRAM related there as it was at ~3.3/4GB used.

I told you before, random spikes don't really matter. It's the percentiles that do. When you did the raw data dump, it proved beyond a doubt that ultra quality textures affected your frame latency in a negative manner.. This debate should have been finished with that, but of course you can't admit to being wrong..

So prove it. Test it.

There is nothing to prove or test on my end. You've done a marvelous job of deflating your own argument, with your own data.
Even your wonderchild GoW4 has the texture settings split into multiple areas and shows slight fps drops even on a 1080. And those results are by Nvidia themselves
.

Are you serious? o_O

gears-of-war-4-effects-texture-detail-performance.png

gears-of-war-4-character-texture-detail-performance.png


Please explain why the spikes show up on all of these, even while under 4gb of VRAM.

I answered this on the first page. Lots of things can cause frametime spikes. It could be shaders being compiled by the CPU, or an asset that had to be loaded from storage rather than from RAM. Who knows, and who cares?

We are talking about thousands of frames here, looking at a few large spikes is meaningless in the grand scheme of things. The percentile data told us everything we need to know.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
I answered this on the first page. Lots of things can cause frametime spikes. It could be shaders being compiled by the CPU, or an asset that had to be loaded from storage rather than from RAM. Who knows, and who cares?

You were the one complaining about the spikes in your first post:

, and it's clear that it has many more frame time spikes

As to why I keep asking for someone else to test, with only a single set of data you can't prove anything. Also add up all of the different texture options in GoW4 and you have:

Character: 0.3, Effects : 0.4, Lighting: 0.3, World Texture Detail: 3.1

gears-of-war-4-world-texture-detail-performance-640px.png


If you had to prioritize settings because of a lack of VRAM, we'd recommend making World Texture Detail and Lighting Texture Detail your top choices, followed by Character Texture Detail and Effects Texture Detail.

All together that is.... 4.1 fps @ ~52 fps or 8% slower.

I mean if you want the game to look like a blurry mess enjoy your low word texture details...

gears-of-war-4-world-texture-detail-002-ultra.png


gears-of-war-4-world-texture-detail-002-low.png


That setting alone is 6% performance hit on a 1080 at a lower resolution than what I was playing at in DXMD.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You were the one complaining about the spikes in your first post:

Yeah, the ones in the mid 20s to low 30s, which comprised the bulk of your readout for ultra quality textures.. The ones in the 90ms range I didn't care about..

I mean if you want the game to look like a blurry mess enjoy your low word texture details...

If you had bothered to read the article, you'd see there is an explanation for this:

Additionally, World Texture Detail affects the resolution of bump mapping, and the very visible specular highlights that are seen throughout the game.

So that setting just isn't affecting textures.. Anyway, I'm done with this one. If you want to believe that your 4GB Fury can magically circumvent the laws of physics, then by all means, be my guest.. :cool:
 
  • Like
Reactions: Sweepr

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
If you had bothered to read the article, you'd see there is an explanation for this:

Actually I did read that part. The problem is we don't know what all changes from DXMD Texture setting do we, as its clear that developers lump in more than just textures. They even split out their options into like 40 compared to only a handful for DXMD and they STILL lumped in other effects into "World Textures" setting.

So again, Do your own testing and stop making assumptions.

I do think its funny you were singing high praises for DXMD until you found out it was using AMD tech at which point you no longer thought it was technically savvy anymore since Nvidia wasn't outpacing AMD.

Things I really like about the graphics are the screen space reflections which are detailed and have high resolution, and they also have a contact hardening effect applied as well. Also, the cloth physics setting is definitely noticeable. You can see the more realistic cloth movement, even on NPCs. The temporal AA is also very good, and gets rid of the vast majority of aliasing and shimmering. MSAA definitely isn't being missed at all here..

Then later:

That said, I think AMD's Gaming Evolved program leaves much to be desired. The graphics effects like CHS are second rate compared to what you find with NVidia's Gameworks..

Deus Ex Mankind Divided would have been a better looking game, and probably even a better performing game had it been a GW title rather than a GE title.
 
  • Like
Reactions: Final8ty

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
If you had bothered to read the article, you'd see there is an explanation for this:

So that setting just isn't affecting textures.. Anyway, I'm done with this one. If you want to believe that your 4GB Fury can magically circumvent the laws of physics, then by all means, be my guest.. :cool:

Wow, texture settings change bump mapping and specular highlights! It's almost as if those things rely on bump maps and specular maps, which are themselves textures.

And if you had bothered looking into the game that this thread is about (Deus Ex), you would know that texture settings in this game also visibly affects bump map resolution and specular highlighting. In fact most games in my experience have these things controlled by the global texture setting, a notable exception would be some of the CoD titles which have specific settings for normal map resolution and specular map resolution.

And the end of the day the fact remains that the only way to conclusively say whether or not VRAM is an issue here, is to test a 6/8 GB card under the same conditions.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Actually I did read that part. The problem is we don't know what all changes from DXMD Texture setting do we, as its clear that developers lump in more than just textures. They even split out their options into like 40 compared to only a handful for DXMD and they STILL lumped in other effects into "World Textures" setting.

This is what you call a false equivalency. You assume that just because Gears 4 does it, that Deus Ex MD must as well, because in your mind, that's the only explanation that can account for these things. Well, if you think so, the burden of proof is on you to show this. You also conveniently ignore the WARNING from the developer themselves that ultra quality textures is for GPUS with MORE than 4GB of VRAM.

qLc649.png


But by all means, continue with your hand wringing. It's entertaining, if anything :D

I do think its funny you were singing high praises for DXMD until you found out it was using AMD tech at which point you no longer thought it was technically savvy anymore since Nvidia wasn't outpacing AMD.

Last time I checked, a GTX 1080 (which is what I have) is faster than any AMD card in DXMD. Also, the reason I said those things had mostly to do with the fact that both the CHS and Very high ambient occlusion settings were bugged and actually degraded IQ rather than enhance it. The shadow draw distance was actually reduced when using CHS (was later fixed in a patch), and the very high ambient occlusion setting introduced temporal artifacts which looked terrible..

NVidia's PCSS and HBAO+ would have been much better solutions, objectively speaking, because at least they work properly.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Wow, texture settings change bump mapping and specular highlights! It's almost as if those things rely on bump maps and specular maps, which are themselves textures.

Specular maps and bump mapping use minor computations in the shaders, which explains the slight performance hit. Unless you want to be disingenuous however, what Bacon1 is experiencing isn't exactly a "slight" performance hit..

And if you had bothered looking into the game that this thread is about (Deus Ex), you would know that texture settings in this game also visibly affects bump map resolution and specular highlighting. In fact most games in my experience have these things controlled by the global texture setting, a notable exception would be some of the CoD titles which have specific settings for normal map resolution and specular map resolution.

No modern GPU should struggle with bump mapping and specular mapping, and the performance hit that Bacon1 gets with ultra quality textures suggests much more is going on. Also, if you scrutinize the YouTube video I linked to earlier, then you'll see that the Fury X has all the characteristics of struggling with texture swapping.

1) It's using 2.5GB more system RAM than the GTX 980 Ti, and 3.5GB more than the GTX 1080.

2) The frametime is unstable, whilst the actual framerate is more stable.

3) The frametime spiking is magnified during movement in large, dense areas like Prague.
And the end of the day the fact remains that the only way to conclusively say whether or not VRAM is an issue here, is to test a 6/8 GB card under the same conditions.

I think most common sense people have already written this off for what it is and don't require further testing.. Bacon1 is suggesting that he knows better than the devs, which is not only misguided but exceedingly arrogant. You think they would have put this warning in there just for nothing?

qLc649.png
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Specular maps and bump mapping use minor computations in the shaders, which explains the slight performance hit. Unless you want to be disingenuous however, what Bacon1 is experiencing isn't exactly a "slight" performance hit..

I think most common sense people have already written this off for what it is and don't require further testing.. Bacon1 is suggesting that he knows better than the devs, which is not only misguided but exceedingly arrogant. You think they would have put this warning in there just for nothing?

The FPS difference in my testing is 11% at 3440x1440 (53 -> 47.5 avg fps). The GoW settings on a 1080 shows 8% loss @ 2560x1440 (56.1-> 52). I'd call those nearly even as I'm using 4953600 vs 3686400 or 34% more pixels.

When testing all low settings + low vs all low + Ultra textures fps was 68.2 vs 65.4 or 4% (still at 3440x1440). Low+ VH had a 2% loss (68.2 -> 66.7)

I find it funny how you are resorting to personally attacking me instead of just testing it yourself.

Guess what, that SAME prompt shows up when you pick Very High, yet it doesn't go over 3.9GB used (hovers around 3.8GB). I showed that in my testing here: https://forums.anandtech.com/thread...rametime-testing.2489005/page-2#post-38521483

I'm asking for someone with 6/8GB or more to test the setting and see if it affects their fps. You said that texture settings only affect VRAM, but that is proven false as games will bundle other settings in and thus it does affect fps. Each of the other texture only settings also affected FPS (2%) but you ignored those conveniently.

1) It's using 2.5GB more system RAM than the GTX 980 Ti, and 3.5GB more than the GTX 1080.

You keep bringing this up, which means you think its important, which also means that the 980 TI is having issues since its using more system ram than the 1080. You can't have it both ways.

You also keep bringing up a very old video as evidence instead of doing your own testing like I've been asking for days now. If you aren't going to bother testing, don't bother replying as you have nothing to offer this thread except for your misplaced insults and theories. I'm asking for hard data and testing.
 
Last edited:
  • Like
Reactions: Final8ty

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Thanks. Here's a quick couple of comparisons of the data, that should hopefully make it a bit clearer exactly how the differents settings compare to each other performance wise:

dMEuV3r.png

RiSLJxe.png

Note that frametime variance is calculated similarly to PCPER's methodology, with a 20 frame running average.

This should have ended the thread right here, variance is definitely effected by ultra textures.

There is nothing wrong with that, anyone expecting to use a 4GB card should keep in mind that they might have to lower a setting or two to keep VRAM in check. Not exactly groundbreaking stuff.
 
  • Like
Reactions: Sweepr

Azix

Golden Member
Apr 18, 2014
1,438
67
91
the question is what kind of impact textures has on performance. it varies by game. Someone should test this with a 480 or 390(x)

This should have ended the thread right here, variance is definitely effected by ultra textures.

There is nothing wrong with that, anyone expecting to use a 4GB card should keep in mind that they might have to lower a setting or two to keep VRAM in check. Not exactly groundbreaking stuff.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
the question is what kind of impact textures has on performance. it varies by game. Someone should test this with a 480 or 390(x)

Yeah its sad no one else is willing to step up and test. I've done a dozen or so for this so far.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
the question is what kind of impact textures has on performance. it varies by game. Someone should test this with a 480 or 390(x)

Textures only have an impact if:

1) Not enough VRAM

2) Specular and or bump mapping applied to them, which is game dependent.

Textures by themselves though don't take any processing power from the GPU other than decompression, and from what I've read, that happens extremely quickly on modern GPUs. Fast enough for them to be able to do it on the fly in fact, so we're talking just a few milliseconds..
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
Textures only have an impact if:

1) Not enough VRAM

2) Specular and or bump mapping applied to them, which is game dependent.

Textures by themselves though don't take any processing power from the GPU other than decompression, and from what I've read, that happens extremely quickly on modern GPUs. Fast enough for them to be able to do it on the fly in fact, so we're talking just a few milliseconds..

And once again, we don't know what all is modified by the Ultra texture setting. Since neither you nor anyone else will test we don't know. Its been proven that games do other things under a "texture" setting.

It would be simple for you to run this test @ 1440p or 4k with Ultra and High settings and prove that there is or is not a FPS difference.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Specular maps and bump mapping use minor computations in the shaders, which explains the slight performance hit. Unless you want to be disingenuous however, what Bacon1 is experiencing isn't exactly a "slight" performance hit.
Those minor computations in the shaders you're so eager to use as explanation - 8% on a GTX 1080 between low and high - is almost a GTX 460 worth of Gflops...
 
  • Like
Reactions: Bacon1

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
Specular maps and bump mapping use minor computations in the shaders, which explains the slight performance hit.

So in other words, when you said previously said this: "textures don't impact FPS because there's no processing involved" you were just talking out of your ass, gotcha.

Unless you want to be disingenuous however, what Bacon1 is experiencing isn't exactly a "slight" performance hit..

It's blatantly obvious that the only one being disingenuous here is you, unless you want to argue that an 11% hit (Deus Ex) is significantly different from an 8% hit (GoW4)

Textures only have an impact if:

1) Not enough VRAM

2) Specular and or bump mapping applied to them, which is game dependent.

Textures by themselves though don't take any processing power from the GPU other than decompression, and from what I've read, that happens extremely quickly on modern GPUs. Fast enough for them to be able to do it on the fly in fact, so we're talking just a few milliseconds..

Again bump maps and specular maps are textures, just like color maps. You can also have stuff like normal maps, glossiness maps, alpha maps, parallax maps and displacement maps. Some of these texture types can have very high performance hits (displacement maps for instance). So saying that textures don't take any processing power is just plain false. You could have said that color maps take no processing power (or at least so little that you won't really notice), which would have been largely true.

Generally speaking all modern games use at least normal maps (instead of bump maps) and specular maps and many of them also utilize parallax occlusion mapping or similar (this includes Deus Ex, although it is a separate setting from texture resolution). So the usage of bump mapping (or normal mapping) and specular mapping is not really game dependent these days.

Also a few milliseconds is a lot when you're running at about 50 FPS. A 2 millisecond hit would be equal to a 10% performance hit (which coincidently is almost exactly what we see in both Deus Ex and GoW4).
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Those minor computations in the shaders you're so eager to use as explanation - 8% on a GTX 1080 between low and high - is almost a GTX 460 worth of Gflops...

Gears of War 4 is likely an exception because of prevalent specular and bump mapping used throughout the game. But I'm confident that most games aren't like this.

Specular and bump mapping are fairly old technologies, that modern day GPUs can handle with ease..
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
And once again, we don't know what all is modified by the Ultra texture setting. Since neither you nor anyone else will test we don't know. Its been proven that games do other things under a "texture" setting.

It would be simple for you to run this test @ 1440p or 4k with Ultra and High settings and prove that there is or is not a FPS difference.

How about this? Apparently Nixxes put up a PC optimization guide for Deus Ex MD on their forums. This is what they say about texture quality:

Texture Quality
This alters the resolution of textures used in the game. This primarily impacts GPU memory pressure. If this is set too high the game may start stuttering and slowing down. It does not impact performance on the GPU or the CPU much otherwise. We recommend it at High for 4GB cards. You have to restart the game to apply changes to Texture Quality.

Source

So as you can see, the texture quality setting doesn't really affect GPU or CPU performance according to the developers of the game :D
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So in other words, when you said previously said this: "textures don't impact FPS because there's no processing involved" you were just talking out of your ass, gotcha.

I will admit that I wasn't fully correct, but I will standby my statement that for the vast majority of games, textures don't impact GPU performance unless you exceed your VRAM budget.

It's blatantly obvious that the only one being disingenuous here is you, unless you want to argue that an 11% hit (Deus Ex) is significantly different from an 8% hit (GoW4)

You're treading down the same path of false equivalency that Bacon1 is. Rather than accepting the obvious, you want to believe that Deus Ex MD is just like Gears 4, when it isn't..

Gears 4 is an exception. Having played it, it has specular and bump mapping all over the place, especially in a certain level at night.

Generally speaking all modern games use at least normal maps (instead of bump maps) and specular maps and many of them also utilize parallax occlusion mapping or similar (this includes Deus Ex, although it is a separate setting from texture resolution). So the usage of bump mapping (or normal mapping) and specular mapping is not really game dependent these days.

I can't think of one game off the top of my head where the texture quality setting affected performance in a significant manner on the GPU or CPU. Only when there was an obvious VRAM limitation did performance issues manifest..

This is what Nixxes said about the texture quality setting in Deus Ex MD:

Texture Quality
This alters the resolution of textures used in the game. This primarily impacts GPU memory pressure. If this is set too high the game may start stuttering and slowing down. It does not impact performance on the GPU or the CPU much otherwise. We recommend it at High for 4GB cards. You have to restart the game to apply changes to Texture Quality.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
Gears of War 4 is likely an exception because of prevalent specular and bump mapping used throughout the game. But I'm confident that most games aren't like this.

Specular and bump mapping are fairly old technologies, that modern day GPUs can handle with ease..
...That's my point, you're trying to shoehorn those ~500 Gflops worth of computational time into effects that have been used for at least 15 years in PC games. The original Serious Sam had both iirc, it ran well on GPUs with less than 2Gflops.

How about this:
rise-of-the-tomb-raider-texture-quality-performance-640px.png


There are a number of games that show very similar performance hits when they increase their texture atlas. Your argument just isn't sufficient as proof.
 
  • Like
Reactions: Final8ty and Bacon1

antihelten

Golden Member
Feb 2, 2012
1,764
274
126
I will admit that I wasn't fully correct, but I will standby my statement that for the vast majority of games, textures don't impact GPU performance unless you exceed your VRAM budget.

You are free to stand by that statement, but that doesn't make it any less wrong.

You're treading down the same path of false equivalency that Bacon1 is. Rather than accepting the obvious, you want to believe that Deus Ex MD is just like Gears 4, when it isn't..

Gears 4 is an exception. Having played it, it has specular and bump mapping all over the place, especially in a certain level at night.

Stating that a 8% and 11% hit is not significantly different is false equivalence now, by all means do elaborate. And yes Deus Ex is exactly like GoW 4, when it comes to the use of bump mapping and specular mapping (and having it tied to the texture quality setting), the fact that you are unwilling to accept that normal mapping and specular mapping is used extensively by pretty much any modern game and actually believe that GoW4 is unique in this aspect, really doesn't do anything other than expose your own ignorance.

I can't think of one game off the top of my head where the texture quality setting affected performance in a significant manner on the GPU or CPU. Only when there was an obvious VRAM limitation did performance issues manifest..

This is what Nixxes said about the texture quality setting in Deus Ex MD:

Whether or not the texture quality setting affects performance, has less to do with the performance hit of the texture quality itself and more to do with exactly which textures are controlled by the setting. For instance if only color map resolution is controlled by the setting, then the performance hit will be negligible (5% or less), on the other hand if other textures such as normal maps and specular maps are controlled by the setting (as is the case in GoW4 and Deus Ex), then the performance hit will be larger (10% or so). Some games will have even more textures controlled by the setting, such as effect textures and shadow textures.

Performance hit from VRAM limitations then comes on top of this.

Also you might want to stop misquoting the devs. They clearly state that if the setting is set too high the game may start stuttering and slowing down, but otherwise (i.e. if you don't set it too high) It does not impact performance on the GPU or the CPU much.
 

Bacon1

Diamond Member
Feb 14, 2016
3,430
1,018
91
How about this? Apparently Nixxes put up a PC optimization guide for Deus Ex MD on their forums. This is what they say about texture quality:



Source

So as you can see, the texture quality setting doesn't really affect GPU or CPU performance according to the developers of the game :D

So let me read the actual quote and point out some of the words for you.

Texture Quality
This alters the resolution of textures used in the game. This primarily impacts GPU memory pressure. If this is set too high the game may start stuttering and slowing down. It does not impact performance on the GPU or the CPU much otherwise. We recommend it at High for 4GB cards. You have to restart the game to apply changes to Texture Quality.

They never said it only affects memory.

I've already shown that even while under the 4GB memory cap textures affect performance. I showed how low+low -> low+very high (~3.8gb vs 3.9 of low+Ultra) still had a performance hit.

https://forums.anandtech.com/thread...rametime-testing.2489005/page-2#post-38521483
 
Status
Not open for further replies.