• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Middle Earth Shadow of Mordor - 6GB Vram required (ultra textures)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Even if it says it requires 6GB I bet a 3 or 4GB will run this game faster at 4K than trying to run Crysis 3 😀

you can monitor how much vram is being used, and just lower texture quality till you aren't hitting 99% vram usage. It's just that simple - pretty much just one variable directly responsible.

btw, people sometimes say things like "this gpu isn't fast enough to benefit from more vram" to prospective gpu buyers, when really there is no basis for that. The amount of vram an application uses is basically arbitrary, so you have to know the specific games or applications to say how much vram is necessary.
 
Optional SSAA and 4K textures. These are definitely PC centric features, kudos to the developers to bother putting these in their games.

If the game doesn't actually look good enough to justify only 0.0001% of the market being able to max it out, it negates that idea.
 
no! how dare they create options that won't allow us to ignorantly crank all the settings up! :colbert:
That's not the point; the point is almost no one can crank the settings up. People using a 6GB card are most likely playing beyond 1080p, therefore, they can't afford the high resolution. Given that that's the case, what's the point of even having such textures?

Also, even recent games don't use 6GB video ram at triple 4K.
 
That's not the point; the point is almost no one can crank the settings up. People using a 6GB card are most likely playing beyond 1080p, therefore, they can't afford the high resolution. Given that that's the case, what's the point of even having such textures?

Also, even recent games don't use 6GB video ram at triple 4K.

Believe it or not, but tons of people who visit the forums consider any game that can't be maxed out as unplayable or poorly optimized, whether or not there are settings they can play at which look good.

As far as creating textures for a rare few goes; does it actually take much time to add those textures?
 
A good example on how you can waste VRAM.

8532-1.jpg

AndroidTextureComp.png
 
Perhaps proper use of technology like metatextures

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

could put a pipe in the 'memory leaks' we're currently seeing on gpu's vram..
I cant help but get the feeling it is one of these areas where "hey lets just throw more hardware at it" instead of "hey lets do this smarter" applies.

A little pause in hardware evolution may not be a bad thing at all, might just force the devs to compete on a different set of parameters.

You probably want to write Megatextures ... and this is not a requirement for optimized VRAM usage. Texture streaming is good enough, but implementing a good streaming system is "fakin hard" with D3D11. We can stream the same textures on consoles with nearly four times lower VRAM usage. So this is purely an optimization issue.
 
Last edited:
That's not the point; the point is almost no one can crank the settings up. People using a 6GB card are most likely playing beyond 1080p, therefore, they can't afford the high resolution. Given that that's the case, what's the point of even having such textures?

Also, even recent games don't use 6GB video ram at triple 4K.

no, that is the point

1. we have yet to really know how much VRAM is being used at 1080p for each of these settings, for now we only know what is recommended (not required)

2. we don't know how good/bad the various settings are

3. Ultra textures is an extra, optional install...

4. why would it matter if they went a step further and created a SuperDuperUltraMega-High setting that required 12GB of VRAM in preparation for 8K, despite the fact that there are no consumer products available for such options?
 
Last edited:
I think the reason that a lot of people are viewing this with reservations is that so far we have seen a large increase in system requirements for games like Watchdogs, Wolfenstien, and Thief, but graphics and gameplay that have not been a huge improvement over what was available before in more smoothly running games that still used less resources. This just strikes of poor optimization to me, and I think people are rightly skeptical that the requirement for so much vram will lead to commensurate increases in image quality.
 
Last edited:
no, that is the point

1. we have yet to really know how much VRAM is being used at 1080p for each of these settings, for now we only know what is recommended (not required)

2. we don't know how good/bad the various settings are

3. Ultra textures is an extra, optional install...

4. why would it matter if they went a step further and created a SuperDuperUltraMega-High setting that required 12GB of VRAM in preparation for 8K, despite the fact that there are no consumer products available for such options?
It isn't the point; everyone adjusts settings if they don't get the right performance. If you want better performance, you decrease the graphics settings; acting like people don't know that is, well, insulting.

1. Recommended means that you'll potentially run out of vram if you don't have that much. What else could it mean?

2. The screenshot showed off some of the medium textures, which didn't look like anything amazing.

3. So? Just because it's optional doesn't mean that it should be (more or less) unusable.

The point is that there should be no reason to give us textures that are only usable by the smallest percentage of gamers; if they want to include these textures, they should at least a modicum of effort into making them playable.

Furthermore, take Crysis 3; it has extremely high pixel textures, along with all kinds of advanced graphics effects. How video ram do you need at 1080p? 2GB. Take Battlefield 4; its highest texture setting uses 6GB at triple 4k. If these games can manage great looking textures with 2GB video ram, no game should require 6GB.
 
People are reporting no need for 6GB of vRam for Mordor's Ultra textures. They are saying usage is around 3GB.
 
Debating picking this up. A bunch of games I've bought lately were garbage so going to wait to hear some user reviews of this one. Could be shovelware.
 
A good example on how you can waste VRAM.

8532-1.jpg

AndroidTextureComp.png
Those are worst comparison pics I have seen for a while.
Image should always be same, also information about the RAW memory requirement would be nice. (PNG is identical in terms of quality, but must be decompressed for display.)
Also difference/error information is good to have.

Agreed that textures should always be compressed unless the use dictates otherwise.
 
Last edited:
http://www.pcgameshardware.de/Mitte...pecials/Hands-on-Test-Ultra-Texturen-1137689/

@google translate:

Reason enough for us to examine what it actually is because the requirements to be. One advance: Seemingly contained the "Ultra-HD Texture Pack", which you have to download and install the retail version separately, in our already provided by Warner available test version. By the time these lines were written, we did not know where you can download the texture pack is said. A request in this regard we have already sent and you will naturally keep this regard to date.

After "Ultra" texture maps have naively set in conjunction with our Radeon R9 290X / 4G, we were quickly taught a lesson: The VRAM ran regularly, so that it came to Nachladerucklern. The graphics card was so unceremoniously replaced with a GTX Titanium "Classic" with 6 GiByte whose VRAM usage was estimated in GPU-Z to just 5.3 GiByte. If you switch from full-on Ultra HD are well and happy even 5.8. The other given values ​​are also down, so you get "High" on the 3.5 GiByte and with "medium" to 2.5.

Who has been looking forward to going through crisp, high-resolution textures, sorry to be disappointed. The quality varies even quite strong on "Ultra", so for example mountains look rather muddy walls but yet are quite respectable. Since the rocks but not infrequently take up much space, the whole thing falls while playing significantly (negatively) on - which is also not just high-resolution soils carry their remaining in these circumstances. Characters look however from consistently good
 
Slightly off topic, but I thought it might be interesting in the context of what we might see when it comes to VRAM usage in future games:

http://www.theastronauts.com/2014/03/visual-revolution-vanishing-ethan-carter/

note that the game in question (The Vanishing of Ethan Carter) doesn't go all in with this technique so the VRAM usage isn't too bad here, but it does have the potential for very heavy VRAM usage (luckily this should come with a proportional increase in texture quality)
 
Back
Top