• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Middle Earth Shadow of Mordor - 6GB Vram required (ultra textures)

IllogicalGlory

Senior member
2lt1umh.png
 
Last edited:
*required for Ultra textures

i.e. not really required to play

also, it clearly states the amounts are recommendations, not requirements
 
6gb at only 1080 for ultra textures? lol that means you will need 6gb for just high textures at 1440.
 
I plan on getting this game so I'll post back here once I try it out. We'll see how the 980 twins handle this "ultra" 6gb recommendation.
 
What exactly happens when its 6 gb (which is impossible) and the gpu reaches its vram limit?

I had watch dogs on max settings except AA turned down some tiers, but it ran on an average of 45 fps. What did the game do with the 1.5 vram?
 
What the heck? Are devs really this lazy now? Ugh... What kind of idiot releases a game like this? Excuse me while I smash my head through a wall.

I'd say that it doesn't matter to me because I have no interest in this game, but I'm concerned about the trend. I guess I'm gonna have to hold out a full extra year with my possible dying 7950, meaning I'll have to play Arkham Knight and GTAV at medium settings. :/ 8GB midrange cards and 12-16GB high-end cards are going to have to come next year. Yay for not compressing things anymore!
 
Last edited:
What exactly happens when its 6 gb (which is impossible) and the gpu reaches its vram limit?

I had watch dogs on max settings except AA turned down some tiers, but it ran on an average of 45 fps. What did the game do with the 1.5 vram?

The game will pause momentarily and resume as the VRAM is filled with new data. It hitches and is a really awful experience. Watchdogs did exactly this on my system if I used settings that were too much for my 3GB cards.
 
Man that is some SLOP programming if I've ever seen some. Just taking advantage of the PC users on their ports now?
 
What the heck? Are devs really this lazy now? Ugh... What kind of idiot releases a game like this? Excuse me while I smash my head through a wall.

Not laziness per se, but I'm sure developers aren't taking the PC into account when designing the game. That most people's discrete GPU only have 2, maybe 3 GB of memory is of no concern to them.
 
Man that is some SLOP programming if I've ever seen some. Just taking advantage of the PC users on their ports now?

how can you come to that conclusion based off a screenshot of just the settings screen? For all we know Low could look amazing with Medium being what we'd be used to, and High and Ultra are settings that will allow people to truly leverage their high end cards

for all we know Ultra really does take the textures to 11

Not laziness per se, but I'm sure developers aren't taking the PC into account when designing the game. That most people's discrete GPU only have 2, maybe 3 GB of memory is of no concern to them.

which is why they have low (1GB) medium (2GB) and high (3GB)...? seems they were concerned enough to create those tiers of textures and were also good enough to spell out what is recommended...

can't really judge it until we see it
 
Last edited:
6gb V-ram is for Ultra textures at 1080p.
So if you're running a 2560x1440 or 4k monitor, you can't even buy a GPU "today" with enough v-ram to run Ultra textures.

So you'd basically be stuck running at a lower game resolution just to use higher resolution textures... or run at a higher game resolution with lower resolution textures.....
Hmmmm........yeah..ok........Good Plan! :thumbsup:
 
Last edited:
which is why they have low (1GB) medium (2GB) and high (3GB)...? seems they were concerned enough to create those tiers of textures and were also good enough to spell out what is recommended...

I doubt they are creating different textures. I imagine it'd be like the CPU cutting the resolution of the texture before sending it off to the GPU; as if you went into photoshop and resized the image to some percentage smaller. Then just tweak the percentage based upon the minimal testing that gets done.
 
6gb V-ram is for Ultra textures at 1080p.
So if you're running a 2560x1440 or 4k monitor, you can't even buy a GPU "today" with enough v-ram to run Ultra textures.

So you'd basically be stuck running at a lower game resolution just to use higher resolution textures... or run at a higher game resolution with lower resolution textures.....
Hmmmm........yeah..ok........Good Plan! :thumbsup:

so by your "logic", if they just flat out didn't include the Ultra textures, it would be a better game because then you wouldn't feel like you were missing out on anything because you didn't have to compromise with the settings, you could just ignorantly crank everything all the way up......Hmmmm........yeah..ok........Good Plan! :thumbsup:

I doubt they are creating different textures. I imagine it'd be like the CPU cutting the resolution of the texture before sending it off to the GPU; as if you went into photoshop and resized the image to some percentage smaller. Then just tweak the percentage based upon the minimal testing that gets done.

ah ok, I see how it is, as PC gamers we're just going to prejudge everything on frivolous speculation. Its ok, its not uncommon that our pessimistic "hunches" turn out to be true, so we're justified in treating every situation as the same 🙄



Who knows, maybe the game will turn out to be a sloppy port to the PC, but I'm going to reserve my judgement until I can see how things actually play out
 
A sloppy console port would ship with "medium" quality and that's it, no "high" or "ultra" options for PC.

The fact that its available as a ultra texture downloadable is a sign the devs care, since their original art assets are most likely 4K resolution.
 
Even if their textures were 4K doesn't really explain why 6 GBs is needed for 1080p. Either that or their mipmaps need some optimizing.
 
Well, since nobody seemed to utilize these huge amounts of vram previously, I tend to think a big part of it will be just sloppy porting from the consoles, but granted, we won't know for sure until the game comes out. But watchdogs used a lot of vram, and that was far from a highly optimized port to the PC.
 
HAHAHAHAHAHAHAHA, so you just bought a 980/970, looking at a new AAA game to play, and you can't "max" it purely because of vRAM. It isn't some Uber setting. AHAHAHAHAHAHAHAHAHAHAHAHHAHAHHAHAHAHAHHA. So now, you'll need an i7 with 16GB minimum to take care of the consoles Atom equivalent CPU and an 8GB vRAM card to take care of the 7790/7850 GPU requirements. HAHAHAHAHAHAHHAHHA.

Let's give the devs more grunt! See what happens when the 512MB RAM restriction is removed!
 
so by your "logic", if they just flat out didn't include the Ultra textures, it would be a better game because then you wouldn't feel like you were missing out on anything because you didn't have to compromise with the settings, you could just ignorantly crank everything all the way up......Hmmmm........yeah..ok........Good Plan! :thumbsup:



ah ok, I see how it is, as PC gamers we're just going to prejudge everything on frivolous speculation. Its ok, its not uncommon that our pessimistic "hunches" turn out to be true, so we're justified in treating every situation as the same 🙄



Who knows, maybe the game will turn out to be a sloppy port to the PC, but I'm going to reserve my judgement until I can see how things actually play out

What??! Logic?? Calm rational thoughts? I thought I was at the Anandtech GPU forums but maybe I got lost
 
The game will pause momentarily and resume as the VRAM is filled with new data. It hitches and is a really awful experience. Watchdogs did exactly this on my system if I used settings that were too much for my 3GB cards.

Oh that did happen. (btw I got gtx 580)
But that happened even on lowest settings. max fps but still did hitch.
So I went max on everything so the game at least looks like something lol.
I did read that a lot people were affected, no matter how much vram.
 
The game will pause momentarily and resume as the VRAM is filled with new data. It hitches and is a really awful experience. Watchdogs did exactly this on my system if I used settings that were too much for my 3GB cards.

Perhaps proper use of technology like metatextures

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

could put a pipe in the 'memory leaks' we're currently seeing on gpu's vram..
I cant help but get the feeling it is one of these areas where "hey lets just throw more hardware at it" instead of "hey lets do this smarter" applies.

A little pause in hardware evolution may not be a bad thing at all, might just force the devs to compete on a different set of parameters.
 
Perhaps proper use of technology like metatextures

http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review/6

could put a pipe in the 'memory leaks' we're currently seeing on gpu's vram..
I cant help but get the feeling it is one of these areas where "hey lets just throw more hardware at it" instead of "hey lets do this smarter" applies.

A little pause in hardware evolution may not be a bad thing at all, might just force the devs to compete on a different set of parameters.

Hahaha.. I bet this is the exact same thing console gamers told us in ps3/360 era. Pathetic. Let's ban kids with shiny toys from our sandbox! LOL

You can always use some vram. Make all textures 8K. Done it? Don't reuse one texture on half of the objects on scene? Done? Make em 16K textures!
 
Correct me if I am mistaken but didn't Doom 3 block out Ultra textures unless your GPU had 512MB? And when it came out I don't think any GPUs had 512 for another generation or even 2 (9800XT 256MB, X800XT/6800 GT/Ultra era had 512MB yet). If the textures look amazing, I give props to a developer pushing the envelope. Frankly since 7970 had 3GB and 680 had 4GB versions from early 2012, if anything flagship cards at $550+ should come with 6-8GB. Now if it's the case of horrible optimization with wannabe "Ultra" textures like Watch Dogs, then it's a problem. However, in general after using a 3GB card for nearly 3 years, I am not even remotely impressed by 980's 4GB at $549.

I think AMD and NV need to anticipate that next gen console ports will use more memory and produce 6-8GB products. GM200 should have at least 6GB. I also don't see why NV can't ship an 8GB 980 for $600 when their mobile cards ship with 6-8GB!! It also sucks that a 1 year old 780Ti got gimped this badly that it's nearly "outdated" just 1 year later due to VRAM gimping vs. Titan Black.

680 2GB, a flagship just 2 years ago is on death row. I guess the developers did warn us last year that if buying a new PC GPU, while GPU power would be more than adequate for Ps4/XB1 ports, they said to buy one with as much VRAM as you can afford. Looks like with The Evil Within, Watch Dogs, Wolfenstein, Titanfall and Mordor, 3-4 GB GPU is now midrange amount with 6-8GB high end.
 
Last edited:
I think AMD and NV need to anticipate that next gen console ports will use more memory and produce 6-8GB products. GM200 should have at least 6GB. I also don't see why NV can't ship an 8GB 980 for $600 when their mobile cards ship with 6-8GB!! It also sucks that a 1 year old 780Ti got gimped this badly that it's nearly "outdated" just 1 year later due to VRAM gimping vs. Titan Black.

I doubt it can be done for just a mere 50$ more. It would most likely be 75-100$. The PS4 BOM was 11$ per 1GB of 5.5Ghz, and that have increased quite significantly since.
 
Last edited:
Back
Top