The main reasons for inflated VRAM requirements

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I guess PC elitists want to stick with their 2GB cards.
Its not like we need different textures for different objects. Concrete looks almost the same as steel when in motion, so that shaves off a few hundred MB. Lets put ground texture on character clothes and call it a camo - more vram savings!

Middle Earth game uses 3GB of vram maxed. Ultra textures add another 3GB to that. That is 3GB of textures that go into vram. Wanna stream textures from system ram? Why bothering with dGPU? Get Kaveri IGP, since its already overpowering 2400MHz system memory DDR3 anyway.

You cant simply transfer huma memory management in a game straight into PC without any changes, since PC (most) do not support such thing.

Everything you say here is wrong and a total misrepresentation of what I am saying. Try and follow the discussion.

Shadow of Mordor does NOT require 6GB of VRAM for ultra textures. There are plenty of people playing the game right now on 2GB, 3GB and 4GB cards using ultra textures.

One example. He's using a GTX 970 and playing at 1440p with ultra textures..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I think I remember reading in the OCN thread that this game is also using 8GB of RAM.
If that is right then I don't think less RAM & more VRAM logic applies here.

EDIT:
I asked in the thread,
Here is the reply



So..
Logic flawed.

Edit2:- The guy made a mistake.
Its about 5GB.
The point still stands.

That's a load of bull. There is no way this game is using 5GB of main memory. Whoever you asked does not know how to properly read the memory utilization graph.

He's probably looking at the entire memory usage, rather than just the game..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
system ram has higher latency to GPUs than VRAM, obviously - i don't see any reason why they need to be unified, and I don't see anything wrong with massive VRAM usage, in fact - i prefer higher performance resulting from increased caching because of larger available low latency memory pools, such as VRAM

I agree that there's nothing wrong with using VRAM as storage. However, using VRAM as main storage for textures can cause performance issues in certain types of large open world games that have high texture variance and fast movement; like Watch Dogs.

Because the VRAM isn't large enough to store all the textures, texture swapping to SSD/HDD is inevitable and that can lead to performance issues if they haven't properly coded the game to use main memory as a buffer to the VRAM.

And like the guy I quoted in my OP stated, GPUs with very large frame buffers like the Titan are a waste of money, because they don't have the processing power to even utilize such a large frame buffer to begin with.

That 6GB frame buffer is mostly being used to store textures, and not for render targets..
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
That's a load of bull. There is no way this game is using 5GB of main memory. Whoever you asked does not know how to properly read the memory utilization graph.

He's probably looking at the entire memory usage, rather than just the game..

Yeh actually 5GB seems a bit too outlandish.
I asked another guy who said 3GB .

Here are the two people I asked.

They posted one after another
http://www.overclock.net/t/1515461/...-of-vram-for-ultra-textures/690#post_22933192

While their might be some inconsistencies here, I think 3GB sounds about right.
I would do my own tests but my system is down & will be down for quite some time.

I actually just PMed a guy with a Titan Black & asked him more specifically about just the game RAM usage in Task Manager. This should be the perfect case scenario as he has 6GB of VRAM, so we'll see how much the game consumes after being allowed to go full out in the VRAM usage.

Get back to you when he answers.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
Everything you say here is wrong and a total misrepresentation of what I am saying. Try and follow the discussion.

Shadow of Mordor does NOT require 6GB of VRAM for ultra textures. There are plenty of people playing the game right now on 2GB, 3GB and 4GB cards using ultra textures.

One example. He's using a GTX 970 and playing at 1440p with ultra textures..

That guy only played an hour, so you don't know if there are more vram heavy parts in the later part in the game. But 3500 memory usage suggests that 2 or 3 GB cards are not going to be a good idea with Ultra. It also suggests that any testing done with Ultra textures was done with a 6 GB card.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
What does memory usage on the Xbox One and PS4 have to do with VRAM requirements on the PC? The PS4 and Xbox One don't even have access to ultra level textures, among many other things..

At any rate, ultra textures on Shadow of Mordor does not NEED 6GB of VRAM. That's the point.. Most of the VRAM requirement stems from caching, and not from rendering.. Even high level textures will use up your VRAM for caching, because that's how the game engine has been designed to work.

This isn't really problematic, but it's very misleading. With Watch Dogs on the other hand, it IS problematic because the texture variance is much greater and so texture swapping is much more frequent, which leads to stuttering.



Nobody insinuated that pulling textures from main memory to the GPU for rendering was advisable. Main memory should never be used for rendering by the GPU as it's not fast enough..

System memory should be used as a buffer for VRAM. But when developers like Ubisoft don't take advantage of system memory for cache and instead hammer the VRAM, it can cause lots of performance issues in certain types of games with large texture variation..

A large open world game with the kind of texture variation that Watch Dogs has should be using 5GB of system memory as a buffer, and not 2GB if it was properly optimized for PC. Why on Earth did Ubisoft even make the game 64 bit if they're not going to take advantage of the increased memory address space?

Because they are frauds. Remember the E3 demo? That game looked as close to real life as possible, even today that demo looks better than most games by far.

What did we get as the end result? Crappy console port that has none of the high quality textures shown at E3, but still uses insane amount of resources as if it was using e# textures.

Watch Dogs has been one of the biggest gaming frauds in history. The game looked 5x better on that E3 demo.

So the developers are lazy, don't care. They just have the PR machine brainwashing that they care, but we can see it in their games that they don't care, we get fraudulent games from what we were shown and promised that are unoptimized and resource cogs for nothing.

So yeah, I'd expect 6GB vRam to at least bring me real life type graphics, but no, its still the same quality like we had 5 years ago, yet it uses 5x the vram more than it used to use.

And of course when no one buys these fraudulent, terrible, unoptimized console ports on the PC, you see the developers and publishers instantly whining and lying how there is not enough gamers on PC, how there is too much piracy, on and on with the lies, when all it comes down to is PC gamers don't want to play fraudulent, unoptimized console ports.


I can't explain how many times I've played one of the absurd console ports and the game requires me to press "O" to confirm, press "X" to cancel, press "triangle" to go back, etc... And I'm like really? This is the gaming experience you are bringing me? Console controllers on my mouse and keyboard PC game?

Look at EA's latest need for speed. Requires console controllers, every system, every UI is designed for console controllers, the frame rate is capped at 30fps, if you mod it to remove it, the game literally plays as if its 2x or more speed. You don't get more frames, you get game that is running is fast mode. And it still required absurd amount of resources for its 5 years old crappy graphics.

So I get freaking PS3 and Xbox360 graphics, but it still requires me to have have absurd amounts of power on my PC to be able to play it properly.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Why on Earth did Ubisoft even make the game 64 bit if they're not going to take advantage of the increased memory address space?
32bit does hard crash when the limit is reached, not exactly a nice for development nor for players.
Having 64bit executable just makes everything a lot easier.
 

Dufus

Senior member
Sep 20, 2010
675
119
101
One example. He's using a GTX 970 and playing at 1440p with ultra textures..
Would have been nice if it had also shown shared memory usage as well. AB is only showing dedicated bytes.

For instance while this screen shot was used to show that a 32-bit program can use a combination of VRAM and DRAM exceeding 4GB it also shows a graphics card with 2GB of VRAM and software loading somewhere near 2.8GB of textures.
2m2b220.png


As can be seen the extra VRAM request of over 2GB (or what was left to use of the 2GB of VRAM) is spilled off into system "shared memory" and this is done auto-magically by the OS drivers.

Would you also please show your calculation of 256MB of render frame buffering for 1920x1080 in first post. Thanks.
 
Last edited:

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
So Watch Dogs is a VRAM hog and as a side note stuttering got worse in the last update I think 9/27/14.

The theory of how they rely on VRAM is sound but people with 6GB cards still have stutter.

One person claimed that moving from his highly overclocked 4770k to a 5930K solved his stuttering issues.

Others have experimented with putting the whole game and its shader cache on RAMdisk with varying results. One person says that resolved his stutter and another says it still stutters.

The person that said the stutter is gone reports low frame rates with under-utilization of both the CPU and GPU. This person is using a 780GTX SLi setup.

I don't know what to believe and what to think.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
A post that I found on OC.net forums sums it up quite nicely:

Source

Developers are targeting higher VRAM specs not because the texture resolutions are so much higher and more detailed, but because they are using VRAM as a storage cache in an effort to circumvent doing proper optimization for memory management on the PC, a NUMA platform..

Watch Dogs is the perfect example. Before the game even launched, I was worried because I had heard that the Disrupt engine would be an amalgamation of the AnvilNext (from the AC series) and Dunia engine (from the Far Cry series).. Both engines in my experience relied HEAVILY on using VRAM as a storage cache, and used very little system memory. AC IV for instance uses less than 700mb of RAM on the highest settings, despite being a very large game.

And sure enough, the low system memory utilization and overt reliance on VRAM for storage made their way from those engines into the Disrupt engine, causing the pervasive stuttering problem we see in Watch Dogs today (the game itself uses around 2GB of RAM on ultra settings as reported by resource monitor, a small amount for a big 64 bit game) when VRAM swapping takes place and the textures have to be pulled from HDD/SSD rather than from system memory which is much faster.

So basically developers are either just lazy, or want to save on development time and costs by relying on the PC's brute force rather than smarter and more effective optimization techniques. I'd say it was more the latter.

Though that's not to say that there is anything wrong with using VRAM as a cache. However, to rely it on it so extensively can be misleading as seen by Shadow of Mordor where the VRAM requirements are highly inflated; or worse, can be very detrimental to performance as seen in Watch Dogs..

Thanks for the read. I figured it was poor optimization. But are they really that short sighted? Why even bother releasing it??? Theyre not making any money when they say 4GB RAM needed cause no ones gonna buy it. Doesnt make ANY business sense.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Everything you say here is wrong and a total misrepresentation of what I am saying. Try and follow the discussion.

Shadow of Mordor does NOT require 6GB of VRAM for ultra textures. There are plenty of people playing the game right now on 2GB, 3GB and 4GB cards using ultra textures.

One example. He's using a GTX 970 and playing at 1440p with ultra textures..
Shame, too. I was really hoping we'd start seeing HQ textures in stock games with this one.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Developers are targeting higher VRAM specs not because the texture resolutions are so much higher and more detailed, but because they are using VRAM as a storage cache
...and they have been doing that regularly since DX9 became mainstream.
in an effort to circumvent doing proper optimization for memory management on the PC, a NUMA platform..
No. That is proper optimization. System RAM is slw, SSDs slower, and HDDs even slower. Keeping tons in VRAM is good. That itself is not poor memory management.

And sure enough, the low system memory utilization and overt reliance on VRAM for storage made their way from those engines into the Disrupt engine, causing the pervasive stuttering problem we see in Watch Dogs today (the game itself uses around 2GB of RAM on ultra settings as reported by resource monitor, a small amount for a big 64 bit game) when VRAM swapping takes place and the textures have to be pulled from HDD/SSD rather than from system memory which is much faster.
You PC players. Yeah, you. You don't matter. Kthxbai. That's the real problem.

So basically developers are either just lazy, or want to save on development time and costs by relying on the PC's brute force rather than smarter and more effective optimization techniques. I'd say it was more the latter.
Publishers consider human labor a bad thing to pay for, so devs tend to be mediocre, get burned out, etc..
 
Last edited:

BD2003

Lifer
Oct 9, 1999
16,815
1
76
I believe the problem is that they're not using dynamic texture streaming anymore. Like in many unreal engine and idtech engine games where you can sometimes see textures slowly fade in. Only the lowest detail textures were kept permanently in vram, and the higher detail levels were cached. The new consoles have lots of ram and it's a unified pool, so they must have figured they can drop that layer of complication. Just dump it all in VRAM. The problem doesnt seem to be a lack of VRAM so much as a lack of management of VRAM.

I dunno if I'd call it lazy devs per se...saving yourself the trouble of complex workarounds and optimization required by weak hardware is a perfectly legitimate use of brute force. I think we're better off in the long run if they don't need to treat high end PCs like they're last gen consoles. Just suck it up and buy a card with more VRAM.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
I believe the problem is that they're not using dynamic texture streaming anymore. Like in many unreal engine and idtech engine games where you can sometimes see textures slowly fade in. Only the lowest detail textures were kept permanently in vram, and the higher detail levels were cached. The new consoles have lots of ram and it's a unified pool, so they must have figured they can drop that layer of complication. Just dump it all in VRAM. The problem doesnt seem to be a lack of VRAM so much as a lack of management of VRAM.

I dunno if I'd call it lazy devs per se...saving yourself the trouble of complex workarounds and optimization required by weak hardware is a perfectly legitimate use of brute force. I think we're better off in the long run if they don't need to treat high end PCs like they're last gen consoles. Just suck it up and buy a card with more VRAM.

Yeah I kind of understand and agree. PC gamers spend an order or two magnitude more than console gamers and we can change out parts easily. So I guess yeah suck it up and buy parts with more VRAM. Right now I can't decide if a 4GB 980GTX is an upgrade to a 3GB 780ti. So this puts me in a tight spot. If the 980GTX were significantly faster it would be an easy decision.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
You could switch that to development time is too short..
IMHO. it's incredible that projects this big are ready for a release dates.

It't almost a miracle - it's not though because it turns users into the QA department. I've been in teams where we've had to fight feature creep, press on after losing a key member or two to another company and even had a project pulled in after a year of work from an 18 mo to a 15 mo project. It's really crazy and usually just means we'd be working really long hours to make up the difference.

By definition, most developers are closer to the average - not towards mediocre. But, as so Cerb just pointed out, even average developers are expensive in the mind of management. On just about every project I've worked on I wish we had anywhere from 2-5 more developers; but I've worked for companies that tried to hirer good developers - and since our reputations were on the line, we would just work 60 or 80 hours (or a 100 for the crazy single guys working their way up) a week to make sure we did a good job.

TL; DR - blame company management - you have no idea what's going on with the development team.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I didn't mean mediocre in terms of ability, necessarily. The dev company and/or publisher favors more work over better work, and you are replaceable, because you are an unknown in a very popular field. If you don't have much experience, you are replaceable, in debt, and have few job prospects that don't involve adding the debt of moving hundreds or thousands of miles to your existing schooling debt. In many cases, the developers may treat their people like fellow humans, but get the screws put to them by publishers. But since there are so many people wanting to get in, the big publishers can, within legal limits (EA got sued, not long ago, after all...), treat programmers like little worker bees. Plus, people keep buying their products, and thus paying for the big publishers to keep doing what they've been doing...
 
Last edited:

brandon888

Senior member
Jun 28, 2012
537
0
0
i can play Middle-earth Shadow of Mordor with 7950 3 GB on ultra ... with ultra textures yes ... it's just me or ? average FPS 50 and 1080p ofc
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
While their might be some inconsistencies here, I think 3GB sounds about right.

I seriously doubt it's even 3GB. I would wager it's no more than 2GB..

Tell them to use resource monitor, as that is pretty accurate for determining what the game itself is using..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
That guy only played an hour, so you don't know if there are more vram heavy parts in the later part in the game. But 3500 memory usage suggests that 2 or 3 GB cards are not going to be a good idea with Ultra. It also suggests that any testing done with Ultra textures was done with a 6 GB card.

You're still not getting it dude. The game is going to use as much VRAM as you can throw at it because it's using the VRAM as a cache.

A much smaller portion of the VRAM is actually being used for render targets.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
32bit does hard crash when the limit is reached, not exactly a nice for development nor for players.
Having 64bit executable just makes everything a lot easier.

There's that, but with games getting larger, more seamless and more complex, a higher memory ceiling is definitely necessary.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
For instance while this screen shot was used to show that a 32-bit program can use a combination of VRAM and DRAM exceeding 4GB it also shows a graphics card with 2GB of VRAM and software loading somewhere near 2.8GB of textures.
2m2b220.png

Interesting, can you tell me what the name of that program is?

Would you also please show your calculation of 256MB of render frame buffering for 1920x1080 in first post. Thanks.

If I had done the calculation, I would tell you. But I was quoting someone else from another forum..
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
I seriously doubt it's even 3GB. I would wager it's no more than 2GB..

Tell them to use resource monitor, as that is pretty accurate for determining what the game itself is using..

I am not even sure who to believe anymore.
The last guy I PMed, first he responded with this screen shot

ccdddcd2_ShadowOfMordor_2014_10_01_09_27_48_711.jpeg


Then I asked him to specifically tell me the Game's executable memory usage from the task manger, to which he said 792.7MB . He is running SLi Titans. Maybe its this low because he is just running the in-game bench & not actually playing.

Both some other guy in the thread is saying his uses 4Gb & yes he is specifically looking at the game only.

I would do my own tests as I have a 290X but my PC is going to be down for a while.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
So Watch Dogs is a VRAM hog and as a side note stuttering got worse in the last update I think 9/27/14.

The theory of how they rely on VRAM is sound but people with 6GB cards still have stutter.

One person claimed that moving from his highly overclocked 4770k to a 5930K solved his stuttering issues.

Others have experimented with putting the whole game and its shader cache on RAMdisk with varying results. One person says that resolved his stutter and another says it still stutters.

The person that said the stutter is gone reports low frame rates with under-utilization of both the CPU and GPU. This person is using a 780GTX SLi setup.

I don't know what to believe and what to think.

Watch Dogs is a tough nut to crack if you want to figure out the source of the stuttering. I can't tell you how many hours I've wasted testing this and that in an effort to reduce the stuttering, only to have Ubisoft release a new "patch" and be right back to square one D:

Anyway, from what I discerned, the game doesn't store ANY textures in system memory and relies solely on VRAM. As per resource monitor, the game itself uses only 2GB of system memory (with pagefile disabled), which isn't enough to include textures.. It's only using the system memory for things like animations, audio, mesh, game code etcetera...

But you're right, that raw VRAM capacity isn't a silver bullet for this game. So it has to be something related to how the game is managing the VRAM..
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I am not even sure who to believe anymore.
The last guy I PMed, first he responded with this screen shot

ccdddcd2_ShadowOfMordor_2014_10_01_09_27_48_711.jpeg


Then I asked him to specifically tell me the Game's executable memory usage from the task manger, to which he said 792.7MB . He is running SLi Titans. Maybe its this low because he is just running the in-game bench & not actually playing.

Both some other guy in the thread is saying his uses 4Gb & yes he is specifically looking at the game only.

I would do my own tests as I have a 290X but my PC is going to be down for a while.
VRAM use and VRAM needs are not the same. Many game engines will use much more VRAM than they need, to keep from having to reload textures into VRAM all the time. Frame time distribution and IQ comparisons (making sure it isn't lowering quality with less VRAM) would be the best way to test it. If 3GB gameplay results in substantially more abnormally long frames (or any, if the game is normally pretty smooth), but 6GB not, that would indicate >3GB to be useful. So far, the anecdotal evidence is against that.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
No. That is proper optimization. System RAM is slw, SSDs slower, and HDDs even slower. Keeping tons in VRAM is good. That itself is not poor memory management...

I agree with you Cerb. I don't have anything against developers using VRAM as a cache for textures and other graphics data.. But, I also think that developers are relying overly much on brute methods to reduce development time and resources for the PC platform, and using VRAM to store textures to the extent that we are seeing in games today, is just a symptom of a greater problem..