The main reasons for inflated VRAM requirements

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
A post that I found on OC.net forums sums it up quite nicely:

User: RagingCain;

I wish people in here would not talk about a subject they do not know about.

People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum. Doing the exact same thing.

The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because they have to for consoles. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.

The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.

1920x1080P, uses 256MB of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.

Memory usage is not linear, it does not go up every freaking year. It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300

When publishers tell developers to do this for the PC:
1.) Don't have to optimize for a NUMA architecture.
2.) Don't have to prioritize assets.
3.) Don't have to write efficient rendering methods.


A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.
Source

Developers are targeting higher VRAM specs not because the texture resolutions are so much higher and more detailed, but because they are using VRAM as a storage cache in an effort to circumvent doing proper optimization for memory management on the PC, a NUMA platform..

Watch Dogs is the perfect example. Before the game even launched, I was worried because I had heard that the Disrupt engine would be an amalgamation of the AnvilNext (from the AC series) and Dunia engine (from the Far Cry series).. Both engines in my experience relied HEAVILY on using VRAM as a storage cache, and used very little system memory. AC IV for instance uses less than 700mb of RAM on the highest settings, despite being a very large game.

And sure enough, the low system memory utilization and overt reliance on VRAM for storage made their way from those engines into the Disrupt engine, causing the pervasive stuttering problem we see in Watch Dogs today (the game itself uses around 2GB of RAM on ultra settings as reported by resource monitor, a small amount for a big 64 bit game) when VRAM swapping takes place and the textures have to be pulled from HDD/SSD rather than from system memory which is much faster.

So basically developers are either just lazy, or want to save on development time and costs by relying on the PC's brute force rather than smarter and more effective optimization techniques. I'd say it was more the latter.

Though that's not to say that there is anything wrong with using VRAM as a cache. However, to rely it on it so extensively can be misleading as seen by Shadow of Mordor where the VRAM requirements are highly inflated; or worse, can be very detrimental to performance as seen in Watch Dogs..
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Yeah, I already ranted about this several times on here.
Joking that once I have a GPU with 8gb Vram, then the console ports can run entirely on my GPU alone and the rest of my pc can just watch. :p

The Problem is, we all know it's happening, but there is nothing you can do about it except not buy the games.
But that just hurts PC gaming even more because the less profits the PC versions make, the less chance we get of convincing Devs to spend the resources to optimize the ports for the PC.

We all know the reality is that PC ports are just not profitable enough vs console versions for a developer to spend the extra resources porting games to the PC.
So they do what is best for them as a business to maximize profits, and that is to do the least costly pc port as possible.
That's business and it's part of what we have to accept as a PC gamers, because it's not going to change any time soon.

So basically, the way I see it... it's a necessary evil for PC gaming to survive.

So i may joke now about wanting a GPU with 8gb Vram...but you can bet I will own one as soon as I feel the right one with the right price/perf becomes available. :biggrin:
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,487
5,155
136
I haven't seen anything that actually mentions the amount of texture memory used by either game on the consoles. So saying that the VRAM is being inflated is a little much without more information. But, yeah, the game is being designed around the consoles and the PC is just kind of there.

when VRAM swapping takes place and the textures have to be pulled from HDD/SSD rather than from system memory which is much faster.

PCI Express really is that slow in that even pulling textures from main memory to the video card to complete a frame is going to cause stuttering. You could design the game to avoid that, but of course that isn't happening.
 

amenx

Diamond Member
Dec 17, 2004
3,842
2,003
136
Brilliant. Thanks for that. Always wondered why I had no problems running any and all games smoothly with a 2gb 770 @ 1440p while everyone screams in your face you need 4gb and above or suffer a hellish gaming experience. Sure a couple games may offer extra textures only available at higher vram but I suspect wont see anything meaningful for me to say "wow". Never bought watchdogs but visually enjoyed WNO without highest texture settings. Will check it out again when my 4gb 970 arrives but needless to say there is no suspense in the waiting.
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Brilliant. Thanks for that. Always wondered why I had no problems running any and all games smoothly with a 2gb 770 @ 1440p while everyone screams in your face you need 4gb and above or suffer a hellish gaming experience. Sure a couple games may offer extra textures only available at higher vram but I suspect wont see anything meaningful for me to say "wow". Never bought watchdogs but visually enjoyed WNO without highest texture settings. Will check it out again when my 4gb 970 arrives but needless to say there is no suspense in the waiting.
Yeah, my brother played WNO on a GTX 470 with 1.5gb vram , with compressed textures turned on and everything else maxed that he could at 1080p and it ran and looked great.
And I personally played Watch Dogs on a 7950 and a R9 290 both without stuttering at all on my two PCs.
The one with the 7950, I ran Ultra Textures and all Ultra settings except for Shadows on high in Borderless Window mode 1080p.....and my R9 290 ran it with just fine also with everything Ultra @ 2560x1440.

So yeah, what you're saying does make sense since being smart about what settings you use can go a long way.
But the OP has a point also, in that we all know the current console ports aren't as optimized for PC architecture as they could be...and he's right.

But the days of Games being developed and optimized for the PC architecture 1st, then settings scaled down for the console versions is over....Its just the reality of PC gaming today in IMO.

We'll eventually strike a balance between developer "cost effective" console to PC ports vs the right amount of GPU Vram to run those type of ports at their best on PC, then we should be back to fully reaping the rewards of gaming on a high end PC.....we're just not quite there yet....but I don't personally think that we're that far away from that perfect balance either.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
I guess PC elitists want to stick with their 2GB cards.
Its not like we need different textures for different objects. Concrete looks almost the same as steel when in motion, so that shaves off a few hundred MB. Lets put ground texture on character clothes and call it a camo - more vram savings!

Middle Earth game uses 3GB of vram maxed. Ultra textures add another 3GB to that. That is 3GB of textures that go into vram. Wanna stream textures from system ram? Why bothering with dGPU? Get Kaveri IGP, since its already overpowering 2400MHz system memory DDR3 anyway.

You cant simply transfer huma memory management in a game straight into PC without any changes, since PC (most) do not support such thing.
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
I think I remember reading in the OCN thread that this game is also using 8GB of RAM.
If that is right then I don't think less RAM & more VRAM logic applies here.

EDIT:
I asked in the thread,
Here is the reply

User: cstkl1
it took full 8.5gb alone ontop of background of 1.5gb. but that was like 1 hr into game play without the ultra texture pack installed.

So..
Logic flawed.

Edit2:- The guy made a mistake.
Its about 5GB.
The point still stands.
 
Last edited:

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
I think I remember reading in the OCN thread that this game is also using 8GB of RAM.

If that is right then I don't think less RAM & more VRAM logic applies here.
Yeah, that does seem a bit "out of balance".
I never really paid attention to Mordor except for when the recent headlines mentioned it needing 6gb vram for ultra textures @ just 1080p!
I don't own it personally since it doesn't look like my type of game.
I would just assume that the less vram you have, the more system memory will be used to swap those textures to vram as needed....which when using Ultra textures may cause stutter transferring such large textures...hence the "recommended" amount of vram for Ultra textures @ 1080p is 6gb is to avoid that "swapping" and possible "stutter".
 
Last edited:

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
GFX cards stopped being 'frame buffers' a long time ago. So while the OP's source has a point, it's fairly outdated considering the present need for storage and computation data on a modern GPU. Developers have allot on their plate - they aren't likely being lazy, given the delivery times games with ever more realistic effects, art assets, etc....
 

rtsurfer

Senior member
Oct 14, 2013
733
15
76
Yeah, that does seem a bit "out of balance".
I never really paid attention to Mordor except for when the recent headlines mentioned it needing 6gb vram for ultra textures @ just 1080p!
I don't own it personally since it doesn't look like my type of game.
I would just assume that the less vram you have, the more system memory will be used to swap those textures to vram as needed.

Well the user reviews say that it has a combat system of Batman & Assassins Creed games, so I'll give it a shot.
Don't know if that's your cup of tea.

I was turned off by the game at first too since it looked kind of Skyrimish...
which is not my genre of gaming at all.

Here is a TL;DW of the game from the TotalBiscuit review.( I stole it from Reddit.)
See if it strikes your fantasy.

Pros:
Very enjoyable combat which gives you appropriate amounts of choice. Many ways to take down the orcs.
Nemesis system introduces a whole new meaning to dying, which is fantastic.
Upgrade system is deep enough with good progression.
Generally he seems pleased with the PC port, performance is good and controls work great.
Great animations, combined with responsive controls gives combat a great feeling.
Cons:
No SLI support yet, but expected to come.
All missions relies heavily on fighting orcs, so if you don't enjoy the combat you might have an issue.
Pretty basic stealth system.
No reason why it would use 3/6GB of video ram, textures aren't that amazing.
Mixed bag in the graphical fidelity. Generally a positive impression, but Mordor in itself is a pretty grey and boring place.
Bottom line:
If you like killing Orcs you will love this game.

I am trying to get some people to report on their system RAM usage. Let's see.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
I can attest to the dev's laziness. BF4 after this new patch is still a memory leak about to happen every 1hr of playtime in MP. Its disgusting I cant even alt-tab now because 95% of my RAM is filled and BF4, that starts needing 2GB when you just enter a game, ends up eating like 5,5GB of system RAM.
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Well the user reviews say that it has a combat system of Batman & Assassins Creed games, so I'll give it a shot.
Don't know if that's your cup of tea.

I was turned off by the game at first too since it looked kind of Skyrimish...
which is not my genre of gaming at all.

Here is a TL;DW of the game from the TotalBiscuit review.( I stole it from Reddit.)
See if it strikes your fantasy.



I am trying to get some people to report on their system RAM usage. Let's see.
Right now I'm busy replaying both Witcher games in preparation for the Witcher 3. ( I absolutely loved the Witcher 1&2)
But I may try the game if I can get it on sale.... to me the story would have to be good...I'll check out the TB review and specific game forums to see what I think.
Thanks for the suggestion....I view other "gamer" suggestions more highly than the usual Gamespot/IGN reviews! :thumbsup:
 
Last edited:

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
I can attest to the dev's laziness. BF4 after this new patch is still a memory leak about to happen every 1hr of playtime in MP. Its disgusting I cant even alt-tab now because 95% of my RAM is filled and BF4, that starts needing 2GB when you just enter a game, ends up eating like 5,5GB of system RAM.

Well, that's pretty egregious if you are not the only one (i.e., some strange combination of hardware and software running that's screws with BF4 for a tiny minority of people - that could be very difficult to sus out). I've worked in high availability embedded development where that sort of thing happened from time to time (except we were dealing with larger customers and had to find a fix fast). It's somewhat sad, that DICE doesn't have someone they can task with fixing unusual problems, which is what I did 25% of my time in that position (sometimes with a few others working the problem as well). I just happened to be good at fixing these type of problems and hence was called on often.

In any case, the devs probably want to fix it - almost all of the devs I've worked with have allot of pride in the code that they write. It's likely an issue of manpower - where DICE just won't give the developers the time to fix it.
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
Well, that's pretty egregious if you are not the only one (i.e., some strange combination of hardware and software running that's screws with BF4 for a tiny minority of people - that could be very difficult to sus out). I've worked in high availability embedded development where that sort of thing happened from time to time (except we were dealing with larger customers and had to find a fix fast). It's somewhat sad, that DICE doesn't have someone they can task with fixing unusual problems, which is what I did 25% of my time in that position (sometimes with a few others working the problem as well). I just happened to be good at fixing these type of problems and hence was called on often.

In any case, the devs probably want to fix it - almost all of the devs I've worked with have allot of pride in the code that they write. It's likely an issue of manpower - where DICE just won't give the developers the time to fix it.
So can I go way OT and ask you a specific question?
When someone reports an issue like the one you quoted, do they try and build a system with the exact same hardware to try and replicate the issue?
Or is that simply deemed not cost effective and all the testing/troubleshooting is just done on what is considered the most common PC hardware configs?
Just a curiosity question that I've always wondered about. :thumbsup:
 
Last edited:

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Well, that's pretty egregious if you are not the only one (i.e., some strange combination of hardware and software running that's screws with BF4 for a tiny minority of people - that could be very difficult to sus out). I've worked in high availability embedded development where that sort of thing happened from time to time (except we were dealing with larger customers and had to find a fix fast). It's somewhat sad, that DICE doesn't have someone they can task with fixing unusual problems, which is what I did 25% of my time in that position (sometimes with a few others working the problem as well). I just happened to be good at fixing these type of problems and hence was called on often.

In any case, the devs probably want to fix it - almost all of the devs I've worked with have allot of pride in the code that they write. It's likely an issue of manpower - where DICE just won't give the developers the time to fix it.

Memory leaks on BF4 are commonplace, specially on Mantle. New Mantle drivers sorted it out a bit, but this new patch (without changing drivers) screwed it again.

Also to have a patch after 5 months and not being able to read a single performance optimizacion with any renderer on BF4 new patch's patchnotes is just disgusting. Like performance degradation with vsync doesnt exist at all, or horrible draw distance regardless of mesh quality and terrain quality settings, or weird performance degradation because of irregular server performance. Yeah, game is just fine, really.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
No shit, textures are stored in VRAM as well as the framebuffer? :rolleyes:

The present model is fine considering the costs of having GPU-like memory transfer rates for system memory. Lots of fast VRAM keeps everything running nice an smooth. And I believe that in a few years, 6 GB or 8 GB will be necessary. I would say that 4 GB is a happy medium to be at right now.

The 360 and PS3 were already constrained with 512 MB (and to think that 256 MB was the original plan!). At the time of the 360's release, 256 MB was the norm for anyone who took PC gaming seriously and the 360 had as good a GPU for 720p rendering as any PC GPU. In two years it became painfully obvious that 512 MB graphics cards would be necessary for higher than 720p PC game renders as devs got better at utilizing 360 and PS3's memory and hardware. The 360's 512 MB of memory in particular is all available to devs for use as VRAM.

The Xbone and PS4 are not as memory constrained. They have 16x the memory but are not 16x as powerful overall. However, they use a substantial portion of their memory for background processes. The PS4 IIRC has 5 GB available to devs which is safe to say, quite a bit.

The console dev-available memory will only increase in the coming years as their OSs get revised and patched. The argument also lies with whether or not the consoles have the graphics horsepower to really make use of the amount of memory they have. From a purely graphics standpoint, I don't think they do, unless new memory intensive but GPU-friendly rendering effects come into being. One also has to remember that memory usage profiles are different engine-to-engine.

In PCs, I see 2 GB being ok for the time being but 3 or 4 GB for graphics cards seems to much more logical when you consider the higher AA, AF, and possibly higher resolutions that may be ran with PC versions of games. 4 GB also has a good deal of longevity built into it, and is likely enough to last a gamer most of the way through this generation.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
So can I go way OT and ask you a specific question?
When someone reports an issue like the one you quoted, do they try and build a system with the exact same hardware to try and replicate the issue?
Or is that simply deemed not cost effective and all the testing/troubleshooting is just done on what is considered the most common PC hardware configs?
Just a curiosity question that I've always wondered about. :thumbsup:

For our hardware/firmware - I would simply build a subset of the client's network and use traffic generators to simulate their network. Occasionally, I or another team member would need to travel to the customer site to gather diagnostic info. We worked with much more closed systems than the PC world.

The open nature of the PC ecosystem presents a very different challenge. I didn't work much with that ecosystem, but another group in my former company had allot more QA guys per product than we did - even though it was enterprise class software (design to run on much more specific hardware configs).

You can PM me if you have more questions.
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
Memory leaks on BF4 are commonplace, specially on Mantle. New Mantle drivers sorted it out a bit, but this new patch (without changing drivers) screwed it again.

Also to have a patch after 5 months and not being able to read a single performance optimizacion with any renderer on BF4 new patch's patchnotes is just disgusting. Like performance degradation with vsync doesnt exist at all, or horrible draw distance regardless of mesh quality and terrain quality settings, or weird performance degradation because of irregular server performance. Yeah, game is just fine, really.

This reeks of being a major screw-up by DICE management. Apparently, they are racking in so much in profits that 'good enough for most' is all they really care about. I'm sorry for your troubles.
 

videogames101

Diamond Member
Aug 24, 2005
6,777
19
81
system ram has higher latency to GPUs than VRAM, obviously - i don't see any reason why they need to be unified, and I don't see anything wrong with massive VRAM usage, in fact - i prefer higher performance resulting from increased caching because of larger available low latency memory pools, such as VRAM
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
I wish people would stop using the "developers are lazy" platitude.

We can switch to "devs are mediocre these days" if you want. We say they are lazy just to not conclude they are ultimately bad at what they do.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
We all know the reality is that PC ports are just not profitable enough vs console versions for a developer to spend the extra resources porting games to the PC.
So they do what is best for them as a business to maximize profits, and that is to do the least costly pc port as possible.

I'm not sure profitability is a big factor in this. By all reports, the PC platform is extremely profitable due to digital distribution, much longer sale cycles and no cuts to Microsoft and Sony. In fact, the PC gaming market is bigger than the console market by a massive margin, and is projected to continue growing.

It's also important to note that not every developer does this. And some game engines are better than others at utilizing PC hardware.. That's the crux of the matter I think. Developers are designing engines to be inherently multiplatform, which is a daunting task. CryEngine 3 is one of the most optimized engines out there for PC, a long with Frostbite 3 and Unreal Engine 3.

To this day I still marvel at how optimized Crysis 3 was for PC. The first time I played it was back when I had my overclocked GTX 580s. I feared their 1.5GB of VRAM would cause performance issues, but the game played surprisingly well. I had an average frame rate of 35 @ 1440p on very high settings with SMAA 1x..

Ubisoft's Disrupt Engine on the other hand should never have seen the light of day. If Ubisoft intended the Disrupt engine to be a major multiplatform engine, they really need to go back to the drawing board..
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
Few GB is a lot of texture/vertex data and a lot of it should should be streamed in. (you do not need desert texture/objects if area is a castle in woods.)
This doesn't fix problems with bad choice on textures which can appear everywhere. (IE. 4 layer 4k texture on a arrow.(32MB - 64MB depending on compression, 256 without.))

This is the place in which virtual texturing or fine grained streaming makes it's win, no more cost depending on texture sizes as it's couple of constant buffers.
We can switch to "devs are mediocre these days" if you want. We say they are lazy just to not conclude they are ultimately bad at what they do.
You could switch that to development time is too short..
IMHO. it's incredible that projects this big are ready for a release dates.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I haven't seen anything that actually mentions the amount of texture memory used by either game on the consoles. So saying that the VRAM is being inflated is a little much without more information. But, yeah, the game is being designed around the consoles and the PC is just kind of there.

What does memory usage on the Xbox One and PS4 have to do with VRAM requirements on the PC? The PS4 and Xbox One don't even have access to ultra level textures, among many other things..

At any rate, ultra textures on Shadow of Mordor does not NEED 6GB of VRAM. That's the point.. Most of the VRAM requirement stems from caching, and not from rendering.. Even high level textures will use up your VRAM for caching, because that's how the game engine has been designed to work.

This isn't really problematic, but it's very misleading. With Watch Dogs on the other hand, it IS problematic because the texture variance is much greater and so texture swapping is much more frequent, which leads to stuttering.

PCI Express really is that slow in that even pulling textures from main memory to the video card to complete a frame is going to cause stuttering. You could design the game to avoid that, but of course that isn't happening.

Nobody insinuated that pulling textures from main memory to the GPU for rendering was advisable. Main memory should never be used for rendering by the GPU as it's not fast enough..

System memory should be used as a buffer for VRAM. But when developers like Ubisoft don't take advantage of system memory for cache and instead hammer the VRAM, it can cause lots of performance issues in certain types of games with large texture variation..

A large open world game with the kind of texture variation that Watch Dogs has should be using 5GB of system memory as a buffer, and not 2GB if it was properly optimized for PC. Why on Earth did Ubisoft even make the game 64 bit if they're not going to take advantage of the increased memory address space?