The main reasons for inflated VRAM requirements

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I am not even sure who to believe anymore.
The last guy I PMed, first he responded with this screen shot

Well the screenshot definitely isn't accurate as that's his entire system memory usage..

Then I asked him to specifically tell me the Game's executable memory usage from the task manger, to which he said 792.7MB . He is running SLi Titans. Maybe its this low because he is just running the in-game bench & not actually playing.

Yeah that seems really low to me. Isn't Shadows of Mordor a 64 bit title?
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Watch Dogs is a tough nut to crack if you want to figure out the source of the stuttering. I can't tell you how many hours I've wasted testing this and that in an effort to reduce the stuttering, only to have Ubisoft release a new "patch" and be right back to square one D:

Anyway, from what I discerned, the game doesn't store ANY textures in system memory and relies solely on VRAM. As per resource monitor, the game itself uses only 2GB of system memory (with pagefile disabled), which isn't enough to include textures.. It's only using the system memory for things like animations, audio, mesh, game code etcetera...

But you're right, that raw VRAM capacity isn't a silver bullet for this game. So it has to be something related to how the game is managing the VRAM..


I'm telling you, it's the lack of dynamic texture streaming. Last gen most good engines didn't stall because they kept a very low quality texture resident in VRAM to render while the higher quality one was loaded. They don't need to do that anymore with so much ram on the next gen consoles, and PC gamers with very recent 2GB and even 3GB cards are suffering as a result.

This is not a bad use of VRAM if you have lots of it, and consoles do.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
I agree with you Cerb. I don't have anything against developers using VRAM as a cache for textures and other graphics data.. But, I also think that developers are relying overly much on brute methods to reduce development time and resources for the PC platform, and using VRAM to store textures to the extent that we are seeing in games today, is just a symptom of a greater problem..


The problem is that GPUs haven't come with enough VRAM. The next gen consoles surprised everyone with 8GB vs the 4GB they were predicted to have. There have been 6GB cards on the market for a while now, but I doubt they were popular because everyone thought they didn't need it. Well guess what....you do.

Using brute force to overcome this kind of stuff isn't a symptom of a disease, it's progress.
 

poohbear

Platinum Member
Mar 11, 2003
2,284
5
81
The problem is that GPUs haven't come with enough VRAM. The next gen consoles surprised everyone with 8GB vs the 4GB they were predicted to have. There have been 6GB cards on the market for a while now, but I doubt they were popular because everyone thought they didn't need it. Well guess what....you do.

Using brute force to overcome this kind of stuff isn't a symptom of a disease, it's progress.

suprised everyone? who was surprised in 2013 that they came with 8gb?? most of us were disappointed since most high end systems had 16gb of RAM. and that's supposed to last for 10 years??

I don't call it progress, i call it poor optimization. If u're not using system ram and only video ram, then yes i would call that a poorly optimized port. They don't even CARE that there's only 1% of users with 4gb VRAM, it was all rushed and reaks of being last minute.
 

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
ITT: People doing their best to justify their recent 2/3GB video card purchase.



I don't recall any area where people complain that we have "too much memory", except for this. Coincidentally, all the people complaining about 4+ GB vram usage all have <4GB cards. Who'd've thunk it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I'm telling you, it's the lack of dynamic texture streaming. Last gen most good engines didn't stall because they kept a very low quality texture resident in VRAM to render while the higher quality one was loaded. They don't need to do that anymore with so much ram on the next gen consoles, and PC gamers with very recent 2GB and even 3GB cards are suffering as a result.

This is not a bad use of VRAM if you have lots of it, and consoles do.

Yeah but if the problem is strictly related to VRAM capacity, why do Titans and other 6GB cards still stutter? It has to be more than just VRAM capacity causing this problem. I suspect it's due to the inefficient way in which the engine manages VRAM..

The Disrupt engine wasn't ready for prime time. While other major 3D engines like CryEngine 3, Frostbite 3 and Unreal Engine 3.5 manage memory resources effectively across multiple platforms, the Disrupt engine falls flat on it's ass on the PC..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
The problem is that GPUs haven't come with enough VRAM. The next gen consoles surprised everyone with 8GB vs the 4GB they were predicted to have. There have been 6GB cards on the market for a while now, but I doubt they were popular because everyone thought they didn't need it. Well guess what....you do.

Using brute force to overcome this kind of stuff isn't a symptom of a disease, it's progress.

If you've been following the thread, you'd know that VRAM is mainly being used for storage, and not for rendering. In fact, modern GPUs are incapable of rendering a frame that requires the full frame buffers that they are equipped with due to lack of computational power and bandwidth..

And while I concede that it is better performance wise to store textures in VRAM than system memory, larger VRAM capacities isn't the silver bullet that you seem to think it is.. Watch Dogs proves that you can't brute force proper resource management by forcing PC gamers to upgrade their video cards to higher VRAM capacity models..

Writing tight and efficient code is still just as important as it's ever been..
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
If you've been following the thread, you'd know that VRAM is mainly being used for storage, and not for rendering. In fact, modern GPUs are incapable of rendering a frame that requires the full frame buffers that they are equipped with due to lack of computational power and bandwidth..

And while I concede that it is better performance wise to store textures in VRAM than system memory, larger VRAM capacities isn't the silver bullet that you seem to think it is.. Watch Dogs proves that you can't brute force proper resource management by forcing PC gamers to upgrade their video cards to higher VRAM capacity models..

Writing tight and efficient code is still just as important as it's ever been..


When devs feel that they need texture streaming in the consoles, it'll filter down to PC. Remember, they're fixed platforms with a finite amount of ram. This is a temporary issue while PC GPUs catch up to consoles. This IS tight code, because it's stripping out what should be an unnecessary hack.

Also, I don't know where this notion is coming from that storing anything VRAM is some unusual thing. VRAM has been more than just a frame buffer since what, the voodoo 3? Referring to it like that demonstrates a completely lack of understanding about why it's even there. You're talking about it like its equivalent to a disk cache crowding programs out of main memory.
 
Last edited:

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Texture streaming has been used in pc titles for years, I don't think devs will remove the feature fromt their engines just because the new consoles don't need it.

That said some devs might not have much resources for the pc version and do a titanfall. In that case you'll need lots of vram for the highest texture setting (and lots of harddisk space for uncompressed audio in all languages :mad: )
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
ITT: People doing their best to justify their recent 2/3GB video card purchase.

I don't recall any area where people complain that we have "too much memory", except for this. Coincidentally, all the people complaining about 4+ GB vram usage all have <4GB cards. Who'd've thunk it.

Scenario 1: GPU VRAM is used as effectively as possible but eventually 3GB isn't enough and newer games require 4-6GB+. We have to upgrade.

Scenario 2: GPU VRAM is not used effectively and because of poor coding 3GB isn't enough and newer games require 4-6GB+. We have to upgrade.

There is no point of fighting the inevitable. I had my 7970s for nearly 3 years and they have 3GB of VRAM. To me 4GB is mid-range at best and flagship cards imo should have 6-8GB for $500+. It's not our fault NV once again gimped their $550 980 with only 4GB of VRAM. They did this with 1.5GB 480/580, 2GB 680 and now 4GB 980 which will be borderline in 2 years but "flagship" $500+ GPU should last more than 2 years.

I sure hope AMD brings out 6-8GB on their 390X so that NV has to bring 6-8GB cards to the market (980 8GB, GM200 6GB, etc.). In the past, I do not recall PC gamers complaining that their 9800Pro 128GB would run out of VRAM. Instead, they embraced upgrading to X800XT 256MB/6800U 256MB because it was required to run new games. Today, all you hear are constant complaints from the very same people that tend to throw $300-500 on a new GPU for 30-50% performance improvements when in the past it was 70-100%. No one is forcing anyone to buy a 4GB card or side-grade from a 780. I am actually pretty disappointed that my 7970s 3GB slice through 99% of games with ease at 1080P and that hardly more than a handful of games take advantage of 3GB of VRAM.

I want developers to start taking advantage of next gen graphics/game engines/features because I am sick and tired of gimped PS4/XB1 graphics on the PC. If it means using 6-12GB of VRAM and my 7970s run at 10fps, I am going to be finally thrilled to upgrade for real, not just for internet e-peen/because I am bored of using the same GPUs for 3 years. Sure, there are 'artificial' ways to induce upgrading such as moving from 1080P to 1440/4K but bumping resolution beyond 1080p does not significantly improve graphics (Polygons, lighting effects, particle effects, shadows, AI, number of objects on the screen, physics complexity/motion animations, etc.); you just get more crispness and less aliasing but the game still looks 99% the same.

Therefore, as far as I am concerned, bring on games with 6GB VRAM requirements. 3-4GB is mid-range at this point when 7970 had it for 3 years and R9 290 for 1 year. Now a mid-range 970 has 4GB for $330, which suggests flagship cards should have 6-8GB moving forward, especially with the popularity of 4K gaming on the rise.
 
Last edited:

TreVader

Platinum Member
Oct 28, 2013
2,057
2
0
Scenario 1: GPU VRAM is used as effectively as possible but eventually 3GB isn't enough and newer games require 4-6GB+. We have to upgrade.

Scenario 2: GPU VRAM is not used effectively and because of poor coding 3GB isn't enough and newer games require 4-6GB+. We have to upgrade.

There is no point of fighting the inevitable. I had my 7970s for nearly 3 years and they have 3GB of VRAM. To me 4GB is mid-range at best and flagship cards imo should have 6-8GB for $500+. It's not our fault NV once again gimped their $550 980 with only 4GB of VRAM. They did this with 1.5GB 480/580, 2GB 680 and now 4GB 980 which will be borderline in 2 years but "flagship" $500+ GPU should last more than 2 years.

I sure hope AMD brings out 6-8GB on their 390X so that NV has to bring 6-8GB cards to the market (980 8GB, GM200 6GB, etc.). In the past, I do not recall PC gamers complaining that their 9800Pro 128GB would run out of VRAM. Instead, they embraced upgrading to X800XT 256MB/6800U 256MB because it was required to run new games. Today, all you hear are constant complaints from the very same people that tend to throw $300-500 on a new GPU for 30-50% performance improvements when in the past it was 70-100%. No one is forcing anyone to buy a 4GB card or side-grade from a 780. I am actually pretty disappointed that my 7970s 3GB slice through 99% of games with ease at 1080P and that hardly more than a handful of games take advantage of 3GB of VRAM.

I want developers to start taking advantage of next gen graphics/game engines/features because I am sick and tired of gimped PS4/XB1 graphics on the PC. If it means using 6-12GB of VRAM and my 7970s run at 10fps, I am going to be finally thrilled to upgrade for real, not just for internet e-peen/because I am bored of using the same GPUs for 3 years. Sure, there are 'artificial' ways to induce upgrading such as moving from 1080P to 1440/4K but bumping resolution beyond 1080p does not significantly improve graphics (Polygons, lighting effects, particle effects, shadows, AI, number of objects on the screen, physics complexity/motion animations, etc.); you just get more crispness and less aliasing but the game still looks 99% the same.

Therefore, as far as I am concerned, bring on games with 6GB VRAM requirements. 3-4GB is mid-range at this point when 7970 had it for 3 years and R9 290 for 1 year. Now a mid-range 970 has 4GB for $330, which suggests flagship cards should have 6-8GB moving forward, especially with the popularity of 4K gaming on the rise.


I agree, and I really think nvidia has a good product with the 970, but they've gimped it (and even the 980!) in the hopes of charging people $700 more for GM210 + 8GB


The developers are lazy, so what? It's not like you can force them to optimize for <8GB. If nvidia had put 8GB on the card, nobody would be complaining. And they should've! At least the 980, right?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I called the impact of the consoles completely wrong. There relatively low GPU compute performance combined with their relatively large amount of VRAM is changing the way the developers code for consoles to make things run fast. They are using the VRAM to cache a lot more things and the poor IO speed of the console is meaning they can't stream effectively not to mention the poor CPU performance. The end result of which is that the compute power on a PC is vastly more than necessary but the VRAM isn't really enough, certainly not for the highest quality of textures and models that the games can provide. We can have better post processing and in time as the PC accelerates off in performance we will see that matter more and the asset quality on consoles will drop to allow more compute use and it might reduce the VRAM use somewhat. But more VRAM is definitely necessary.

My understanding of the current pipelines for these games is that much of the VRAM goes into the lighting. They render many different images for different purposes so that later post processing can use all of this information to generate the final scene. For example you can have an image that shows how occluded a pixel is for the purposes of ambient occlusion, you can do a pre HDR lighting image to determine what the brightness should be for the image and then scale the lighting later on accordingly. These images are nothing like what we see, they are often in a different colour space (lighting being one of 3 components for example) that aids their calculations, which is why we don't use the 1920x1080x4 (for 32 bit colour) = 8MB for an image being the only use of VRAM. Textrures are becoming a big part of it and that is because they are using the strengths of the console, lots of VRAM(not fast but lots of it) and avoiding its weaknesses (CPU, GPU and IO performance).

I suspect we will see reasonably priced high VRAM GPUs soonish. It will likely top out at 6GB for the grand majority of games anyway because of the impact of the consoles, and we can still expect to see PCs targeting available GPUs.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
I want developers to start taking advantage of next gen graphics/game engines/features because I am sick and tired of gimped PS4/XB1 graphics on the PC. If it means using 6-12GB of VRAM and my 7970s run at 10fps, I am going to be finally thrilled to upgrade for real, not just for internet e-peen/because I am bored of using the same GPUs for 3 years. Sure, there are 'artificial' ways to induce upgrading such as moving from 1080P to 1440/4K but bumping resolution beyond 1080p does not significantly improve graphics (Polygons, lighting effects, particle effects, shadows, AI, number of objects on the screen, physics complexity/motion animations, etc.); you just get more crispness and less aliasing but the game still looks 99% the same.

Therefore, as far as I am concerned, bring on games with 6GB VRAM requirements. 3-4GB is mid-range at this point when 7970 had it for 3 years and R9 290 for 1 year. Now a mid-range 970 has 4GB for $330, which suggests flagship cards should have 6-8GB moving forward, especially with the popularity of 4K gaming on the rise.

You said it... :thumbsup:
 

janii

Member
Nov 1, 2013
52
0
0
1080p with 1.5gb vram user here.

Games have been running absolutely fine on my pc, no matter if my vram got maxed out or not. Games wont simply say "sorry, but the game cant run because the ram is maxed"

Literally every game I play maxes out my vram now. But I see no drawback. No, there are no mini stutters (attempting to load stuff from ram or whatever)

Shadow of Mordor, I have everything on max except shadows medium and texture quality high. Vram is maxed. Game runs 60 fps (of course there are places where it dips)
Gtx 580 btw.

The only time it starts to weaken is when I go higher resolution.
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,013
136
"Storage"? Hahaha. How about "keeping all the data needed to render close to the GPU"?

If you start needing to stream things in from main memory (or god forbid, from a hard drive), expect a stuttery mess.

Hopefully Intel or AMD will sort their stuff out and give us a 200W APU with quad channel unified DDR4 memory. Get rid of this split memory nonsense.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Hopefully Intel or AMD will sort their stuff out and give us a 200W APU with quad channel unified DDR4 memory. Get rid of this split memory nonsense.

Yes AMD has a good chance here to take advantage of the unified memory pools / address spaces like with what Kaveri did. Unfortunately I don't think DDR4 will really be fast or cheap enough for a little while but eventually it'll come down and then AMD releases the super APU.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I called the impact of the consoles completely wrong. There relatively low GPU compute performance combined with their relatively large amount of VRAM is changing the way the developers code for consoles to make things run fast. They are using the VRAM to cache a lot more things and the poor IO speed of the console is meaning they can't stream effectively not to mention the poor CPU performance. The end result of which is that the compute power on a PC is vastly more than necessary but the VRAM isn't really enough, certainly not for the highest quality of textures and models that the games can provide. We can have better post processing and in time as the PC accelerates off in performance we will see that matter more and the asset quality on consoles will drop to allow more compute use and it might reduce the VRAM use somewhat. But more VRAM is definitely necessary.

My understanding of the current pipelines for these games is that much of the VRAM goes into the lighting. They render many different images for different purposes so that later post processing can use all of this information to generate the final scene. For example you can have an image that shows how occluded a pixel is for the purposes of ambient occlusion, you can do a pre HDR lighting image to determine what the brightness should be for the image and then scale the lighting later on accordingly. These images are nothing like what we see, they are often in a different colour space (lighting being one of 3 components for example) that aids their calculations, which is why we don't use the 1920x1080x4 (for 32 bit colour) = 8MB for an image being the only use of VRAM. Textrures are becoming a big part of it and that is because they are using the strengths of the console, lots of VRAM(not fast but lots of it) and avoiding its weaknesses (CPU, GPU and IO performance).

I suspect we will see reasonably priced high VRAM GPUs soonish. It will likely top out at 6GB for the grand majority of games anyway because of the impact of the consoles, and we can still expect to see PCs targeting available GPUs.
1920x1080x4? 1920x1080x16 :), or more, plus all those intermediate buffers, ideally. You could use several hundred MB today, for 1080P, especially w/ AA. The consoles aren't driving the many-buffer thing, though--they are just finally catching up. Lighting happens to very hard to make look good, while also being fast. What deferred lighting could be done in DX9 worked, and could look good, but still wasn't nearly as good as what DX11+ can do. But, it requires ever more buffers. These buffers may require more space than the prior consoles, but they are much more constrained by bandwidth than they are space. Each buffer is fairly small, but must be generated, then combined with others, making for several loads and stores per pixel/point in each buffer.

1080p with 1.5gb vram user here.

Games have been running absolutely fine on my pc, no matter if my vram got maxed out or not. Games wont simply say "sorry, but the game cant run because the ram is maxed"
Yes, they can. Not all will, but they can and do. Try FO3, F:NV, Skyrim, w/ mesh and/or texture packs; especially with a higher than normal cell loading distance.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Ohh please expand on that matter.

What's there to expand on that hasn't already been said? If it were simply a matter of VRAM capacity, then the Titans would play Watch Dogs without stuttering.

But evidently this isn't the case. Brute VRAM capacity can't make up for shoddy programming and bad design..
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
What's there to expand on that hasn't already been said? If it were simply a matter of VRAM capacity, then the Titans would play Watch Dogs without stuttering.

But evidently this isn't the case. Brute VRAM capacity can't make up for shoddy programming and bad design..


Why makes you so certain it's VRAM that's causing the stutter?
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
When devs feel that they need texture streaming in the consoles, it'll filter down to PC. Remember, they're fixed platforms with a finite amount of ram. This is a temporary issue while PC GPUs catch up to consoles. This IS tight code, because it's stripping out what should be an unnecessary hack.

PC GPUs already exceed console GPUs by leaps and bounds. You're acting as though these isolated examples are the new rule, but they're not.

Most next gen games are going to be made on Frostbite 3, Unreal Engine 4.x and CryEngine 3 and all of these engines are very efficient at managing resources; unlike the Watch Dogs engine which was a disaster..

Resource management is really the key to all of this, because as has been stated numerous times, render targets don't need nowhere near this much VRAM.. Proper management of resources is why Shadows of Mordor can use high level textures on a GTX 580 1.5GB @ 1080p, despite the developers recommending 3GB and greater..

Also, I don't know where this notion is coming from that storing anything VRAM is some unusual thing. VRAM has been more than just a frame buffer since what, the voodoo 3? Referring to it like that demonstrates a completely lack of understanding about why it's even there. You're talking about it like its equivalent to a disk cache crowding programs out of main memory.
I never claimed it's unusual, so I don't know where you're getting that from o_O

There's nothing wrong with storing textures in VRAM. But, people are acting as though VRAM capacity is the be all end all. What matters more than VRAM capacity is how well the engine manages it's resources.

That's the defining factor, and not VRAM capacity..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's a free, open source program call Process Hacker, similar to Process Explorer.

You can find info and download from here.

http://processhacker.sourceforge.net/

Hey thanks for that, it's a very useful program :awe:

Just played around with it a bit and loaded up Crysis 3 and Watch Dogs. The GPU shared bytes in both games was very low, around 130mb.

This just goes to show that fears of games using main memory for a frame buffer are completely unfounded. The game will only use VRAM as a frame buffer. The system memory's purpose is to buffer the VRAM itself..

If done properly, this causes no performance hit..
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Scenario 1: GPU VRAM is used as effectively as possible but eventually 3GB isn't enough and newer games require 4-6GB+. We have to upgrade.

Scenario 2: GPU VRAM is not used effectively and because of poor coding 3GB isn't enough and newer games require 4-6GB+. We have to upgrade.

And here's Scenario 3: GPU VRAM is used as effectively as possible and combined with effective resource management, 3GB and 4GB cards are viable for years to come as rendering uses a small portion of the VRAM for buffering.

There is no point of fighting the inevitable. I had my 7970s for nearly 3 years and they have 3GB of VRAM. To me 4GB is mid-range at best and flagship cards imo should have 6-8GB for $500+. It's not our fault NV once again gimped their $550 980 with only 4GB of VRAM. They did this with 1.5GB 480/580, 2GB 680 and now 4GB 980 which will be borderline in 2 years but "flagship" $500+ GPU should last more than 2 years.

This is a total overreaction. janii in this thread claimed he was using a GTX 580 1.5GB card for Shadows of Mordor and playing high level textures @ 1080p.

Yet the developer recommends TWICE that amount of VRAM for high level textures.. :|

And back when I had my GTX 580 SLI, I played Crysis 3 at 1440p on very high settings with no issue at all..

Like I've been saying, resource management is the real factor in all of this. If a engine is well coded and has efficient resource management, then sky high VRAM requirements are unnecessary.

It's only when you have poorly coded and inefficient engines where it becomes a problem, and in a case like Watch Dogs, all the VRAM in the World still won't prevent stuttering because the engine is just so inefficient with it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
1080p with 1.5gb vram user here.

Games have been running absolutely fine on my pc, no matter if my vram got maxed out or not. Games wont simply say "sorry, but the game cant run because the ram is maxed"

Literally every game I play maxes out my vram now. But I see no drawback. No, there are no mini stutters (attempting to load stuff from ram or whatever)

Shadow of Mordor, I have everything on max except shadows medium and texture quality high. Vram is maxed. Game runs 60 fps (of course there are places where it dips)
Gtx 580 btw.

The only time it starts to weaken is when I go higher resolution.

Well stated :thumbsup: