A look at VRAM use with AA enabled

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
GTA V even passes 4GB without AA in 4K.

Both GTA V and Shadows of Mordor passes 4GB with AA on at 1440p.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
6GB looks to be pretty safe and 8GB even better. It's pretty clear with next gen (Pascal, for example) will likely be 4gb/8gb. 4gb should continue to be fine for most 1080p and 8gb will be preferred with hbm2.

I really am interested in Fury, but at 25% higher than 1440p, 4gb makes me worried. Hoping a 8gb arrives this year. :)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
With Pascal i think we will see 16GB cards in the 980ti/Titan segment. While 8GB will be the standard in the 970/980 replacements.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
But why use AA at 4K?

Well, the article mentions that:

With AA applied - and some people really do love using AA - those numbers really start to climb above 6GB in some games, like Shadow of Mordor, or GTA V.

We aren't even maxing out the anti-aliasing settings, as you can force NVIDIA or AMD settings to much higher settings, but that is overkill, isn't it? For some it won't be, so if that's something you want to see us do, please do let us know. For now, we've shown you just how much the maxed out in-game settings can do with AA (or AO in SoM) can cause games to consume much more VRAM.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
The most important aspect of all of this was not addressed - the impact to frame timing. If I had a huge stack of money and lots of time, I would do it myself... What we need to see is a vast arrangement of GPUs all pushed to the max settings/AA like TT did, but then add in min/max frames, avg frames, and frame times/FCAT results to determine just what the impact of consuming all this VRAM truly is. Anecdotally, I can tell you that based on my stuttery experience with my old 970, is that it makes a difference, but I want to see that hard evidence in a single repository compared to a whole bunch of other GPUs.
 
Feb 19, 2009
10,457
10
76
GTA V even passes 4GB without AA in 4K.

Both GTA V and Shadows of Mordor passes 4GB with AA on at 1440p.

Even the last COD used more than 4GB vram. The question isn't so much usage as to whether its required.

ie. Performance suffers due to lack of vram.

Then one has to factor in whether there's actually enough GPU grunt to run games at 4K with 4x MSAA or lol 8x MSAA (yes, TT tested 4K with 8x MSAA or SSAA!). Considering many modern games tank hard in performance as soon as MSAA is used, its really only for those who QUAD SLI Titan X.

The most realistic usage scenario for single or dual card of Titan X performance caliber at 4K is with MSAA disabled.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
This is a terrible review they did. Showing consumption measured in Afterburner tells you nothing about how much VRAM you actually need. You can use two different video cards with different amounts of VRAM in the same system, same game and same settings - but see two different values of VRAM consumed. You have to show framerate over time to give a real indication of how much VRAM you actually need. When you actually run out of VRAM your game drops down to sub 5fps and stutters while the memory is refilled. Just because software says the game is using X amount of VRAM, it does not mean that is how much VRAM you need.

For example my VRAM used in Battlefield 4 was different on my 780ti cards vs my Titan X cards; the Titan Xs used more VRAM. This has no impact on visuals or gameplay, both were identical, just different amounts of VRAM consumed.

The only time I've actually ran out of VRAM in recent memory was trying to run Shadow of Mordor at 2560x1600 on my 3GB 780ti cards. The game ran out of VRAM constantly and stuttered horribly. I had to reduce the texture settings to make it playable. On my Titan X cards I saw VRAM usage over 5GB, but if you look at reviews of GTX 980 cards with 4GB, they don't stutter at 2560x1600 with maxed settings. The 980 with 4GB can even run Mordor at Ultra settings 4K without running out of VRAM:

74764.png



TLDR: This only shows how much VRAM is being filled in monitoring software, not how much VRAM you actually need. To show actual VRAM requirements you need to show minimum framerate over time in a graph that also displays VRAM usage over the same synchronized timeline.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
My own subjective experience going from a 2560x1600 @ 30" using 2xMSAA to 3840x2160 @ 32" with no AA, the latter looks better. I played a lot of GTAV over the first few days of having a 4k monitor and a lot of the geometry, especially curved geometry doesn't have visible aliasing.

I posted this in another thread, the most visible aliasing that I see is between high contrast colours on lines which are near vertical or horizontal, the kind where the stepping is 1px offset every 30px or so, that occurred mostly obviously on the mini-map where the GPS highlights your route in yellow and the line is near vertical but ever so slightly off.

Otherwise things like geometry of the cars, this smooth changing curve, and even with things like thin power lines in the distance all look surprisingly good.

Remember that going form something like 1080p to 4k, that's 2x the horz and 2x the vert resolutions, so you're actually quadrupling the information per any given area (for a monitor the same size), 4k@27" ought to need it even less.

More generally speaking if you're running out of memory near the 4Gb limit you've probably already hit a bottleneck in frame anyway, I can't imagine even 2xAA is cheap at 4k, who cares if there's enough memory for it or not if the final frame rate is like 10fps anyway?
 

xthetenth

Golden Member
Oct 14, 2014
1,800
529
106
Naive metrics are bad ways to determine things, news at 11.

Sloppy, bad work, and timed so it may as well be a hit piece, and except for GTA V it looks like they tried to tune the AA settings so it used under 6 GB.
 

Udgnim

Diamond Member
Apr 16, 2008
3,679
122
106
don't see it mentioned anywhere what AA method was used

isn't MSAA much more VRAM intensive than FXAA / SMAA?

nm I'm seeing it listed on their specific charts

they really need to also show min & avg FPS with the charts

if a game needs more than 4GB+ VRAM to render scenes at 4K+AA but framerate tanks no matter the accessible VRAM for a GPU, then the GPU is the issue
 
Last edited:

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
don't see it mentioned anywhere what AA method was used

isn't MSAA much more VRAM intensive than FXAA / SMAA?

nm I'm seeing it listed on their specific charts

they really need to also show min & avg FPS with the charts

if a game needs more than 4GB+ VRAM to render scenes at 4K+AA but framerate tanks no matter the accessible VRAM for a GPU, then the GPU is the issue

All good points. I feel that we will likely see more in-depth reviews on this topic following the Fury launch, assuming that is 4GB initially. The 4GB vs. 6GB will be a deciding factor for some, and solid and extensive reviews hopefully can highlight if this is an issue or not. HBM is a wild-card here too...does that change anything from what we see today for GDDR5? Don't know yet. :)
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
It is very interesting to see the same reactions (this time around) for many who had opposite comments when the whole 2GB vs 3GB debate was happening. Same with 1/1.5GB vs. 2Gb further back.

What I learned personally is that both my 5870 1GB and 670 2GB were fantastic cards for the time I used them. Both cards hit the wall with memory usage....I grabbed both at launch, so they were great buys and worked great for me, but did hit a VRAM wall eventually. I do think we will see similar results at 4K. Thats just my gut feeling. Being 'good enough' now doesn't mean it will be great in 1 year, 18 months, or 2 years out.

The head in the sand reactions are alarming here, and kind of sad. Lets look at the issue objectively not through green or red lens'. If I am buying a 4GB card right now for anything more than 1080P I should know that it's days could be numbered in some cases, for some games. It doesn't mean it will not work, but I might have to compromise for some titles down the road.

More concrete info and results would be helpful, but lets also not kid ourselves and say there is 0 advantage to 6GB/8GB.
 

MiRai

Member
Dec 3, 2010
159
1
91
But why use AA at 4K?
Unfortunately, 2160p isn't the savior that people were touting a year ago about how you wouldn't need AA at such a high resolution. As someone who is sensitive to any aliasing on the screen, I prefer, at the very least, 2x MSAA, if not 4x MSAA, when playing at 3840x2160. "MSAA" was just used to fill in the blank, but I do prefer a method of AA that handles transparency, as well.

If someone isn't sensitive to aliasing (or doesn't care), then I envy them, but jagged lines and "shimmering" bother me when I play games, and I'm the guy who will sacrifice a bit of FPS in order to get that small bump in quality.
 

96Firebird

Diamond Member
Nov 8, 2010
5,734
327
126
If I am buying a 4GB card right now for anything more than 1080P I should know that it's days could be numbered in some cases, for some games. It doesn't mean it will not work, but I might have to compromise for some titles down the road.

More concrete info and results would be helpful, but lets also not kid ourselves and say there is 0 advantage to 6GB/8GB.

This. It all depends on the gamer, and if they are OK with lowering settings if/when their card runs out of VRAM.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
This. It all depends on the gamer, and if they are OK with lowering settings if/when their card runs out of VRAM.

Yeah, I think this is what people are missing. Its not going to be a black or white issue here. We know that 2GB is too small for a majority of games that are newer, with high/ultra textures. That's just how it is. 4GB will likely be enough for many games, but likely not all, in every possible configuration. Some adaptations or compromises may be required. It is what it is.

Edit: Not putting the all the blame on the card company. There are dependencies on the 4GB limit and the game engine/code as well. Some engines are more efficient than others. TW3 is great at using a small VRAM footprint, other games are not, for example.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Very suspicious timing for this article. ;)

Agree, but show me another time in the last 10-15 years where the VRAM has differed so much between the top 3 options?

Titan X: 12GB
980Ti: 6GB
Fury: 4GB (rumored)

Add to the fact that we also have HBM mixed-in. This isn't 2GB vs 1GB, 2GB vs 1.5GB or even 3GB vs 2GB. We are talking 2-3x differences in some cases, with the LEAST difference being 50%. That's pretty unique.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
This article doesn't even factor in that games have different ways they use VRAM. They can either just keep about what you need for the current set of textures and other data in VRAM or they can also use a portion or as much as is available as a fast cache.

If a cache strategy is in place then a game can eat every last bit of VRAM you have available if the driver allows it (for example 970 driver tries to keep usage around 3.5 GB but will go over if needed). You can't read that as "this game requires x amount of VRAM". The Titan X has excessive enough VRAM that it can show what the game will reserve at most but that the actual amount needed (data for drawing the current frame + some cached) will be far less, most likely in the 2-3 GB range for everything except 4K.

Many of the situations are also something that requires a dual TX/980Ti in the first place before performance walls are hit way before VRAM becomes an issue.

If the upcoming Fury only has 4 GB, then it's probably not going to be an ideal 4K card but it could still be pretty great for 1080p and 1440p.