How good can game graphics look with modest Vram?

amenx

Diamond Member
Dec 17, 2004
4,299
2,629
136
How good can game graphics look with modest vram? Witcher 3 is probably the best looking game I've seen. Yet look at the vram it consumes:

vram.gif


Tbh, I havent played Mordor so cant comment on how good that looks (as a known vram hogger) vs W3. Other great titles noted for their graphics are Metro 2033 and Last Light, yet again less than 2gb vram usage. Sure there are great looking games that consume a lot of vram, but is or was that necessary in the design of the game? Just shoving tons of textures into vram is not the only way to make great looking games... or even necessary... is it?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Witcher 3 uses more.

http--www.gamegpu.ru-images-stories-Test_GPU-RPG-The_Witcher_3_Wild_Hunt_v.1.04-test-witcher3_vram.jpg


And thats not even ultra settings.

You need high quality textures if you want a good result.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I can try run Witcher 3 in a few locations and measure usage. Then we can exclude any benchmark run in a "low vram" area or with some certain limiting settings. I will post back with results.

There is no way remotely I can get near those memory usages you posted. On the other hand I am pretty close to gsmegpus. Just 100MB shy.

There is about 300MB difference depending on location I tried.
 
Last edited:

happy medium

Lifer
Jun 8, 2003
14,387
480
126
@ 1080p almost all games don't use more than 2gb of memory.
@ 1440p most games still don't use more than 2gb of memory.
any resolution higher after that needs more than 2gb.

The very latest games seem to using more memory.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I haven't been logging it and looking for a max used amount, or anything, but I've been seeing 14.-1.6, when I'm checking out other stuff Afterburner, in TW3. Settings are ultra, but with most of the post-processing off.

In general, games can look very good with re-use of textures, and not so good with many unique textures. A rich urban environment is probably going to need much more VRAM, if done well, than a forest, FI, to get about the same perceived quality. Part of why modded Bethesda and Rockstar games use so much VRAM is the high-res textures for all the NPCs and clutter, which can have a great deal of variation within the same scene.
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
How good can game graphics look with modest vram? Witcher 3 is probably the best looking game I've seen. Yet look at the vram it consumes:

vram.gif


Tbh, I havent played Mordor so cant comment on how good that looks (as a known vram hogger) vs W3. Other great titles noted for their graphics are Metro 2033 and Last Light, yet again less than 2gb vram usage. Sure there are great looking games that consume a lot of vram, but is or was that necessary in the design of the game? Just shoving tons of textures into vram is not the only way to make great looking games... or even necessary... is it?

I'm no expert, but I think there are methods of streaming textures that reduces the amount of textures actually loaded into memory, and dynamically loads more depending on what's on screen. It's a way of optimizing performance. Of course, that sort of thing can lead to texture pop-in. I read that the reason Shadow of Mordor uses such a monstrous amount of memory is not because its textures are more detailed than any other game's, but because its highest texture setting disables texture streaming entirely and loads every texture in a whole game area simultaneously. It's kind of cool that they've enabled that for people with 8 GB of VRAM to waste, but there's not a lot of practical, visual benefit to it.
 

Stormflux

Member
Jul 21, 2010
140
26
91
It is absolutely about the design of the game, or rather the assets and management of assets.

Usage of textures is only one piece of the puzzle to making a game look good.

As Cerb mentions, there are situations that call for Unique textures, and others that can re-use textures.

See this link for an extreme of re-using a texture:
https://www.unrealengine.com/showcase/amazing-one-texture-enviroment

Characters that are a main focus for a game could use multiple 2k or 4k Textures, for different channels as well. Diffuse, Specular and/or Reflection (if the game is PBR) or not, SSS, Normals, etc.

Other characters may not need that attention to detail so they'll receive lesser quality textures.

The re-use of textures, is mostly a Shader capability, something the Witcher 3 utilizes a lot. Mixing and tweaking various textures procedurally in many different ways. This method isn't open to most Modding tools that I'm aware (or they require more effort) and why Texture packs that pack VRAM to the gills are common.

But as noted at the start, Textures are only one measure for VRAM usage.

Edit: And as I've come to know recently as I look more into Gaming tech (I come from a Film/TV background), the frame buffer is a big reason why the VRAM usage goes up with the increase of Resolution.
 
Last edited:

sm625

Diamond Member
May 6, 2011
8,172
137
106
You only need a couple GB to make a game look as good as possible given the compute resources available today. The only reason some games use more is due solely to the laziness of the devs. 2GB is enough to store all visual information within a normal first person view vision range. If you were to switch to a scope, it is perfectly acceptable for there to be a 100mS lag if additional textures have to be loaded. Games wouldnt even be realistic if you could just toggle a scope and instantly see at 50x magnification anyway. So the lag induced by the memory filling synergizes with the desire for realism. Aside from scoping, there is no need to load more ultra-hi-res textures than what the character can see or travel to in a set amount of time. Unless we're talking about superhuman type characters that can zoom around massive areas quickly and the devs want all that data to be instantly accessible. But really, how many games are like that? That basically only happens in driving games, which are usually very good at dynamic loading. Even in driving games, the frame of reference moves around at somewhat predictable speeds which makes dynamic loading relatively easy. Memory bandwidth is just so much more important that actual capacity. Dynamic loading will occur very frequently regardless of how much VRAM you have, because it is never enough to hold everything.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
You only need a couple GB to make a game look as good as possible given the compute resources available today. The only reason some games use more is due solely to the laziness of the devs. 2GB is enough to store all visual information within a normal first person view vision range. If you were to switch to a scope, it is perfectly acceptable for there to be a 100mS lag if additional textures have to be loaded. Games wouldnt even be realistic if you could just toggle a scope and instantly see at 50x magnification anyway. So the lag induced by the memory filling synergizes with the desire for realism. Aside from scoping, there is no need to load more ultra-hi-res textures than what the character can see or travel to in a set amount of time. Unless we're talking about superhuman type characters that can zoom around massive areas quickly and the devs want all that data to be instantly accessible. But really, how many games are like that? That basically only happens in driving games, which are usually very good at dynamic loading. Even in driving games, the frame of reference moves around at somewhat predictable speeds which makes dynamic loading relatively easy. Memory bandwidth is just so much more important that actual capacity. Dynamic loading will occur very frequently regardless of how much VRAM you have, because it is never enough to hold everything.
How many games can you walk up to things and not see them get fuzzy? So far I'm at 0, which, as far as I'm concerned, completely destroys your argument. Textures will not be detailed enough until every object that's right up against the "camera" has no obvious stretching of textures on its surfaces. There's no need to be zooming. It happens with typical 60-90 degree FOVs.

As well, if you have a free camera, be it third or first person, you can generally make a 360 in 100-200ms, with no disorientation, and the game should be able to remain detailed and fluid throughout that time. That should generally be plenty of time to grab a texture and put it in VRAM, too, if the game can use system RAM to cache it, in lieu of huge VRAM amounts, and/or can assume it will be loaded from an SSD (assuming only a few textures actually aren't already in VRAM, since it's the same space, and most of it will be needed at any angle).

Today, we still don't have anything even close to that, even in games that genuinely do look amazing, like Ryse (if only its gameplay could match its graphics--it's like Crysis, in that respect). A texture that appears 2x as detailed, or 2x less fuzzy, takes just under 3x more space the combination of less detailed textures, so the VRAM usage, if not managed by necessary LOD at the moment, would balloon, not merely bump up a tad. Instead of using LODs and high res textures to give that for close objects, but still drop down at distances, we get things that look decent at a perceived 10-20ft, and then that fuzzy wet marker look, closer.

What you see at a distance is of curse a different matter, and should be more aggressively managed, IMO, to free up some of that VRAM that isn't needed, but we should have the resources today to remove perceived pop-in. Maybe with decent [V]RAM in the new consoles, engines over the next few years will take care of that.
 
Last edited:

alcoholbob

Diamond Member
May 24, 2005
6,367
435
126
Metro 2033 and Last Light are corridor shooters. Basically 2004 FPS game design. Certainly not the greatest examples of VRAM usage.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
Much more interesting than looking at "VRAM Usage" - which can mean everything or nothing depending on the situation - is to look at actual results in some real games with 2GB vs 4GB cards of the same line.

http://www.gamersnexus.net/guides/1888-evga-supersc-4gb-960-benchmark-vs-2gb


Here's one where 4GB makes a huge difference at 1080p :

960-4v2gn-acu.jpg



Here's a mix, sometimes it doesn't matter even at 4K :

960-4v2gn-grid.jpg


960-4v2gn-mll.jpg


960-4v2gn-bfhl.jpg


960-4v2gn-3dm1.jpg



If you note, these pretty much alternate yes-no-yes-no etc on whether 4GB is of benefit or not.

The article's conclusion seems to be spot on :

The answer to the “is a 4GB video card worth it?” question is a decidedly boring “it depends.”

Most of the time I've seen an article come to a different conclusion, it's because their sample size of games was limited and just happened to point in one particular direction.
 

96Firebird

Diamond Member
Nov 8, 2010
5,734
327
126
2GB vs 4GB cards not only matter depending on the game you want to play, but also if you're ok turning down a setting or two. If you don't want to sacrifice max settings, go with the larger VRAM.
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
Quality settings that would cause a difference between a midrange 2GB card and 4GB card would usually end up with FPS I wouldn't consider acceptable with either.
 

gamervivek

Senior member
Jan 17, 2011
490
53
91
Tiled resources will allow for even lesser memory usage. Of course developers will be quick to fill it up again.
 

dave1029

Member
May 11, 2015
94
1
0
What settings, and where in the world? Is there a hidden, "disable pop-in" somewhere, are you running multiple GPUs, or...?
4K resolution, maxed except no AA, just out in the wilderness. No hidden disable pop-in. SLI Titan X.
 

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
games don't look good on memory requirements. You can have a game use max memory and not look good, all depends on how game is developed.

The only reason we see games now with so much memory is consoles.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
As others have said above, there are several factors. Just because a game's peak usage can be measured at x GB doesn't always mean it "needs" it for smooth gameplay if it has fairly intelligent streaming code (and there isn't always stutter when streaming it in either). Bioshock Infinite used only 1.0-1.5GB VRAM yet initially stuttered like crazy even on 2-3GB cards. This was related to certain loading zones streaming data into VRAM and was later patched out. The problem wasn't the need to dynamically stream code in, but rather the way it was poorly implemented in that specific game.

A well written game can look and run surprisingly well on a 2GB VRAM card. The key word being "well written". Since the new consoles were released, it seems easier to "dump" the whole lot into VRAM when porting to PC resulting in very average looking games "needing" lots of VRAM. Previous generation's console games were optimised for low VRAM usage due to hardware limitations. Meanwhile modded Skyrim still looks great with 2K textures even on a 1GB VRAM card with little stutter even though it will use nearer 2GB when given it.

It's all down to the game / game engine in question. If Witcher 3 looks good and needs little VRAM, CDPR should be applauded for the effort in an age where Ubisoft couldn't even write Pacman without "needing" 3GB VRAM. :D