VRAM usage in lower resolutions

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Just curious, how some of those VRAM hungry games act like in lower resolutions, say, 1366 x 768. Do you still need 2+ gigs of video memory for Ultra textures in this res?
 
Last edited:

nsafreak

Diamond Member
Oct 16, 2001
7,093
3
81
I doubt they would need a full 4 gigs or more for that resolution. But it'd also be pretty pointless to run with ultra level detail at that low of a resolution since you likely wouldn't be able to tell the difference between ultra and high at that point.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
I would think not. It's debatable if games need more than 3-2 GB even at 1080p. At 768p, 2 GB should be more than enough. I would not go to 1 GB, though. I tried running Dragon Age Inquisition at high textures on my brother's 1 GB Radeon HD 5770, and the game did not take kindly to it at all even at 1440x900.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
I doubt they would need a full 4 gigs or more for that resolution. But it'd also be pretty pointless to run with ultra level detail at that low of a resolution since you likely wouldn't be able to tell the difference between ultra and high at that point.
Thanks. But I wonder, at what res Shadow of Mordor can be fully maxed out with a 2/4/? gig card, considering it requires 6 gigs of VRAM to just select Ultra or so I heard at 1080p?

I tried running Dragon Age Inquisition at high textures on my brother's 1 GB Radeon HD 5770, and the game did not take kindly to it at all even at 1440x900.
So, you had to tone down just one setting? Not bad for a five year old card. Did you fraps it?
 
Last edited:

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
So, you had to tone down just one setting? Not bad for a five year old card. Did you fraps it?

I had just about everything else set to medium, I was just seeing if it could try high textures. :biggrin: There's no way that card could run DAI even close to full blast.

Thanks. But I wonder, at what res Shadow of Mordor can be fully maxed out with a 2/4/? gig card, considering it requires 6 gigs of VRAM to just select Ultra or so I heard at 1080p?

At the maximum setting, Shadow of Mordor loads ALL of its textures for a gameplay area, rather than dynamically streaming them. That's why it takes up a lot of memory. It's really an unnecessary thing that doesn't aid visual quality all that much, and Shadow of Mordor is the only game I've heard that does it.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Yes you need more then 2gb, you could lower some settings but when 2gb cards are already getting slow at higher resolutions and gimped at lower then you know its time for a new card. Really up to the user at this point if a new card is worth turning on those settings.

If i was to casual to give a damn about every setting on,the settings in the demanding games that i can run that use under 2gb at 768p give a excellent fps experience but turn on those hogging settings and your pretty much screwed.

Hardcore users at 768p simply could get a high fps with the settings that run under 2gb,or they can get a card and turn on those settings and get back to awesomeness.:biggrin:
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Yes you need more then 2gb, you could lower some settings but when 2gb cards are already getting slow at higher resolutions and gimped at lower then you know its time for a new card. Really up to the user at this point if a new card is worth turning on those settings.

If i was to casual to give a damn about every setting on,the settings in the demanding games that i can run that use under 2gb at 768p give a excellent fps experience but turn on those hogging settings and your pretty much screwed.

Hardcore users at 768p simply could get a high fps with the settings that run under 2gb,or they can get a card and turn on those settings and get back to awesomeness.:biggrin:

Do you have an example of a card being bottlenecked by just 2 GB of RAM at 768p on high settings?
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Do you have an example of a card being bottlenecked by just 2 GB of RAM at 768p on high settings?

Titanfall and COD AW. Titanfalls Insane textures for me can cause major stuttering and some in some cases and maps even the very high ones.I end up running high and 2x MSAA on my 770 for the best experience.COD AW and its advanced shadows can tank fps and memory and this is with zero AA applied.

BF4 is the only game i play on my computer where i could drop under 60fps with maxed settings at 768p and not touch 2gb of memory.The drops aren't constant but if there was such thing as a 120HZ 768p screen i would need to drop as far as high and no MSAA to benefit....

I often times use nvidias DSR and run right up to 2732x2536 in some cases,i need a new card period.Either if i want to keep enjoying DSR with new games or if i want to max out certain games at 768p and pull 60+ or max out the settings with no issues.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
Titanfall and COD AW. Titanfalls Insane textures for me can cause major stuttering and some in some cases and maps even the very high ones.I end up running high and 2x MSAA on my 770 for the best experience.COD AW and its advanced shadows can tank fps and memory and this is with zero AA applied.

BF4 is the only game i play on my computer where i could drop under 60fps with maxed settings at 768p and not touch 2gb of memory.The drops aren't constant but if there was such thing as a 120HZ 768p screen i would need to drop as far as high and no MSAA to benefit....

I often times use nvidias DSR and run right up to 2732x2536 in some cases,i need a new card period.Either if i want to keep enjoying DSR with new games or if i want to max out certain games at 768p and pull 60+ or max out the settings with no issues.

Titanfall was just notoriously poorly optimized and its textures didn't look good enough to justify the performance cost. I'm not sure about Advanced Warfare, but it could be another case of poor optimization as I have a hard time believing the developer that kept Call of Duty graphically stagnant for years would suddenly be legitimately pushing the limits of graphics processing even down to 768p.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Yes you need more then 2gb, you could lower some settings but when 2gb cards are already getting slow at higher resolutions and gimped at lower then you know its time for a new card.
I was just thinking how badly GK104 class cards with 2GB of vram are going to age with the increasing consumption of vram even at 1080p. 2015 going forward. Maybe at 768p it might still be relevant. Or rather how dependent is resolution on the actual memory consumption. Most of the benches I have seen are 1080p and up.

Okay, this has been mentioned earlier here. And it looks like that even GTX 480 is some games is doing OK, with just what, 1.5gb?

I often times use nvidias DSR and run right up to 2732x2536 in some cases,i need a new card period.Either if i want to keep enjoying DSR with new games or if i want to max out certain games at 768p and pull 60+ or max out the settings with no issues.
Speaking of DSR, does it require extra framebuffer? Thanks.
 
Last edited:

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Titanfall was just notoriously poorly optimized and its textures didn't look good enough to justify the performance cost. I'm not sure about Advanced Warfare, but it could be another case of poor optimization as I have a hard time believing the developer that kept Call of Duty graphically stagnant for years would suddenly be legitimately pushing the limits of graphics processing even down to 768p.

Yeah i agree about Titanfall and COD AW there.There is other titles i could guarantee offer the same experience and god knows there's enough crap pc ports so its almost something you deal with.BO2 i could max out 2732x1536 with AA and hold 50+fps easily and stay under 2gb usage but without AA and shadows on at 768p,COD AW looked like garbage.Both fun games surely.:)

I was just thinking how badly GK104 class cards with 2GB of vram are going to age with the increasing consumption of vram even at 1080p. 2015 going forward. Maybe at 768p it might still be relevant. Or rather how dependent is resolution and memory consumption. Most of the benches I have seen are 1080p and up.

Going forward 3gb+ really should be the default choice for anyone who is building or considering upgrading their cards for 1080p.280/280x are dirt cheap and offer all that is needed for the majority of people and maybe the new 960 ti when it debuts.290/290x/970 are mainstream and affordable enough as well.

When true next generation games come out that ditch ports for your PS3/XBOX360 then its gonna get a bit more serious i am certain of it.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
RAM needed for textures is independent of RAM needed for buffers. So, yes, you'll still need XGB for some arbitrary texture res minimum. All you've done by lowering the res a little is saved some tens of MB, or maybe a 100-200MB, for some games, while you have GBs in use for meshes and textures. So, if you want fine detail on the ground, and on the shirt of the NPC you just bumped into, you need that VRAM (but first, you need textures of high enough resolution to give you that detail level!).

At the same detail levels, lacking enough VRAM to handle it will cause stutters, not showing up in avg FPS. For game engines that cache well, amounts above that will increase performance slightly, up to some point of severely diminishing returns (basically, like system RAM, all your VRAM gets used up, but to no added benefit over having a bit less). Just like system RAM, once you have, "enough," more offers very little, and after a point, no, improvements at all. It's not having enough that can hurt, and very few games come with stock settings that are too hard for 2GB, today. Modded games aplenty, and games shipping with nice textures are here, but they still all back down, so you're not going to be left with an unplayable game, just one that might be able to look better with a newer card sporting more VRAM.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
Techspot is damn cool with their articles Magic.Issues for me are those are averages where you could assume safely a minimum fps could be half of the average making even a 480 something i wouldn't use for ultra in BF4 which is one title i play off that article there.Many will and that is fine:)

Anyone playing with a 980 is out of their crack smoking mind if they are only at 768p and not applying 4x MSAA,160fps average without it in BF4 and that is overkill in the third degree ok.Helpful article sure but that is beyond silly period:)My freaking 3770 non k can't even provide that average fps lol.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
RAM needed for textures is independent of RAM needed for buffers. So, yes, you'll still need XGB for some arbitrary texture res minimum. All you've done by lowering the res a little is saved some tens of MB, or maybe a 100-200MB, for some games, while you have GBs in use for meshes and textures. So, if you want fine detail on the ground, and on the shirt of the NPC you just bumped into, you need that VRAM (but first, you need textures of high enough resolution to give you that detail level!).

At the same detail levels, lacking enough VRAM to handle it will cause stutters, not showing up in avg FPS. For game engines that cache well, amounts above that will increase performance slightly, up to some point of severely diminishing returns (basically, like system RAM, all your VRAM gets used up, but to no added benefit over having a bit less). Just like system RAM, once you have, "enough," more offers very little, and after a point, no, improvements at all. It's not having enough that can hurt, and very few games come with stock settings that are too hard for 2GB, today. Modded games aplenty, and games shipping with nice textures are here, but they still all back down, so you're not going to be left with an unplayable game, just one that might be able to look better with a newer card sporting more VRAM.
So basically it's up to the devs to optimize these things. And once the market saturation of 3gb+ cards reaches a certain number. Should we expect a bump in quality? I am under the feeling, that larger textures equal more details and clarity?

But thanks for a rather in-depth explanation :thumbsup:
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
We need better looking games,games that can use what we have at any given resolution so i welcome any game to use 2-3gb without question when usage/performance/quality makes it warrantied.

Performance+vram has been increase this latest generation,i say lets just use it simple as that.If a game looked beautiful enough and trashed my 770 at a lowly 768p i could accept it in a second.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
RAM needed for textures is independent of RAM needed for buffers. So, yes, you'll still need XGB for some arbitrary texture res minimum. All you've done by lowering the res a little is saved some tens of MB, or maybe a 100-200MB, for some games, while you have GBs in use for meshes and textures. So, if you want fine detail on the ground, and on the shirt of the NPC you just bumped into, you need that VRAM (but first, you need textures of high enough resolution to give you that detail level!).

At the same detail levels, lacking enough VRAM to handle it will cause stutters, not showing up in avg FPS. For game engines that cache well, amounts above that will increase performance slightly, up to some point of severely diminishing returns (basically, like system RAM, all your VRAM gets used up, but to no added benefit over having a bit less). Just like system RAM, once you have, "enough," more offers very little, and after a point, no, improvements at all. It's not having enough that can hurt, and very few games come with stock settings that are too hard for 2GB, today. Modded games aplenty, and games shipping with nice textures are here, but they still all back down, so you're not going to be left with an unplayable game, just one that might be able to look better with a newer card sporting more VRAM.

It's interesting that you say meshes can consume a lot of memory. I've recently been trying to run Dragon Age Inquisition as best I can on my brother's PC, which is a stock C2Q Q6600, 4 GB DDR2 system RAM, and a 1 GB Radeon HD 5770. I turned most settings down to medium (and tessellation down to low), but I tried running the "mesh" setting on high, because there were a good amount of details and effects lost by going from high to medium. Playing at that setting though seemed to cause very noticeable stutter and brief hanging, which mostly went away upon turning the mesh setting down to high. Do you think that's memory size related, pushing past the 5770's 1 GB of memory? And if that's the case, would overclocking the 5770's memory help at all, or would it be fruitless?
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Like textures, the models need roughly 3-4x the points to appear double the detail, so while generally lighter on VRAM than textures, much higher detail ones will take much more room, and more bandwidth. Stutter is usually a problem of having to go back out to system RAM, but unless you watch it and see going near the limit w/ one setting, and to it with another, and getting stuttering, it's hard to say for certain. Depending on game, a 5770 could very well get limited by the act of tessellation, if it's going crazy with it (FI, Crysis), and I'm not sure how that works from one frame to the next, as far as the GPU's processing needs (IE, does it recompute a tessellated new model each frame?).

Try it and see, I guess.