- Oct 2, 2011
- 3,477
- 233
- 106
Just curious, how some of those VRAM hungry games act like in lower resolutions, say, 1366 x 768. Do you still need 2+ gigs of video memory for Ultra textures in this res?
Last edited:
Thanks. But I wonder, at what res Shadow of Mordor can be fully maxed out with a 2/4/? gig card, considering it requires 6 gigs of VRAM to just select Ultra or so I heard at 1080p?I doubt they would need a full 4 gigs or more for that resolution. But it'd also be pretty pointless to run with ultra level detail at that low of a resolution since you likely wouldn't be able to tell the difference between ultra and high at that point.
So, you had to tone down just one setting? Not bad for a five year old card. Did you fraps it?I tried running Dragon Age Inquisition at high textures on my brother's 1 GB Radeon HD 5770, and the game did not take kindly to it at all even at 1440x900.
So, you had to tone down just one setting? Not bad for a five year old card. Did you fraps it?
Thanks. But I wonder, at what res Shadow of Mordor can be fully maxed out with a 2/4/? gig card, considering it requires 6 gigs of VRAM to just select Ultra or so I heard at 1080p?
Yes you need more then 2gb, you could lower some settings but when 2gb cards are already getting slow at higher resolutions and gimped at lower then you know its time for a new card. Really up to the user at this point if a new card is worth turning on those settings.
If i was to casual to give a damn about every setting on,the settings in the demanding games that i can run that use under 2gb at 768p give a excellent fps experience but turn on those hogging settings and your pretty much screwed.
Hardcore users at 768p simply could get a high fps with the settings that run under 2gb,or they can get a card and turn on those settings and get back to awesomeness.:biggrin:
Do you have an example of a card being bottlenecked by just 2 GB of RAM at 768p on high settings?
Titanfall and COD AW. Titanfalls Insane textures for me can cause major stuttering and some in some cases and maps even the very high ones.I end up running high and 2x MSAA on my 770 for the best experience.COD AW and its advanced shadows can tank fps and memory and this is with zero AA applied.
BF4 is the only game i play on my computer where i could drop under 60fps with maxed settings at 768p and not touch 2gb of memory.The drops aren't constant but if there was such thing as a 120HZ 768p screen i would need to drop as far as high and no MSAA to benefit....
I often times use nvidias DSR and run right up to 2732x2536 in some cases,i need a new card period.Either if i want to keep enjoying DSR with new games or if i want to max out certain games at 768p and pull 60+ or max out the settings with no issues.
I was just thinking how badly GK104 class cards with 2GB of vram are going to age with the increasing consumption of vram even at 1080p. 2015 going forward. Maybe at 768p it might still be relevant. Or rather how dependent is resolution on the actual memory consumption. Most of the benches I have seen are 1080p and up.Yes you need more then 2gb, you could lower some settings but when 2gb cards are already getting slow at higher resolutions and gimped at lower then you know its time for a new card.
Speaking of DSR, does it require extra framebuffer? Thanks.I often times use nvidias DSR and run right up to 2732x2536 in some cases,i need a new card period.Either if i want to keep enjoying DSR with new games or if i want to max out certain games at 768p and pull 60+ or max out the settings with no issues.
Titanfall was just notoriously poorly optimized and its textures didn't look good enough to justify the performance cost. I'm not sure about Advanced Warfare, but it could be another case of poor optimization as I have a hard time believing the developer that kept Call of Duty graphically stagnant for years would suddenly be legitimately pushing the limits of graphics processing even down to 768p.
I was just thinking how badly GK104 class cards with 2GB of vram are going to age with the increasing consumption of vram even at 1080p. 2015 going forward. Maybe at 768p it might still be relevant. Or rather how dependent is resolution and memory consumption. Most of the benches I have seen are 1080p and up.
So basically it's up to the devs to optimize these things. And once the market saturation of 3gb+ cards reaches a certain number. Should we expect a bump in quality? I am under the feeling, that larger textures equal more details and clarity?RAM needed for textures is independent of RAM needed for buffers. So, yes, you'll still need XGB for some arbitrary texture res minimum. All you've done by lowering the res a little is saved some tens of MB, or maybe a 100-200MB, for some games, while you have GBs in use for meshes and textures. So, if you want fine detail on the ground, and on the shirt of the NPC you just bumped into, you need that VRAM (but first, you need textures of high enough resolution to give you that detail level!).
At the same detail levels, lacking enough VRAM to handle it will cause stutters, not showing up in avg FPS. For game engines that cache well, amounts above that will increase performance slightly, up to some point of severely diminishing returns (basically, like system RAM, all your VRAM gets used up, but to no added benefit over having a bit less). Just like system RAM, once you have, "enough," more offers very little, and after a point, no, improvements at all. It's not having enough that can hurt, and very few games come with stock settings that are too hard for 2GB, today. Modded games aplenty, and games shipping with nice textures are here, but they still all back down, so you're not going to be left with an unplayable game, just one that might be able to look better with a newer card sporting more VRAM.
RAM needed for textures is independent of RAM needed for buffers. So, yes, you'll still need XGB for some arbitrary texture res minimum. All you've done by lowering the res a little is saved some tens of MB, or maybe a 100-200MB, for some games, while you have GBs in use for meshes and textures. So, if you want fine detail on the ground, and on the shirt of the NPC you just bumped into, you need that VRAM (but first, you need textures of high enough resolution to give you that detail level!).
At the same detail levels, lacking enough VRAM to handle it will cause stutters, not showing up in avg FPS. For game engines that cache well, amounts above that will increase performance slightly, up to some point of severely diminishing returns (basically, like system RAM, all your VRAM gets used up, but to no added benefit over having a bit less). Just like system RAM, once you have, "enough," more offers very little, and after a point, no, improvements at all. It's not having enough that can hurt, and very few games come with stock settings that are too hard for 2GB, today. Modded games aplenty, and games shipping with nice textures are here, but they still all back down, so you're not going to be left with an unplayable game, just one that might be able to look better with a newer card sporting more VRAM.