• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

GPU Memory usage/when will we get more VRAM?

Jackie60

Member
I'm playing Arma 3 with 5500M view distance 4500M Object distance all settings maxed/Ultra plus 4x transparency SSAA. 1080TI SLI and image quality and general performance is great but I'm already at the limit
of my memory. I've been on the same rig for a while with SLI XP's then 1080Tis (5960X @4.5, 2666Mhz DDR4) but I'm concerned that I'm already at the limit of my VRAM. Is this routine i.e. is the game just taking all the vram it can or am I genuinely at the limit? If I'm at the maximum when will we see video cards with more ram e.g. 24GB or whatever. 2080 'only' has 8GB and 2080Ti has 11GB so shouldn't both those number be higher. Using afterburner to measure usage.
 
A couple thoughts:
1) if you extend the view distance on any game, VRAM just goes up. Stick to the preset settings. Just because dev's give you a slider to go beyond reason, doesn't mean you need to.
2) Just because a game keeps VRAM reserved, doesn't mean it needs that much VRAM for great performance.
 
You don't need 24GB of vram to render a crappy ass, 10 years old game like Arma 3 properly. In fact I doubt that anything more than 8GB is actually used as opposed to reserved and 8Gb is plenty for 99% of games even at 4k.
 
24GB of VRAM? I think we will be waiting a while before we see consumer level GPUs with that much memory... Very people even have that much for system memory.
 
I just bought an EVGA 1050 with 3GB DDR5, which is miles above what I'm running. Figure to buy a cheap 40" 4K display is why. I'm told it's minimal to run Netflix 4K streaming. Should have it in hand by Monday, but I haven't ordered the display yet. Don't know if I've lowballed myself here.
 
I just bought an EVGA 1050 with 3GB DDR5, which is miles above what I'm running. Figure to buy a cheap 40" 4K display is why. I'm told it's minimal to run Netflix 4K streaming. Should have it in hand by Monday, but I haven't ordered the display yet. Don't know if I've lowballed myself here.
It is the DRM used by Netflix for 4K streaming that eats memory.
 
I just bought an EVGA 1050 with 3GB DDR5, which is miles above what I'm running. Figure to buy a cheap 40" 4K display is why. I'm told it's minimal to run Netflix 4K streaming. Should have it in hand by Monday, but I haven't ordered the display yet. Don't know if I've lowballed myself here.
I thought that you said that you didn't play any modern games? If you had responded affirmatively, I would have pointed you to the 8GB GTX 1070 (ti), for around $350-400.
 
I thought that you said that you didn't play any modern games? If you had responded affirmatively, I would have pointed you to the 8GB GTX 1070 (ti), for around $350-400.
It's totally true, I don't play any modern games. It was 4K Netflix streaming I indicated as my barometer for acceptability above... see below.
I just bought an EVGA 1050 with 3GB DDR5, which is miles above what I'm running. Figure to buy a cheap 40" 4K display is why. I'm told it's minimal to run Netflix 4K streaming. Should have it in hand by Monday, but I haven't ordered the display yet. Don't know if I've lowballed myself here.
 
Jackie, I play at 4k at ARMA 3 and the GPU is only taxed in the Jungle in Tanoa, and still maintains 60fps easily.

I did turn off FSAA and I run 3d res at 100%. The game looks great without AA at native 4k imo.

I play at 4k because 60fps is hardly attainable with the amount of AI on screen.

Maxing out ARMA 3 view distance hits the CPU way harder than the GPU. The buildings in a city are all rendered by CPU which is a major downfall in urban combat.

I do think 16GB will be needed soon, and that 16gb will not feel as ample as we think once it is mainstream.

ARMA 4 isn't even in development yet and it is all CPU limited once a large mission is loaded.

At this point in games that are largely CPU limited at 4k, I would rather invest in faster, low latency RAM, and a higher clocked CPU.
 
And AMD has the gall to release a new card with a measly 16GB of HBM2...

If 4GB of HBM was "enough" is 16GB of HBM2 now "too much?"

Joking aside, really is going to be interesting to see if AMD keeps to 16GB of VRAM on their, is this mid-tier now? Anyways, if the Navi product aimed to replace Radeon 7 doesn't have 16GBs of VRAM, I wonder if opinion on "how much VRAM is enough" will change again.
 
Sure, why not ? You can just load the whole game of Quake right on the video card 😛

Would this be the equivilant of a Voodoo 3 or 5 in PCI or 56k Modem that is ISA ?
 
Last edited:
Back
Top