What is the connection between Vram and performance?

topeira

Member
May 19, 2011
77
0
0
hello good people.
i asked this question in a less related thread so im starting a new thread dedicated to trying to understand this:

how is the Vram used on the GPU? i used to think that high res textures and more info about NPCs and physics need ram space but i read that higher display resolution and AA requires more ram and more Vram. what is the connection between the resolution i game at and AA and VRAM? why do i need more VRAM if i wanna play on higher resolution? if i stay on lower resolution (1680X1050) can i scale up other VRAM heavy graphical features?

what graphical features benefit more from a lot of VRAM?

why is Vram so important? or is it important? is 3GB enough? do i really need 4 if i game on 1680X1050?

...etc etc

:) im just a curious cat so im asking cuz google didnt help me one bit...
 

el etro

Golden Member
Jul 21, 2013
1,584
14
81
One thing is the vRAM wall, that is the amount of vRAM needed to play each game on a determined graphical setting.

Other thing is the vRAM Bandwidth that is the total Speed(Mhz*bits) of the RAM modules(Ex 7Ghz and 384bits vRAM vs 5,5Ghz and 512bits vRAM). vRAM Bandwidth effectiveness can vary depending of the GPU.


vRAM wall have to do with the size of textures loaded. Higher the resolution is, higher will be the amount of vRAM needed to render the same game. AA also needs more vRAM to be used.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If the game needs more VRAM than the card has VRAM, you will suffer a huge performance penalty, because the overflow will need to be stored in system RAM and that's a LOT more lag time than VRAM which has direct access to the GPU.
 

realibrad

Lifer
Oct 18, 2013
12,337
898
126
Data is stored on the Hard Drive. Vram data gets stored on the Vram of the GPU. The data is then sent to the GPU to be processed. If you have 20 gigs of Vram, and not enough bandwidth to send it to the gpu, the size is useless. If the GPU cannot process the data fast enough, the extra Vram will be useless. If you have a fast GPU and not enough Vram to store the data and send it to the gpu, its useless.

GPU processes the data. Vram holds data until its needed to be processed. Bandwidth is the "Highway" the data travels to get to the GPU.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
to have enjoyable and playable performance.

you have to have enough shaders to even begin considering vram.

without enough shaders, performance will choke before the vram ceiling is ever reached.

this is the overall principal.

-----

vram will depends on the game, the resolution, the in game settings, and any additional mods.

here is an example. keep in mind this example is only valid on this specific configuration. BF4 at 5760x1080 and 1920x1080 at Max Ultra Setting
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
> is 3GB enough? do i really need 4 if i game on 1680X1050?

For 1680X1050, even 1 GB of VRAM is going to be enough for games at normal settings. You will only run into problems if you turn on some ultra-quality setting or install a "hi-res texture pack" or other image quality add-on.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The GPU stores a lot of different things in VRAM, it stores the models themselves that represent all the objects in the world, it stores the textures and anything else associated with the game world including the shader programs and anything else needed to render.

These days most of the VRAM is used either in textures or in intermediate images that games use for various post processing techniques. A example of this would be a occlusion image that is used by ambient occlusion to produce the shadow or perhaps a full colourspace image which is used to determine HDR colourspace and glare. IIRC BF4 uses about 6 major these images all encoding various aspects of the rendering intermediate stages. MSAA is another user of RAM because it needs the additional information for its multi sampling.

VRAM doesn't directly translate into additional performance, in the same way that RAM doesn't help a PC either. You need sufficient amounts of it to support the games needs with particular settings. If you start running low then the GPU will page out some of its needs to main RAM, with the impact of a serious reduction in performance.

Most games on high settings at 1080p seem to need about 1-1.5 GB of VRAM today, with a few using more but its by and large dependent on settings and mods can seriously increase the VRAM usage when they increase texture quality or model quality dramatically.
 

topeira

Member
May 19, 2011
77
0
0
thanks you, BrightCandle. that was a pretty understandable explanation in my standards :p im a lamer about those things.
but besides GTAIV and Skyrim there arent a lot of games that add such high res textures, anyway.

however we can all expect that the next gen games that are ported from PS4\XB1 will require more and more VRAM due to the consoles GPUs.
since i wanna upgrade GPU soon do u think that more than 2 or 3 vram gigabytes will be required in the future games if im running games on 1680X1050 (not higher) and on highest res textures and effects?
i mean, how high can the PC texture resolutions get in the next gen games? 4096?!? will it ever get THAT high? i kinda feel it won't since textures THAT high wont look much better than 2048 anyways.... and even if they do, i wouldnt pick this rez. useless, IMO.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
If you do not care for max settings, you can get away with a LOT less... hell I bet that most PC games 2 years from now will not need more than 2GB VRAM for 1050p.

But check out Watch Dogs Ultra specs:
http://www.maximumpc.com/youll_need_burly_pc_run_watch_dogs_ultra_settings_2014

"Ultra specs: Intel Core i7 4770K with a GeForce 780. The PC video was recorded on an Intel Core i7 3930K with a GeForce Titan IIRC," Morin said.
As for the minimum requirements listed on Steam, you'll need one of the aforementioned processors or higher, 6GB of RAM, GeForce GTX 460 / AMD Radeon HD 5770 or higher, and 25GB of hard drive space. For the recommended requirements, hardware gets bumped up to an Intel Core i7 3770 / AMD FX-8350 X8 or higher, 8GB of RAM, GeForce GTX 560 Ti / AMD Radeon HD 7850 or higher, and the same amount of hard drive space (25GB).
Watch Dogs will be available to play May 27, 2014.

Note that they didn't specify the resolution, but for sake of argument let's assume somewhere between 1080p and 1600p. Imagine how much more horsepower you'd need for 3K or 4K resolutions.
 

UaVaj

Golden Member
Nov 16, 2012
1,546
0
76
unless you are running an exotic resolution. vram is something you do not need to lose sleep over.

the current generation of console (ps4 and xbox1) are not even at 1080p. those vram hype are nothing more than marketing gimmick to sell more console.
 

topeira

Member
May 19, 2011
77
0
0
If you do not care for max settings, you can get away with a LOT less... hell I bet that most PC games 2 years from now will not need more than 2GB VRAM for 1050p.

But check out Watch Dogs Ultra specs:
http://www.maximumpc.com/youll_need_burly_pc_run_watch_dogs_ultra_settings_2014

"Ultra specs: Intel Core i7 4770K with a GeForce 780. The PC video was recorded on an Intel Core i7 3930K with a GeForce Titan IIRC," Morin said.
As for the minimum requirements listed on Steam, you'll need one of the aforementioned processors or higher, 6GB of RAM, GeForce GTX 460 / AMD Radeon HD 5770 or higher, and 25GB of hard drive space. For the recommended requirements, hardware gets bumped up to an Intel Core i7 3770 / AMD FX-8350 X8 or higher, 8GB of RAM, GeForce GTX 560 Ti / AMD Radeon HD 7850 or higher, and the same amount of hard drive space (25GB).
Watch Dogs will be available to play May 27, 2014.

Note that they didn't specify the resolution, but for sake of argument let's assume somewhere between 1080p and 1600p. Imagine how much more horsepower you'd need for 3K or 4K resolutions.

1) have you SEEN the sys-req of "shadows of mordor"? they are even higher than WD's. i cant imagine what AC:Unity or the witcher 3 will require...
http://www.ign.com/articles/2014/04/08/middle-earth-shadow-of-mordor-pc-requirements-unveiled

i dont plan on playing on a higher res than 1900X1200 ever. even 1080 is more than im used to. im gaming on the PC connected to a TV im sitting 3 meters away from. from that range i can barely notice a difference in resolutions higher than 1680X1050 so why lose FPS for a negligible difference?
same with AA. i find lower AA adequate enough.

so assuming i wont be playing on higher than 1080P and mediocre AA (but with everything else on high or above. especially view distance) do you recon a 280X can hold me nicely for the next 3 years?! (it's hard to guess. i know. we cant tell how the new generation of games will look and what it will require, but i gotta ask cuz any guess is better than mine)
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
1) have you SEEN the sys-req of "shadows of mordor"? they are even higher than WD's. i cant imagine what AC:Unity or the witcher 3 will require...
http://www.ign.com/articles/2014/04/08/middle-earth-shadow-of-mordor-pc-requirements-unveiled

i dont plan on playing on a higher res than 1900X1200 ever. even 1080 is more than im used to. im gaming on the PC connected to a TV im sitting 3 meters away from. from that range i can barely notice a difference in resolutions higher than 1680X1050 so why lose FPS for a negligible difference?
same with AA. i find lower AA adequate enough.

so assuming i wont be playing on higher than 1080P and mediocre AA (but with everything else on high or above. especially view distance) do you recon a 280X can hold me nicely for the next 3 years?! (it's hard to guess. i know. we cant tell how the new generation of games will look and what it will require, but i gotta ask cuz any guess is better than mine)

Re-read those Mordor specs, Watch Dogs has higher specs for Ultra settings.

Witcher 3 is too far away to say for sure, but last I heard, with max settings at 1080p getting 35-45 fps on a 780 Ti: http://www.guru3d.com/news_story/the_witcher_3_gtx_780ti_could_push_35_45_fps_at_max.html

As for your 1080p mediocre settings, high, mediocre AA (what the heck is that supposed to mean? 2x MSAA? or FXAA-type of fake AA? I'm going to take that to mean 2x MSAA) a 280X 3GB should have enough VRAM for at least a couple of more years. But whether or not it will have enough GPU power to keep you comfortably above 30 fps (let's say, 45 fps average in the latest games) is the real question.

If you get a 280X or equivalent today, 3 years from now you will probably have to play with lower-than-highest settings, 2x MSAA for the latest games due to your GPU, not because you don't have enough VRAM. It's like a 5870 2GB today--you will have to turn down settings in some games because your GPU isn't strong enough, not because of VRAM.
 
Last edited:

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
how is the Vram used on the GPU? i used to think that high res textures and more info about NPCs and physics need ram space but i read that higher display resolution and AA requires more ram and more Vram. what is the connection between the resolution i game at and AA and VRAM? why do i need more VRAM if i wanna play on higher resolution? if i stay on lower resolution (1680X1050) can i scale up other VRAM heavy graphical features?

Texture resolution and anisotropic filtering have the biggest effect on VRAM usage. High detail models, high display resolution, shadows, and post-processing also have an effect. NPCs and physics, not so much.
 

Rakehellion

Lifer
Jan 15, 2013
12,181
35
91
thanks you, BrightCandle. that was a pretty understandable explanation in my standards :p im a lamer about those things.
but besides GTAIV and Skyrim there arent a lot of games that add such high res textures, anyway.

however we can all expect that the next gen games that are ported from PS4\XB1 will require more and more VRAM due to the consoles GPUs.
since i wanna upgrade GPU soon do u think that more than 2 or 3 vram gigabytes will be required in the future games if im running games on 1680X1050 (not higher) and on highest res textures and effects?
i mean, how high can the PC texture resolutions get in the next gen games? 4096?!? will it ever get THAT high? i kinda feel it won't since textures THAT high wont look much better than 2048 anyways.... and even if they do, i wouldnt pick this rez. useless, IMO.

2GB is plenty.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I game at 1680x1050 with pretty much everything turned on with 2GB. I rarely see a situation where it reaches 2GB. Maybe in Rome 2 with huge armies on a city map, but it is still playable.

Whilst monitoring my card, I have not noticed Anisotropic filtering make any difference to the amount of memory used, MSAA does though. If memoryt becomes an issue and you still want AA you can use something like SweetFX to inject SMAA into the game instead of MSAA. SMAA uses Shaders instead of video memory and doesn't blur the image like FXAA or TXAA.
 

topeira

Member
May 19, 2011
77
0
0
I game at 1680x1050 with pretty much everything turned on with 2GB. I rarely see a situation where it reaches 2GB. Maybe in Rome 2 with huge armies on a city map, but it is still playable.

Whilst monitoring my card, I have not noticed Anisotropic filtering make any difference to the amount of memory used, MSAA does though. If memoryt becomes an issue and you still want AA you can use something like SweetFX to inject SMAA into the game instead of MSAA. SMAA uses Shaders instead of video memory and doesn't blur the image like FXAA or TXAA.

how can you check how much of the Vram is used? in ATI cards and Nvidia?
didnt know i could do that... (im using ATI. CCC)

and the hard question is whether we could game on 1050P with everything on high or highest on next gen games. i consider WD to be the first next-gen game (i'd consider infamous:SS or RYSE:SOR also but these arent on PC).

does anyone think that the next gen games 3 years from now will require different system requirements than the games of THIS year? if these games are optimized to work best on a 7950 cards than why would they need better PC specs than today? how high res textures can you get?! how much the new shaders, HBAO or AA tech are we gonna have if the same tech needs to work on GPUs like 7950?
i would assume that there will be PC specific advancements but not that noticeable. not as noticeable like from PS3 games to PS4 games....
what do you guys think?!
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
You can either use GPUz to run in the background, or set something like Rivatuner Statistics Server up to monitor various aspects whilst in game with an overlay.

It's hard to predict how exactly games will be developed, but I expect that 2GB will be fine for 1050p for quite some time.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Texture resolution and anisotropic filtering have the biggest effect on VRAM usage. High detail models, high display resolution, shadows, and post-processing also have an effect. NPCs and physics, not so much.

Incorrect. These days, anti aliasing has the most dramatic effect on VRAM usage; you can eat up 1-2GB of VRAM use in surround simply through various levels of FXAA > SSAA. FXAA uses 0 VRAM. SSAA in its various forms such as SGSSAA and resolution scale eat an insane amount of VRAM -Pretty sad when AA uses more VRAM than game assets, but that's the current situation due to the anemic pathetic next-gen consoles.

More VRAM for 1680x1050. Really? Do I see that mentioned in this thread. 2GB is going to be fine, I assure you. And in the .0000000001% chance that it isn't 1 year from now, lower AA.

VRAM also does not affect performance. From what i've seen, if you don't have enough, the game fails. For instance, Far Cry 3 at 1600p x 3 fails with 8X MSAA. 2X MSAA? works. You can look at numerous benchmarks of 2 vs 4GB cards and there is not a performance difference. More VRAM is not about performance. It is about more anti aliasing at higher surround resolutions. But if you want more, get some more; the GTX 780 and 780ti both have 6GB cards for a 50$ premium over the 3GB base models. I think those are hitting the market in a month. Or you can do the AMD thing if you're an AMD fan. But you don't need to lose sleep over VRAM. Anyone suggesting that more VRAM increases performance is wrong, period, and while I could link 20 million benchmarks to collaborate that fact, you could simply google 2 vs 4 GB benchmarks (GTX 680 or 770 for instance) instead of creating an argument about it here on a forum. In fact i'm quite sure HardOCP did numerous surround 2 vs 4GB benchmarks. Performance difference? Zero. 4GB lets you run more mods and more AA at higher surround resolutions or 4k. That's it. No performance difference. More VRAM = more modding, more anti aliasing, more SSAA, especially at surround or 4k resolutions.
 
Last edited:

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
VRAM is like a workbench. The bigger the workbench the more stuff you can store in memory to work on without have to either put it completely away or put it somewhere else. The GPU itself is like a series of tools working in concert, so the more tools, the more work you can do.
 

bonehead123

Senior member
Nov 6, 2013
559
19
81
remember the ominous statement from billyG:

"nobody will ever need more than 64k of ram"

hahahahahahahahahahahaha :) :D

games are essentially just apps, and as we have seen over the past 20+ years, every new generation of apps ALWAYS require more & more system resources......

so BE PREPARED, the UHD 4k & 8k video revolution is here, and sooner or later (prob sooner) it WILL make it's way into the pc gaming arena.....
 

Automaticman

Member
Sep 3, 2009
176
0
71
A couple really good answers here. One thing I haven't seen specifically mentioned about why higher resolutions require more VRAM is that the VRAM holds the Frame Buffer in addition to texture assets. That is, when the GPU has finished drawing a frame and is ready to be sent to the monitor, that uncompressed image gets sent to the frame buffer, where it is held. Since these images are not compressed, they can be very large, and can scale dramatically as resolution increases.

In addition, you may have multiple frame buffers if you are Vsync with triple bufferring, this takes even more VRAM.

The reason that some types of AA dramatically increase VRAM use is by how they work. FXAA and AMD's MLAA are esentially just fancy shaders that blur the image a bit before it is sent to the buffer. This does not take up any extra VRAM.

SSAA, on the other hand, draws the image multiple times larger than the actual output resolution. In other words, if you have a 1920x1080 monitor, your GPU might draw the image at 3840x2160 or even higher, and then downsample the image at the very end before it is sent to the monitor. You can see why this would take massively more VRAM (and just as big a hit on your game's performance - hence why it is not often used). Have fun with the "Resolution Scale" slider in BF4 if you want to play around with it. Now imagine applying SSAA to an Eyefinity setup that already has a ridiculously high resolution.

MSAA essentially does the same thing, but only increases the resolution around the edges of objects, making it much easier to run.
 

Mand

Senior member
Jan 13, 2014
664
0
0
A couple really good answers here. One thing I haven't seen specifically mentioned about why higher resolutions require more VRAM is that the VRAM holds the Frame Buffer in addition to texture assets. That is, when the GPU has finished drawing a frame and is ready to be sent to the monitor, that uncompressed image gets sent to the frame buffer, where it is held. Since these images are not compressed, they can be very large, and can scale dramatically as resolution increases.

Not as much as you might think. Here's a table of the size of a single frame, in bytes, for varying resolutions and bit depths for color (keep in mind most TN panels are actually 6-bit color with dithering, which is why it's on the list):

Resolution || 6-bit || 8-bit || 10-bit

1920x1080 || 4665600 || 6220800 || 7776000

2560x1440 || 8294400 || 11059200 || 13824000

3840x2160 || 18662400 || 24883200 || 31104000

While it does scale, even at 4k with 10-bit color a single frame is a mere 31 megabytes in size. If it were just a buffered frame or even triple buffered, that's still pretty insignificant.

The problem with VRAM is that the scaling, from 4 MB to 31 MB, applies to the entire process, including all the calculations that happen. More pixels means more memory to store intermediate calculated values in, and if you're having an 8x increase in the number of bits a frame takes up that does become significant. But having to buffer uncompressed frames before sending them to the display is not why you need a lot of VRAM for higher resolutions.

Edit: Bah. Why have a table function in the post window if it doesn't actually do that when it posts?
 
Last edited:

CropDuster

Senior member
Jan 2, 2014
371
54
91
The reason that some types of AA dramatically increase VRAM use is by how they work. FXAA and AMD's MLAA are esentially just fancy shaders that blur the image a bit before it is sent to the buffer. This does not take up any extra VRAM.

SSAA, on the other hand, draws the image multiple times larger than the actual output resolution. In other words, if you have a 1920x1080 monitor, your GPU might draw the image at 3840x2160 or even higher, and then downsample the image at the very end before it is sent to the monitor. You can see why this would take massively more VRAM (and just as big a hit on your game's performance - hence why it is not often used). Have fun with the "Resolution Scale" slider in BF4 if you want to play around with it. Now imagine applying SSAA to an Eyefinity setup that already has a ridiculously high resolution.

MSAA essentially does the same thing, but only increases the resolution around the edges of objects, making it much easier to run.

Question regarding BF4 if you know.

Is there a % on the resolution scale that would correlate with 2x, 4x, etc MSAA? Also, turning on MSAA seems to add input lag for me. I don't know if that's normal or maybe I'm just noticing the reduced FPS as I'm usually 100+ with Mantle.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Resolution scale is not MSAA. It is OGSSAA. OGSSAA is by far the most VRAM intensive and resource intensive form of SSAA. I would personally suggest not using it unless you have tons of GPU to spare, or are using a low 1080p type resolution with a cutting edge GPU.

Percentage wise, the description is resolution scale. Multiply your current resolution by the % you apply and that is your downsampled image. This is why resolution scale and SSAA take up insane amounts of VRAM - you can take a 1080p game and downsample it to 4k. Obviously at 4k you're using more VRAM. Obviously at 4k, downsampled, your GPU is going to bog down.

As I said earlier, due to the rather pathetic situation with PS4 / XB1 in terms of lack of power, we're at a point in current days where games are taking more VRAM from anti aliasing rather than game assets. Is that bass ackwards? Yes. Yes it is. Had Sony and MS used a 2 chip solution for CPU and graphics (even if retaining AMD for graphics where AMD does well - the CPU cores are _not_ good) we would not be in this situation. But they wanted to shave costs presumably, and console ports will suffer as a result. Since these next gen consoles are struggling to even hit 1080p, it is doubtful that the situation will ever change, but we'll see. Optimized programming will improve things, but i'm not optimistic.
 
Last edited:

Automaticman

Member
Sep 3, 2009
176
0
71
Question regarding BF4 if you know.

Is there a % on the resolution scale that would correlate with 2x, 4x, etc MSAA? Also, turning on MSAA seems to add input lag for me. I don't know if that's normal or maybe I'm just noticing the reduced FPS as I'm usually 100+ with Mantle.

Resolution scale is not MSAA. It is OGSSAA. OGSSAA is by far the most VRAM intensive and resource intensive form of SSAA. I would personally suggest not using it unless you have tons of GPU to spare, or are using a low 1080p type resolution with a cutting edge GPU.

Percentage wise, the description is resolution scale. Multiply your current resolution by the % you apply and that is your downsampled image. This is why resolution scale and SSAA take up insane amounts of VRAM - you can take a 1080p game and downsample it to 4k. Obviously at 4k you're using more VRAM. Obviously at 4k, downsampled, your GPU is going to bog down.


I think what CropDuster is asking is, if Resolution Scale @200% = SSAAx4 (i think), how does that compare to MSAA x4, x8, x16 etc. I really don't think there's a good answer for that though. SSAA is Supersampling the entire image, and then downsampling it at teh very end, while MSAA supersamples only bits and pieces of the image that are likely to be aliased.

That is why MSAA historically has issues with transparent textures like fences and leaves, because it only applies AA to the outside edge. Adaptive MSAA tried to fix that by applying SSAA to those sections with transparent textures like that, but I don't think it's seen a lot of use since DX10/11 came into use.

BF4 might not be the best game to test out MSAA, since it was really designed with post process FXAA use in mind and optimized for that. I think the "lag" you are experiencing is just the performance hit that even MSAA takes on your hardware. SSAA hits with a sledgehammer, but it makes for great screenshots though.


Thanks to Blackened and Mand for the great info and clarification on frame buffer sizes.