Are VRAM requirements overhyped?

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
So I was looking through Nordic Hardware's performance analysis of recent games and I stumbled upon this graph:

mordormemory.png



Do note, this is 4K resolution. Since the vast majority of people play at 1080p, I'm just not really buying this VRAM craze. Shadow of Mordor stands out because that's because the devs are lazy and put stuff into VRAM which doesn't need to be there.

I keep seeing these inflation and ever-growing VRAM requirements and I just think that if you're not playing at 4K, you probably don't need more than 2 GB at this stage and I'd doubt we'll breach 3 GB for quite some time. I think Nvidia's 770 model of a 2 GB base model and a 4 GB model for SLI is better. Obviously I think that 3 GB should be standard today if you're looking at a few years down the road, but anything above that is pretty pointless unless you're doing SLI@4K which most people just aren't.

Still, I guess it's something that GPU vendors can push as a "tick-box item" for each generation. I wouldn't be surprised if 6 GB of VRAM was standard in new cards just 12-18 months from now, and that would be completely unnecessary.
 

Annisman*

Golden Member
Aug 20, 2010
1,931
95
91
Yes and no. Yes because on a per game basis the 'recommended' VRAM requirements can often be largely inflated. Like in Shadow of Mordor, it said 6GB of VRAM for Ultra Textures.

I ran SOM with my GTX Titan(6GB), worked fine, the next day I threw in my GTX980 (4GB) no issue running ultra textures whatsoever.

No because you really really don't want to be that guy who skimps on his VRAM when upgrading his graphics cards and 2 years later he legitimately is hurting for more. Think of the 8800GTS 320, and I'm sure there are more recent examples too.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
If you exclude poorly made games its not inflated. 2GB is still plenty today.
 

Spanners

Senior member
Mar 16, 2014
325
1
0
If you exclude poorly made games its not inflated. 2GB is still plenty today.

This is just plainly wrong, 2GB is not plenty today and you can't just call any game that has higher than 2GB VRAM requirements poorly made.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
This is just plainly wrong, 2GB is not plenty today and you can't just call any game that has higher than 2GB VRAM requirements poorly made.

How many games need more than 2GB today and how many doesnt?

And I didnt say a game is poorly made just because they use more than 2GB. However lazy console ports give you a lot more VRAM usage than you should.
 
Feb 19, 2009
10,457
10
76
Game artists will always tell you there's never enough because all their hard work is on ultra resolution and in most games, scaled down to reduce memory usage.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
I think half the "inflation" problem is badly optimized games with uncompressed textures. Generally 2-3GB will run most decently written games perfectly well. The problem is the same problem with 50GB install sizes for a 10-15GB game with 35-40GB of uncompressed WAV's... :rolleyes:
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Some games also use spare VRAM as a cache and always try to fill it where possible. So just because it's full, it doesn't necessarily mean the game can't run properly without it.
 

DarkKnightDude

Senior member
Mar 10, 2011
981
44
91
Yeah BF4 has a nice VRAM tech where it uses extra VRAM as needed, but had no effect on your fps if you had less.
 

Dannar26

Senior member
Mar 13, 2012
754
142
106
I think half the "inflation" problem is badly optimized games with uncompressed textures. Generally 2-3GB will run most decently written games perfectly well. The problem is the same problem with 50GB install sizes for a 10-15GB game with 35-40GB of uncompressed WAV's... :rolleyes:

I see you too have installed titanfall.

My gtx 260 that I recently retired had 800megs. It ran things fine for the most part. I guess when you have too little vram it will impact fps?
 

Morbus

Senior member
Apr 10, 2009
998
0
0
This is just plainly wrong, 2GB is not plenty today and you can't just call any game that has higher than 2GB VRAM requirements poorly made.

For 1080p, it's perfectly fine.

Then again, a single 770 or even a 760 is perfectly fine for EVERY game out there today, if you're only playing at 1080p.
 
Feb 19, 2009
10,457
10
76
For 1080p, it's perfectly fine.

Then again, a single 770 or even a 760 is perfectly fine for EVERY game out there today, if you're only playing at 1080p.

There's a few with ultra textures that will stutter like crazy or isn't even an option for 2gb cards.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
The "ultra" settings more often than not are just thrown in for good measure rather than actually providing superior visuals because it's easy to simply include the uncompressed highest res assets. I can't tell much of a difference between high and ultra in Shadow of Mordor except ultra makes my 2 GB GTX770 SLI setup stutter while high is smooth as silk. Same thing for Wolfenstein New Order for example.

Maybe at 4k resolutions the ultra settings make a noticeable difference (more pixels to convey the texture's surface) but otherwise not so much.
 

nenforcer

Golden Member
Aug 26, 2008
1,777
20
81
I played Rage at 1600x1200 on a CRT Monitor and at 1080P on my television at 60fps with a Geforce GTX 460 768MB only 3 years ago. 512MB (Radeon HD 4200 / Geforce 8800GT) was the minimum, I believe.

Titanfall, The Evil Within and LoTR:SoM are now requiring >= 3GB for Ultra textures. The fact that the new consoles have 8GB of memory to split between game code and VRAM but can't even hit 60fps at 1080P is disheartening. We may be stuck with these poorly optimized PC ports for years to come.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Yeah the Ultra textures in Mordor seem to be for 4k, I found that the game looked great with the next ones down, and even better if I sat back a bit from the monitor. The whole picture came together better.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Titanfall, The Evil Within and LoTR:SoM are now requiring >= 3GB for Ultra textures. The fact that the new consoles have 8GB of memory to split between game code and VRAM but can't even hit 60fps at 1080P is disheartening. We may be stuck with these poorly optimized PC ports for years to come.

New generation of games, higher system requirements.

Also, both new consoles may have 8 GB, but a good portion is used for their respective OSs and background functions. And amount of VRAM =/= FPS as well, and console devs tend to target 30 FPS, not 60.

I think AC: Unity, with it's 2 GB VRAM requirement is certainly ridiculous right now, but part of it is devs not wanting to waste budget creating lower end assets that may or may not meet they're visual standards. Each new gen, the system requirements have gone up, so it's not out of expectations. If it comes down to a cache issue, that too is still not a huge problem, because the game is using it to smooth out rendering and loading, which leads to a better experience.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
I played Rage at 1600x1200 on a CRT Monitor and at 1080P on my television at 60fps with a Geforce GTX 460 768MB only 3 years ago. 512MB (Radeon HD 4200 / Geforce 8800GT) was the minimum, I believe.

Titanfall, The Evil Within and LoTR:SoM are now requiring >= 3GB for Ultra textures. The fact that the new consoles have 8GB of memory to split between game code and VRAM but can't even hit 60fps at 1080P is disheartening. We may be stuck with these poorly optimized PC ports for years to come.

I definitely do not like the stagnation of consoles affecting PC game dev, but more RAM is OK by me. RAM is relatively cheap, so if 4-6GB becomes the new norm and is often used to cache ports, and then REALLY taken advantage of for PC-dedicated titles, that's cool by me.

The 'positive' I see from consoles affecting PCs is that more and more titles will be built using XB/PS memory bandwidth and this will help drive more bandwidth and 'smarter' bandwidth like what we are now starting to see on the 285 and Maxwell (mem compression, etc).

We should see PC GPUs settle at around 6-8GB max alongside this console gen and bandwidth increase a LOT. That's great news for us! Console ports or no. :)
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
The best optimization is to use as much resources as available. That means the game takes advantage of 6GB Vram if you have it, but works on 2GB Vram if that is what you have.

That is game optimization. Unfortunately game developers don't care these days.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Do note, this is 4K resolution. Since the vast majority of people play at 1080p, I'm just not really buying this VRAM craze.
1080P v. 4k is going to be, at worst, a difference of a few hundred MBs. Probably less, on average.

Having more than enough RAM for buffers, these days, on average, and with VRAM used well as a cache by most engines, VRAM usage and need is only very weakly proportional to resolution, but strongly proportional to maximum model and texture sizes. Mesh size will vary much more than texture, but each higher texture size will take another 4x the space of the next smaller size (and will also include some number of the smaller sizes).

How much you want or need is going to vary, but don't consider the resolution itself to be the important part.
 

Jovec

Senior member
Feb 24, 2008
579
2
81
Option 1) Devs and Pubs hire qualified programmers, pay them accordingly, and give them enough time to thoroughly optimize game performance.

Option 2) Have the customers through more hardware at performance problems.

Which do you think the more likely option?

Edit: A little hyperbole and the truth is somewhere in the middle, but rising hardware requirements will always remain.
 
Last edited: