The main reasons for inflated VRAM requirements

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Why makes you so certain it's VRAM that's causing the stutter?

I didn't say it was VRAM specifically. I said it's ineffective management of VRAM. The game is terrible at managing VRAM, hence why you still get stuttering on cards with 6GB of VRAM..

The onus for this is totally on Ubisoft, and not on PC gamers to upgrade their video cards to higher capacity models..

It remains to be seen whether AC Unity will follow suit. AC Unity uses an updated version of the older AnvilNext engine, so it may escape the issues plaguing Watch Dogs..
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
You're acting like there's no overhead associated with this management. They not only have to track which textures are visible, they also have to predict which won't be visible so they know which one to drop, all of which takes CPU time. Then they have to read them from a system memory buffer which has a latency penalty...if there's enough memory....because it's not like system memory is an inexhaustible resource either. Then they have to blend the texturing smoothly so you barely notice it, wasting precious GPU resources. Worst case they have to read from disk and then you might be staring at a blurry wall, a load screen, an unskippable cutscene, a long windy hallway or any other number of techniques to stall you so you don't overrun the buffer. They have to track and implement all sorts of bullshit to suit a scenario that no longer exists on next gen consoles, and soon will no longer exist on PC once VRAM catches up.

This isn't smart vs dumb, they're making real performance and even gameplay sacrifices by shifting the burden to other components. You're just mad because you have a card that doesn't have enough VRAM to keep up with it, and you're telling yourself this fancy tale about how lazy and incompetent they are to make yourself feel better about that. Be careful what you wish for, because one day you'll get a card with enough VRAM, and then you'll be glad they left that stuff behind.
 
Last edited:

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Hey thanks for that, it's a very useful program :awe:

Just played around with it a bit and loaded up Crysis 3 and Watch Dogs. The GPU shared bytes in both games was very low, around 130mb.

This just goes to show that fears of games using main memory for a frame buffer are completely unfounded. The game will only use VRAM as a frame buffer. The system memory's purpose is to buffer the VRAM itself..

If done properly, this causes no performance hit..


Add the GPU dedicated bytes column to see the gigabytes of textures and other stuff crysis 3 is storing in VRAM.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
You're acting like there's no overhead associated with this management. They not only have to track which textures are visible, they also have to predict which won't be visible so they know which one to drop, all of which takes CPU time.

Last time I checked, we're PC gamers not console gamers and modern CPUs have more than enough processing power to handle that overhead; unlike the anemic CPUs in the consoles.. I know this because there are lots of very intensive PC games that run much better on PC than they do on console. BF4 is one such example..

And that overhead has been decreasing over time. DX11.x has a significantly lower CPU overhead cost than DX9, and DX12 due out next year will have a lower CPU overhead cost than DX11.x That's just progress..

Also when developers start taking advantage of tiled resources, VRAM requirements should be reduced even more..

Then they have to read them from a system memory buffer which has a latency penalty...if there's enough memory....because it's not like system memory is an inexhaustible resource either. Then they have to blend the texturing smoothly so you barely notice it, wasting precious GPU resources. Worst case they have to read from disk and then you might be staring at a blurry wall, a load screen, an unskippable cutscene, a long windy hallway or any other number of techniques to stall you so you don't overrun the buffer. They have to track and implement all sorts of bullshit to suit a scenario that no longer exists on next gen consoles, and soon will no longer exist on PC once VRAM catches up.
It sounds to me like you're making excuses for shoddy game development practices, and you seem to think that high capacity VRAM is going to magically solve all of these issues..

But it won't. High level hardware can make up for a lot of things, but it can't make up for bad programming..

This isn't smart vs dumb, they're making real performance and even gameplay sacrifices by shifting the burden to other components. You're just mad because you have a card that doesn't have enough VRAM to keep up with it, and you're telling yourself this fancy tale about how lazy and incompetent they are to make yourself feel better about that. Be careful what you wish for, because one day you'll get a card with enough VRAM, and then you'll be glad they left that stuff behind.
I have two Gigabyte G1 GTX 970s that are equipped with 4GB each. That's more than enough to play any game out right now, and for the foreseeable future.. If it isn't, I'll just upgrade..

But the evidence is on my side. Shadows of Mordor that "supposedly" requires 6GB of VRAM for ultra level textures, plays just fine on 4GB cards by all appearances @ 1440p, and on 3GB cards @ 1080p..

Watch Dogs on the other hand requires 3GB for ultra level textures, yet still stutters on 3GB, 4GB and even 6GB cards... I wonder why that is? Maybe Watch Dogs requires an 8GB card?

Makes you go hmmmm :hmm:
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Take a look at the last 5 years on console ports before this new generation was released. Did we have lots and lots of lazy ports that had glaring problems on the PC? Were they clearly optimised for the console and hence didn't use the graphical performance of the PC GPU well? Answer to both is a resounding yes. So the problem is quite simply that some games are focussed on the console architecture, they are using it as well as they can (high VRAM availability, moderate GPU compute, poor IO and poor CPU performance). But the PC has different traits and this console optimisation is what breaks them. There have always been these lazy ports, lots and lots of them. When the hardware is so close like it is now that optimisation to the console matters, in a few years it wont and we will be complaining about the graphics again.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Last time I checked, we're PC gamers not console gamers and modern CPUs have more than enough processing power to handle that overhead; unlike the anemic CPUs in the consoles.. I know this because there are lots of very intensive PC games that run much better on PC than they do on console. BF4 is one such example..

And that overhead has been decreasing over time. DX11.x has a significantly lower CPU overhead cost than DX9, and DX12 due out next year will have a lower CPU overhead cost than DX11.x That's just progress..

Also when developers start taking advantage of tiled resources, VRAM requirements should be reduced even more..

It sounds to me like you're making excuses for shoddy game development practices, and you seem to think that high capacity VRAM is going to magically solve all of these issues..

But it won't. High level hardware can make up for a lot of things, but it can't make up for bad programming..

I have two Gigabyte G1 GTX 970s that are equipped with 4GB each. That's more than enough to play any game out right now, and for the foreseeable future.. If it isn't, I'll just upgrade..

But the evidence is on my side. Shadows of Mordor that "supposedly" requires 6GB of VRAM for ultra level textures, plays just fine on 4GB cards by all appearances @ 1440p, and on 3GB cards @ 1080p..

Watch Dogs on the other hand requires 3GB for ultra level textures, yet still stutters on 3GB, 4GB and even 6GB cards... I wonder why that is? Maybe Watch Dogs requires an 8GB card?

Makes you go hmmmm :hmm:

I'm not going to make excuses for watch dogs, that one is indeed coded like garbage. But mordor seems fine. There's lots of things that affect VRAM, and if it plays fine on lower cards, what are you complaining about? They're probably just being conservative in their recommendation for 6GB, there might be only a few scenes in the game that hit that level when combined with ultra high resolution/AA.

Any gains due to APIs are irrelevant, because devs will just use that to push limits. If tiled resources let them get away with 50% less texture memory for the same detail, they'll just double the detail. If DX12 reduces CPU overhead, they'll use those freed up resources somewhere else.

And while you're absolutely right that most PCs can handle that overhead, I'm just being realistic. Consoles drive development of big games, and PC keeps pace with brute force. That's how it's always been, and how it always will be. You wanted to know why there's an explosion in VRAM reqs, and that's the answer. If you've got 4GB cards, you're fine for a while, so stop complaining. Everyone else can just lower the texture detail and live with it.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
What's there to expand on that hasn't already been said? If it were simply a matter of VRAM capacity, then the Titans would play Watch Dogs without stuttering.

But evidently this isn't the case. Brute VRAM capacity can't make up for shoddy programming and bad design..

Exactly. Seems like everyone runs their mouth about VRAM but no matter how many times this is mentioned nobody backs off the VRAM explanation. Sorry. Just sick and tired of it.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Any gains due to APIs are irrelevant, because devs will just use that to push limits. If tiled resources let them get away with 50% less texture memory for the same detail, they'll just double the detail. If DX12 reduces CPU overhead, they'll use those freed up resources somewhere else.

Yep, and as they should..

And while you're absolutely right that most PCs can handle that overhead, I'm just being realistic. Consoles drive development of big games, and PC keeps pace with brute force. That's how it's always been, and how it always will be. You wanted to know why there's an explosion in VRAM reqs, and that's the answer. If you've got 4GB cards, you're fine for a while, so stop complaining. Everyone else can just lower the texture detail and live with it.

I wasn't complaining. I just found the whole flipping out over VRAM requirements to be silly. There are people that cancelled their GTX 980/970 orders when they found out Shadow of Mordor "required" 6GB of VRAM for ultra level textures. They think this is the new norm, but the truth is much more complicated than that as we've been discussing.

BTW here's a really informative presentation by a NVidia rep about hardware based tiled resources and how it will impact the future of games
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Take a look at the last 5 years on console ports before this new generation was released. Did we have lots and lots of lazy ports that had glaring problems on the PC? Were they clearly optimised for the console and hence didn't use the graphical performance of the PC GPU well? Answer to both is a resounding yes. So the problem is quite simply that some games are focussed on the console architecture, they are using it as well as they can (high VRAM availability, moderate GPU compute, poor IO and poor CPU performance). But the PC has different traits and this console optimisation is what breaks them. There have always been these lazy ports, lots and lots of them. When the hardware is so close like it is now that optimisation to the console matters, in a few years it wont and we will be complaining about the graphics again.

Last generation console ports weren't very good admittedly, but current generation should be much better because the PS4 and Xbox One both use x86-64 architecture and DX11.2 class hardware..

Once hardware based tiled resources comes into play, the VRAM issues should be mitigated considerably..
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
so consoles fucking up pc gaming isn't just fanboi ranting, it is actually the truth.

fuck that.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
So, the next game is comming - The Evil Within. Here are recommended specs:
http://www.tomshardware.com/news/bethesda-evil-within-specs-gaming,27815.html#xtor=RSS-998
the recommended system requirements [for The Evil Within] include a 64-bit version of Windows 7 or Windows 8, an Intel Core i7 processor with four cores or more, and 4 GB of RAM. The game also needs an Nvidia GeForce GTX 670 or equivalent GPU with 4 GB of VRAM,

What is worth nothing is, all these games are multi-platforms for both console generations. Here is a shocker: Those games using 4GB of VRAM run on 256MB VRAM PlayStation3.

It's not like they dump everything into VRAM because they can. It is design decision to make use of whats available. And its not news that PCs are way behind the new gen consoles now.

What is average VRAM capacity for a PC?

Just as before, consoles will push PC industry forward and will help make the shift. And just as before, people will quickly adapt and wait for next-next gen console to change the pc gaming again.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
So, the next game is comming - The Evil Within. Here are recommended specs:
http://www.tomshardware.com/news/bethesda-evil-within-specs-gaming,27815.html#xtor=RSS-998


What is worth nothing is, all these games are multi-platforms for both console generations. Here is a shocker: Those games using 4GB of VRAM run on 256MB VRAM PlayStation3.

It's not like they dump everything into VRAM because they can. It is design decision to make use of whats available. And its not news that PCs are way behind the new gen consoles now.

What is average VRAM capacity for a PC?

EvilW ithin is no news:
http://forums.anandtech.com/showthread.php?t=2401373

Same crap. Abuse of storage and VRAM.

Just as before, consoles will push PC industry forward and will help make the shift. And just as before, people will quickly adapt and wait for next-next gen console to change the pc gaming again.

You dont move the PC industry forward just by abusing VRAM. And specially not when the bar is a middle to lowend card with just 4GB tagged on. Also the consoles are so utterly far behind on other metrics that the combined sum drags downwards rather drasticly.

PC gaming revenue is also higher than console.

Evil Within got its own thread:
http://forums.anandtech.com/showthread.php?t=2401373

Same crap and abuse of VRAM and storage. More 50GB installs for unompressed textures and audio.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
You dont move the PC industry forward just by abusing VRAM. And specially not when the bar is a middle to lowend card with just 4GB tagged on. Also the consoles are so utterly far behind on other metrics that the combined sum drags downwards rather drasticly.

PC gaming revenue is also higher than console.

Are they storing cat pictures in the VRAM? Do you have any proof of abusing, or just talking out of your back?

Will love to see you repeat that statement when consoles start to use compute for games destroying every castrated (nvPR: optimized for efficiency) Kepler card.

You can have all the processing power of the world, but without enough VRAM you will lagg behind -> kaveri symptom
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
You dont move the PC industry forward just by abusing VRAM. And specially not when the bar is a middle to lowend card with just 4GB tagged on. Also the consoles are so utterly far behind on other metrics that the combined sum drags downwards rather drasticly.

It's just a side effect of the developers designing their games for the consoles really. The real solution is to replace PCI Express with something that a dGPU can directly access system memory (eg: NVLink). Good luck in getting Intel to agree to do that though.

PC gaming revenue is also higher than console.

If you include WoW and the F2P games, I suppose. But the AAA titles that have been discussed in this thread, the PC sales is only a small fraction. Mobile at this point is probably bigger than both combined, heh.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
You dont move the PC industry forward just by abusing VRAM. And specially not when the bar is a middle to lowend card with just 4GB tagged on. Also the consoles are so utterly far behind on other metrics that the combined sum drags downwards rather drasticly.

PC gaming revenue is also higher than console.

Evil Within got its own thread:
http://forums.anandtech.com/showthread.php?t=2401373

Same crap and abuse of VRAM and storage. More 50GB installs for unompressed textures and audio.


No amount of GPU power can make up for a lack of VRAM. The other metrics are irrelevant.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
It's just a side effect of the developers designing their games for the consoles really. The real solution is to replace PCI Express with something that a dGPU can directly access system memory (eg: NVLink). Good luck in getting Intel to agree to do that though.

I don't think PCI Express is the limiting factor here, as it provides more than enough bandwidth between the GPU and system memory for mere gaming even if that data has to go through the CPU first. Remember the CPU has a PCI-e controller embedded in it now..

NVLink is primarily designed for massive supercomputers that need to share terabytes or petabytes of data between the CPUs and GPUs so it's a completely different scenario.

For gaming, that kind of bandwidth would just be overkill considering that PCI-E 3.0 16x isn't nowhere near twice as fast as PCI-E 3.0 8x (even on the highest settings), so clearly gaming GPUs aren't being bandwidth constrained..

Now when GPUs are eventually capable of photorealistic graphics, it may be a limitation, but that's still around 10 years down the road or so..

If you include WoW and the F2P games, I suppose. But the AAA titles that have been discussed in this thread, the PC sales is only a small fraction. Mobile at this point is probably bigger than both combined, heh.
You are seriously downplaying the profitability of PC gaming. If PC sales is as small a fraction as you seem to believe, why do developers even bother with it?
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I find it strange that the Evil Within recommends either a GTX 670 (most models are equipped with 2GB of VRAM) or an equivalent GPU with 4GB of VRAM.

It seems almost contradictory..
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What is worth nothing is, all these games are multi-platforms for both console generations. Here is a shocker: Those games using 4GB of VRAM run on 256MB VRAM PlayStation3.

It's not like they dump everything into VRAM because they can. It is design decision to make use of whats available. And its not news that PCs are way behind the new gen consoles now.

I find it strange that the Evil Within recommends either a GTX 670 (most models are equipped with 2GB of VRAM) or an equivalent GPU with 4GB of VRAM.

It seems almost contradictory..

I agree Carfax. I also find it strange that its a console port using x86 hardware and requires an i7 or equivalent. This is an xbox 360 title.
 

jpiniero

Lifer
Oct 1, 2010
14,591
5,214
136
You are seriously downplaying the profitability of PC gaming. If PC sales is as small a fraction as you seem to believe, why do developers even bother with it?

Because the additional cost isn't much, especially if they are using a licensed engine. Now, spending the effort to optimize it is another story. Compare this to the Wii U, which third party developers pretty much abandoned because they would have to do quite a bit of work to make their game work for a title designed around the PS4/XBone.
 
Aug 11, 2008
10,451
642
126
Well, so fa it seems the system requirements have been over-rated for most of the games the are requiring i7. For instance, Mordor runs fine in an i3. The only one is Watchdogs, which run like crap no matter what. Not sure what is going on here, but seems like they could be deterring a lot of potential buyers by overstating the requirements.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Well, so fa it seems the system requirements have been over-rated for most of the games the are requiring i7. For instance, Mordor runs fine in an i3. The only one is Watchdogs, which run like crap no matter what. Not sure what is going on here, but seems like they could be deterring a lot of potential buyers by overstating the requirements.

recommended and minimal system requirements are two different things. They recommend 4GB of VRAM for best experience. It may very well run on 1GB, but the texture quality, rendering distance, scene complexity may be lowered.

Its PC gaming FFS! You can customize in-game settings to fit your sys spec. If that is not enough, you can downgrade settings in drivers to make it run on ancient hardware.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
Well, so fa it seems the system requirements have been over-rated for most of the games the are requiring i7. For instance, Mordor runs fine in an i3. The only one is Watchdogs, which run like crap no matter what. Not sure what is going on here, but seems like they could be deterring a lot of potential buyers by overstating the requirements.

To be fair, there are lots of different i7 CPUs.
Shadow of Mordor specifically mentions an i7 750 or Phenom II X4 965, and a Sandybridge i3 or above has superior single threaded performance.


Overstating the requirements are sure to affect the sales though. It's even worse for The Evil Within, which only has official recommended requirements.
I don't get why people are so surprised about games going to need more VRAM though. We've had plenty of high-end GPUs in the past reaching the VRAM limit.
 
Last edited:

BD2003

Lifer
Oct 9, 1999
16,815
1
76
To be fair, there are lots of different i7 CPUs.
Shadow of Mordor specifically mentions an i7 750 or Phenom II X4 965, and a Sandybridge i3 or above has superior single threaded performance.


Overstating the requirements are sure to affect the sales though. It's even worse for The Evil Within, which only has official recommended requirements.
I don't get why people are so surprised about games going to need more VRAM though. We've had plenty of high-end GPUs in the past reaching the VRAM limit.


I get why people are upset...2GB cards are still the norm. Even most 780 Tis were 3GB. But it's a chicken and egg problem. The video card market is ultra competitive, and while there were a few high VRAM cards out there, no one was buying them because they didn't perceive the need for it. Then all these true next gen ports start coming out, the VRAM reqs double overnight and everyone finds out the hard way that their "high end" GPUs is starved for memory at max settings, and they're looking for someone to blame.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I don't get why people are so surprised about games going to need more VRAM though. We've had plenty of high-end GPUs in the past reaching the VRAM limit.

The surprise is that the games are targeting Titan GPUs only and nothing else with those settings. Not very many people bought those cards especially since they are primarily about compute and not really about gaming.

One thing I have said is that it makes little sense for developers to target hardware no one has. Its one thing to make people want to upgrade for the improved visuals your game brings but its a whole different thing to be asking people buy professional class GPUs instead of gaming ones. Its made doubly worse when we find out these games are using uncompressed textures which take up vastly more space, and the reason is because the xbox/ps don't seem to be doing compressed textures (I don't really understand why though, its hardware decoded and DX exposes a tonne of options, although what the console APIs do I don't know).

If you want a game to sell it has to target hardware people have or would buy. No one is going to buy a titan because the developers put no effort in porting the game well, for some it would be a step back in performance and for most its simply not worth the cost. Its this requirement for 6GB that makes no sense, the hardware doesn't exist in sufficient quantity, its not marketable. Hence I think these are just classic signs of super lazy bad console ports.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
The surprise is that the games are targeting Titan GPUs only and nothing else with those settings. Not very many people bought those cards especially since they are primarily about compute and not really about gaming.



One thing I have said is that it makes little sense for developers to target hardware no one has. Its one thing to make people want to upgrade for the improved visuals your game brings but its a whole different thing to be asking people buy professional class GPUs instead of gaming ones. Its made doubly worse when we find out these games are using uncompressed textures which take up vastly more space, and the reason is because the xbox/ps don't seem to be doing compressed textures (I don't really understand why though, its hardware decoded and DX exposes a tonne of options, although what the console APIs do I don't know).



If you want a game to sell it has to target hardware people have or would buy. No one is going to buy a titan because the developers put no effort in porting the game well, for some it would be a step back in performance and for most its simply not worth the cost. Its this requirement for 6GB that makes no sense, the hardware doesn't exist in sufficient quantity, its not marketable. Hence I think these are just classic signs of super lazy bad console ports.


To be fair I haven't seen any games require 6GB. Just recommend 4GB for high, and shadow of mordor gives a 6GB option, because why not? Textures are created at ultra high resolution before being downsampled to reasonable sizes. Why not give people the unreasonably high option if they have an unreasonable amount of VRAM? Is it really fair to hold that back from them for fear of hurting the egos of everyone with 2-4GB cards?