The main reasons for inflated VRAM requirements

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
BC, you speak as if those games couldn't be played with anything other than titan. These are not the first games that are hard to max-out, not the last either.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
BC, you speak as if those games couldn't be played with anything other than titan. These are not the first games that are hard to max-out, not the last either.

And specifically for Shadow of Mordor, the settings menu makes it clear that it scales down to 1GB VRAM for low settings.
6GB VRAM is for Ultra. It may be high, but PC games have always had options for both weaker and powerful hardware.
 

kasakka

Senior member
Mar 16, 2013
334
1
81
There aren't even any good examples of those super high res textures providing superior visuals. They're just easy to offer because that's how the textures were made.

Until 4K resolutions become actually viable I don't think we will see a notable difference between the highest quality textures and whatever is normal. There's a point where there just aren't enough pixels on the screen to show the tiniest detail differences. Plus with movement it all goes to hell anyway.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
There aren't even any good examples of those super high res textures providing superior visuals. They're just easy to offer because that's how the textures were made.

Until 4K resolutions become actually viable I don't think we will see a notable difference between the highest quality textures and whatever is normal. There's a point where there just aren't enough pixels on the screen to show the tiniest detail differences. Plus with movement it all goes to hell anyway.

Here they show difference between PS4 Xbone and PC Ultra textures:
http://www.eurogamer.net/articles/digitalfoundry-2014-shadow-of-mordor-face-off

About vram usage:
PC performance is a curious subject, especially in light of the huge video memory requirements for ultra level textures. In planting a GTX 780 Ti (featuring 3GB of GDDR5) into a Core i7 3770K PC backed by 16GB of RAM, the top-end results are startlingly choppy with this card. Even with a circa £450 GPU here, we're beset by huge downward spikes in frame-rate; the ultra texture setting even impacting performance with a 30fps cap set.

I think it is the same as with geometry - Diminishing returns. Each polygon multiplication gives lesser visual improvement. The same with textures - each MB of texture gives less visual improvement. We need a stop-motion, side-by-side, spy-glass analyse to see the difference.
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
Weird how they report the stuttering, when other people with similar specs report smooth gameplay with ultra level textures @ 1080p on video cards with 3GB of VRAM..

There must be factors other than just VRAM which can increase susceptibility to stuttering. If I had to guess, it could be their memory speed (most likely they were using DDR3-1600), or maybe they weren't running the game on an SSD. Out of the two, I'd wager on the latter being the culprit..

But then again, given Shadow of Mordor's system memory usage (which is only slightly over 1GB on ultra), I don't even think the game is using system memory to cache the textures.. If it were, then the system memory would be much higher..
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
I don't think it's a good comparison because there are a lot of other differences. PC version with High vs Ultra textures but other settings the same would be better.

The texture res increase on some of the ground areas is the biggest difference though. There's some additional vegetation and other subtle things, but that's about it.
 
Aug 11, 2008
10,451
642
126
To be fair, there are lots of different i7 CPUs.
Shadow of Mordor specifically mentions an i7 750 or Phenom II X4 965, and a Sandybridge i3 or above has superior single threaded performance.


Overstating the requirements are sure to affect the sales though. It's even worse for The Evil Within, which only has official recommended requirements.
I don't get why people are so surprised about games going to need more VRAM though. We've had plenty of high-end GPUs in the past reaching the VRAM limit.

Yes, but the "recommended" CPU is 3770k. Part of the problem is that no one really knows what "minimum" and "recommended" really mean. I tend to ignore the minimum spec, because in the past, that has usually meant a barely playable experience. At least for mordor though, seems like the minimum CPU is more than adequate. It is also strange that the recommended CPU is hyper threaded, because technically the vast majority of gaming systems don't meet this, since they are running i5 CPU or lower. Even more inexplicable is that what little data I have found seems to show little or no benefit from hyper threading.

Edit: Kind of forgot this was the vram forum. Didnt really mean to go off topic on the cpu requirements.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Nvidia and AMD are trying to solve the recommended settings thing a bit with there Experience/Raptor programs, all be it in quite different ways. But my experience of those is that they target 45 fps with drops into the 30s for their settings and I don't like to play below 45 at any point (used to be 60 but gsync has given me more room which was almost the same impact as upgrading to 2x 780 ti's in impact visually!). Recommended and minimum settings have this problem multiplied, especially in a game like Shadows of Mordor where the frame rate can go to 1/3 of its average and zoom up to 3x its average in some points. These vast swings make choosing the settings very difficult.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
Yes, but the "recommended" CPU is 3770k. Part of the problem is that no one really knows what "minimum" and "recommended" really mean. I tend to ignore the minimum spec, because in the past, that has usually meant a barely playable experience. At least for mordor though, seems like the minimum CPU is more than adequate. It is also strange that the recommended CPU is hyper threaded, because technically the vast majority of gaming systems don't meet this, since they are running i5 CPU or lower. Even more inexplicable is that what little data I have found seems to show little or no benefit from hyper threading.

Edit: Kind of forgot this was the vram forum. Didnt really mean to go off topic on the cpu requirements.

It's the other side of the same coin. What do next gen consoles have that most PCs don't? Lots of VRAM and lots of cores. Both consoles have 8 single threaded cores. Just like they're building the new games to rely heavily on lots of VRAM, they're leaning heavily on multithreading now.

I know what people are going to say already...but my i5 is SO much powerful that those jaguar cores! And that's true, but it's not necessarily the case that a single, powerful core can achieve the same things as a bunch of smaller cores in a heavily multithreaded workload. That's why intel themselves created a coprocessor filled with dozens of cores (the xeon phi coprocessor) rather than just trying to sell people entirely on bigger, more powerful CPUs. And PC games have a ton of overhead that consoles don't have to deal with. So it's no surprise that the ability to run more simultaneous threads is increasing in tandem with VRAM. The requirements for a gaming PC are changing in direct response to the new consoles.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
It's the other side of the same coin. What do next gen consoles have that most PCs don't? Lots of VRAM and lots of cores. Both consoles have 8 single threaded cores. Just like they're building the new games to rely heavily on lots of VRAM, they're leaning heavily on multithreading now.

I know what people are going to say already...but my i5 is SO much powerful that those jaguar cores! And that's true, but it's not necessarily the case that a single, powerful core can achieve the same things as a bunch of smaller cores in a heavily multithreaded workload. That's why intel themselves created a coprocessor filled with dozens of cores (the xeon phi coprocessor) rather than just trying to sell people entirely on bigger, more powerful CPUs. And PC games have a ton of overhead that consoles don't have to deal with. So it's no surprise that the ability to run more simultaneous threads is increasing in tandem with VRAM. The requirements for a gaming PC are changing in direct response to the new consoles.

An i5 is more than twice as fast per core than the consoles. And there is no way the consoles even get near.

And the Xeon Phi uses many cores like GPUs because the workload scales as it does and because making a single core CPU with that performance is impossible. Same reason why your i5 is not a single core. And multithreading things like games isnt perfect scaling. So multithreading gives a performance penalty that grows with the amount of cores. Its not running 6 times faster on 6 cores vs 1. You may be lucky if it even runs 4 times as fast.

The overhead? We already see an i3 running a console game at 100FPS. Something the console CPUs cant do. And with DX12 the API part is even lower.

And remember, a max of 6 cores can be used for games. Not 8, on the consoles.

But again, how many 6-8 threaded games do we see today from the Xbox360/PS3 era?
 
Last edited:

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I'm with Shintai on this. The CPU is the Achilles heel of the consoles. Devs complain about it all the time and how it bottlenecks the system.. Even with the much higher overhead that you find on PC, the desktop CPUs (especially the Core i series) are powerful enough to easily mitigate the handicaps from the drivers, and the Direct3D API.. This is even more so once they are overclocked, like many gamers and hardware enthusiasts tend to do..

When DX12 comes, PC gaming is going to receive a big shot in the arm because the shackles will finally be off..
 

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
I wont comment either way on the isse but here are some interesting slides to color the conversation.
infamous-second-son-4.5gb-gddr5-memory-analysis.jpg

infamous-second-son-graphics-buffer-memory-1024x571.jpg

ps4-cpu-decent-sucker-punch-1024x571.jpg

ps4-gpu-compute-infamous-second-son-gdc-2014.jpg

http://www.redgamingtech.com/infamous-second-son-engine-postmortem-analysis-breakdown/
http://www.dualshockers.com/2014/04...am-cpu-and-gpu-compute-to-make-our-jaws-drop/
 
Aug 11, 2008
10,451
642
126
It's the other side of the same coin. What do next gen consoles have that most PCs don't? Lots of VRAM and lots of cores. Both consoles have 8 single threaded cores. Just like they're building the new games to rely heavily on lots of VRAM, they're leaning heavily on multithreading now.

I know what people are going to say already...but my i5 is SO much powerful that those jaguar cores! And that's true, but it's not necessarily the case that a single, powerful core can achieve the same things as a bunch of smaller cores in a heavily multithreaded workload. That's why intel themselves created a coprocessor filled with dozens of cores (the xeon phi coprocessor) rather than just trying to sell people entirely on bigger, more powerful CPUs. And PC games have a ton of overhead that consoles don't have to deal with. So it's no surprise that the ability to run more simultaneous threads is increasing in tandem with VRAM. The requirements for a gaming PC are changing in direct response to the new consoles.

You totally missed the point of what I was saying. The point is, a hyperthreaded quad core is *not* needed to run any of the new console ports, except perhaps Watchdogs, and one could argue that game runs poorly on any cpu. In fact a dual core i3 runs most of them quite well.
 

Carfax83

Diamond Member
Nov 1, 2010
6,841
1,536
136
I've seen those slides before. It's difficult to extrapolate the PS4's memory usage with that of a PC however, because the PS4's memory us unified.

Infamous SS nearly uses 4.5GB of VRAM, but not all of that is dedicated to graphics related stuff. According to the article, roughly 3GB is for graphics.. On a PC, currently you can have 6GB of VRAM and all of it will be used for graphics in some form or manner..

So basically to equal the PS4, you need a GPU with about 3GB of VRAM and that corresponds with Shadow of Mordor, as 3GB will allow you to use high level textures @ 1440p and below. You can also use high level textures with 2GB, but only at 1080p and below..
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
You totally missed the point of what I was saying. The point is, a hyperthreaded quad core is *not* needed to run any of the new console ports, except perhaps Watchdogs, and one could argue that game runs poorly on any cpu. In fact a dual core i3 runs most of them quite well.

A dual core i3 has about as much throughput as all 8 console cores put together, and it does half as many threads.
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
An i5 is more than twice as fast per core than the consoles. And there is no way the consoles even get near.

And the Xeon Phi uses many cores like GPUs because the workload scales as it does and because making a single core CPU with that performance is impossible. Same reason why your i5 is not a single core. And multithreading things like games isnt perfect scaling. So multithreading gives a performance penalty that grows with the amount of cores. Its not running 6 times faster on 6 cores vs 1. You may be lucky if it even runs 4 times as fast.

The overhead? We already see an i3 running a console game at 100FPS. Something the console CPUs cant do. And with DX12 the API part is even lower.

And remember, a max of 6 cores can be used for games. Not 8, on the consoles.

But again, how many 6-8 threaded games do we see today from the Xbox360/PS3 era?


Theoretically all of that is true, yet this isn't the first and won't be the last game to recommend an i7. To the extent that it's a result of devs being sloppy, I just consider that another layer of overhead that PC gamers have always had to deal with. I'm aware they only have 6 cores available to use because 2 are reserved for the OS...but it's not like windows isn't running in the background of every PC game either, so it's a moot point.

Again, I'm just trying to be realistic. Whether or not this should be happening is besides the point - it IS happening, whether we like it or not, and whether it's justified or not. And the proximal cause for it is clearly the next generation of consoles. Seeing that those aren't going anyway anytime soon, I don't expect that these inflated requirements will either.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Theoretically all of that is true, yet this isn't the first and won't be the last game to recommend an i7. To the extent that it's a result of devs being sloppy, I just consider that another layer of overhead that PC gamers have always had to deal with. I'm aware they only have 6 cores available to use because 2 are reserved for the OS...but it's not like windows isn't running in the background of every PC game either, so it's a moot point.

Again, I'm just trying to be realistic. Whether or not this should be happening is besides the point - it IS happening, whether we like it or not, and whether it's justified or not. And the proximal cause for it is clearly the next generation of consoles. Seeing that those aren't going anyway anytime soon, I don't expect that these inflated requirements will either.

I think we already established that those recommendations and reality doesnt mix ;)

And Windows in the background uses what, 0.1% CPU time?
 

BD2003

Lifer
Oct 9, 1999
16,815
1
76
I think we already established that those recommendations and reality doesnt mix ;)

And Windows in the background uses what, 0.1% CPU time?


Well, then in that case they'll be unrealstic inflated numbers going forward.

It's not so much windows at idle as what else can be going on in the background - open browser windows, skype, lots of little helper utilities, etc...all that little stuff can add up. In win 8 and newer the desktop manager and all those metro apps can soak up a few hundred megs of VRAM too.
 

SlickR12345

Senior member
Jan 9, 2010
542
44
91
www.clubvalenciacf.com
We'll have to wait and see, since stacked ram is coming next and that would mean the size is likely going to stay the same, around 4GB for flagship, 3GB for mainstream and 2GB for entry level, but the output is going to be a lot bigger anyways.

It will be interesting to see if game developers optimize for stacked ram then or if we'll just get washed out garbage console ports.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Weird how they report the stuttering, when other people with similar specs report smooth gameplay with ultra level textures @ 1080p on video cards with 3GB of VRAM..

There must be factors other than just VRAM which can increase susceptibility to stuttering. If I had to guess, it could be their memory speed (most likely they were using DDR3-1600), or maybe they weren't running the game on an SSD. Out of the two, I'd wager on the latter being the culprit..

But then again, given Shadow of Mordor's system memory usage (which is only slightly over 1GB on ultra), I don't even think the game is using system memory to cache the textures.. If it were, then the system memory would be much higher..

I have Watchdogs on a 840evo and my RAM is 2100MHz and it stutters. The only person who claims his is smooth said it became so after he switched to a 5930k.

Also my PC is solely for gaming. I have no background apps and I have Aero disabled. Everything is set for "best performance". 3570k at 4.8Ghz. 780ti at 1243MHz.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Well, then in that case they'll be unrealstic inflated numbers going forward.

It's not so much windows at idle as what else can be going on in the background - open browser windows, skype, lots of little helper utilities, etc...all that little stuff can add up. In win 8 and newer the desktop manager and all those metro apps can soak up a few hundred megs of VRAM too.

Here is a little follow up on the greatness of the console cores:
http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus

The CPU in the consoles is simply a disaster.

Also it explains all the uncompressed audio and textures. The CPU in the consoles simply cant handle it otherwise.

Developer Ubisoft Montreal apparently didn't expect the CPU limitation. The team anticipated "a tenfold improvement over everything AI-wise," Pontbriand said, but they were "quickly bottlenecked." He describes the limitation as "frustrating" and says the game could be running at 100 FPS if it was "just graphics." Seems like this generation of consoles has enough pixel-pushing prowess for prettier visuals but not enough CPU power for richer gameplay.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Here is a little follow up on the greatness of the console cores:
http://techreport.com/news/27168/assassin-creed-unity-is-too-much-for-console-cpus

The CPU in the consoles is simply a disaster.

Also it explains all the uncompressed audio and textures. The CPU in the consoles simply cant handle it otherwise.

LOL @ CPU cant handle Sound. 10/10 would laugh again!

Wasn't the StarSwarm demo running on 8 core FX downclocked to 2GHz? Which is about the same as jaguar? If I recall correctly, this demo had more objects, AI, etc going on than every assassins creed combined.

New consoles have dedicated sound hardware.
Also, do supposedly uncompressed audio and textures require less CPU work?

If anything here is a disaster, its not console, but the game.
 

96Firebird

Diamond Member
Nov 8, 2010
5,711
316
126
Also, do supposedly uncompressed audio and textures require less CPU work?

Do you understand what compression is? If a file gets compressed, it needs to be uncompressed in the same fashion for final production. That takes CPU work. So yes, uncompressed audio and textures require less CPU work.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Do you understand what compression is? If a file gets compressed, it needs to be uncompressed in the same fashion for final production. That takes CPU work. So yes, uncompressed audio and textures require less CPU work.

Do you know what is responsible for compressing/uncompressing textures?