Evil Within - mandatory 4GB vRAM requirement

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
I assume its all uncompressed textures to lower the CPU overhead for the consoles when they put the textures into VRAM from HD/Bluray.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
But, the Xbox 360 has 512MB of shared memory (plus 10MB edram), the PS4 has 256MB system ram and 256MB vram. You probably wouldn't want to play ports from those old consoles on a PC with a total of 512MB split between system ram and vram. Likewise, the new consoles have 5GB available and 8GB total. So I wouldn't be surprised if, just as with the old consoles, you need a fair amount more ram and vram in your PC than is packed inside the new consoles.

Its usually also apples and oranges. Features, IQ and function changes.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
This is why I'm holding out for 8GB cards. I usually skip two generations of GPUs between upgrades (3-5 years). Given my upgrade cycle, there's nothing for me to buy at the moment that I feel comfortable with. I assume we'll see non-reference 8GB 9xx-series cards soon enough. If AMD goes 256 or 512 bus for 3xx-series, then I assume we'll get 4/8GB configurations of those as well.

When I saw that new consoles were coming with 8GB RAM (combined in one way or another), 4K displays were coming to market much cheaper than people predicted, and high-res texture packs are getting much more popular, combined with the industry's less-than-stellar port reputation, I knew I would be holding out for an 8GB card. Even though I KNOW it isn't necessary for 4K gaming and I KNOW that consoles don't dedicate all 8GB to graphics, I figured it's CYA.

I'd rather have overkill than the alternative and it's worked well for me since the 90s.
 

Pinstripe

Member
Jun 17, 2014
197
12
81
How much VRAM can the consoles even dedicate for textures? 2GB, 3GB? They have puny CPU/GPU's so I doubt they can push through more than 2GB reserved for textures, 3Gb at most for Killzone but that was a showcase techdemo. I love how people always go into panic mode due to a couple of shitty optimized games with outdated engines.
Same with Watch Dogs. Turned out it just runs fine with 2GB VRAM after couple of patches and reasonable settings.
 

It's Not Lupus

Senior member
Aug 19, 2012
838
3
81
That's a contradictory statement about requirements.

It sounds like console developers first time developing on the PC. 4GB VRAM and i7?
 

Pinstripe

Member
Jun 17, 2014
197
12
81
That's a contradictory statement about requirements.

It sounds like console developers first time developing on the PC. 4GB VRAM and i7?

Well the developer is Japanese and the engine has been given virtually zero support from idSoft for years so that makes sense.
 
Last edited:

Pinstripe

Member
Jun 17, 2014
197
12
81
That's how a well-optimized Next-Gen PC port looks like:

RYSE: Son of Rome:


System Requirements


  • Minimum:
    • OS: Windows Vista SP1, Windows 7 or Windows 8 (64bit)
    • Processor: Dual core with HyperThreading technology or quad core CPU (4+ logical processors)
    • Memory: 4 GB RAM
    • Graphics: DirectX 11 graphics card with 1 GB video RAM
    • DirectX: Version 11
    • Hard Drive: 26 GB available space
    • Sound Card: DirectX compatible Sound Card with latest drivers


  • Recommended:
    • OS: Windows Vista SP1, Windows 7 or Windows 8 (64bit)
    • Processor: Quad Core or Six Core CPU (6+ logical processors)
    • Memory: 8 GB RAM
    • Graphics: DirectX 11 graphics card with 2 GB video RAM
    • DirectX: Version 11
    • Hard Drive: 26 GB available space
    • Sound Card: DirectX compatible Sound Card with latest drivers
 

SlowSpyder

Lifer
Jan 12, 2005
17,305
1,002
126
Its usually also apples and oranges. Features, IQ and function changes.


Yes. But the point is that just because the new consoles use 5GB of shared system ram / vram, a 2GB - 3GB video card isn't going to automatically be enough for ports. Just like a 256MB video card wouldn't be for the Xbox 360 or PS4.
 

rolodomo

Senior member
Mar 19, 2004
269
9
81
6GB VRAM needed for ultra textures in Shadow of Mordor

Lol, I guess that makes the GTX 970/980 the next paper weight card. Good thing I automatically assume game developers have the best technical talent and know what they're doing! ;)
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Lol, I guess that makes the GTX 970/980 the next paper weight card. Good thing I automatically assume game developers have the best technical talent and know what they're doing! ;)

I have a tough time believing that those sort of requirements make sense. The consoles are really weak compared to PCs, but now we need 6GB VRAM on PC for ultra textures at 1080p in console ports ?! :awe:
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
How much VRAM can the consoles even dedicate for textures? 2GB, 3GB? They have puny CPU/GPU's so I doubt they can push through more than 2GB reserved for textures, 3Gb at most for Killzone but that was a showcase techdemo. I love how people always go into panic mode due to a couple of shitty optimized games with outdated engines.
Same with Watch Dogs. Turned out it just runs fine with 2GB VRAM after couple of patches and reasonable settings.
you don't need horse power to push higher res textures. They use virtually no more power than low res textures and only need video RAM.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I have a tough time believing that those sort of requirements make sense. The consoles are really weak compared to PCs, but now we need 6GB VRAM on PC for ultra textures at 1080p in console ports ?! :awe:

This is for Ultra textures only. The game will likely run perfectly well with much less VRAM, but not with Ultra textures.

The problem here, is they are claiming that 4GB is the minimum requirement for the game at any setting.
 

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
Im rolling on the floor when someone mentions 'unoptimized port that eats vram'.
WTF is that? Console master race and dirty PC peasants?

It was obvious that 8GB consoles will push games requirements forward. Taking the 250mb vram pool on last gen consoles into account, one should expect next gen games take at least 8x more vram (more like 16x).

There was so much high horse, about low res textures at the end of last gen lifecycle going from PC elitists, now here we are. PC peasants crying that their shiny $$$$ boxes can't run simple stealth game?

All those post are laughs worthy and should be quoted for later.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
How much VRAM can the consoles even dedicate for textures? 2GB, 3GB?
As much as they see fit, within the 5-6GB the game can use.
They have puny CPU/GPU's so I doubt they can push through more than 2GB reserved for textures, 3Gb at most for Killzone but that was a showcase techdemo. I love how people always go into panic mode due to a couple of shitty optimized games with outdated engines.
Video cards equivalent to what they have can readily use 2+GB of textures. They aren't high-end, but the GPUs are hardly, "puny."

Bigger textures allow greater detail up close, all else being equal. Farther away, a lower-detail version can be used, and not use any more GPU memory bandwidth than if the maximum texture size were smaller. More space can also be used to give more objects unique textures, which can be done without significantly increasing bandwidth needed, as well (FI, more textures for concrete surfaces, or trees, to increase immersion). Not bothering to scale down, on the PC, however, is a really bad idea.
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
Not sure why you are shocked.

We saw this coming with next gen consoles having ~6GB of available vram, which is a huge leap from the older consoles. Ofcourse any AAA dev worth his salt would take advantage of that and raise the bar in IQ.

Take an old game like Skyrim, it looks ugly by current standards, add in 4K texture mods and shader/lighting/shadow mods and it looks great. You can achieve a lot if you're given the resources to do so. In fact, it is terrible that AAA games don't do this, as in not having an "Ultra" setting.

The new Consoles don't have 6GB vram. They don't even have vram. They have unified system ram, an inordinate amount of which is reserved to OS. PS4 uses 3.5GB RAM for the OS, though that is likely to change. Xbone uses 3GB for the OS(es). So if you call it 4GB for system memory, that leaves a puny 1GB or less for VRAM. Generously giving a 33% reduction in total RAM usage due to non-duplication of assets between system RAM and VRAM still only gets you 2-3GB at very best for VRAM. In real terms, GTX 660/670 and 7850s are already basically equivalent in specs AND in RAM.

The real issue is how ridiculously hungry the background OSs are this time around. They might as well just run Vista at that rate...

http://www.ibtimes.com/ps4-vs-xbox-...-xbox-according-report-does-it-matter-1361395
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
The new Consoles don't have 6GB vram. They don't even have vram. They have unified system ram, an inordinate amount of which is reserved to OS. PS4 uses 3.5GB RAM for the OS, though that is likely to change. Xbone uses 3GB for the OS(es). So if you call it 4GB for system memory, that leaves a puny 1GB or less for VRAM. Generously giving a 33% reduction in total RAM usage due to non-duplication of assets between system RAM and VRAM still only gets you 2-3GB at very best for VRAM. In real terms, GTX 660/670 and 7850s are already basically equivalent in specs AND in RAM.

The real issue is how ridiculously hungry the background OSs are this time around. They might as well just run Vista at that rate...

http://www.ibtimes.com/ps4-vs-xbox-...-xbox-according-report-does-it-matter-1361395

Not true.
http://gearnuke.com/memory-biggest-bottleneck-playstation4xbox-one-developers-claims-insider/
In a statement issued to Eurogamer's Digital Foundry, Sony addresses a key technical matter from the original story.

We would like to clear up a misunderstanding regarding our "direct" and "flexible" memory systems. The article states that "flexible" memory is borrowed from the OS, and must be returned when requested - that's not actually the case.The actual true distinction is that:

"Direct Memory" is memory allocated under the traditional video game model, so the game controls all aspects of its allocation
"Flexible Memory" is memory managed by the PS4 OS on the game's behalf, and allows games to use some very nice FreeBSD virtual memory functionality. However this memory is 100 per cent the game's memory, and is never used by the OS, and as it is the game's memory it should be easy for every developer to use it.
We have no comment to make on the amount of memory reserved by the system or what it is used for.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
This is untrue. :|
It's true if one uses mipmapping and there is no magnification.
Also even with modest magnification it is pretty close.

This is due to the texture caching doing the work.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
This is untrue. :|
please show me one game where using higher res textures has any meaningful impact on framerates if you have enough vram. I have never seen more than 1 fps difference in any game by choosing very high textures over low.
 

MTDEW

Diamond Member
Oct 31, 1999
4,284
37
91
They've gotta sell new PC hardware somehow.
And right now it seems they are using V-ram requirements to do it.

Heck once we all own GPU's with 8gb v-ram that equals the amount of memory in an entire console...shouldn't the console ports be able to run entirely on our GPUs while rest of the hardware in your PC just sits back and watches the GPU do its thing.
LOL...yeah right. o_O
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
please show me one game where using higher res textures has any meaningful impact on framerates if you have enough vram. I have never seen more than 1 fps difference in any game by choosing very high textures over low.

I don't see why I need to provide any examples. GPUs aren't magical, they can't process arbitrarily large textures at a fixed speed. At some point you will hit fillrate or bandwidth limits.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
I don't see why I need to provide any examples. GPUs aren't magical, they can't process arbitrarily large textures at a fixed speed. At some point you will hit fillrate or bandwidth limits.
so you have no real world proof then just like I thought. I on the other hand have actually compared low res to very high res textures and saw basically no performance difference.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
I don't see why I need to provide any examples. GPUs aren't magical, they can't process arbitrarily large textures at a fixed speed. At some point you will hit fillrate or bandwidth limits.
At some point. Question is: where is that point?

If there is enough spare bandwidth to easily process another 5-10% of texture data, the effect should be within testing margins of error. But, you may need +70-100% VRAM to fit the complete textures to allow that. If every surface used an 8K texture, then that would be a problem. Most won't, though, instead using a smaller one. That big texture allows surface to not look fuzzy as they get very close to your viewpoint, and/or to allow surfaces covering large areas of the screen to have one big texture.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
so you have no real world proof then just like I thought. I on the other hand have actually compared low res to very high res textures and saw basically no performance difference.

Nm, link below.
 
Last edited:

realibrad

Lifer
Oct 18, 2013
12,337
898
126
so you have no real world proof then just like I thought. I on the other hand have actually compared low res to very high res textures and saw basically no performance difference.

Here is a link that backs you up.

http://www.tweakguides.com/Crysis3_5.html

The graph shows results that are quite normal for texture resolution-related settings: there is no difference in FPS at each level of the setting. The key issue with Texture Resolution is the amount of Video RAM on your GPU that is consumed to hold textures. If you have a graphics card with a lower amount of VRAM, then you may experience periodic stuttering, or visible texture streaming, if Texture Resolution is set too high. Texture Resolution of Medium is generally a good balance of image quality and smooth performance.