• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Evil Within - mandatory 4GB vRAM requirement

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Not sure why you are shocked.

We saw this coming with next gen consoles having ~6GB of available vram, which is a huge leap from the older consoles. Ofcourse any AAA dev worth his salt would take advantage of that and raise the bar in IQ.

Take an old game like Skyrim, it looks ugly by current standards, add in 4K texture mods and shader/lighting/shadow mods and it looks great. You can achieve a lot if you're given the resources to do so. In fact, it is terrible that AAA games don't do this, as in not having an "Ultra" setting.
 
Not sure why you are shocked.

We saw this coming with next gen consoles having ~6GB of available vram, which is a huge leap from the older consoles. Ofcourse any AAA dev worth his salt would take advantage of that and raise the bar in IQ.

Take an old game like Skyrim, it looks ugly by current standards, add in 4K texture mods and shader/lighting/shadow mods and it looks great. You can achieve a lot if you're given the resources to do so. In fact, it is terrible that AAA games don't do this, as in not having an "Ultra" setting.
um people are shocked because that is listed as mandatory and probably only about 10% of people even have 4gb of vram.
 
If a 780 Ti can't run it on Ultra purely because its missing 1GB vRAM, than that is rubbish.

High res textures take up a lot of space in vram. 2048 x 2048 texture = 16MB of vram in its raw form. I'm sure compression is used to lower it, but if they went with 4K (~67MB per object texture) textures, that is a massive leap in vram requirements, especially if their levels are large, re: Watchdogs.

Some people did not believe the 1GB vram advantage of R290/X vs 780/ti would come into play ever, they are clearly wrong. As more future games are released, they will be proven comprehensively wrong. A few years from now, CF R290s will still handle many AAA games on "Ultra" whereas SLI 780/ti will have to turn it down. :/
 
This is really weird. From the comments on the official blog post:

erfan said on Thursday, September 25, 2014 at 4:11 pm:

why 4gb VRAM ? can i run it with 2gb Vram or no?

gstaff (Bethesda) said on Thursday, September 25, 2014 at 6:28 pm:

You really should have 4 GB of VRAM to run it.

EDIT: Talking further with folks here, it sounds like you might be able to play with under 4 GB VRAM, but it’s still not recommended.

Bob Loblaw said on Thursday, September 25, 2014 at 8:17 pm:

That’s not a direct answer… is it playable on 2GB cards? Even if only at low settings?

gstaff (Bethesda) said on Thursday, September 25, 2014 at 8:24 pm:

Talking with folks on the team, you can give it a go with 2 GB VRAM, but it’s definitely not recommended. As mentioned in the blog post, we’re not posting minimum requirements, because we’re looking to share requirements that show the game the way it was meant to be played.

So... this game has customizable graphics options, right? What's the point of lowering my graphics settings if I still have to use 4GB of VRAM regardless? I'm a little confused. It still feels like there's some sort of miscommunication here, or there's something they're hiding.

Edit: Texture quality is the primary factor in VRAM requirements (I think, correct me if I'm wrong,) so if the game has any sort of texture quality option at all you should be able to play it just fine with a 2GB card. This is some exceptionally poor communication on Bethesda's part.
 
Last edited:
It cannot be possible 4GB Vram required for 1080p or 1440p maybe they are recommending for 4k

Games like Shadow of Mordor,Lord of fallen,Alien Isolation and Ryse dont require 4gb of Vram in fact they have 10x better graphics.This shows why Open GL is fail for AAA games.
 
Last edited:
Bethseda: Awesome games, tons of bugs.

You're thinking of Bethesda Game Studios, of Elder Scrolls/Fallout fame. This isn't being made by them, it's being made by Tango Gameworks in Japan. Bethesda is only publishing it.
 
So much for the great advances and easy porting between the consoles and PC. Seems like the developers are just becoming even more lazy in optimizing for PC, since they are already more similar than in the past. Maybe the developers were right in that a console can give image quality close to a high end PC. Ironically though, I think it will be because the games are lazy ports terribly optimized for PC, not because the consoles are so great.

I have a HD7770, and might consider upgrading to a 2gb card, but at current prices, there is no game on the horizon that I would want to play badly enough to pay the price of a 4gb gpu. These high vram requirements are even more troubling than the steeply increasing cpu requirements. At least there is some hope that DX12 will hold down the cpu requirements, but I dont know what can be done about the vram requirements.

Guess I will just have to stick to older games. The only games coming out now that really interest me anyway are the Borderlands pre-sequel and Dragon Age Inquisition.
 
Just because a game asks for 4gb of video ram doesn't mean that it is automatically a crap console port. I'd wait to actually either see the reviews and/or play the game until I actually decide that it is a crap and un-optimised or not.

Personally I don't think this is a one off and that quite a few newer games are going to require 3gb+ or video ram.
 
It cannot be possible 4GB Vram required for 1080p or 1440p maybe they are recommending for 4k

Games like Shadow of Mordor,Lord of fallen,Alien Isolation and Ryse dont require 4gb of Vram in fact they have 10x better graphics.This shows why Open GL is fail for AAA games.
Whether OpenGL or DX, that changes nothing. And what other games use or need also changes nothing. Even 1080P changes nothing. If textures and meshes are high enough detail they they will rarely take more than one pixel per polygon or texture pixel, you could go pretty high for 1080P. Whether it is a good idea or not, however...

I doubt there will be issues, but rather, that they need this shit written by marketing people, and that those people are not the ones handling this.

If it does run like crap w/o 4GB VRAM, it will be a failure in the market, though, because there just aren't enough people with 4GB, and not enough cheap 4GB cards.
 
Just because a game asks for 4gb of video ram doesn't mean that it is automatically a crap console port. I'd wait to actually either see the reviews and/or play the game until I actually decide that it is a crap and un-optimised or not.

Personally I don't think this is a one off and that quite a few newer games are going to require 3gb+ or video ram.

If the game scales well and can be played on say 2gb, I can accept that. But a game that is not programmed well enough to scale to less than 4gb vram (as the guys in this quote seem to be saying) I would consider it if not crap, at least stupid, because you are going to eliminate the vast majority of potential customers.

As an example, look at Frostbyte. It looks great on a high end system but still scales well to lesser hardware. A game had better be freaking phenomenal to justify a requirement of 4gb of vram to play it. I could accept that kind of requirement for ultra quality, but these guys seemed to be saying it takes 4gb of vram to even play at decent settings, although as someone else said, they didnt seem to be that knowledgeable about what it really took to play the game.
 
After Watch Dogs I am not that surprised. As more developers start targeting next gen consoles, they have less incentive to optimize for lower VRAM as next gen consoles gives them 4.5-5GB available. Granted since 7970 3GB and 670/680 4GB came out in early 2012, it's been more than 2.5 years. I hope at Ultra settings this game pushes the envelope to warrant 4GB. I am sure at medium and high texture settings you could get away with 2-3GB.

We shouldn't be complaining too much though. It's good to see developers moving forward or otherwise the extra VRAM on cards like 290/970/980 and next gen (390X/GM200) would be wasted even though gamers paid for them. There will a way to adjust settings to lower VRAM requirements but the suggestion here is if you want to max the game out, you want 4GB.

I kinda expected 670/680 2GB to be the next "paperweight" cards just as 470/480/570/580 1.5GB all suffered the same. Then again, many of these gamers will have an even greater incentive to upgrade to Maxwell which keeps the industry healthy vs. 8 year PS360 console stagnated period.
 
After Watch Dogs I am not that surprised. As more developers start targeting next gen consoles, they have less incentive to optimize for lower VRAM as next gen consoles gives them 4.5-5GB available. Granted since 7970 3GB and 670/680 4GB came out in early 2012, it's been more than 2.5 years. I hope at Ultra settings this game pushes the envelope to warrant 4GB. I am sure at medium and high texture settings you could get away with 2-3GB.

The consoles' memory is shared between the CPU and GPU though, but I agree with the overall gist of your argument..

With games getting so much bigger and seamlessness being the trend, a 2GB frame buffer is no longer enough to enjoy the highest texture resolutions..
 
That's nothing.

TVcK5cB.png


Mordor has a requirement of 6 GB VRAM for ultra textures, and that's just for 1080p. Though this is with the optional texture pack apparently.
 
I'm skeptical. It's either a really bad port or the person posting for Bethsoft is an idiot who thinks his "hard drive box" needs 4 gigamabits of that video stuff on the card RAM part.
 
Back
Top