Vram requirements...I don't buy it for a second

moonbogg

Lifer
Jan 8, 2011
10,669
3,200
136
I was playing Crysis 3 last night and that game looks totally stunning, better than any new game that I have seen or played myself. It looks better than Far Cry 4 and GTA V and many other games, yet Crysis 3 only uses about 2.5GB of Vram @ 1440p, fully maxed with 4XAA. How is that? How is that possible? Some would say its because its not an open world game, but that's BS. Far Cry 3 looks almost the same as FC4 and it uses about half the Vram that FC4 does.
As new GPU's are released, games seem to require right up to the limit of the Vram that these new GPUs offer, regardless of graphical fidelity. Why do Vram requirements continue to increase parallel to new GPU Vram amounts?
Crysis 3 was released February 2013, which pushed just outside the limits of the GTX 680 and was just within the limits of the 7970. If you had the money and wanted the game to run at high res with no Vram concerns, you could buy the GTX Titan, which coincidentally was released on the same day as Crysis 3.
There are exceptions though, such as all COD games. All COD games can still be maxed on an 8800GT and Core 2 Duo. And if they can't, its because they don't want you to have it like that, just for fun and just because they want to troll customers by making them buy new hardware for the lolz. All COD games look like COD 4 still.
So why the inflated and ridiculous Vram requirements?
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,378
5,523
136
It's because it's not an open world game. Open world games need more generalised, pessimistic caching algorithms.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Witcher 3 doesn't use a lot.

I think many games use it because it's there, you can always have higher quality textures, even if it only gets you slightly better image quality.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
I agree about the COD games,i own some of the series to certainly agree with the Op.

Well disasters like COD AW require double what games like COD BO2 did.A half decent quad core and a 1gb 650 can run with the majority of settings enabled in BO2 with ease at over 80fps at 1080p.

The older games up to BO2 can run fine on a 9500gt if you drop to 800x600 low.I know this from experience.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
^ What he said.

Fallout 4 will use every last byte.

That's not an excuse. FO4 isn't loading high resolution textures, shadows or details. Everything in it is primitive, last gen. Even the shadows on houses get loaded at the last minute per the video in this overview:
http://www.computerbase.de/2015-11/fallout-4-benchmarks/

Open world is not the only explanation why VRAM requirements have skyrocketed with nothing to show for them because there are other games outside of FO4 that suffer from this issue: Black Op 3, Advanced Warfare, Titanfall, Dead Rising 3, Batman AK, Mortal Kombat X, etc. FO4 is a perfect example of an outdated game engine that is also unoptimized pile. The game looks horrible and runs like a turd even on 980Ti/Fury X given its level of graphics. PS4 patch alone improves frame rates 40-50%, a single patch with no changes to GPU drivers or hardware. When a game is that poorly optimized at launch, it's not possible to believe that it's well-optimized in any other area, including VRAM usage. Therefore, using a random game's open-world nature isn't a safe ticket/excuse for all open world games when some of them are clearly plagued by lack of proper optimization:
https://www.youtube.com/watch?v=oL5-nYt2Rvw

I was playing Crysis 3 last night and that game looks totally stunning, better than any new game that I have seen or played myself. It looks better than Far Cry 4 and GTA V and many other games, yet Crysis 3 only uses about 2.5GB of Vram @ 1440p, fully maxed with 4XAA. How is that? How is that possible?

That's because some developers know how to optimize PC games and use cutting edge efficient game engines and others use bloated/horribly optimized game engines and on top of that cannot optimize games if their life depended on it; or most likely because they aren't given a chance since they are forced by the publisher to release a game ahead of time to hit quarterly earnings/holidays.

BTW, play SW:BF, I think it looks better than Crysis 3 and runs even better.

https://www.youtube.com/watch?v=6XCHBrkBc8E
https://www.youtube.com/watch?v=KtZd7mOS8bI

There is no doubt in my mind if Crytek or DICE made games like FO4 or Black Ops 3 they would look way better using the same hardware we have today.

The other explanation is that some console to PC ports or PC games made to cater to consoles too are just very poorly optimized with memory leaks - Watch Dogs, Black Ops 3, Titanfall, Batman AK, AC Unity/Syndicate all come to mind.

Expect VRAM requirements to increase even more because we are now in the 3rd year of console gen and developers have 4.5-5GB of VRAM for games on those consoles. Most AAA games can only exist if they sell on both consoles and PC which means AAA developers will be targetting XB1/PS4 as the primary platforms. Since these consoles have weak CPUs/GPUs, the developers are going to use the VRAM/virtual memory system tricks as one of the main areas to help them get performance up to par.

However, we should expect next gen games to use more and more VRAM. Even Crytek mentioned that 8GB of RAM on current consoles will become a bottleneck over time:
http://www.escapistmagazine.com/new...-Limiting-Factor-For-PS4-Xbox-One-Development

Thankfully 16GB of RAM can be purchased for $70-100 and next gen GPUs should start to have 6-8GB of VRAM in the $300-550 price brackets.
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
Fallout 4 is rather badly programmed. Boris Vorontsov, the guy behind ENB Series, has been fighting with Bethesda's renderers for a while now. He learned that Fallout 4 uses six render targets (I.E, has six copies of the current frame), when three would do just fine.

That right there ain't helping the vRAM. Then when you factor in the needlessly large textures, you're gonna want every GB you can get.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
The problem is the WDDM1.x. It is impossible to fit everything to the VRAM with today's graphics complexity. So many engines use streaming which is a really good option for the consoles, but extremely hard to optimize it on the PC. Even if there is a lot of unused data in the VRAM, it is not recommended to delete these anyway, because WDDM1.x do a lot of resource checking to ensure that it is safe to delete the selected allocation. This will lead to stuttering. Today's streaming algorithms just use all the available VRAM, because optimizing these for better resource allocation is just too expensive.
Explicit memory management in D3D12 and Vulkan will change this, alongside with WDDM 2.0. In these models deleting an allocation will be much-much faster, and won't lead to stuttering.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
My 980 Ti playing Fallout 4 at 3840x2160 (4K res) with Ultra Preset (Ultra on everything, Godrays on High) uses up about 3.5GB-4GB of VRAM.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
That right there ain't helping the vRAM. Then when you factor in the needlessly large textures, you're gonna want every GB you can get.

FO4 isn't even a bad offender. Look at F1 2015. Cannot play the game maxed out 1080P with less than 3GB of VRAM:

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-F1_2015_-test-f1_1920.jpg

http--www.gamegpu.ru-images-stories-Test_GPU-Simulator-F1_2015_-test-f1_vram.jpg


Mortal Kombat X.
http--www.gamegpu.ru-images-stories-Test_GPU-Action-Mortal_Kombat_X_-test-mkx_2560.jpg


Those aren't even open world games.

To play the devil's advocate, GPUs are way behind the curve wrt to VRAM.

Nov 16, 2005 Xbox 360 = 512MB
June 2008:
GTX280 = 1GB
GTX260 = 896MB

Nov 2013 PS4/XB1 = 8GB
Summer 2016 => we should have GPUs with 12-16GB of VRAM in comparison. :D

That means the programmers are not entirely to blame. In relative terms, it seems VRAM amount on our GPU hardware is way behind. Even the $199 HD4850 launched with 512MB of VRAM, easily matching XBox360/PS3.

Are we going to have a $199-249 GPU in 2016 with 8GB of memory? R9 390 is the only card that's getting close.
 

skipsneeky2

Diamond Member
May 21, 2011
5,035
1
71
All my cards have usually ran out of my desired 60+fps minimums before vram even became a issue.

Only exception was the brief ownership of a 320mb 8800gts,CS:GO,BF4 and COD BO2 would certainly be held back by the 320mb if i had it still.BF4 certainly wouldn't work well unless it was a 512mb card.
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
Just for reference I just ran the Ashes of the Singularity benchmark with the latest update in both DX11 and DX12 and this is what GPU-Z recorded:

DX11 Maximum VRAM: 4,067MB
DX12 Maximum VRAM: 4,338MB
 

moonbogg

Lifer
Jan 8, 2011
10,669
3,200
136
Just for reference I just ran the Ashes of the Singularity benchmark with the latest update in both DX11 and DX12 and this is what GPU-Z recorded:

DX11 Maximum VRAM: 4,067MB
DX12 Maximum VRAM: 4,338MB

Holy crap, for an RTS?
 

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
Expect VRAM requirements to increase even more because we are now in the 3rd year of console gen and developers have 4.5-5GB of VRAM for games on those consoles. Most AAA games can only exist if they sell on both consoles and PC which means AAA developers will be targetting XB1/PS4 as the primary platforms. Since these consoles have weak CPUs/GPUs, the developers are going to use the VRAM/virtual memory system tricks as one of the main areas to help them get performance up to par.


If there's a take-away from the thread, it's this.
 

boozzer

Golden Member
Jan 12, 2012
1,549
18
81
Fallout 4 is rather badly programmed. Boris Vorontsov, the guy behind ENB Series, has been fighting with Bethesda's renderers for a while now. He learned that Fallout 4 uses six render targets (I.E, has six copies of the current frame), when three would do just fine.

That right there ain't helping the vRAM. Then when you factor in the needlessly large textures, you're gonna want every GB you can get.
that reads like it is deliberate. bethesda did it to up the hardware requirements. in collusion with hardware vendors?

at least now we know why the game can look like a piece of turd and have sky high requirements. it is essentially 200% of what it needs. now everything makes sense.
 

psolord

Platinum Member
Sep 16, 2009
2,095
1,235
136
The problem is the WDDM1.x. It is impossible to fit everything to the VRAM with today's graphics complexity. So many engines use streaming which is a really good option for the consoles, but extremely hard to optimize it on the PC. Even if there is a lot of unused data in the VRAM, it is not recommended to delete these anyway, because WDDM1.x do a lot of resource checking to ensure that it is safe to delete the selected allocation. This will lead to stuttering. Today's streaming algorithms just use all the available VRAM, because optimizing these for better resource allocation is just too expensive.
Explicit memory management in D3D12 and Vulkan will change this, alongside with WDDM 2.0. In these models deleting an allocation will be much-much faster, and won't lead to stuttering.

good post

thanks
 

Mercennarius

Senior member
Oct 28, 2015
466
84
91
Holy crap, for an RTS?

Yes. This benchmark has a significantly higher scale and graphic fidelity than any other RTS I know of. This game also is still early access and not fully optimized. But. Still shows how much VRAM next gen games can utilize.
 

Innokentij

Senior member
Jan 14, 2014
237
7
81
Because since PS4/X1 came the game dev has made new game egine that uses all of the GPU and it's ram and power cause the cpu that sits in this consolls are lower powered AMD cores that is on par with iphones. TLDR: lazy coding. Companys that still makes good engines : Rockstar, DICE and Codemaster that i can think off.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
The problem is the WDDM1.x. It is impossible to fit everything to the VRAM with today's graphics complexity. So many engines use streaming which is a really good option for the consoles, but extremely hard to optimize it on the PC. Even if there is a lot of unused data in the VRAM, it is not recommended to delete these anyway, because WDDM1.x do a lot of resource checking to ensure that it is safe to delete the selected allocation. This will lead to stuttering. Today's streaming algorithms just use all the available VRAM, because optimizing these for better resource allocation is just too expensive.
Explicit memory management in D3D12 and Vulkan will change this, alongside with WDDM 2.0. In these models deleting an allocation will be much-much faster, and won't lead to stuttering.
^This.
There was an AMD presentation where they explained all that in good detail.
Texture streaming is what mucks up all recent games,if devs switch to dx12 vram needs will drop dramatically because getting textures into the GPU will not cause it to stop rendering anymore.
https://www.youtube.com/watch?feature=player_detailpage&v=H1L4iLIU9xU#t=941