I actually havn't checked RAM usage of games lately but if this is true that's really cool!
It's about time modern games started making use of more system memory, the whole max 4Gb limit in x86 OS's causes a massive delay in adoption of >4Gb, these days with gamers running 16-32Gb it's about time we put to some good use.
I noticed when playing this game that you can zip across the map at high speed and there's no descernible loading times or streaming issues with 16Gb RAM and a fast SSD, which is super cool.
I bet that's including the pagefile. There's no way all of that is system memory.
Far Cry 4 according to the gamegpu review uses around 1.2 GB of RAM for the game itself. AC Unity from my own tests will use around 2.5 GB of RAM.
Ubisoft engines have always been reluctant to use system memory, which probably explains why stuttering is so prevalent in their games.
It is, look at the screenshots. Looks like 1.5 patch more or less fixed the stuttering but leaks memory like a sieve and is unstable. Unity seems about right. And really, reviews are an indication. They don't play games the whole way through through multiple patches and/or driver variations. Playing a tiny cross-section doesn't necessarily prove GameGPU is correct.
How do you get the console readout in your hud? The top left part with resource stats?
What is it actually using? Does anyone understand exactly how the game engine works? Is it buffering lots of texture data to system ram that isn't stored in the videocard's own ram? If its just loading up textures in ram to get them ready and make all loads happen faster, then I'm not impressed...but I have no clue how this game engine, let alone game engines in general, work.
