PrincessFrosty
Platinum Member
Here's a graph I developed in Excel using PC games (2-3 PC games a year, mostly PC exclusives but some console ports, from 2003 to 2013). Many points overlap, so it looks like far fewer:
RAM requirements increased by 2^3 or 2^4 since 2003. However, the rate has stagnated since 2010, floating at 4GB or so, likely because of "legacy" support for 32-bit OSes. Even before then, you can see that the games take longer and longer between each doubling (a year at 256MB, three years at 512MB, 3 years at 2GB, 4 years at 4GB). Until a majority of computers and consoles can utilize more than 8GB (in the case of consoles, between the VRAM and the RAM), I don't think we'll see more than 4GB. Even the new consoles probably won't push past 5 or 6GB of RAM, since much of that needs to go to the GPU (shared space for RAM in consoles).
Thanks for doing that graph.
I think you're right, we've see a steady increase in the amount of RAM we can afford to put in our PCs as the modules increase in size and become affordable, but we hit the 32bit "wall" and stagnated, I think in the case of games that's partially also because of the long console lifecycles and their fixed hardware.
Once the consoles are launched and have 8Gb RAM sizes (I kind of wish this was 16Gb) then hopefully we should see the usage skyrocket, we have these huge games these days up to 30Gb in size and still a measly 2Gb of RAM usage. With spare CPU cores streaming media off the disk and into RAM while playing should be possible and then when it comes to loading art assets into video memory going straight from RAM to vRAM should make loading very fast.
The whole unified memory of Volta in 2014 is an exciting prospect for gaming, sharing vRAM and RAM should have been done a LONG time ago.
