A post that I found on OC.net forums sums it up quite nicely:
Developers are targeting higher VRAM specs not because the texture resolutions are so much higher and more detailed, but because they are using VRAM as a storage cache in an effort to circumvent doing proper optimization for memory management on the PC, a NUMA platform..
Watch Dogs is the perfect example. Before the game even launched, I was worried because I had heard that the Disrupt engine would be an amalgamation of the AnvilNext (from the AC series) and Dunia engine (from the Far Cry series).. Both engines in my experience relied HEAVILY on using VRAM as a storage cache, and used very little system memory. AC IV for instance uses less than 700mb of RAM on the highest settings, despite being a very large game.
And sure enough, the low system memory utilization and overt reliance on VRAM for storage made their way from those engines into the Disrupt engine, causing the pervasive stuttering problem we see in Watch Dogs today (the game itself uses around 2GB of RAM on ultra settings as reported by resource monitor, a small amount for a big 64 bit game) when VRAM swapping takes place and the textures have to be pulled from HDD/SSD rather than from system memory which is much faster.
So basically developers are either just lazy, or want to save on development time and costs by relying on the PC's brute force rather than smarter and more effective optimization techniques. I'd say it was more the latter.
Though that's not to say that there is anything wrong with using VRAM as a cache. However, to rely it on it so extensively can be misleading as seen by Shadow of Mordor where the VRAM requirements are highly inflated; or worse, can be very detrimental to performance as seen in Watch Dogs..
SourceUser: RagingCain;
I wish people in here would not talk about a subject they do not know about.
People who know better see this as Windows 8 saying it needs 4GB of System Ram minimum to Windows 9 saying it needs 12 GB of RAM minimum. Doing the exact same thing.
The reason we don't need 6GB of VRAM for 1080P is the simple fact we are a NON-UNIFIED MEMORY ARCHITECTURE. They are using the VRAM for plain storage because they have to for consoles. Anybody who doesn't understand that, needs to basically stop talking when we complain because we are trying to get them NOT to do this, since we aren't bloody consoles, and we, generally, have anywhere from 8GB to 32 GB of accessible RAM, pagefiles, ssds, and PCI-e SSD Cards etc.
The main system RAM is for secondary storage and caching, not VRAM. Period. End of discussion.
1920x1080P, uses 256MB of framebuffer with 4xAA and being double buffered. The MAJORITY of what remains is up to the developer on how to use, or to be fair about 512MB with all Post Processing including. The rest of what actually remains is essentially up to developer/drivers.
Memory usage is not linear, it does not go up every freaking year. It isn't time to need "2GB+" for 1080P. When the resolution stays the same, there is only so much more Memory usage can increase without dumping EXTRA crap into VRAM. For comparison's sake, a 2GB frame buffer comes out to produce a 128 Megapixel image per frame. The resolution needed to create that is 56,633x8300
When publishers tell developers to do this for the PC:
1.) Don't have to optimize for a NUMA architecture.
2.) Don't have to prioritize assets.
3.) Don't have to write efficient rendering methods.
A 780 Ti 6GB or Titan 6GB is a waste of money, you are paying for extra storage of cached extras. The GPUs are incapable of rendering a frame buffer to even remotely fill that. Let alone the crappy GPU in these consoles.
Developers are targeting higher VRAM specs not because the texture resolutions are so much higher and more detailed, but because they are using VRAM as a storage cache in an effort to circumvent doing proper optimization for memory management on the PC, a NUMA platform..
Watch Dogs is the perfect example. Before the game even launched, I was worried because I had heard that the Disrupt engine would be an amalgamation of the AnvilNext (from the AC series) and Dunia engine (from the Far Cry series).. Both engines in my experience relied HEAVILY on using VRAM as a storage cache, and used very little system memory. AC IV for instance uses less than 700mb of RAM on the highest settings, despite being a very large game.
And sure enough, the low system memory utilization and overt reliance on VRAM for storage made their way from those engines into the Disrupt engine, causing the pervasive stuttering problem we see in Watch Dogs today (the game itself uses around 2GB of RAM on ultra settings as reported by resource monitor, a small amount for a big 64 bit game) when VRAM swapping takes place and the textures have to be pulled from HDD/SSD rather than from system memory which is much faster.
So basically developers are either just lazy, or want to save on development time and costs by relying on the PC's brute force rather than smarter and more effective optimization techniques. I'd say it was more the latter.
Though that's not to say that there is anything wrong with using VRAM as a cache. However, to rely it on it so extensively can be misleading as seen by Shadow of Mordor where the VRAM requirements are highly inflated; or worse, can be very detrimental to performance as seen in Watch Dogs..