I believe that is unique to only this one game. The devs may not have taken into account many gigabytes of mod textures that could cause such issues. And even so, that may not necessarily be strictly a hardware (lack of vidmem) issue but of poor game coding that doesnt efficiently handle the loading or culling of textures. After a particularly difficult scene, ie, outdoors with much detailed vegetation, dense clouds, streaming water etc, and you go indoors to simpler scenes, does the vidmem usage get less? If it doesnt, there may be a clue that bad coding is involved. The game coding serves the vanilla game well without exagerrated texture mods thrown at it. Anything excessive may become a problem.Benches don't measure it well so even when the VRAM pool is too small they won't show much. I ran out of VRAM a lot in Skyrim at 1920x1200 on a GTX 780 and had to remove some texture mods to get back to good performance. In terms of average FPS it made hardly any difference, but when I ran out of VRAM I got frequent "stutters," i.e. isolated frame times that were 10x my average frame times.
Some of the mods I've seen are just gratuitously res'd up (4k resolutions) and make almost little visual difference (beyond 1028x1028) but can hamper performance drastically. If thats the course of new game developement where every Tom, Dick and Harry gets to throw in their mega res mods, then yes, by all means, go buy a card with 6 or 8gb videm. But fortunately, most games are closed to major modding of this nature. I would think devs are typically more concerned with getting the best looking AND performing games that can accomodate the widest range of hardware, even with low vid mem cards simply to broaden their market and sales potential. Of course while allowing for high end components to make the most of that with higher settings.