You're acting like there's no overhead associated with this management. They not only have to track which textures are visible, they also have to predict which won't be visible so they know which one to drop, all of which takes CPU time.
Last time I checked, we're PC gamers not console gamers and modern CPUs have more than enough processing power to handle that overhead; unlike the anemic CPUs in the consoles.. I know this because there are lots of very intensive PC games that run much better on PC than they do on console. BF4 is one such example..
And that overhead has been decreasing over time. DX11.x has a significantly lower CPU overhead cost than DX9, and DX12 due out next year will have a lower CPU overhead cost than DX11.x That's just progress..
Also when developers start taking advantage of tiled resources, VRAM requirements should be reduced even more..
Then they have to read them from a system memory buffer which has a latency penalty...if there's enough memory....because it's not like system memory is an inexhaustible resource either. Then they have to blend the texturing smoothly so you barely notice it, wasting precious GPU resources. Worst case they have to read from disk and then you might be staring at a blurry wall, a load screen, an unskippable cutscene, a long windy hallway or any other number of techniques to stall you so you don't overrun the buffer. They have to track and implement all sorts of bullshit to suit a scenario that no longer exists on next gen consoles, and soon will no longer exist on PC once VRAM catches up.
It sounds to me like you're making excuses for shoddy game development practices, and you seem to think that high capacity VRAM is going to magically solve all of these issues..
But it won't. High level hardware can make up for a lot of things, but it can't make up for bad programming..
This isn't smart vs dumb, they're making real performance and even gameplay sacrifices by shifting the burden to other components. You're just mad because you have a card that doesn't have enough VRAM to keep up with it, and you're telling yourself this fancy tale about how lazy and incompetent they are to make yourself feel better about that. Be careful what you wish for, because one day you'll get a card with enough VRAM, and then you'll be glad they left that stuff behind.
I have two Gigabyte G1 GTX 970s that are equipped with 4GB each. That's more than enough to play any game out right now, and for the foreseeable future.. If it isn't, I'll just upgrade..
But the evidence is on my side. Shadows of Mordor that "supposedly" requires 6GB of VRAM for ultra level textures, plays just fine on 4GB cards by all appearances @ 1440p, and on 3GB cards @ 1080p..
Watch Dogs on the other hand requires 3GB for ultra level textures, yet still stutters on 3GB, 4GB and even 6GB cards... I wonder why that is? Maybe Watch Dogs requires an 8GB card?
Makes you go hmmmm :hmm: