I would agree.The upcoming 3080 Ti w/ 12 gigs should be safe.
Who knows... maybe when the mining bubble pops, the price of Coca-Cola will even come back down...I would love a firesale/dumping of 8g 3070 cards so I can snatch one up for cheap and play all the games up to 2021 at eye-wateringly high framerates.
Who knows... maybe when the mining bubble pops, the price of Coca-Cola will even come back down...
The upcoming 3080 Ti w/ 12 gigs should be safe.

BS. Not enough data to go on. Looks like an anomaly, glitch or driver issue with the 3070 in GR Breakpoint. Also conflicting data from other sites in this game. Look at GameGPU results of the 2080 vs the 16gb RVII @ 4k:
View attachment 43448
Not that I think comparing different cards with different arches a proper way to conclude vram deficiencies, just using the same criteria you think is somehow 'definitive' in your POV. The only proper way to know vram is an issue is to compare exact same cards with different vram capacities.
Also 3070 seems doing very well 4k maxed out vs 2080ti here, which is big difference vs Techspots results:
I should add that Steve Walton of Techspot has a fishy record in these types of examples. He did the same with rtx 2080 vs RVII in Doom E but results in contradiction with other sites. I believe he slyly corrected the outcome much later on @ techspots website, but left the YT vid on that as is.
Even 12GB is borderline on that card. I see many VR games routinely going over that, the actual process usage. Although to be fair, it doesn't necessarily mean that the card would tank if the memory is exceeded. Not like anyone will be able to buy it anyway.
Some people apparently believe there's a conspiracy to make poor little nVidia look bad. Also apparently developers are forbidden to use more VRAM until nVidia gives them "approval" to do so.I have no idea how people could ever argue that 8GB is enough for the next 3 years in AAA titles in one of the threads here that I'm too lazy to dig up.
The VRAM usage in ray tracing might be massive. This is because the standard PC implementations don't support flexible LOD systems. Every object must be loaded to the memory with really high resolution, no matter if it is far or not from the camera. The LOD must be selected before shooting the rays, and there is no standardized way to change it after. The only solution that works is to drastically limit the length of the rays. If a program won't do this, than the RT effect will be a memory hog.
I honestly think that the 3080 will have enough VRAM for 4K in the most situations, but not for ray tracing. If this matters, than don't buy a VGA with less than 16 GB memory. To be honest I would go for the 3090 with 24 GB VRAM. Sounds overkill, but not for ray tracing, 16 GB is the absolute minimum for 4K. 12 GB would be ok for 1080-1440p
I see well over 12GB of actual usage in some games (reported by the game itself), not just the allocation which is higher. You're right that the game may just use less memory on another card though and maybe reload assets during loading screens.
Some people apparently believe there's a conspiracy to make poor little nVidia look bad. Also apparently developers are forbidden to use more VRAM until nVidia gives them "approval" to do so.
Here's a twitter quote from the lead engine designer of Doom Eternal itself:
View attachment 43520
Basically, he's saying 8GB is the minimum, but it's better to go with the highest amount if possible. If you are spending good money on it, might as well pick up one with more memory. Makes sense.
Some people apparently believe there's a conspiracy to make poor little nVidia look bad. Also apparently developers are forbidden to use more VRAM until nVidia gives them "approval" to do so.
Hence the dozen or so concrete examples we've already seen continually get dismissed as "somebody else's fault that nVidia rips off their customers for VRAM".
Well, that won’t happen for another decade likely. You can still run modern games with 2Gb of VRAM today, albeit on lowest preset.The point is to run maximum graphical settings/textures and RT. For that, in 1-2 years 8-10gb frame buffer might not be enough.I can’t imagine there being a big installed base for a game that requires 10GB or more video card memory.
Well, that depends on resolution as well. For example, Doom Eternal maxed out at 4K already needs 9 gigs of VRAM. So, 3080 would be the MINIMUM nvidia card I'd go with for 4K. Compare the screens to see how badly 8GB cards get slaughtered in 4K, due to lack of VRAM.Bear in mind he's literally saying 8 GB is the minimum, not "plenty for the next 3 years".
Does it really need 12 GB or is it just a case of it not unloading older data that it doesn't really need at the moment, but that by keeping it in memory may mean not having to load it again should it be needed?
If you tested the game with a card with only 8 GB or 10 GB does it tank the performance more than it should for that card? If it doesn't, it's probably just the game not clearing textures or other data out of memory until it needs the space for something else.