More examples of 8GB VRAM not enough

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
People aren't really talking about this given non-existent availability, but I came across two new examples where 8GB absolutely cripples performance.

Videos are time-stamped to the right place:



No such problems with the 6700XT 12GB/VII 16GB used in the same tests.
 
Last edited:

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Entirely predictable. I have no idea how people could ever argue that 8GB is enough for the next 3 years in AAA titles in one of the threads here that I'm too lazy to dig up.

I ended up with 3070 FE because that's all that I was lucky to score in this horrible GPU market. However, based on prior experience there is a good chance that AMD 6000 series will age a lot better than Nvidia 30 series because it will be less likely to run into RAM limitations and because Nvidia will probably stop maintaining driver performance for Ampere after Lovelace is out. I would have gladly paid $150 extra for reference 6800XT if I could get one, unfortunately those are like unicorns.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
The upcoming 3080 Ti w/ 12 gigs should be safe.

Even 12GB is borderline on that card. I see many VR games routinely going over that, the actual process usage. Although to be fair, it doesn't necessarily mean that the card would tank if the memory is exceeded. Not like anyone will be able to buy it anyway.
 

amenx

Diamond Member
Dec 17, 2004
3,851
2,019
136
BS. Not enough data to go on. Looks like an anomaly, glitch or driver issue with the 3070 in GR Breakpoint. Also conflicting data from other sites in this game. Look at GameGPU results of the 2080 vs the 16gb RVII @ 4k:

2021-04-22 10_54_27-Ghost Recon Breakpoint тест GPU_CPU _ Action _ FPS _ TPS _ Тест GPU — Mozi...png
Not that I think comparing different cards with different arches a proper way to conclude vram deficiencies, just using the same criteria you think is somehow 'definitive' in your POV. The only proper way to know vram is an issue is to compare exact same cards with different vram capacities.

Also 3070 seems doing very well 4k maxed out vs 2080ti here, which is big difference vs Techspots results:


I should add that Steve Walton of Techspot has a fishy record in these types of examples. He did the same with rtx 2080 vs RVII in Doom E but results in contradiction with other sites. I believe he slyly corrected the outcome much later on @ techspots website, but left the YT vid on that as is.
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
BS. Not enough data to go on. Looks like an anomaly, glitch or driver issue with the 3070 in GR Breakpoint. Also conflicting data from other sites in this game. Look at GameGPU results of the 2080 vs the 16gb RVII @ 4k:

View attachment 43448
Not that I think comparing different cards with different arches a proper way to conclude vram deficiencies, just using the same criteria you think is somehow 'definitive' in your POV. The only proper way to know vram is an issue is to compare exact same cards with different vram capacities.

Also 3070 seems doing very well 4k maxed out vs 2080ti here, which is big difference vs Techspots results:


I should add that Steve Walton of Techspot has a fishy record in these types of examples. He did the same with rtx 2080 vs RVII in Doom E but results in contradiction with other sites. I believe he slyly corrected the outcome much later on @ techspots website, but left the YT vid on that as is.

Breakpoint says it needs a 3GB card to run but it runs on a 7870 2GB @1600x900 low with 30fps+, I used to run it on a 4GB card and it said 5GB used but still had ~60fps @1080p high. 4K on a 5700XT wasn't pretty but 1440p was fine, now a 3080 makes 4K just fine.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
I'm still not sure it's that much of a big deal given it's in 4K. While the 3070 is certainly capable of 4K gaming, it's at the edge where in a few years the more graphically demanding titles will be more likely to hit a wall due to the number of shaders before the amount of VRAM. There probably are some people who will want to use a 3070 for 4K, but personally I'd probably go with 1440p which will likely maintain playable framerates even with all the settings cranked up years from now.
 

Justinus

Diamond Member
Oct 10, 2005
3,167
1,509
136
Even 12GB is borderline on that card. I see many VR games routinely going over that, the actual process usage. Although to be fair, it doesn't necessarily mean that the card would tank if the memory is exceeded. Not like anyone will be able to buy it anyway.

Viewing high VRAM usage on a card with more memory also doesn't mean the game would use as much memory given a more constrained card. A lot of engines pre-allocate or keep assets in memory that are no longer needed until the memory is required for something else.

This will lead to very high observed usage while the game would perform identically with considerably less.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
I see well over 12GB of actual usage in some games (reported by the game itself), not just the allocation which is higher. You're right that the game may just use less memory on another card though and maybe reload assets during loading screens.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
I have no idea how people could ever argue that 8GB is enough for the next 3 years in AAA titles in one of the threads here that I'm too lazy to dig up.
Some people apparently believe there's a conspiracy to make poor little nVidia look bad. Also apparently developers are forbidden to use more VRAM until nVidia gives them "approval" to do so.

Hence the dozen or so concrete examples we've already seen continually get dismissed as "somebody else's fault that nVidia rips off their customers for VRAM".
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
I agree with Zlatan on the VRAM issue.
The VRAM usage in ray tracing might be massive. This is because the standard PC implementations don't support flexible LOD systems. Every object must be loaded to the memory with really high resolution, no matter if it is far or not from the camera. The LOD must be selected before shooting the rays, and there is no standardized way to change it after. The only solution that works is to drastically limit the length of the rays. If a program won't do this, than the RT effect will be a memory hog.

I honestly think that the 3080 will have enough VRAM for 4K in the most situations, but not for ray tracing. If this matters, than don't buy a VGA with less than 16 GB memory. To be honest I would go for the 3090 with 24 GB VRAM. Sounds overkill, but not for ray tracing, 16 GB is the absolute minimum for 4K. 12 GB would be ok for 1080-1440p
 
  • Like
Reactions: Mopetar

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
I see well over 12GB of actual usage in some games (reported by the game itself), not just the allocation which is higher. You're right that the game may just use less memory on another card though and maybe reload assets during loading screens.

Does it really need 12 GB or is it just a case of it not unloading older data that it doesn't really need at the moment, but that by keeping it in memory may mean not having to load it again should it be needed?

If you tested the game with a card with only 8 GB or 10 GB does it tank the performance more than it should for that card? If it doesn't, it's probably just the game not clearing textures or other data out of memory until it needs the space for something else.

Some people apparently believe there's a conspiracy to make poor little nVidia look bad. Also apparently developers are forbidden to use more VRAM until nVidia gives them "approval" to do so.

Personally I think it's a bit overblown, but the original conversation was around the 3080 and whether or not 10 GB would be enough. For most titles I expect that to be the case, but I wouldn't want less than that for 4K gaming. By the end of the current console cycle I expect that we'll see upwards of 12 GB since that's what developers are going to be able to use, but it's probably not going to be a case for every single title.

It might be more of an issue for people who like to play games with a lot of mods and custom texture packs where they're loading a lot of additional data and need the extra space though.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
Here's a twitter quote from the lead engine designer of Doom Eternal itself:

1619201377039.png

Basically, he's saying 8GB is the minimum, but it's better to go with the highest amount if possible. If you are spending good money on it, might as well pick up one with more memory. Makes sense.
 
Last edited:
  • Like
Reactions: Tlh97

Justinus

Diamond Member
Oct 10, 2005
3,167
1,509
136
Here's a twitter quote from the lead engine designer of Doom Eternal itself:

View attachment 43520

Basically, he's saying 8GB is the minimum, but it's better to go with the highest amount if possible. If you are spending good money on it, might as well pick up one with more memory. Makes sense.

Bear in mind he's literally saying 8 GB is the minimum, not "plenty for the next 3 years".
 
  • Like
Reactions: IEC
Feb 4, 2009
34,506
15,737
136
Some people apparently believe there's a conspiracy to make poor little nVidia look bad. Also apparently developers are forbidden to use more VRAM until nVidia gives them "approval" to do so.

Hence the dozen or so concrete examples we've already seen continually get dismissed as "somebody else's fault that nVidia rips off their customers for VRAM".

I think there is something to this but I think it is more like nvidia outsells AMD cards and we have to cater to what the majority have. I can’t imagine there being a big installed base for a game that requires 10GB or more video card memory.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
I can’t imagine there being a big installed base for a game that requires 10GB or more video card memory.
Well, that won’t happen for another decade likely. You can still run modern games with 2Gb of VRAM today, albeit on lowest preset.The point is to run maximum graphical settings/textures and RT. For that, in 1-2 years 8-10gb frame buffer might not be enough.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
231
106
Bear in mind he's literally saying 8 GB is the minimum, not "plenty for the next 3 years".
Well, that depends on resolution as well. For example, Doom Eternal maxed out at 4K already needs 9 gigs of VRAM. So, 3080 would be the MINIMUM nvidia card I'd go with for 4K. Compare the screens to see how badly 8GB cards get slaughtered in 4K, due to lack of VRAM.
 

Attachments

  • doom_eternal.png
    doom_eternal.png
    465.9 KB · Views: 32
  • doom_eternal_1440.png
    doom_eternal_1440.png
    435.3 KB · Views: 31
Last edited:
  • Like
Reactions: Tlh97 and Mopetar

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
Does it really need 12 GB or is it just a case of it not unloading older data that it doesn't really need at the moment, but that by keeping it in memory may mean not having to load it again should it be needed?

If you tested the game with a card with only 8 GB or 10 GB does it tank the performance more than it should for that card? If it doesn't, it's probably just the game not clearing textures or other data out of memory until it needs the space for something else.

Yeah that would be useful to compare with. Some VR games can go way beyond 12GB though, even over 20GB occasionally. I think that earlier comment about 12GB not good for 4K RT is even more true for VR. It's enough for most non-RT 4K games though.