8GB VRAM not enough (and 10 / 12)

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
9,151
3,092
136
www.teamjuchems.com
Pretty amazing that 10GB 3080 is "fine" but makes sense when you consider that they are likely optimizing for the 10GB fast pool on the Series X. I would hazard a guess that lots of console released games will be optimized for that much graphics memory at 1440 and up resolutions.

And maybe optimized for the 10GB 3080 :D

2080ti buyers in that little bitty window when they were fire sold before the 3080 launch are looking like geniuses. I definitely should have snagged the one for $450 that I had a chance at but hesitated on.

That graph makes me feel better about my 6800 purchase. I really wanted a 6800XT but the extra hundreds of dollars soured me on it, while it was relative peanuts to step from 6700xt to the 6800. Looks like both are OK in this title, anyway.

Final edit: Sad for the 3070ti if this is the way things are going to go. Should have gone to 16GB GDDR6 and used the power budget for core clocks or something.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?

Everyone agrees one day 8gb won't be enough, but picking some silly settings that don't really add much just because you can isn't the way to prove it.

Truth is games are built around the limitations of the cards, not the other way. So as most cards have 4-8gb of memory games will have settings that run well with that much memory. 8gb is only going to become a problem when most low-mid range cards in use (e.g. popular in steam survey) have more memory then that, which is not going to be for a long time.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,750
746
136

Even at 1440p, 8GB cards sink like a rock to the bottom, and the 3070 gets absolutely rekt by the 3060:

image-21.png

4K is even more hilarious, with even the 6700XT being more than twice as fast as the 3070:

image-23.png

Amazingly there are still people on the internet claiming 8GB is just fine.

So if you don't play DOOM Eternal (at Ultra Nightmare or at all) it's fine right??? :tonguewink:
 

blckgrffn

Diamond Member
May 1, 2003
9,151
3,092
136
www.teamjuchems.com
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?

True enough one day 8gb won't be enough, but picking some silly settings that don't really add much just because you can isn't the way to prove it.

Truth is games are built around the limitations of the cards, not the other way. So as most cards have 4-8gb of memory games will have settings that run well with that much memory. 8gb is only going to become a problem when most low-mid range cards in use (e.g. popular in steam survey) have more memory then that, which is not going to be for a long time.

Maybe. When (how is this normalized, lol) I pay over $1k for a GPU I expect to be able to turn all titles to 11.
 

coercitiv

Diamond Member
Jan 24, 2014
6,262
12,233
136
This gamegpu review produced a very interesting exchange between Hardware Unboxed and DF on Twitter, with Alexander Battaglia from DF pointing out that "texture setting is merely a Cache for streaming and has no visual effect. High looks the same as Ultra Nightmare." It sounds like a very sensible point to make, which should be taken into account when discussing DOOM Eternal with RTX benchmarks. Right?!

Well, here's the big irony: last year, before the 3080 official reviews, Digital Foundry was given exclusive right to publish a preview... and they chose to pit the 3080 against the 2080 at 4K using the Ultra Nightmare quality setting, inducing the same texture buffer pressure on the 2080 with 8GB of VRAM. The aggressive settings were fine back then to showcase the superiority of 3080, but now that they come back and bite the 3070... we need to discuss how Ultra Nightmare isn't really a good benchmark option, since it has "no visual effect". :cool:
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,182
1,536
136
This gamegpu review produced a very interesting exchange between Hardware Unboxed and DF on Twitter, with Alexander Battaglia from DF pointing out that "texture setting is merely a Cache for streaming and has no visual effect. High looks the same as Ultra Nightmare." It sounds like a very sensible point to make, which should be taken into account when discussing DOOM Eternal with RTX benchmarks. Right?!

Well, here's the big irony: last year, before the 3080 official reviews, Digital Foundry was given exclusive right to publish a preview... and they chose to pit the 3080 against the 2080 at 4K using the Ultra Nightmare quality setting, inducing the same texture buffer pressure on the 2080 with 8GB of VRAM. The aggressive settings were fine back then to showcase the superiority of 3080, but now that they come back and bite the 3070... we need to discuss how Ultra Nightmare isn't really a good benchmark option, since it has "no visual effect". :cool:

I would be extremely skeptical of this. Too bad DF has no sources to cite for this claim, nor did they back it up with image quality comparisons.
 
Last edited:
  • Like
Reactions: Leeea and Tlh97

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,182
1,536
136
I would be extremely skeptical of this. Too bad DF has no sources to cite for this claim, nor did they back it up with image quality comparisons.

I tested it myself. He's right it targets the same texture quality, but the incidence of texture streaming artifacts is higher due to the smaller pool, especially with some of the very large, detailed outdoor scenes in the game.
 
Feb 4, 2009
34,636
15,832
136
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?

Everyone agrees one day 8gb won't be enough, but picking some silly settings that don't really add much just because you can isn't the way to prove it.

Truth is games are built around the limitations of the cards, not the other way. So as most cards have 4-8gb of memory games will have settings that run well with that much memory. 8gb is only going to become a problem when most low-mid range cards in use (e.g. popular in steam survey) have more memory then that, which is not going to be for a long time.

This, this, this, this, this, and this!
What’s the obsession with ultra settings. Can anyone tell the difference between ultra and normal while gaming?
 
  • Like
Reactions: AnitaPeterson

blckgrffn

Diamond Member
May 1, 2003
9,151
3,092
136
www.teamjuchems.com
This, this, this, this, this, and this!
What’s the obsession with ultra settings. Can anyone tell the difference between ultra and normal while gaming?

Todays Ultra settings are next years (or maybe the next, but being accurate is ruining my point) normal settings.

I mean, games should keep looking better and better if they can. If we broke the cycle, we might be content with the hardware we have! ;)

(Also, I love it when games look amazing, not gonna lie. It's one of my biggest hang ups with the Switch now.)
 

Hail The Brain Slug

Diamond Member
Oct 10, 2005
3,182
1,536
136
This, this, this, this, this, and this!
What’s the obsession with ultra settings. Can anyone tell the difference between ultra and normal while gaming?

the human eye can't see over 720p
the human eye can't see 4k or 8k texture resolution
the human eye can't see above 30 fps
the human eye can't see ultra settings
the human eye can't see texture streaming artifacts...

wait, it can.

I agree though, the Doom Eternal settings menu clearly states increasing the texture pool setting to Nightmare/Ultra Nightmare with everything else maxed at 4k will use ~8600MB, it's stupid to blatantly ignore the game's own indicator and exceed your vram.
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,445
126
That sounds more like an optimization problem for Doom Eternal RTX than an issue with the hardware. 8 GB of video memory should be more than enough for any title in 2021, especially considering that the chances of being able to upgrade to a better card right now are slim and none.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,974
7,401
136
There would be a deep sense of irony associated with NV having the more capable 2nd gen RT hardware, only to cripple it with stingy VRAM outlays.

This is exactly the kind of issue DLSS should not be a crutch to resolve...
 
  • Like
Reactions: Leeea

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Meh, nvidia skimping on VRAM and no longer optimizing drivers for older cards is planned obsolescence by design. This way next card will have a selling point of having 2GB more VRAM, and old one is going to be crippled by not having enough VRAM. Nothing is going to change so long as gamers fall for nVidia marketing and so long as AMD cannot produce enough cards for gamers to actually buy - reference 6800/6800XT is impossible to find.
 
Feb 4, 2009
34,636
15,832
136
Meh, nvidia skimping on VRAM and no longer optimizing drivers for older cards is planned obsolescence by design. This way next card will have a selling point of having 2GB more VRAM, and old one is going to be crippled by not having enough VRAM. Nothing is going to change so long as gamers fall for nVidia marketing and so long as AMD cannot produce enough cards for gamers to actually buy - reference 6800/6800XT is impossible to find.

and this is exactly why I want intel to succeed in the discrete graphics arena. We have needed a viable third option for about a decade.
 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
Not smart or nerdy like you guys, but why doesn't the bigger memory bus on the RTX 3060Ti and above make up for the lack of VRAM in this scenario?
 

blckgrffn

Diamond Member
May 1, 2003
9,151
3,092
136
www.teamjuchems.com
Not smart or nerdy like you guys, but why doesn't the bigger memory bus on the RTX 3060Ti and above make up for the lack of VRAM in this scenario?

It's like a hard drive filling up. It doesn't matter how fast/wide the interface is if the hard drive is full.

The 8GB of ram gets full and then it has start using system memory and that's farther away and slower, so performance drops.

It looks like 10GB and up of memory fits enough of the data that any performance degradation is so small as impossible to detect.
 
Last edited: