The "8GB not enough" thread

BFG10K

Lifer
Aug 14, 2000
22,173
1,457
126
Reasons why 8GB nine years after first release isn't nVidia's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down settings while 3060 users have no issue.
  15. It's an anomaly.
According to some people here, 8GB is neeeevaaaaah nVidia's fault and the objective evidence below "doesn't count" because of reasons(tm) above. If you have others please let me know and I'll add them to the list. Cheers!

Deathloop

Resident Evil Village
3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:



Doom Eternal

Even at 1440p, 8GB cards sink like a rock to the bottom, and the 3070 gets absolutely rekt by the 3060:

4K is even more hilarious, with even the 6700XT being more than twice as fast as the 3070:


Ghost Recon Breakpoint

Resident Evil 2

Cyberpunk 2077
https://youtu.be/U0Ay8rMdFAg?t=423

Far Cry 6


Portal RTX
Without DLSS 3060 is faster than the 3070/3080TI/2080TI. Even with DLSS 3060 is still faster than 3070.



Company of Heroes
3060 has a higher minimum than the 3070TI, even at just 1080p.


Hogwarts Legacy
At 1080p with ray tracing, 3060, 6750XT and even Arc is faster than the 3070, 3070TI and the 3080. Imagine paying $2500 for a 3080 in 2021 then 18 months later finding it's obsolete.



Godfall / Forspoken / Watchdogs Legion
- Godfall stutters more on the 3070 despite higher overall FPS.
- Even 12 GB isn't enough in Forspoken, confirmed by needing low textures.
- Watchdogs Legion is slower on 3070 and has up to 2x performance randomness between runs.

 
Last edited:

blckgrffn

Diamond Member
May 1, 2003
8,764
2,416
136
www.teamjuchems.com
Pretty amazing that 10GB 3080 is "fine" but makes sense when you consider that they are likely optimizing for the 10GB fast pool on the Series X. I would hazard a guess that lots of console released games will be optimized for that much graphics memory at 1440 and up resolutions.

And maybe optimized for the 10GB 3080 :D

2080ti buyers in that little bitty window when they were fire sold before the 3080 launch are looking like geniuses. I definitely should have snagged the one for $450 that I had a chance at but hesitated on.

That graph makes me feel better about my 6800 purchase. I really wanted a 6800XT but the extra hundreds of dollars soured me on it, while it was relative peanuts to step from 6700xt to the 6800. Looks like both are OK in this title, anyway.

Final edit: Sad for the 3070ti if this is the way things are going to go. Should have gone to 16GB GDDR6 and used the power budget for core clocks or something.
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,024
566
136
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?

Everyone agrees one day 8gb won't be enough, but picking some silly settings that don't really add much just because you can isn't the way to prove it.

Truth is games are built around the limitations of the cards, not the other way. So as most cards have 4-8gb of memory games will have settings that run well with that much memory. 8gb is only going to become a problem when most low-mid range cards in use (e.g. popular in steam survey) have more memory then that, which is not going to be for a long time.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,588
554
136

Even at 1440p, 8GB cards sink like a rock to the bottom, and the 3070 gets absolutely rekt by the 3060:


4K is even more hilarious, with even the 6700XT being more than twice as fast as the 3070:


Amazingly there are still people on the internet claiming 8GB is just fine.
So if you don't play DOOM Eternal (at Ultra Nightmare or at all) it's fine right??? :tonguewink:
 

blckgrffn

Diamond Member
May 1, 2003
8,764
2,416
136
www.teamjuchems.com
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?

True enough one day 8gb won't be enough, but picking some silly settings that don't really add much just because you can isn't the way to prove it.

Truth is games are built around the limitations of the cards, not the other way. So as most cards have 4-8gb of memory games will have settings that run well with that much memory. 8gb is only going to become a problem when most low-mid range cards in use (e.g. popular in steam survey) have more memory then that, which is not going to be for a long time.
Maybe. When (how is this normalized, lol) I pay over $1k for a GPU I expect to be able to turn all titles to 11.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
230
106
There has been an update since. Honestly, how many of you thought back in February that 3070 is going to be beaten by 3060, that soon due to lack of VRAM?

1626196045000.png
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
5,458
9,286
136
This gamegpu review produced a very interesting exchange between Hardware Unboxed and DF on Twitter, with Alexander Battaglia from DF pointing out that "texture setting is merely a Cache for streaming and has no visual effect. High looks the same as Ultra Nightmare." It sounds like a very sensible point to make, which should be taken into account when discussing DOOM Eternal with RTX benchmarks. Right?!

Well, here's the big irony: last year, before the 3080 official reviews, Digital Foundry was given exclusive right to publish a preview... and they chose to pit the 3080 against the 2080 at 4K using the Ultra Nightmare quality setting, inducing the same texture buffer pressure on the 2080 with 8GB of VRAM. The aggressive settings were fine back then to showcase the superiority of 3080, but now that they come back and bite the 3070... we need to discuss how Ultra Nightmare isn't really a good benchmark option, since it has "no visual effect". :cool:
 

Justinus

Diamond Member
Oct 10, 2005
3,027
1,336
136
This gamegpu review produced a very interesting exchange between Hardware Unboxed and DF on Twitter, with Alexander Battaglia from DF pointing out that "texture setting is merely a Cache for streaming and has no visual effect. High looks the same as Ultra Nightmare." It sounds like a very sensible point to make, which should be taken into account when discussing DOOM Eternal with RTX benchmarks. Right?!

Well, here's the big irony: last year, before the 3080 official reviews, Digital Foundry was given exclusive right to publish a preview... and they chose to pit the 3080 against the 2080 at 4K using the Ultra Nightmare quality setting, inducing the same texture buffer pressure on the 2080 with 8GB of VRAM. The aggressive settings were fine back then to showcase the superiority of 3080, but now that they come back and bite the 3070... we need to discuss how Ultra Nightmare isn't really a good benchmark option, since it has "no visual effect". :cool:
I would be extremely skeptical of this. Too bad DF has no sources to cite for this claim, nor did they back it up with image quality comparisons.
 
Last edited:
  • Like
Reactions: Leeea and Tlh97

Kenmitch

Diamond Member
Oct 10, 1999
8,504
2,245
136
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?
I have a 6800 XT and play on the nightmare setting,1440p with RT enabled, as I prefer the higher framerates during the many intense battles.
 
Last edited:

Justinus

Diamond Member
Oct 10, 2005
3,027
1,336
136
I would be extremely skeptical of this. Too bad DF has no sources to cite for this claim, nor did they back it up with image quality comparisons.
I tested it myself. He's right it targets the same texture quality, but the incidence of texture streaming artifacts is higher due to the smaller pool, especially with some of the very large, detailed outdoor scenes in the game.
 
Feb 4, 2009
33,079
13,953
136
Why do you need to run in "ultra nightmare" when just dropping it to "ultra" would look pretty well identical when playing and run fine?

Everyone agrees one day 8gb won't be enough, but picking some silly settings that don't really add much just because you can isn't the way to prove it.

Truth is games are built around the limitations of the cards, not the other way. So as most cards have 4-8gb of memory games will have settings that run well with that much memory. 8gb is only going to become a problem when most low-mid range cards in use (e.g. popular in steam survey) have more memory then that, which is not going to be for a long time.
This, this, this, this, this, and this!
What’s the obsession with ultra settings. Can anyone tell the difference between ultra and normal while gaming?
 
  • Like
Reactions: AnitaPeterson

blckgrffn

Diamond Member
May 1, 2003
8,764
2,416
136
www.teamjuchems.com
This, this, this, this, this, and this!
What’s the obsession with ultra settings. Can anyone tell the difference between ultra and normal while gaming?
Todays Ultra settings are next years (or maybe the next, but being accurate is ruining my point) normal settings.

I mean, games should keep looking better and better if they can. If we broke the cycle, we might be content with the hardware we have! ;)

(Also, I love it when games look amazing, not gonna lie. It's one of my biggest hang ups with the Switch now.)
 

Justinus

Diamond Member
Oct 10, 2005
3,027
1,336
136
This, this, this, this, this, and this!
What’s the obsession with ultra settings. Can anyone tell the difference between ultra and normal while gaming?
the human eye can't see over 720p
the human eye can't see 4k or 8k texture resolution
the human eye can't see above 30 fps
the human eye can't see ultra settings
the human eye can't see texture streaming artifacts...

wait, it can.

I agree though, the Doom Eternal settings menu clearly states increasing the texture pool setting to Nightmare/Ultra Nightmare with everything else maxed at 4k will use ~8600MB, it's stupid to blatantly ignore the game's own indicator and exceed your vram.
 

ultimatebob

Lifer
Jul 1, 2001
25,135
2,438
126
That sounds more like an optimization problem for Doom Eternal RTX than an issue with the hardware. 8 GB of video memory should be more than enough for any title in 2021, especially considering that the chances of being able to upgrade to a better card right now are slim and none.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
4,771
4,084
136
There would be a deep sense of irony associated with NV having the more capable 2nd gen RT hardware, only to cripple it with stingy VRAM outlays.

This is exactly the kind of issue DLSS should not be a crutch to resolve...
 
  • Like
Reactions: Leeea

fleshconsumed

Diamond Member
Feb 21, 2002
6,407
2,103
136
Meh, nvidia skimping on VRAM and no longer optimizing drivers for older cards is planned obsolescence by design. This way next card will have a selling point of having 2GB more VRAM, and old one is going to be crippled by not having enough VRAM. Nothing is going to change so long as gamers fall for nVidia marketing and so long as AMD cannot produce enough cards for gamers to actually buy - reference 6800/6800XT is impossible to find.
 
Feb 4, 2009
33,079
13,953
136
Meh, nvidia skimping on VRAM and no longer optimizing drivers for older cards is planned obsolescence by design. This way next card will have a selling point of having 2GB more VRAM, and old one is going to be crippled by not having enough VRAM. Nothing is going to change so long as gamers fall for nVidia marketing and so long as AMD cannot produce enough cards for gamers to actually buy - reference 6800/6800XT is impossible to find.
and this is exactly why I want intel to succeed in the discrete graphics arena. We have needed a viable third option for about a decade.
 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
Not smart or nerdy like you guys, but why doesn't the bigger memory bus on the RTX 3060Ti and above make up for the lack of VRAM in this scenario?
 

blckgrffn

Diamond Member
May 1, 2003
8,764
2,416
136
www.teamjuchems.com
Not smart or nerdy like you guys, but why doesn't the bigger memory bus on the RTX 3060Ti and above make up for the lack of VRAM in this scenario?
It's like a hard drive filling up. It doesn't matter how fast/wide the interface is if the hard drive is full.

The 8GB of ram gets full and then it has start using system memory and that's farther away and slower, so performance drops.

It looks like 10GB and up of memory fits enough of the data that any performance degradation is so small as impossible to detect.
 
Last edited:

ASK THE COMMUNITY