8GB VRAM not enough (and 10 / 12)

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
Meh, nvidia skimping on VRAM and no longer optimizing drivers for older cards is planned obsolescence by design. This way next card will have a selling point of having 2GB more VRAM, and old one is going to be crippled by not having enough VRAM. Nothing is going to change so long as gamers fall for nVidia marketing and so long as AMD cannot produce enough cards for gamers to actually buy - reference 6800/6800XT is impossible to find.

Reference 6700/6800/6900 is pretty impossible. The AIBs are charging those big premiums so they're a lot less desirable.
 

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
This gamegpu review produced a very interesting exchange between Hardware Unboxed and DF on Twitter, with Alexander Battaglia from DF pointing out that "texture setting is merely a Cache for streaming and has no visual effect. High looks the same as Ultra Nightmare." It sounds like a very sensible point to make, which should be taken into account when discussing DOOM Eternal with RTX benchmarks. Right?!

Well, here's the big irony: last year, before the 3080 official reviews, Digital Foundry was given exclusive right to publish a preview... and they chose to pit the 3080 against the 2080 at 4K using the Ultra Nightmare quality setting, inducing the same texture buffer pressure on the 2080 with 8GB of VRAM. The aggressive settings were fine back then to showcase the superiority of 3080, but now that they come back and bite the 3070... we need to discuss how Ultra Nightmare isn't really a good benchmark option, since it has "no visual effect". :cool:

I have done some reading and the setting impacts texture streaming. More textures in VRAM = less streaming required = better IQ. Eternal seems pretty good with streaming though so it is a minor thing. So while High might look the same as Ultra Nightmare once the textures load in if you are whizzing through sections you are more likely to be looking at a low res temporary texture at High than Ultra Nightmare.

My issue here is that Alex was very particular about pointing out the IQ issues with FSR but now there is an IQ issue that impacts the 8GB 3060Ti, 3070 and 3070Ti he wants to give it a pass by telling people to use a smaller texture pool. The problem is the 12GB 3060 does not have this issue and can run with the texture pool maxed out but Alex does not seem to want to acknowledge it.

When it comes to PC stuff I am just going to ignore anything DF have to say. between GN, HUB, Anand, Ian, LTT, L1T etc I won't miss out on anything of importance and I can avoid obvious NV bias.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,678
136
When it comes to PC stuff I am just going to ignore anything DF have to say. between GN, HUB, Anand, Ian, LTT, L1T etc I won't miss out on anything of importance and I can avoid obvious NV bias.
I wouldn't go that far, DF offers good coverage of new tech in games, for Doom Eternal alone they managed to produce some nice interviews with id Software staff full of interesting insights, they're a valuable source of information overall. They're not consistent though, so their own analysis and interpretation of new tech must be filtered using their own previous standards. This applies to image reconstruction, driver overhead, and now VRAM requirements.
 
  • Like
Reactions: lightmanek

Timorous

Golden Member
Oct 27, 2008
1,532
2,535
136
I wouldn't go that far, DF offers good coverage of new tech in games, for Doom Eternal alone they managed to produce some nice interviews with id Software staff full of interesting insights, they're a valuable source of information overall. They're not consistent though, so their own analysis and interpretation of new tech must be filtered using their own previous standards. This applies to image reconstruction, driver overhead, and now VRAM requirements.

I think it will depend on who writes the article.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,678
136
I think it will depend on who writes the article.
Not necessarily, here's Alex Battaglia talking about 3070 VRAM and how he feels like AMD may have done a better job with the 6000 series from this point of view.


I like to think that DF will become more aware of their own need for consistency in review standards, to me it seems they got a bit carried away with RTX/DLSS tech from Nvidia and they're still in the process of figuring out proper criteria to judge some of the emerging tech from both vendors. Personally I'll still listen to what they have to say.
 

CP5670

Diamond Member
Jun 24, 2004
5,508
586
126
I don't think the 3080's 10GB is good enough for 4K with RT, HD texture packs, VR, etc. It should be good for 4K without RT or 1440p. Ultra settings are admittedly useless most of the time, but RT and texture packs are well worth it. Even the 3080ti's 12GB is borderline and will fall short in a few cases, particularly given how expensive that card is. AMD's 16GB is probably the sweet spot, but VR can occasionally use beyond 20GB.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
And we have another game, Resident Evil Village added to OP. Once again the 3070 tanks with ray tracing and is slower than the 3060/6700XT at 4K.

The delicious irony is the game doesn't have DLSS but just got FSR patched. Thanks AMD! o_O
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Well I guess we can be thankful that no gamers were able to buy any of these cards!

Maybe by the time the mining boom subsides, Nvidia will refresh the lineup to include more VRAM and we won't have to worry about it.

Of course that may take so long that the 16 GB of VRAM that Nvidia is including in the mid-range will be regarded as widely inadequate. Hopefully Musk or Bezos will have space flight worked out in the interim because I want off this crazy world.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
8GB is for 1080p casual gaming. It's hilarious that Nvidia marketed the 3070 as a 2080Ti replacement for only $500. I cant' wait for the super cards to arrive with the same Vram amounts, lol.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
And we have yet another game added to the OP, Deathloop:

8GB cards show visible texture pop-in at close range as shown in the video, and that's with ray tracing off. Turning on ray tracing uses even more VRAM.

Imagine paying $1500 USD for a 3070Ti and immediately having to start dropping texture quality because it has the same VRAM capacity as a 5 year old 1070. Utter lunacy.
 

Golgatha

Lifer
Jul 18, 2003
12,639
1,481
126
Pretty amazing that 10GB 3080 is "fine" but makes sense when you consider that they are likely optimizing for the 10GB fast pool on the Series X. I would hazard a guess that lots of console released games will be optimized for that much graphics memory at 1440 and up resolutions.

And maybe optimized for the 10GB 3080 :D

2080ti buyers in that little bitty window when they were fire sold before the 3080 launch are looking like geniuses. I definitely should have snagged the one for $450 that I had a chance at but hesitated on.

That graph makes me feel better about my 6800 purchase. I really wanted a 6800XT but the extra hundreds of dollars soured me on it, while it was relative peanuts to step from 6700xt to the 6800. Looks like both are OK in this title, anyway.

Final edit: Sad for the 3070ti if this is the way things are going to go. Should have gone to 16GB GDDR6 and used the power budget for core clocks or something.

The 3070 ti should have definitely had 16GB of vRAM, if for no other reason than the huge price increase over the 3070.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
The 3070 ti should have definitely had 16GB of vRAM, if for no other reason than the huge price increase over the 3070.

I still think the 8GB frame buffers are to ensure that IF mining crashes they'll still have a reason to move a lot of 40xx series GPUs, as they'll likely move to 12+GB across the board.

The 3060ti-3070ti values could tank pretty hard.

(glances at 3060ti wrapped on the parts shelf, shudders)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
I still think the 8GB frame buffers are to ensure that IF mining crashes they'll still have a reason to move a lot of 40xx series GPUs, as they'll likely move to 12+GB across the board.

The 3060ti-3070ti values could tank pretty hard.

(glances at 3060ti wrapped on the parts shelf, shudders)
Speaking of which... the 12GB RTX 2060... do you think Nvidia will re-brand it the RTX 4050?
 

CakeMonster

Golden Member
Nov 22, 2012
1,384
482
136
2060 12GB seems like one of the few realistic options for 'cheaper' cards with no immediate VRAM problems, but that would also assume games that can run with low input resolution for DLSS. You might not like the artifacts, but given the options...
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
Then prepare to get extra spicy my friend, it ain't happening. :D Current transportation costs and increased times, tariffs, unfulfilled orders for various sectors, the list goes on.

Yeah, I hear that.

Just saying some people might get mad about even the concept of turning a high vram 2060 into a "4050" but that bothers me not really at all - I mean a "50" card would be like $175 in olden times? Now $299 (again?!? lol) A 2060 as a direct replacement for the 1650 Super is ¯\_(ツ)_/¯

It also makes sense to me that lower tier cards might be older cards repackaged and on process technology that is a step back (or two?) from bleeding edge. I've got no beef with that so long as they are DX12U compliant even at lower performance levels.

But why even bother caring unless we can buy it? And that's where your point comes into play :D
 
Last edited:

DeathReborn

Platinum Member
Oct 11, 2005
2,743
734
136
And we have yet another game added to the OP, Deathloop:

8GB cards show visible texture pop-in at close range as shown in the video, and that's with ray tracing off. Turning on ray tracing uses even more VRAM.

Imagine paying $1500 USD for a 3070Ti and immediately having to start dropping texture quality because it has the same VRAM capacity as a 5 year old 1070. Utter lunacy.

Considering 3090's with 24GB run out of VRAM running Deathloop I'd consider the game just plain badly made.