8GB VRAM not enough (and 10 / 12)

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

aleader

Senior member
Oct 28, 2013
502
150
116
I get a chuckle out of these '8GB isn't enough' threads. I paid $829 CAD ($640 USD) for my EVGA XC3 Ultra 3070, and was just playing DCS (a VRAM hog) at 1440p on my 32" IPS monitor at about 110 fps with high settings on the Syria map (most performance-heavy map in the game), and always marvel at just how smooth and excellent it runs compared to my 1070. It also eats Squad and Post Scriptum for lunch. These games are FAR more system-intensive than Doom, which can run easily on a 1050. If you want to spend TRIPLE or more to buy a 3090 just to 'future-proof' your card for maybe another year...have at 'er I guess ;)
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
I get a chuckle out of these '8GB isn't enough' threads.
I get a chuckle when every single game is hand-waved away, which means 8GB @ $1500 is never nVidia's fault. :rolleyes:

If you want to spend TRIPLE or more to buy a 3090 just to 'future-proof' your card for maybe another year...have at 'er I guess ;)
Nobody is saying that. In fact a lot of people are skipping Ampere because of horrific pricing. Factoring in cost, the product stack is far worse than Turing, which was already bad. Especially 8GB 3070/307TI which is simply farcical.

If I was forced to buy a graphics card right now, I'd get a 3060.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
While this dosent really qualifies as "8GB is not enoght". I have a RX5700 and a RTX3060, both cards performs exactly the same, i ended up sticking with the 3060 on my pc and i moved the 5700 to the mining rig, for one simple reason, the 12GB vs 8GB, a 12GB VRAM card allows me to play a lot of games whiout shutting down the miner (the GPU do a really good "QoS" by dropping mining hashrate), while this is not the case for EVERY game, i can still play most of them this way. This is not possible with 8GB VRAM, as you end up with about 3.6GB free for games.
 

CropDuster

Senior member
Jan 2, 2014
366
45
91
I get a chuckle out of these '8GB isn't enough' threads. I paid $829 CAD ($640 USD) for my EVGA XC3 Ultra 3070, and was just playing DCS (a VRAM hog) at 1440p on my 32" IPS monitor at about 110 fps with high settings on the Syria map (most performance-heavy map in the game), and always marvel at just how smooth and excellent it runs compared to my 1070. It also eats Squad and Post Scriptum for lunch. These games are FAR more system-intensive than Doom, which can run easily on a 1050. If you want to spend TRIPLE or more to buy a 3090 just to 'future-proof' your card for maybe another year...have at 'er I guess ;)

Try the F14 in VR :confused:
 

dr1337

Senior member
May 25, 2020
331
559
106
I get a chuckle out of these '8GB isn't enough' threads. I paid $829 CAD ($640 USD) for my EVGA XC3 Ultra 3070, and was just playing DCS (a VRAM hog) at 1440p on my 32" IPS monitor at about 110 fps with high settings on the Syria map (most performance-heavy map in the game), and always marvel at just how smooth and excellent it runs compared to my 1070. It also eats Squad and Post Scriptum for lunch. These games are FAR more system-intensive than Doom, which can run easily on a 1050. If you want to spend TRIPLE or more to buy a 3090 just to 'future-proof' your card for maybe another year...have at 'er I guess ;)
Yeah I always felt my 3gb 7950s could run anything at 1440p with a solid 60fps 8 years ago but these days the 8gb on my rx580 is really at the limit for 4k or my oculus. Funny thing is I could have saved some decent change on my 580 had I gone with a 4gb model, but I don't think it would be performing nearly as well. I don't think anyone needs to go straight to the top with VRAM capacity in order to be future proof but I also don't think 8gb is going to be enough at all going forward in a 4k or vr gaming setup.
 

aleader

Senior member
Oct 28, 2013
502
150
116
I get a chuckle when every single game is hand-waved away, which means 8GB @ $1500 is never nVidia's fault. :rolleyes:


Nobody is saying that. In fact a lot of people are skipping Ampere because of horrific pricing. Factoring in cost, the product stack is far worse than Turing, which was already bad. Especially 8GB 3070/307TI which is simply farcical.

If I was forced to buy a graphics card right now, I'd get a 3060.

Totally Nvidia's fault and I wouldn't buy a 3070 at $1,500 either...you'd have to be a full-on idiot. I sold 2 of them to said idiots...likely miners. Yes, it should have more than 8GB, but in the end it really doesn't matter is my point. I guess if you play at 4K (like, less than .1% of gamers?) and just have to play Doom Eternal at 200fps on Ultra-Nightmare...then, ok.
 
Last edited:

aleader

Senior member
Oct 28, 2013
502
150
116
Try the F14 in VR :confused:

Well, I don't do VR (again, very niche) because it isn't ready for mainstream yet IMO, especially the well-documented poor implementation in DCS. I believe in ED's last newsletter they say they are 'working on it'. Way too expensive to get a good headset, and they still haven't fixed motion sickness...so I'll be waiting.
 

aleader

Senior member
Oct 28, 2013
502
150
116
Yeah I always felt my 3gb 7950s could run anything at 1440p with a solid 60fps 8 years ago but these days the 8gb on my rx580 is really at the limit for 4k or my oculus. Funny thing is I could have saved some decent change on my 580 had I gone with a 4gb model, but I don't think it would be performing nearly as well. I don't think anyone needs to go straight to the top with VRAM capacity in order to be future proof but I also don't think 8gb is going to be enough at all going forward in a 4k or vr gaming setup.

This is a phish right? You're using a 580 for 4K VR gaming and you're blaming the amount of VRAM? 🤔 As I said, moving from my 1070 to a 3070 (both 8GB) more than doubled my performance at 1440p.
 
  • Like
Reactions: Stuka87

Spjut

Senior member
Apr 9, 2011
928
149
106
It of course comes down to the price brackets, but I do not get why people still question the advantage of more VRAM. Whether we go back to the 512MB vs 1GB model, 1GB vs 2GB model and so on, the higher VRAM has always proven more useful in the end. One cannot predict the future, but it has always been typical for people to say, no, today's games do not benefit from it, so get the cheaper card/model. Then fast forward three years and you can typically still have Ultra textures on the higher model card whereas the other is limited to high or perhaps even Medium.

I can get why people who are prepared to upgrade often, say perhaps every three years, get the lower VRAM models though.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
I can get why people who are prepared to upgrade often, say perhaps every three years, get the lower VRAM models though.
It was much less of an issue in the past when you could get a new midrange card every year or two at most and when price creep was non existent. It's a far bigger problem nowadays that release cadence has slowed down considerably to 2 years at best and when every new generation brings little to no improvement in performance/dollar metric. Lots of people are still using 4-5 year old GPUs, so yeah, assuming 4 year upgrade cycle the amount of VRAM you buy today matters.
 
  • Like
Reactions: Tlh97 and coercitiv

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
I think 8GB of VRAM is enough for me, as I usually turn down settings for higher FPS in a lot of games. I am happy with my 3090 though. 24GB is a lot, and 16GB, as offered from the AMD 6800(XT) and 6900XT, is also a great amount. Even a little while back, the R9 Fury cards were great deals due to being faster than polaris, even if they only had 4GB of HBM. I still am not happy that they were EOLed in drivers so quick. That said, for someone at 1080p, depending on the games they play, something like a used 290 or Fury could still be a great buy.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Yeah I always felt my 3gb 7950s could run anything at 1440p with a solid 60fps 8 years ago but these days the 8gb on my rx580 is really at the limit for 4k or my oculus. Funny thing is I could have saved some decent change on my 580 had I gone with a 4gb model, but I don't think it would be performing nearly as well. I don't think anyone needs to go straight to the top with VRAM capacity in order to be future proof but I also don't think 8gb is going to be enough at all going forward in a 4k or vr gaming setup.

You are using a card that is just adequate at 1080P for 4K VR? I guarantee that VRAM is not your limiting factor.
 
  • Like
Reactions: amenx

dr1337

Senior member
May 25, 2020
331
559
106
This is a phish right? You're using a 580 for 4K VR gaming and you're blaming the amount of VRAM? 🤔 As I said, moving from my 1070 to a 3070 (both 8GB) more than doubled my performance at 1440p.
um no im saying the only reason my 580 is even managing to keep up at 4k is that it has the vram to. It would be much worse being compute limited and memory limited.
You are using a card that is just adequate at 1080P for 4K VR? I guarantee that VRAM is not your limiting factor.
Actually it is becoming a bottleneck for me at times and I see 60-80% compute utilization when vram is getting slammed. Again my point was that if I had a 4gb 580 I severly doubt I'd be getting away with all of this. And again when thinking about the future or long term, 8gb of vram is obviously fine for lower resolutions, compute and bandwidth are still bigger problems, but as time goes on I would think more people would also want to run VR or 4k or even higher resolutions. This is why a card like the 3070 seems gimped, and the links in the OP of this post show how 2gb more of much faster memory on the 3080 makes all of the raytracing and high-res bottlenecks go away. Imagine how much worse the problem will be in 3 years with new games.
 

Furious_Styles

Senior member
Jan 17, 2019
492
228
116
um no im saying the only reason my 580 is even managing to keep up at 4k is that it has the vram to. It would be much worse being compute limited and memory limited.

Actually it is becoming a bottleneck for me at times and I see 60-80% compute utilization when vram is getting slammed. Again my point was that if I had a 4gb 580 I severly doubt I'd be getting away with all of this. And again when thinking about the future or long term, 8gb of vram is obviously fine for lower resolutions, compute and bandwidth are still bigger problems, but as time goes on I would think more people would also want to run VR or 4k or even higher resolutions. This is why a card like the 3070 seems gimped, and the links in the OP of this post show how 2gb more of much faster memory on the 3080 makes all of the raytracing and high-res bottlenecks go away. Imagine how much worse the problem will be in 3 years with new games.

Honestly though a 580 is just not a good card for 4K. You're really underpowered regardless of vram. 1440p would be much more sensible.
 

aleader

Senior member
Oct 28, 2013
502
150
116
It of course comes down to the price brackets, but I do not get why people still question the advantage of more VRAM. Whether we go back to the 512MB vs 1GB model, 1GB vs 2GB model and so on, the higher VRAM has always proven more useful in the end. One cannot predict the future, but it has always been typical for people to say, no, today's games do not benefit from it, so get the cheaper card/model. Then fast forward three years and you can typically still have Ultra textures on the higher model card whereas the other is limited to high or perhaps even Medium.

I can get why people who are prepared to upgrade often, say perhaps every three years, get the lower VRAM models though.

I'm certainly not questioning the advantage of more VRAM...to a point. 12GB would be nice to have, but really by the time you need that much (you'll likely need a 4K monitor to hit the wall), the 4000 series or 5000 series (or whatever) will be out, probably for a few years already. It's also been shown umpteen times that moving from High to Ultra textures in a game has virtually no visible benefit...none that the vast majority will ever notice anyways. Of course those who spent $2,500 on a GPU will always say they can see a difference if only to protect their ego...

What's the option? Spending $2,000 on a 3090 just to get the extra VRAM? That's just dumb, and the aforementioned 4000/5000 cards will blow the 3090 out of the water anyways. Would you want a 1080ti rather than a 3080 just because it has more VRAM? I'll use my example of a 1070/3070 in DCS. The VRAM is the same, but the game experience is completely different to the point I don't think I could play it with a 1070 anymore.

Also, if everyone has 8GB cards game companies can't make their games require more than that or nobody will buy them. The way things are going there are going to be a LOT of 8GB cards around for a very long time.
 
Last edited:
  • Like
Reactions: Fanatical Meat

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,792
136
Wow. That's also telling 1080ti and 2080ti owner be ready for stuttering issues too? Does it even let you select it if you don't have enough hardware (looks like you could check it).

From the experiences I've read, if you don't have enough VRAM for the HD textures the main symptom is that you sometimes get a mix of HD and non-HD textures in the scene and blatant pop-in texture issues.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,746
740
136
Wow. That's also telling 1080ti and 2080ti owner be ready for stuttering issues too? Does it even let you select it if you don't have enough hardware (looks like you could check it).

Just downloaded it on a PC with just 4GB of VRAM (could have done it on one with 2GB too) and it is selectable but FPS tanks on a R9 290. Didn't have FPS issues at 1440p on a 3080 though.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
16GB is the right number for Vram on a high-end card today and near future. Not 8, not 10, not 12, not 24. It's 16. I would be pissed seeing something like that Far Cry 6 Vram suggestion if I bought a 3080 for $700 let alone the $1800 most people pay for it these days.

I think it's ridiculous how Nvidia has managed to dance all around that 16GB number without landing on it. It's like a land mine or something; they wouldn't dare step on it. But why? It's the right number and has a nice, even feel to it. But no. Gaming customers would simply like that too much, and when you have power, you never give people what they want. You keep them on a leash and feed them just enough so they don't bite you.
 

blckgrffn

Diamond Member
May 1, 2003
9,123
3,056
136
www.teamjuchems.com
16GB is the right number for Vram on a high-end card today and near future. Not 8, not 10, not 12, not 24. It's 16. I would be pissed seeing something like that Far Cry 6 Vram suggestion if I bought a 3080 for $700 let alone the $1800 most people pay for it these days.

I think it's ridiculous how Nvidia has managed to dance all around that 16GB number without landing on it. It's like a land mine or something; they wouldn't dare step on it. But why? It's the right number and has a nice, even feel to it. But no. Gaming customers would simply like that too much, and when you have power, you never give people what they want. You keep them on a leash and feed them just enough so they don't bite you.

They are saving it for 4xxx cards. 3xxxx are mining cards. Silly moonbogg.
 
  • Like
Reactions: Leeea

gdansk

Platinum Member
Feb 8, 2011
2,078
2,559
136
Nvidia was deliberate about it. The lack of memory in 3080 is intended to drive buyers with a "fear of future proofing" into the 3090 and much higher profit margins.
 
  • Like
Reactions: Leeea

samboy

Senior member
Aug 17, 2002
217
77
101
For 2k gaming, 8GB should be sufficient; similar to 1k gaming where 4GB is usually sufficient.
That said, an extra 2GB on top of the above never hurts as there will be a few titles that will be an exception.

For 4k gaming, AMD is spot on with their 16GB offering and the Nvidia 3080 is short changed at 10GB that will become a problem down the road. For this reason, I was in the market for an AMD 6800XT as extra memory was more important than RT performance for my expected use. However, in these pandemic crazy times I could not get hold of a 6800XT and ended up with the 10GB NVidia 3080! Nvidia had better availability and in this market its a matter of what you can get your hands on.

The rule of thumb of using 4-6GB for 1080p resolution and simply scaling this for the higher resolution (8-12GB for 2k and 16-24GB for 4k) should be a pretty safe bet.
 
  • Like
Reactions: Leeea and Tlh97