8GB VRAM not enough (and 10 / 12)

Page 101 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
HUB has a new video out showing how much VRAM games utilize at various resolutions and performance settings. It covers a lot of the same titles that have already been discussed in this thread, but I thought it was interesting.

 

poke01

Diamond Member
Mar 8, 2022
3,754
5,087
106
HUB has a new video out showing how much VRAM games utilize at various resolutions and performance settings. It covers a lot of the same titles that have already been discussed in this thread, but I thought it was interesting.

Yep, do NOT buy a 8GB VRAM card in 2024. 12GB is bare minimum and 16GB is ideal.
 

poke01

Diamond Member
Mar 8, 2022
3,754
5,087
106

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
31,252
12,777
136
Well, he convinced me! 8 GB cards? Nah, I am now running my old GTX 1060 3GB card for 4K gaming at the "correct settings".

Never mind that the "correct settings" is 1024 x 768 upscaled with everything set to low but hey, getting 6FPS is obviously so worth it.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,808
6,418
136
Well, he convinced me! 8 GB cards? Nah, I am now running my old GTX 1060 3GB card for 4K gaming at the "correct settings".

Never mind that the "correct settings" is 1024 x 768 upscaled with everything set to low but hey, getting 6FPS is obviously so worth it.

I've read 640x480 upscaled works better because it was the original VGA resolution. You might get some more frames out of that.
 

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
Heh, I wish I still had a good CRT monitor at 1280x1024 res or so. It would be great for older games especially. If only LCD tech could scale as well as CRT did.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,808
6,418
136
Heh, I wish I still had a good CRT monitor at 1280x1024 res or so. It would be great for older games especially. If only LCD tech could scale as well as CRT did.

That's one thing I miss about CRT's. Options. I ran 1152x864 as my standard for awhile. But I could always go a bit higher or lower if necessary. Their massive weight though, I do not miss that at all! EDIT: Didn't they fix that with Integer Scaling?
 
  • Like
Reactions: TESKATLIPOKA
Jul 27, 2020
26,029
17,958
146
In terms of quality, OLED reminds me more of CRT, maybe coz CRTs were also good at searing retinas with actual electrons fired at the phosphor dots :S
 

coercitiv

Diamond Member
Jan 24, 2014
7,225
16,982
136
EDIT: Didn't they fix that with Integer Scaling?
The part that is harder to fix (as in emulate) is the inherent properties of CRT. Due to the way it works, one could argue it has both black frame insertion and frame generation baked in. :cool:
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
I remember getting a 17" CRT and thinking that I was king of the world. One could argue that given the weight it still might be useful to this day as gym equipment. That thing was a beast.

Had anyone actually tried out frame generation with a sub 720p resolution? I'm kind of curious if only just for a good laugh at what it might spit out. If it's nightmarish enough, someone could even design a game around it.
 

Iron Woode

Elite Member
Super Moderator
Oct 10, 1999
31,252
12,777
136
I remember getting a 17" CRT and thinking that I was king of the world. One could argue that given the weight it still might be useful to this day as gym equipment. That thing was a beast.

Had anyone actually tried out frame generation with a sub 720p resolution? I'm kind of curious if only just for a good laugh at what it might spit out. If it's nightmarish enough, someone could even design a game around it.
I had a 20" Sony CRT and that was one heavy monitor. It used BNC connectors.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,684
31,541
146
The Spiderman Remastered testing in the OP by some Youtube rando is okay but leaves something to be desired. Showing system ram usage for starters. This morning my allocated was more than twice the used amount. The vid shows a 3-4GB increase for the 8GB 4060ti but seeing the usage would have been more revealing. The 8GB may have been using almost all of it, while the 16GB may have been using half or less of what was allocated.

Running down the street in a not particularly crowded area is not stressful enough. Anyone with 10s of kilometers of "skimmer" knows that swinging near street level in busy areas like Broadway taxes the system from the amount of asset streaming.

Last gen RX 6800 in Horizon:FW expansion's toughest area doing 4K high + FSR Q+FG with impressive results. Using more vram than the 4060ti 8GB has to give.

 

marees

Golden Member
Apr 28, 2024
1,252
1,804
96
HUB has a new video out showing how much VRAM games utilize at various resolutions and performance settings. It covers a lot of the same titles that have already been discussed in this thread, but I thought it was interesting.

Text version (for 4060 ti 8gb vs 16gb)

with the exception of entry-level options, you shouldn't be buying 8GB graphics cards anymore. 12GB is now the bare minimum, and 16GB is the ideal target.
https://www.techspot.com/review/2856-how-much-vram-pc-gaming/

What Does Running Out of VRAM Look Like?​

  1. Halo Infinite —​

    • ultra preset with ray tracing enabled is out of the question, as it uses 10.3GB at 1080p and 11GB at 4K
  2. Forspoken​

    • Upon release, Forspoken would miss all textures without enough VRAM, resulting in a horrible-looking game.
    • Now, it removes textures for everything not currently in view and tries to load them where they are most obvious. This results in texture pop-in and sometimes texture cycling, where high-quality textures appear and disappear, (as we've also observed in Hogwarts Legacy. )
  3. Ratchet & Clank: Rift Apart —​

    • It is possible to use the very high setting at 1080p on an 8GB GPU, but you will likely notice some frame time issues, and it's unlikely that all textures will be rendered at full quality. Enabling ray tracing with the very high preset won't end well, as this uses 11.2GB of VRAM. If you enable frame generation, it will push you over 12GB.
  4. Cyberpunk 2077: Phantom Liberty —​

    • Problems start to occur for 8GB graphics cards when enabling ray tracing, which pushes memory usage to 11GB at 1080p and 12GB at 1440p. If you enable frame generation as well, you're looking at 12GB of usage at 1080p, 13.5GB at 1440p, and 16.5GB at 4K.
  5. Forza Motorsport —​

    • ultra preset with ray tracing enabled is out of the question, though, as it uses 10.3GB at 1080p and 11GB at 4K
  6. The Last of Us Part 1 —​

    • Even after several optimization patches, the game still requires more than 8GB of VRAM at 1080p if you wish to use the 'ultra' preset.
  7. Avatar: Frontiers of Pandora —​

    • A quick in-game comparison between the 8GB and 16GB versions of the RTX 4060 Ti at 1440p using the ultra preset showed the 16GB model delivering 25% higher 1% lows.
  8. Homeworld 3 —​

    • Those playing Homeworld 3 with an 8GB graphics card should avoid the 'epic' quality preset and instead stick to 'high' as the maximum setting. Though 'high' does result in noticeable frame time performance, we recommend 'medium' as this will keep you within an 8GB buffer, even at 1440p.
  9. Hogwarts Legacy —​

    • 8GB graphics cards are limited to the 'high' preset without any ray tracing at 1080p and 1440p. Enabling the 'ultra' preset will result in either missing textures or poor frame time performance.
  10. Starfield — not memory-hungry, until 4K​

  11. Horizon: Forbidden West —​

    • In Horizon Forbidden West, in some in-game cutscenes, the 16GB model was up to 35% faster and had fewer frametime spikes, which is more noticeable during gameplay.
    • On average, the 16GB model was 40% faster and offered significantly better frame time performance, resulting in a much smoother and more consistent experience.
    • Even with DLSS upscaling, the 16GB model is much better, offering higher frame rate performance with better frame time consistency. In our tests, we saw the 16GB model achieving 73 fps on average compared to 49 fps for the 8GB model – that's nearly a 50% improvement explained by having the sufficient VRAM capacity.
  12. Senua's Saga: Hellblade 2 —​

    • Senua's Saga: Hellblade II will use more than 8GB of VRAM at 1080p with the 'high' preset and well over that if you want to enable frame generation.
  13. Ghost of Tsushima —​

    • if you enable DLSS frame generation, you might see some issues at 1440p and almost certainly at 4K
  14. Alan Wake 2 —​

    • At 1080p, an 8GB GPU will suffice for the medium and high presets. However, enabling ray tracing will be problematic as this requires over 10GB, so a 12GB VRAM buffer is necessary here
 

marees

Golden Member
Apr 28, 2024
1,252
1,804
96
Text version (for 4060 ti 8gb vs 16gb)

with the exception of entry-level options, you shouldn't be buying 8GB graphics cards anymore. 12GB is now the bare minimum, and 16GB is the ideal target.
https://www.techspot.com/review/2856-how-much-vram-pc-gaming/

What Does Running Out of VRAM Look Like?​

  1. Halo Infinite —​

    • ultra preset with ray tracing enabled is out of the question, as it uses 10.3GB at 1080p and 11GB at 4K
  2. Forspoken​

    • Upon release, Forspoken would miss all textures without enough VRAM, resulting in a horrible-looking game.
    • Now, it removes textures for everything not currently in view and tries to load them where they are most obvious. This results in texture pop-in and sometimes texture cycling, where high-quality textures appear and disappear, (as we've also observed in Hogwarts Legacy. )
  3. Ratchet & Clank: Rift Apart —​

    • It is possible to use the very high setting at 1080p on an 8GB GPU, but you will likely notice some frame time issues, and it's unlikely that all textures will be rendered at full quality. Enabling ray tracing with the very high preset won't end well, as this uses 11.2GB of VRAM. If you enable frame generation, it will push you over 12GB.
  4. Cyberpunk 2077: Phantom Liberty —​

    • Problems start to occur for 8GB graphics cards when enabling ray tracing, which pushes memory usage to 11GB at 1080p and 12GB at 1440p. If you enable frame generation as well, you're looking at 12GB of usage at 1080p, 13.5GB at 1440p, and 16.5GB at 4K.
  5. Forza Motorsport —​

    • ultra preset with ray tracing enabled is out of the question, though, as it uses 10.3GB at 1080p and 11GB at 4K
  6. The Last of Us Part 1 —​

    • Even after several optimization patches, the game still requires more than 8GB of VRAM at 1080p if you wish to use the 'ultra' preset.
  7. Avatar: Frontiers of Pandora —​

    • A quick in-game comparison between the 8GB and 16GB versions of the RTX 4060 Ti at 1440p using the ultra preset showed the 16GB model delivering 25% higher 1% lows.
  8. Homeworld 3 —​

    • Those playing Homeworld 3 with an 8GB graphics card should avoid the 'epic' quality preset and instead stick to 'high' as the maximum setting. Though 'high' does result in noticeable frame time performance, we recommend 'medium' as this will keep you within an 8GB buffer, even at 1440p.
  9. Hogwarts Legacy —​

    • 8GB graphics cards are limited to the 'high' preset without any ray tracing at 1080p and 1440p. Enabling the 'ultra' preset will result in either missing textures or poor frame time performance.
  10. Starfield — not memory-hungry, until 4K​

  11. Horizon: Forbidden West —​

    • In Horizon Forbidden West, in some in-game cutscenes, the 16GB model was up to 35% faster and had fewer frametime spikes, which is more noticeable during gameplay.
    • On average, the 16GB model was 40% faster and offered significantly better frame time performance, resulting in a much smoother and more consistent experience.
    • Even with DLSS upscaling, the 16GB model is much better, offering higher frame rate performance with better frame time consistency. In our tests, we saw the 16GB model achieving 73 fps on average compared to 49 fps for the 8GB model – that's nearly a 50% improvement explained by having the sufficient VRAM capacity.
  12. Senua's Saga: Hellblade 2 —​

    • Senua's Saga: Hellblade II will use more than 8GB of VRAM at 1080p with the 'high' preset and well over that if you want to enable frame generation.
  13. Ghost of Tsushima —​

    • if you enable DLSS frame generation, you might see some issues at 1440p and almost certainly at 4K
  14. Alan Wake 2 —​

    • At 1080p, an 8GB GPU will suffice for the medium and high presets. However, enabling ray tracing will be problematic as this requires over 10GB, so a 12GB VRAM buffer is necessary here

What would be the maximum that you would pay for an 8gb card ?

Assuming gpu size is not the bottleneck, it could be as powerful as a 4080 for example


Current nvidia prices in newegg:
4060 - 300
4060ti - 390
4060ti 16gb - 450
4070 - 550
4070super - 600
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
What would be the maximum that you would pay for an 8gb card ?

Assuming gpu size is not the bottleneck, it could be as powerful as a 4080 for example


Current nvidia prices in newegg:
4060 - 300
4060ti - 390
4060ti 16gb - 450
4070 - 550
4070super - 600
Best Buy Canada had a Zotac 4060 for $310 Canadian pesos yesterday, which is US$228. I think in that price range it's perfectly acceptable to pick an 8GB card.
US$390 for an 8GB 4060 Ti is just out to lunch though.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136

$200 if it's a new card from the latest generation which itself isn't over a year old. Otherwise $150 is where I'd draw the line as moving up to something with 16 GB, even if it costs $300 more, will at least ensure it will be useful in 5 years.

I'd like to see Strix Halo included in some of these tests in the future. It's the first truly top end APU in terms of what it brings to the table with the 40 CU GPU.

Although it's a premium product, it also has to exist in a more limited TDP and doesn't benefit from faster GDDR memory (and further has to split its available bandwidth with the CPU) but I think it will be able to beat these discrete cards in the cases where they hit the 8 GB wall.

We'll probably see a more reasonable mainstream APU with around 24 CU in the near future and those will make 8 GB cards pointless. Even if the card has more shaders, the memory bottleneck will prevent them from being fully utilized.
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
Strix Point has up to 16 CU, and while those no guarantee that AMD increases to 24 CU with their next generation, if they don't the one after definitely will.

I don't think it will be more than 3 years to see some mainstream part that hits those targets.
 

jpiniero

Lifer
Oct 1, 2010
16,493
6,989
136
Strix Point has up to 16 CU, and while those no guarantee that AMD increases to 24 CU with their next generation, if they don't the one after definitely will.

Well... If you think about it, Kraken is the real replacement for Phoenix and that has less CUs. With node costs not getting cheaper and all of Tech obsessing over AI, there's no telling what AMD does with IGPs. The people who want good IGP/low end GPU performance also don't want to pay for it (see above)
 

Mopetar

Diamond Member
Jan 31, 2011
8,436
7,631
136
I think if they had a modular chiplet based solution they could cater to the market. Imagine something similar to their current desktop Zen but with an IO die that can connect to three chiplets, which can be any combination of CPU or GPU cores.

After RDNA3 having issues I'm a little more leery about AMD going down that path anytime soon, but a few years ago it seemed like something they might be aiming at.