8GB VRAM not enough (and 10 / 12)

Page 145 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Rigg

Senior member
May 6, 2020
705
1,789
136
Concerning Doom Dark ages; Has anyone seen the 5060 tie 8GB tested in PCIe 4 or 3 mode? This is the tier of card cats on older hardware upgrade to. AM4 is far and away the best selling chipset of all time. Millions of 3-4-5 series boards and CPUs out there, and none of them are 5.0. A lot of Intel 6 series B and H series and even older chipset owners without 5.0 too. I'd especially be interested in 3.0 as the Ryzen 5500 still sells in the top 5 frequently and has been a top 10 mainstay since last year.

Will the texture pool size or another factor come more into play with less throughput?
It seems to be a good rule of thumb that any settings that cause the 8 GB card to lose performance vs the 16 GB will be exacerbated by PCIE 3 or 4. 4K Ultra Nightmare with quality upscaling seems to be the tipping point in Doom Dark Ages. I view the PCIE bandwidth thing as more of an interesting side effect of running out of VRAM rather than a real consideration. Even on PCIE 5 it doesn't look like it's worth using settings that exceed the available VRAM. W!zard tested this on the 16GB card and there wasn't a noticeable difference. I think it's safe to assume the same will hold true for the 8 GB unless it's in a scenario where it's out of VRAM.
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,424
136
Came across this on the Dune Awakening Beta this weekend. An experimental setting called "virtual shadow maps" that want a minimum of 10gb vram. Lumen is turned off automatically on medium graphics settings.

20250511031741_1.jpg
 
Feb 4, 2009
35,862
17,402
136
Came across this on the Dune Awakening Beta this weekend. An experimental setting called "virtual shadow maps" that want a minimum of 10gb vram. Lumen is turned off automatically on medium graphics settings.

View attachment 123640
Any thoughts on the beta please add them here

 
  • Like
Reactions: Ranulf

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,424
136
I was just reading that thread earlier today for the laughs. Not surprised they locked it though. So many people just bringing up the "its just the market man, it sucks but its not that bad" mentality. Even worse then some of the defenders of the 6500xt cards a couple years ago.
 

Rigg

Senior member
May 6, 2020
705
1,789
136
Even worse then some of the defenders of the 6500xt cards a couple years ago.
Wut??? This was a thing? Lol, I actually own one. I got it in a trade and don't have the heart to sell it to some poor bastard. I got it for like $100 in trade value so what ev. It might come in handy for trouble shooting or emergency backup.
 
  • Like
Reactions: Tlh97 and marees

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,424
136
Wut??? This was a thing? Lol, I actually own one. I got it in a trade and don't have the heart to sell it to some poor bastard. I got it for like $100 in trade value so what ev. It might come in handy for trouble shooting or emergency backup.

Yeah it was a thing with some, though this was in early '22 when those laptop chips on a card were really the only thing available at under $300. They were sort of right but then if you waited 6 months, the 6600 cards were dropping down to $250 by then.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,702
31,594
146
@Rigg @Ranulf

This was me

7n69.gif


Don't be the dude clinking the bottles together. :p

Yeah it was a thing with some, though this was in early '22 when those laptop chips on a card were really the only thing available at under $300.
Indeed. Context is key. If it had enough varm for Ethereum it was massively overpriced or nigh impossible to buy. That's why 4GB cards were a thing. People were paying $250-$300 for a 1050ti for spaghetti monster's sake. They started manufacturing them again the times were so grim. The 1650 was another $50-$75. The glorified display adapter known as the GT 1030 2GB, overall slower than a 5700G, was averaging $140. And don't forget the inflated prices AMD APUs were going for. All the way down to the 2200G. It was pure insanity.

Way too many people paid $350 for a GTX 1650 during the 2021 holiday season. The single slot powered XFX 6400 I picked up on Prime Day? for $117 shipped was not only by far, the best single slot powered deal out there; it was the best deal. It has not been that good a deal since. Why? Because there are few cards that can do what it does. But all of the 6400 and 6500XT models were sold for less than the competing Nvidia offerings. My comments are for the U.S. market where I live. I have no input on other markets, and my conclusions may not hold up for those.

Ironically both the 6400 and 6500XT were perfect for paring with lower stack Intel 11-12 gen with UHD. As it solved the lack of media features Zen 2-3 on 5 series boards could not. All of them had the needed PCIe 4 support. For people on 4 series or older AM4 or older Intel, I did concede the GTX cards were really the only real option under $400.
They were sort of right but then if you waited 6 months, the 6600 cards were dropping down to $250 by then.
We were correct, not sorta correct. Because unless you had a time machine, there was no knowing when the crazy train would reach the station. It eventually left the station again. Ain't this market grand?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,702
31,594
146
Wayback machine shows (So Many Compromises) included in article from the review date..
Thanks man. That's why I was as confused as Rigg about the discussion. Aside from justuserbenchmarking; that dude has to be getting compensated for the endless hours of trolling and shilling.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,808
6,421
136
Thanks man. That's why I was as confused as Rigg about the discussion. Aside from justuserbenchmarking; that dude has to be getting compensated for the endless hours of trolling and shilling.

justbenchmarking is a next level shill.
 

coercitiv

Diamond Member
Jan 24, 2014
7,226
16,988
136
I still don’t get why people are happy paying more for less regarding 8GB vs 12 or 16GB. The memory isn’t that expensive, the margins on these cards are high.
I bet there's some product in your home that got shrinked in the last few years and you're still buying at equal or higher prices. I'm pretty sure there's more than one in my home too. Don't blame the people, many are just unaware and don't have the energy to research each and every purchase they make to avoid pitfalls (we're talking all products and services in their life). This is why making noise on the subject is important, eventually it reaches the average buyer even in their relatively low effort run to upgrade the toy in their life.
 
Feb 4, 2009
35,862
17,402
136
I bet there's some product in your home that got shrinked in the last few years and you're still buying at equal or higher prices. I'm pretty sure there's more than one in my home too. Don't blame the people, many are just unaware and don't have the energy to research each and every purchase they make to avoid pitfalls (we're talking all products and services in their life). This is why making noise on the subject is important, eventually it reaches the average buyer even in their relatively low effort run to upgrade the toy in their life.
My point was about the 8GB is fine crowd or the settings are wrong crowd
 
  • Like
Reactions: coercitiv

coercitiv

Diamond Member
Jan 24, 2014
7,226
16,988
136
My point was about the 8GB is fine crowd or the settings are wrong crowd
That's a different story indeed. That crowd ain't buying 8GB, they usually have more powerful hardware and pretend their knowledge about gaming should be the norm among all the Joes. That crowd also conveniently forgets to mention how they play games while browsers and other chromium based programs stay open in the background (e.g. Discord). They conveniently forget to mention how 1GB+ of VRAM is usually lost to other programs before the game even starts. The $400 gamer should not expect such luxury, I guess.

We don't make endless threads about 16GB of RAM being fine for gaming with correct settings, we just tell people they should go with 32GB in new systems. Just as you said, the cost of memory is more than acceptable.
 

jpiniero

Lifer
Oct 1, 2010
16,498
7,003
136
I still don’t get why people are happy paying more for less regarding 8GB vs 12 or 16GB. The memory isn’t that expensive, the margins on these cards are high.

In case you haven't noticed, like the consoles are getting more expensive instead of getting cheaper (as had happened in previous generations).
 
Feb 4, 2009
35,862
17,402
136
In case you haven't noticed, like the consoles are getting more expensive instead of getting cheaper (as had happened in previous generations).
Point taken however I’m not going to say they’re playing on the wrong settings.
I will say we need far more competition, more production (see chips act) and/or people who make games should optimize them for what’s currently on the market.
 

marees

Golden Member
Apr 28, 2024
1,258
1,817
96
Doom on id.tech engine


Based on our testing, multi-frame generation is buggy on the 8 GB model and flawless on the 16 GB model. Reducing the texture pool size to 1.5 GB on an 8 GB GPU is essential. For those wondering, the texture pool size can be increased to 4 GB on a 16 GB GPU without any performance hit, though we couldn't identify any visual or performance improvements as a result.

VRAM Debate: 8GB vs. 16GB​

Now let's talk about 8 GB GPUs in Doom: The Dark Ages, specifically the new 8 GB version of the RTX 5060 Ti. For the most part, 8 GB GPUs perform reasonably well in this game, and it's clear the developer has put effort into optimizing for that configuration. This makes sense, as the majority of PC gamers are still stuck on 8 GB GPUs, largely due to AMD and Nvidia continuing to ship low-VRAM models, and appear to be actively trying to kill PC gaming, but I digress.

As was the case with Space Marine 2, this game could greatly benefit from a proper 4K texture pack. While some textures look excellent, many appear low-resolution and lack detail when viewed at higher resolutions. We made similar comments about Space Marine 2, which some pushed back on – until the 4K texture pack was released. At that point, the game looked dramatically better but became unplayable on 8 GB cards.

2025-05-09-image-11.jpg

To illustrate the difference, here's a look at how the 8 GB and 16 GB versions of the RTX 5060 Ti perform at 4K using DLSS Balanced upscaling during a large horde battle. While the 16 GB model's frame rate isn't great, the game is at least playable. In contrast, the 8 GB version is completely broken in this scenario – though this is an extreme case, meant to test VRAM limits.

Now, if we enabled DLSS quality upscaling at 1440p, the 8 GB 5060 Ti sees very little improvement over native performance, while the 16 GB model is roughly 40% faster. To further test VRAM saturation, we moved beyond the 30-second benchmark pass and played for several minutes.

2025-05-09-image-13.jpg

Initially, the 16 GB card was about 42% faster. But a few minutes in, VRAM usage overwhelmed the 8 GB model, tanking its performance. This resulted in the 16 GB model delivering an 82% higher average frame rate – and over 200% better 1% lows.

We observed similar differences with the Nightmare and Ultra presets. Even the High preset showed some discrepancy, though Medium provided nearly identical performance and visuals, making it a more viable option for 8 GB GPUs.

Screenshot_20250517-111354_Opera.jpg