8GB VRAM not enough (and 10 / 12)

Page 137 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,000
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

poke01

Diamond Member
Mar 8, 2022
3,394
4,640
106
Iceberg lamenting how 8GB has become more of a problem just since he last visited the 5700XT in 2023.


In AC: Shadows if you don't have enough vram increasing the texture streaming quality does nothing. You still get bad textures.
As consoles get better this problem will worsen. If PS6 has 24GB of GDDR6 then PC will need 32GB VRAM minimum for AAA games released with PS6 in mind.
 

Mopetar

Diamond Member
Jan 31, 2011
8,317
7,347
136
As consoles get better this problem will worsen. If PS6 has 24GB of GDDR6 then PC will need 32GB VRAM minimum for AAA games released with PS6 in mind.

I don't know if we'll get there for a while. Developers can always make more detailed textures and try to cram more of them into their game areas, and perhaps AI being able to cheaply generate these assets will drive this faster than it would otherwise occur, but until 8K becomes mainstream I don't think most games will go beyond 24 GB.

Keep in mind that only recently has 8 GB become insufficient in enough cases that most people should avoid it if possible. Most games in development are going to target consoles as well as PC and that means 12 GB is a practical maximum for most games. A few will likely have PC settings that push or eclipse 16 GB, but 24 is likely to be safe for a while.
 

SolidQ

Golden Member
Jul 13, 2023
1,350
2,085
96
Looking at Street Fighter 6.
Xbox Series S vs Switch 2. 8GB vs 12GB.
Switch 2 is better, but that seems how Ram is matter. Microsoft doing pathetic 2 pool ram's



I don't think most games will go beyond 24 GB.
Some devs want 32GB for full charge.
 

poke01

Diamond Member
Mar 8, 2022
3,394
4,640
106
I don't know if we'll get there for a while. Developers can always make more detailed textures and try to cram more of them into their game areas, and perhaps AI being able to cheaply generate these assets will drive this faster than it would otherwise occur, but until 8K becomes mainstream I don't think most games will go beyond 24 GB.

Keep in mind that only recently has 8 GB become insufficient in enough cases that most people should avoid it if possible. Most games in development are going to target consoles as well as PC and that means 12 GB is a practical maximum for most games. A few will likely have PC settings that push or eclipse 16 GB, but 24 is likely to be safe for a while.
I don’t think so. Even Switch 2 has 12GB of unified RAM. That’s the baseline now.
 

Mopetar

Diamond Member
Jan 31, 2011
8,317
7,347
136
I don’t think so. Even Switch 2 has 12GB of unified RAM. That’s the baseline now.

The Switch 2, despite offering the ability to output 4K when docked is primarily a 1080p system. If Nintendo has something akin to DirectStorage it'll definitely help out with those limitations, but at the end of the day this is still a console that has a GPU that's not even as powerful as a 3050M.

I think it will lean heavily on DLSS and while that might normally be an issue, with a console there's a single hardware specification and that means developers can tune around the visuals matching their expectations. When there's no native to compare to no one is the wiser.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,640
3,197
136
Looking at Street Fighter 6.
Xbox Series S vs Switch 2. 8GB vs 12GB.
Switch 2 is better, but that seems how Ram is matter. Microsoft doing pathetic 2 pool ram's
Microsoft uses 2 RAM pools, where is the problem in that? Because Switch 2 is better? It has 2GB more in total than Series S and from that video I didn't really notice any noticeable difference.
And Series S is already 3.5 years old, while being a lot cheaper than Switch 2.

If nothing else, Switch 2 should have been released with at least 16GB RAM -> 24GB even better, not just pathetic 12GB, If they ask that much for It.
 
Last edited:
Jul 27, 2020
23,691
16,618
146
Yeah, the Switch 2 screenshots look much more vibrant and detailed with a lot more contrast.

I hate to say it but I can predict that some people will use this screenshot as evidence to prove that Nvidia is the best in graphics.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,640
3,197
136

TESKATLIPOKA

Platinum Member
May 1, 2020
2,640
3,197
136
Display is one story, Render resolution is other.
It's true that plenty of games on Switch 1 didn't render in native 720p, but Switch 2 is many times more powerful, so Full HD should be more than doable.
Even If they released a game, which couldn't render in that resolution and needed DLSS, that doesn't mean 12GB is "fine".
It will all depend on texture quality, because that one is the biggest VRAM consumer.
Seriously, putting 4GB more would have cost only a few $ in BOM, talk about being greedy when they already ask $150 more.
A lot devs crying about low ram in XSS. Would be interesting how BG3 is working on SW2. BG3 devs was a lot crying about XSS
So what's the problem, low RAM or 2 pools of RAM 2GB+8GB in XSS? To me It looks like the problem is with the amount of RAM.
If the problem is about low RAM then they will likely cry about SW2's low RAM, after all It's only 2GB more.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
8,317
7,347
136
Yeah, the Switch 2 screenshots look much more vibrant and detailed with a lot more contrast.

I'm assuming it's due to slight differences in the two that are intentional. The Switch version may have changes made so that it looks better on the display for the Switch.

I hate to say it but I can predict that some people will use this screenshot as evidence to prove that Nvidia is the best in graphics.

I'm sure some pillocks will do exactly that. I suggest switching the labels on the images before showing them because they're going to answer how they want to and will make up justifications as they go. It's also a lot funnier when you tell them what you did and they now have to refute their own arguments.

The funny thing is that the Switch 2 APU has more GPU cores so if it could clock high enough it probably would beat the Series S GPU. The Ampere cores in the Switch 2 are contemporary with the RDNA 2 cores in the Series S, but it's likely that the Switch 2 does have some more recent tech that has been back ported. It's probably comparable to a PS4, which is nothing to sneeze at.
 

marees

Senior member
Apr 28, 2024
953
1,270
96
I'm assuming it's due to slight differences in the two that are intentional. The Switch version may have changes made so that it looks better on the display for the Switch.



I'm sure some pillocks will do exactly that. I suggest switching the labels on the images before showing them because they're going to answer how they want to and will make up justifications as they go. It's also a lot funnier when you tell them what you did and they now have to refute their own arguments.

The funny thing is that the Switch 2 APU has more GPU cores so if it could clock high enough it probably would beat the Series S GPU. The Ampere cores in the Switch 2 are contemporary with the RDNA 2 cores in the Series S, but it's likely that the Switch 2 does have some more recent tech that has been back ported. It's probably comparable to a PS4, which is nothing to sneeze at.
I am guessing in handheld mode power limitations come into play.

Its probably hdr on switch 2 that makes it look better on series S
Plus DLSS 2 might allow a 540p resolution gaming in handheld mode
 

Mopetar

Diamond Member
Jan 31, 2011
8,317
7,347
136
There are many plenty of Switch 2 games that could pull off good looking native 1080p with the hardware it has. Not everything is trying to be a graphical powerhouse and I think many Switch games are better looking games because of the art direction even if the poly count is lower.

Ports from other systems will likely run at lower internal resolutions and use DLSS to upscale though. The Switch 2 just doesn't have the muscle to compete against something with over twice the cores clocked at over twice the speed.
 

poke01

Diamond Member
Mar 8, 2022
3,394
4,640
106
Keep in the next console generation will likely use more than 16GB. So what in like 3 years?

The PS5 Pro already has 16GB dedicated for gaming only. It does have an extra 2GB for the OS. Most likely the PS6 will double the base PS5 memory.

So buying a 12GB $500 GPU in 2025 is crazy as that is likely be dead for high textures for example for games like GTA6 when it releases for PC.
 

psolord

Platinum Member
Sep 16, 2009
2,107
1,244
136
Wait, what?

So it has the "concept" of 1080p but isn't actually running at 1080p? And its OK to list this as 1080p performance because you subjectively think it's probably "pretty close"?

And you keep saying how these cards are running 1080p high/ultra settings when... they aren't.

That's disingenuous, man.

Maybe I missed the asterisk, but when the text with the video is "hey, look at how these do at 1080p" I am expecting the game to be set to 1080p, not just the monitor resolution to be set to 1080p.
There's nothing indigenous. Everything is recorded. This is a default settings.

I said again and let me repeat. The 3060ti on the Indiana Jones run, is using up to 165W. That's way lower than its max power. In order for the dynamic res to kick in, the card needs to be pushed hard.

This system is now decommissioned, but I will rebuilt it in a couple weeks. So I will do the same run without dynamic res and you will see its the same thing.
 

psolord

Platinum Member
Sep 16, 2009
2,107
1,244
136
I think your comprehension is failing. In what way is it academic for a 3060 owner to run a new game at 1080p and maximum settings? The specific example which you made your response to seems like a very practical setting and one at which playable frame rates are achievable. I would not be at all surprised to find that most 3060 owners use those exact settings. In other words, not at all academic.
It is academic because they are using max texture pool size, which causes the card to fail. Low pool size does not mean low texture quality in this one. The solution is two clicks away and yet you are paying attention to a manufactured result, which was made specifically to make 8GB cards to fail.

Well you're failing at that as well for reasons that similarly seem beyond your grasp or more likely that you just refuse to understand.

Explain to me why a 3060 and even a 2060 can run a new game at 1080p and achieve better frame rates? That should never happen. A newer product should not perform worse than an older one from the last generation, much less one from two generations prior.

No one disputes that you can turn down the visual settings to increase the frame rate, but that's typically something you have to do on an older card to extend the life of it. The problem is that you shouldn't have to do that with a newer card when the same model of card from several years ago can still run the game fine.

Once again the problem with your argument is that it's entirely subjective. No one actually needs a discrete GPU because with correct settings you could run the game on the iGPU. That's how asinine your argument is because you have no better justification for why your settings are correct than anyone else does.

This is like claiming that after drinking the correct number of beers the troglodyte at the other end of the bar will be pretty. Or you could buy an appropriate amount of VRAM so you don't wake up tomorrow and realize what a terrible mistake you made.
The real problem here, is that I am not the one not grasping the problem, but rather the rest of you.

I show you Indiana Jones running fine, being pretty and absolutely usable and you are like NO SET IT AT MAX. wtf?

The reason why these 12GB cards can do better with THESE SPECIFIC settings, is because this test is setup to do so. According to my maxed@165W run of the 3060ti, there is no actual usability problem with the 8GB 4060 in that game. Only if you set it to fail.

If you set the texture pool size to low, which will largely not affect the quality, the 4060 will jump in front of the other two 12GB cards. And that's the problem I have with this thread.

Your igpu argument and the troglodyte arguments are silly and a far cry compared to what I say. A good image quality with correct settings is not subjective. It's a fact.