8GB VRAM not enough (and 10 / 12)

Page 41 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Jaskalas

Lifer
Jun 23, 2004
33,466
7,525
136
Which games are struggling at low/medium texture settings with 8GB of ram?

I mean, they can look ugly but run fine, right? Isn't that what you expect when you turn the settings down?
The issue was never performance at lower settings, it is quality at lower settings. Worse than expected, worse than something 10 years ago, and some games have resolved it with a patch.
It is almost like they had a binary texture setting and skipped everything in-between. Those 8GB frame buffers were looking like 2GB frame buffers.

So the complaint isn't having to run on medium, it was the shocking result when doing so.
 

blckgrffn

Diamond Member
May 1, 2003
9,138
3,074
136
www.teamjuchems.com
The issue was never performance at lower settings, it is quality at lower settings. Worse than expected, worse than something 10 years ago, and some games have resolved it with a patch.
It is almost like they had a binary texture setting and skipped everything in-between. Those 8GB frame buffers were looking like 2GB frame buffers.

So the complaint isn't having to run on medium, it was the shocking result when doing so.

Yes. They are going to optimize for current consoles (which have more than 8GB of ram) and then have a setting that works on a potato. Or as close to a potato which they can conceivably support.

Everything between takes time, and time is money. When they can judge future investments based on sales and then release updates if needed, then it's no surprise we've gotten here. This is the natural state of things.

And buying a midrange or better card released in 2023 should take this (this being reality) into account. It's a frustrating and terribly timed regression by nvidia to walk Vram back on the vanilla 60 series.
 
Feb 4, 2009
34,624
15,817
136
Yes. They are going to optimize for current consoles (which have more than 8GB of ram) and then have a setting that works on a potato. Or as close to a potato which they can conceivably support.

Everything between takes time, and time is money. When they can judge future investments based on sales and then release updates if needed, then it's no surprise we've gotten here. This is the natural state of things.

And buying a midrange or better card released in 2023 should take this (this being reality) into account. It's a frustrating and terribly timed regression by nvidia to walk Vram back on the vanilla 60 series.
While I’m living in fantasy land. How awesome would it be if intel went F-it and provided Battle Mage cards with a ridiculous amount of memory like starts a 16GB on the low end and goes up to 64GB on the high end. Basically a big FU to nvidia & AMD.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,373
10,068
126
While I’m living in fantasy land. How awesome would it be if intel went F-it and provided Battle Mage cards with a ridiculous amount of memory like starts a 16GB on the low end and goes up to 64GB on the high end. Basically a big FU to nvidia & AMD.
I could see that happening if two things are true:
1) BM could be used for AI
2) BM has the brute rasterization to handle that much VRAM.

Even still, I think 64GB of VRAM, would be a "deluxe SKU" (or pro/workstation card version) rather than "standard gamer version".
 
Feb 4, 2009
34,624
15,817
136
I could see that happening if two things are true:
1) BM could be used for AI
2) BM has the brute rasterization to handle that much VRAM.

Even still, I think 64GB of VRAM, would be a "deluxe SKU" (or pro/workstation card version) rather than "standard gamer version".
I’m going to roll with any part number the begins or ends with the letters F and U in that exact order.
 
Last edited:
Jul 27, 2020
16,600
10,597
106
Even still, I think 64GB of VRAM, would be a "deluxe SKU" (or pro/workstation card version) rather than "standard gamer version".
I bet a pretty decent AAA game could be made that puts all required texture assets into that much RAM. No more hitching! No more GPU resources going idle waiting on data requests from system RAM!
 

Mopetar

Diamond Member
Jan 31, 2011
7,911
6,175
136
While I’m living in fantasy land. How awesome would it be if intel went F-it and provided Battle Mage cards with a ridiculous amount of memory like starts a 16GB on the low end and goes up to 64GB on the high end. Basically a big FU to nvidia & AMD.

64GB is almost pointless for consumer cards, at least for gamers. CAD users would probably love it and a few hobbyists would find a way to put it to use, but very few games would even try to take advantage of it.

If they offered up to 32GB that'd be solid well beyond the useful life of the card. 8K might be able to utilize that much VRAM, but that's still at the fringes of what's possible and widely supported and most people would rather have faster frames at 4K than a passable frame rate at 8K.
 
Feb 4, 2009
34,624
15,817
136
64GB is almost pointless for consumer cards, at least for gamers. CAD users would probably love it and a few hobbyists would find a way to put it to use, but very few games would even try to take advantage of it.

If they offered up to 32GB that'd be solid well beyond the useful life of the card. 8K might be able to utilize that much VRAM, but that's still at the fringes of what's possible and widely supported and most people would rather have faster frames at 4K than a passable frame rate at 8K.
Come on don’t ruin my fantasy.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,748
743
136
Yes. They are going to optimize for current consoles (which have more than 8GB of ram) and then have a setting that works on a potato. Or as close to a potato which they can conceivably support.

Everything between takes time, and time is money. When they can judge future investments based on sales and then release updates if needed, then it's no surprise we've gotten here. This is the natural state of things.

And buying a midrange or better card released in 2023 should take this (this being reality) into account. It's a frustrating and terribly timed regression by nvidia to walk Vram back on the vanilla 60 series.
The Series S has 8GB VRAM (out of 10GB shared) and targets 1080/1440p. Perhaps PC games are unoptimised or just need a setting labelled Series S Mode for 8GB VRAM cards. I do think $250 is the absolute max for 8GB & that's being generous by $50+. We do also need Memory manufacturers to step up with the 24/32Gbit Modules.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,603
20,994
146
  • Like
Reactions: Tlh97 and Ranulf

blckgrffn

Diamond Member
May 1, 2003
9,138
3,074
136
www.teamjuchems.com
The Series S has 8GB VRAM (out of 10GB shared) and targets 1080/1440p. Perhaps PC games are unoptimised or just need a setting labelled Series S Mode for 8GB VRAM cards. I do think $250 is the absolute max for 8GB & that's being generous by $50+. We do also need Memory manufacturers to step up with the 24/32Gbit Modules.

Yeah, I mentioned that in another post, but they might make up what, 10% to 15% of current gen console sales? They’ve had a very cool reception.

Developers have been scathing of its existence, especially its reduced ram amount.

It’s had to be heavily discounted to really move and its combination of no disc drive and small storage make it a frustrating appliance for Gamepass.

Since it has too little ram, back compat games use the Xbox one vanilla profile vs the One X profile, meaning lower resolutions and detail settings.

Gamers also complain how bad games look on it, sound familiar? 😂. It looks like it gets the potato treatment. As expected.
 

Spjut

Senior member
Apr 9, 2011
928
149
106
I find it annoying with the excuses that it's the developers' fault. It explains it but it doesn't solve the issue people are running into.

The Last of Us got some well needed patches, but you cannot count on patches like that for all other games. People running into VRAM limitations will always be at the mercy of post launch support, that's something the reviewers ought to proclaim with their announcements about the developers being lazy.

AMD Bulldozer have aged better than the i3 CPUs of the time, but back when they were released, there were no excuses being made that developers were lazy because the game didn't benefit from eight cores/threads, the popular statement of the time was that AMD's single threaded performance was so lackluster.

Likewise, games with high requirements kept using DX9 all the way until the PS4 generation started, reviewers showed no mercy at the time by stating these games could run better on modern CPUs and GPUs with a DX10/11 patch.

TLOU also reminded me of Red Dead Redemption 2 because of the contrast. When you played it on low, it really reminded you of PS2 textures, on GPUs that once compared and performed favorably to the PS4 version.
I guess it was kind of glossed over because many reviewers thought it's no big deal to replace eight year old GPUs. But this accusation of texture quality being lower than required are really nothing new, it has just gotten more attention because it sticks out negatively when it happens on new and expensive hardware.
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,911
6,175
136
8GB vs 11GB. Game go blurrrrrr.

I don't know why he calls it a nightmare for reviewers though. Most have been churning out content with little to no image analysis for years now. It's good to see more of them returning to it. Even if it's perforce.

Because until recently you could trust that regardless of whether you had an AMD or NVidia card or how game settings were configured, they would both produce essentially the same image.

Between all of the new DLSS/FSR versions and settings and games not loading textures as opposed to dropping frame rates to allow time to load, it now becomes necessary to do a detailed image analysis because the raw numbers aren't worth as much because everyone is trying to cheat them.

Doing a 30 game benchmark suddenly becomes a nightmarish amount of work when you need to do image analysis on all of it to make sure that the FPS it claims to deliver is actually giving you the settings level you're benchmarking at.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,603
20,994
146
Because until recently you could trust that regardless of whether you had an AMD or NVidia card or how game settings were configured, they would both produce essentially the same image.

Between all of the new DLSS/FSR versions and settings and games not loading textures as opposed to dropping frame rates to allow time to load, it now becomes necessary to do a detailed image analysis because the raw numbers aren't worth as much because everyone is trying to cheat them.

Doing a 30 game benchmark suddenly becomes a nightmarish amount of work when you need to do image analysis on all of it to make sure that the FPS it claims to deliver is actually giving you the settings level you're benchmarking at.
I don't think you intend it as condescension, I suspect it just reads that way to me. And if you think I am that dumb, that I need these things explained to me, you're wrong. I'm dumber than that.

Aussie Steve pointed out recently how his bar charts can mislead. They have been doing it a long time, as I've pointed out over the years. And my image analysis remark was not the extent of my criticisms. I was merely constraining myself to it for brevity.

30 card shootouts would be too tough eh? Great, stop doing them. I'd prefer each product be thoroughly evaluated instead of the equivalent of a fake rasslin' royal rumble for clicks/views. They can, they have, and they should continue, to do better.

If my opinion is unpopular, cool. I don't require external validation. There are prevailing mentalities about this hobby I don't subscribe to. Ferengi math, acceptance of lazy testing for dollars, and white knighting trillion dollar companies among them. I can understand clicks for dollars, but I don't have to like or accept it.
 
  • Like
Reactions: Tlh97 and blckgrffn

fralexandr

Platinum Member
Apr 26, 2007
2,245
188
106
www.flickr.com
I don't think you intend it as condescension, I suspect it just reads that way to me. And if you think I am that dumb, that I need these things explained to me, you're wrong. I'm dumber than that.

Aussie Steve pointed out recently how his bar charts can mislead. They have been doing it a long time, as I've pointed out over the years. And my image analysis remark was not the extent of my criticisms. I was merely constraining myself to it for brevity.

30 card shootouts would be too tough eh? Great, stop doing them. I'd prefer each product be thoroughly evaluated instead of the equivalent of a fake rasslin' royal rumble for clicks/views. They can, they have, and they should continue, to do better.

If my opinion is unpopular, cool. I don't require external validation. There are prevailing mentalities about this hobby I don't subscribe to. Ferengi math, acceptance of lazy testing for dollars, and white knighting trillion dollar companies among them. I can understand clicks for dollars, but I don't have to like or accept it.
Hah, I remember way back when they always did image quality comparisons, because new graphics cards actually had an impact on how great games looked at the same settings. Sure there weren't as many AAA games, and games were a lot simpler back then, but reviewers actually checked.

edit: Also, wow the main site has some obnoxious ads now...
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,911
6,175
136
I don't think you intend it as condescension, I suspect it just reads that way to me.

It's just my interpretation of why someone would call it a nightmare. Previously you could get away with just the bar charts because you could trust the data to present an accurate enough picture.

I'm more annoyed at NVidia/AMD and the game developers for going down this road because it removes objective measures of performance and just leads to people picking favorable reviews of image quality and calling anyone else biased.

To actually review image quality in an objective manner would require a double-blinded study of dozens of individuals. No review site can afford to do something like that, and most probably don't want to take the massive amount of time to do their own subjective analysis because no matter what they conclude someone in the comment section is going to call them a biased shill and scream about linking their results. We already get plenty of that just based on test-setup, settings, etc.

So if I sound irritated it's because I am, but it's not at you or anyone here, or even the reviewers themselves. It's being upset at the awful state that I knew was going to be an issue as soon as DLSS was trotted out and that everyone is just starting to wake up to. It's probably going to get even worse since we've gone from upscaling to generating fake frames. I can only imagine how great the image quality is going to be when the game engine can't lose the textures and the AI is creating fake frames from a low-resolution texture file.

It makes it even worse for me as a consumer because now it's a lot harder to do any kind of meta-analysis of reviews. How many people have ten hours to devote to watching all of the different reviews and trying to figure out all how to sum the different reviewer opinions on image quality differences.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,978
126
I find it annoying with the excuses that it's the developers' fault. It explains it but it doesn't solve the issue people are running into.
What I find even more annoying is the same people demanding developers patch games to work on 8GB because "it's the developers fault", none of them demand games to work on dual-core CPUs with 4GB system RAM and 5400rpm HDDs.

In fact a lot of them are trumpeting the benefits of Direct Storage as the savior messiah, which ironically needs high-end Gen 5 SSDs to work best. I'll bet the number of people with >8GB GPUs far outnumbers those with Gen 5 SSDs.

It's almost like...there's an intrinsic bias toward the nine year old VRAM time capsules NV keeps releasing.
 
Last edited: