8GB VRAM not enough (and 10 / 12)

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,650
5,275
136
They just might. They know that the low VRAM is making them look pretty bad in benchmarks and now they get to charge possibly up to $100 more for the highest VRAM configuration. But that will be MSRP. Actual prices will be crazy high for the 16GB one.

I think this is just a case where Gigabyte has no idea what nVidia is planning and just made up 10,12,16 GB SKUs. A theoretical 16 GB model would be 128-bit with the clamshell or 4 GB chips.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,364
2,855
136
There is also a strong belief on my part that GPU manufacturers overly influence developers to make their games as difficult to run as possible and that includes going extra heavy on the vram, even when not necessary. Its the way they stay in business.
AD106, AD107 have only 8GB Vram and AD104 12GB, so I don't think influencing game developers to increase Vram usage is what Nvidia does. If they want to stay in business, then they would influence them to use less Vram.
 
  • Like
Reactions: Tlh97 and scineram

GodisanAtheist

Diamond Member
Nov 16, 2006
6,858
7,242
136
AD106, AD107 have only 8GB Vram and AD104 12GB, so I don't think influencing game developers to increase Vram usage is what Nvidia does. If they want to stay in business, then they would influence them to use less Vram.

- While to don't entirely buy into the base theory, NV has shown itself absolutely willing to screw over its own customers if it means screwing over AMD more.

However, given AMD tends to have the VRAM advantage per tier (at least for the last gen) I can definitely see this as a play on their part (they even did it with Farcry 6's high res texture pack).
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,231
5,233
136
No I dont believe its "a lie". Just that other sites report no such issue, but rather smooth or "fluent" gameplay at the same settings.

There is definitely something odd going on that HWUB/Techspot should have noticed about the FC6 results:

October 8, 2021

The FC6 GPU test, Ultra 4K HD texture results:

Cards of interest:
6800 57/64
3070 Ti 51/60
3070 45/56
6700 XT 42/49

So initially, 3070, has no issues at all at 4K Ultra/HD textures, then:

March 22, 2022

The 3070 vs 6700 XT test, FC6 , High 4K HD texture results:

3070 9/44
6700 XT 56/63

Now the 3070 is completely crushed, but further into the future a 3070 Ti vs 6800 comparison:

October 17, 2022

Unfortunately we don't have detailed results for Far Cry this time, and just the Percentage difference (3070 Ti 10% slower than 6800). But I think that's enough to see that 3070 Ti (also with 8GB) is NOT being crushed here...

Back when the 3070 tested against 6700 XT, 6700XT wins FC 6 by over 40% when 3070 wins 4K overall by something like 20%...

But the 3070 Ti is a bit slower overall vs 6800, and only loses FC 6 by 10%...

It really looks like the March results were some kind of anomaly.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I picked up Hogwarts Legacy yesterday. And I am unable to play the game at max settings, even with FSR enabled (equivalent to 960P) because I run out of VRAM with my 5700 XT. The game runs smooth from a GPU point of view, but if you hit a new area its stutter city.

If I drop the settings to "high", the stutter goes away completely. VRAM usage averages around 7.5 to 7.8GB on High.
 

jpiniero

Lifer
Oct 1, 2010
14,650
5,275
136
I picked up Hogwarts Legacy yesterday. And I am unable to play the game at max settings, even with FSR enabled (equivalent to 960P) because I run out of VRAM with my 5700 XT. The game runs smooth from a GPU point of view, but if you hit a new area its stutter city.

If I drop the settings to "high", the stutter goes away completely. VRAM usage averages around 7.5 to 7.8GB on High.

Really need more games with DirectStorage to see how well developers can leverage it.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,976
126
There is definitely something odd going on that HWUB/Techspot should have noticed about the FC6 results:
They're testing different parts of the game, just like how Howarts Legacy isn't VRAM limited in every area. This is really quite elementary.

Ubisoft clearly lists the requirements: https://www.ubisoft.com/en-us/help/...g-the-hd-texture-pack-for-far-cry-6/000099142

Please note that the minimum requirement for running the HD Texture Pack for Far Cry 6 is 12GB of VRAM. For 4K configurations, you will need 16GB of VRAM.
If you download and run the HD Texture Pack with lower VRAM capabilities, you will encounter performance issues while playing.

The stuttering has been confirmed by a plethora of people including HWUB. If you think HWUB, Ubisoft and all those people are lying, you need to provide evidence.

It really looks like the March results were some kind of anomaly.
LMFAO. That's a new one so let's add it to the list:

Reasons why 8GB nine years after first release isn't nVidia's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down texture settings while 3060 users have no issue.
  15. It's an anomaly - guidryp (2023)
If you have any other reasons why 8GB isn't nVidia's fault, please let me know and I'll add them to the list. Cheers!
 
Last edited:

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
They're testing different parts of the game, just like how Howarts Legacy isn't VRAM limited in every area. This is really quite elementary.

Ubisoft clearly lists the requirements: https://www.ubisoft.com/en-us/help/...g-the-hd-texture-pack-for-far-cry-6/000099142

Please note that the minimum requirement for running the HD Texture Pack for Far Cry 6 is 12GB of VRAM. For 4K configurations, you will need 16GB of VRAM.
If you download and run the HD Texture Pack with lower VRAM capabilities, you will encounter performance issues while playing.

The stuttering has been confirmed by a plethora of people including HWUB. If you think HWUB, Ubisoft and all those people are lying, you need to provide evidence.


LMFAO. That's a new one so let's add it to the list:

Reasons why 8GB nine years after first release isn't nVidia's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. 4K is irrelevant.
  11. Texture quality is irrelevant as long as it matches a console's.
  12. Detail levels are irrelevant as long as they match a console's.
  13. There's no reason a game should use more than 640K 8GB, because a forum user said so.
  14. It's completely acceptable for 3070/3070TI/3080 owners to turn down texture settings while 3060 users have no issue.
  15. It's an anomaly - guidryp (2023)
If you have any other reasons why 8GB isn't nVidia's fault, please let me know and I'll add them to the list. Cheers!
DRAM prices went up for 3 straight years (Samsung/Micron/SK Hynix...)
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,231
5,233
136
The stuttering has been confirmed by a plethora of people including HWUB. If you think HWUB, Ubisoft and all those people are lying, you need to provide evidence.

Don't put words in my mouth. I never said they were lying. I just said something was odd, that HWUB should have noticed.

For 8GB cards:

Oct 2021: No problem with HD textures at any resolution, including 4K.
Mar 2022: Big problem at 4K (none at 1440P)
Oct 2022: No problem at 4K again...

It's something that bears a little more investigation.

Even if this was the consistent result for Far Cry. It's hardly a big deal. You could simply use DLSS/FSR for that game, if you want to run 4K on a mid series card like the 3070...
 

amenx

Diamond Member
Dec 17, 2004
3,953
2,193
136
I think its become clear that 8gb (midrange) GPUs are not capable of running modern games at 4k resolutions at ultra++ settings. Any folks here or anywhere that use their 8gb cards under such scenarios? And not aware of higher tier GPU options designed for those conditions?

Of course there are also many games at 4k that run well at decent settings with 8gb cards (just dont go overboard).
doom4.png

So focus should really be on 1440p and 1080 since these are the most likely resolutions utilized by 8gb GPU owners. 2 years ago (at start of this thread) this was not an issue other than an isolated example here or there being run at some crazy setting (despite still finding the odd contradictions to some of those examples). Yes, the vram is holding you back despite getting over 100 fps. :D
doom 1440p hub.png
But for newer games today at 1440p being played with older 8gb GPUs, the GPU manufacturers message is loud and clear, that you must upgrade your cards....... Or ignore them for another year or two and continue to enjoy the vast majority of games @ 1440p that have not yet been hampered by the lack of vram.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
But for newer games today at 1440p being played with older 8gb GPUs, the GPU manufacturers message is loud and clear, that you must upgrade your cards....... Or ignore them for another year or two and continue to enjoy the vast majority of games @ 1440p that have not yet been hampered by the lack of vram.

I can't even play Hogwarts Legacy at 1440 though. I am running out with it at 960P (FSR Quality with 1440 native) with the settings on High, not Ultra.
 
  • Like
Reactions: KompuKare

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Don't put words in my mouth. I never said they were lying. I just said something was odd, that HWUB should have noticed.

For 8GB cards:

Oct 2021: No problem with HD textures at any resolution, including 4K.
Mar 2022: Big problem at 4K (none at 1440P)
Oct 2022: No problem at 4K again...

It's something that bears a little more investigation.

Even if this was the consistent result for Far Cry. It's hardly a big deal. You could simply use DLSS/FSR for that game, if you want to run 4K on a mid series card like the 3070...

HWUB likely tested a different area because there were threads on forums/redit of people having issues. The vast majority of testers only test some place at the start of the game.
 
  • Like
Reactions: Mopetar

amenx

Diamond Member
Dec 17, 2004
3,953
2,193
136
More HUB inconsistencies. Look at HUB review of Hogwarts Legacy vs their Techspot review. Again, same reviewer, same settings, same results, except the 3080 10gb. It scores higher in his Techspot review.

HUB Hogwarts Legacy - 1440p (Ultra quality) TAA high, Ray tracing ultra
HL 1440p vid.png

Techspot Hogwarts Legacy - 1440p (Ultra quality) TAA high, Ray tracing ultra

HL 1440p RT.png
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,231
5,233
136
HWUB likely tested a different area because there were threads on forums/redit of people having issues. The vast majority of testers only test some place at the start of the game.

Or differences between game patches and/or driver updates. We just don't know what happened because there was one anomalous result out of three tests, that wasn't investigated/explained.
 

amenx

Diamond Member
Dec 17, 2004
3,953
2,193
136
Thats because the benchmark that is built into the game doesn't represent game play in areas that have the issue. It also gets worse the longer you play. The benchmark doesn't replicate this.
Sounds like sloppy memory management by the game engine in allowing assets to accumulate without purging them.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,927
5,854
136
Sounds like sloppy memory management by the game engine in allowing assets to accumulate without purging them.

Bad PC ports are the rule and not the exception though. Needing to go overboard on hardware being a necessity for AAA PC gaming has been a general rule for at least this century.
 
  • Like
Reactions: KompuKare