8GB VRAM not enough (and 10 / 12)

Page 162 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

mikeymikec

Lifer
May 19, 2011
20,378
15,071
136
Not at all. This is what Intel did at their peak. Stopped caring about their product and serving their customer base.

I disagree, though perhaps it depends on what you mean by their peak. IMO their peak was Haswell/Broadwell, but even then they had been making incremental performance gains probably solely through die shrinks and minor tweaks. IMO Intel laid on its laurels for too long, and around the time when the Core architecture should have been put out to pasture (14nm, Skylake) with a successor in the wings, soon after AMD released Zen and Intel was caught completely with their pants down. Intel's inability to regroup from an 8-year-old event is Intel's problem.

However, (AFAIK) bear in mind that zen5 also involves incremental gains over zen4. IMO incremental gains are fine, a hardware manufacturer ideally wants to produce an architecture that scales well for a few generations so they're not having to hit the drawing board every other generation. They need a cash cow with which to fund the R&D for the next major re-work.

Nvidia might also be in a similar rut to Intel (the 4000 and 5000 range are the same fabrication node, TSMC 4N), but it doesn't explain how they've likely also completely fluffed their drivers.

I hope Nvidia crashes in one day. Puts arrogant millionaires on the street.

I agree that market abusers like Intel and Nvidia deserve a painful outcome.
 

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
Seeing it be worse than the 2060 6G is even more telling, IMO ...

Turing aging like fine wine apparently. This has to be driver issues. Which by the way I see more and more of the general gaming public talk about it and acknowledge it. Usually to tell people on non 5000 cards to roll back their drivers if having problems. I laugh every time Dune Awakening tells me I'm running old drivers and I need to get the latest to run the game.

Though I'm more interested in this remastered HZD which shows as a $10 upgrade cost that you can't see I guess if you don't own the base game on steam. Another $20 or less game that is back to $50. My GoG version apparently doesn't get the upgrade.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
8,738
7,347
136
I disagree, though perhaps it depends on what you mean by their peak. IMO their peak was Haswell/Broadwell, but even then they had been making incremental performance gains probably solely through die shrinks and minor tweaks. IMO Intel laid on its laurels for too long, and around the time when the Core architecture should have been put out to pasture (14nm, Skylake) with a successor in the wings, soon after AMD released Zen and Intel was caught completely with their pants down. Intel's inability to regroup from an 8-year-old event is Intel's problem.
Thought their peak was more Sandy Bridge. I remember people holding on to their 2500k, 2600k, and 2700k forever as there wasn't anything much better for years. Though I did hold onto my Haswell cpu (Xeon E3-1231v3) for 8 years until Elden Ring became the first game I couldn't run 60 fps on it in 2022.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,680
31,538
146
Turing aging like fine wine apparently.
I was stoked with both my 2060 and 2070 supers. Only sold them because the market was so good. I gave buyers what would be considered deals at the time, and still had money left after buying a 3060 12GB and 3060ti via Newegg shuffle and EVGA queue.

Using the 3060 that powered my kid's system for 4 years, I am impressed with it. Steve was adamant about not focusing on vram issues in the comparison, but it skewed his perspective as a consequence. He said the 3060 was a terrible card. I could not disagree more. Between 12GB and access to DLSS TM, I have not tried a game yet, that looks or plays poorly sitting 2.5 meters from a 55" gaming TV. Can use RT reflections, while never compromising on textures. Something that cannot be said for any of the other 60 series RTX.

In every game he used medium settings, he could have added "Plus max textures with the 3060" That and texture issues at higher settings on the 8GB cards is a big deal. Don't let people making fun of you for it, like GN Steve, dissuade you from staying on point. Consequently, he produces those dumb geomeans instead. I think here past the mid way point of the third decade of the 21st century we can no longer encapsulate things that way. The math lies by omission.

I continue to maintain my position that I would rather use correct settings™ for compute limitations than vram limitations.
 

marees

Golden Member
Apr 28, 2024
1,249
1,803
96
Dragon Age: The Veilguard
This is a new one to me

Game is based on frost bite engine & has a texture slider


EDIT:

Apparently this game is 'unoptimized' & has some VRAM bugs ??
 
Last edited:

marees

Golden Member
Apr 28, 2024
1,249
1,803
96
This is a new one to me

Game is based on frost bite engine & has a texture slider


EDIT:

Apparently this game is 'unoptimized' & has some VRAM bugs ??
Apparently memory leaks

Obviously lower vram cards will hit the issue faster

 

marees

Golden Member
Apr 28, 2024
1,249
1,803
96

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,680
31,538
146
Aussie Steve does the AMD edition of the 4 generations of 60 series compared. When using gen 5 hardware the 9060XT 8GB is beastly. Unlike with team green where you see 30 series vram be a fly in the ointment for the latest iteration, RDNA 4 wins everything handily.

As is pointed out repeatedly in the video, paired with older gen hardware, it'll get ugly for the 9 series. I like that he included the inflation adjusted pricing too. Most of the talking points against hardware are erroneous, since they are based on non adjusted numbers. You can't expect anyone to take your position seriously when it amounts to old dudes complaining about how a loaf of bread cost 25 cents when they were a kid. Is $300 in today's money too much? Yes. $250 is probably the price where we should realistically stop bitching about 8GB 60 series pricing though. $200 for the 5050.

 

Thunder 57

Diamond Member
Aug 19, 2007
3,805
6,414
136
Aussie Steve does the AMD edition of the 4 generations of 60 series compared. When using gen 5 hardware the 9060XT 8GB is beastly. Unlike with team green where you see 30 series vram be a fly in the ointment for the latest iteration, RDNA 4 wins everything handily.

As is pointed out repeatedly in the video, paired with older gen hardware, it'll get ugly for the 9 series. I like that he included the inflation adjusted pricing too. Most of the talking points against hardware are erroneous, since they are based on non adjusted numbers. You can't expect anyone to take your position seriously when it amounts to old dudes complaining about how a loaf of bread cost 25 cents when they were a kid. Is $300 in today's money too much? Yes. $250 is probably the price where we should realistically stop bitching about 8GB 60 series pricing though. $200 for the 5050.


Great, more people will start saying how HUB hates Nvidia and are pro AMD.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,805
6,414
136
Haters gonna hate. I think he does a good job of explaining his testing methodology and what it is he is testing, comparing, etc.

They're pretty solid though I like GN better. I used to like LTT but they are crap to put it nicely. Haven't watched the video yet but I look forward to it. I almost got a 5600 XT but was worried about the VRAM. That sure turned out to be right.
 

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
The 5600xt was a joke since it released about a month after the 2019 holiday sales of the 5700 which had that card down to 300-330 or so. I think there was even reference versions at Dell for $279 (The 5600xt's msrp). The same or almost the same money for 6gb vs 8gb? Total joke.
 

Thunder 57

Diamond Member
Aug 19, 2007
3,805
6,414
136
The 5600xt was a joke since it released about a month after the 2019 holiday sales of the 5700 which had that card down to 300-330 or so. I think there was even reference versions at Dell for $279 (The 5600xt's msrp). The same or almost the same money for 6gb vs 8gb? Total joke.

Why do you think I got the 5700? ;)
 
  • Like
Reactions: Tlh97 and Ranulf