8GB VRAM not enough (and 10 / 12)

Page 127 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,822
2,423
136
See? You can make FPS bar bigger better by just turning down textures on your 3080. Sure textures will pop in, but who cares if number bigger.

Exactly! It is all about appropriate settings. Of course, I will probably have to overclock my 2500k a bit more to get it to run Indy but its not the years, its the mileage.

Edit to add: Re the 2060, we need to see the 12GB variant results, lol.
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
Is that settings, or does the game not run at all on cards that don't support ray tracing?
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
Here's how I see it.

8GB is the new 4, 12GB the new 6, 16GB the new 8.

With 8GB you could play everything for many years with scant few if any compromises involving vram. 6GB as @GodisanAtheist can tell you held its own for years too. It hit the wall before 8GB however, and the compromises started sooner. 4GB you could play the games, but more and more often the compromises necessary resulted in increasingly undesirable res and visuals.

The real stinker, is 8GB cards now, unlike 4GB cards of yesteryear, can't even use the features they are promoted with. FG on a 4060 is often unusable at 1080 and DLSS looks like ass when starting from that res = lolno.
 
Jul 27, 2020
26,024
17,952
146
Nvidia users await Lord Jensen's neural texture compression tech in Geforce 5000 series with bated breath to double their VRAM :p
 
Jul 27, 2020
26,024
17,952
146
If that happened, it be kind of badass wouldn't it?
Nothing is ever free. Would be a miracle if there are no caveats, like doubling of VRAM in only certain games that use a lot of repeated textures or graphical glitches accompanying higher compression ratios.

Remember the old demo scene demos that would generate virtually hundreds of megabytes of textures from only 64K executables? It could be possible with AI where it consumes all the textures in the game's assets and then churns out similar looking ones that take up less space.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
Nothing is ever free. Would be a miracle if there are no caveats, like doubling of VRAM in only certain games that use a lot of repeated textures or graphical glitches accompanying higher compression ratios.
Since buying the gaming TV I've been playing at 4K with FSR Q and FG in Stalker 2. It's ummm....great. I hate saying that about FG after railing it so hard a couple years ago. Feel like a hypocrite. But I'm using it and liking it, no way around it. Makes all of the drama about how much better DLSS is seem exaggerative. I suppose I could see some differences in a side by side, but like the TV itself, that's meaningless to me as I am only using the one.

There may be no free lunch, but I've had some kickass blue plate specials. ;) If we are going to immerse ourselves in complaint culture and nitpick incessantly, we will never be happy. I know an acceptable compromise when I experience it. My lifetime's experience has been that the technology gets better over time. And sometimes they have to drag some of us into the future kicking and screaming.Once you get them there, and give them the pacifier, they STFU and enjoy it.
 

poke01

Diamond Member
Mar 8, 2022
3,754
5,084
106
Okay I finally tried the game on my PC. 1440p Ultra + non full-RT gets 130FPS

With Path tracing its 20FPS, its not worth it.

But even with PT off, the game is so nice and crisp. I'll enjoy my 130FPS
 

MrTeal

Diamond Member
Dec 7, 2003
3,916
2,700
136
Neither support DX12U though, and honestly if you’re drawing a line it’s as good as any other…
Their decision obviously, but I'd somewhat disagree with that. There's a decent percentage of relatively new hardware that isn't DX12U capable, some even still being sold like the Turing 16 series or even a 5700G. It'd suck if the guy who built a cheap work and gaming system with a 5700G a year ago starts getting locked out of even launching most games going forward because DX12U is mandatory and they don't provide a DX12 codepath.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
Their decision obviously, but I'd somewhat disagree with that. There's a decent percentage of relatively new hardware that isn't DX12U capable, some even still being sold like the Turing 16 series or even a 5700G. It'd suck if the guy who built a cheap work and gaming system with a 5700G a year ago starts getting locked out of even launching most games going forward because DX12U is mandatory and they don't provide a DX12 codepath.

Yes. As a developer if you wanted to cut the cruft and streamline your testing and support, it does all that. It’s a feature.

It’s their product and it hearkens back to the days of making sure your accelerator was supported - and then running the right driver version - to enable hardware support for a title before you bought it.

I am not saying it’s a “fair” thing, but it’s absolutely the developers right to set their own hardware requirements. They do so at their own peril, but my guess is the 5700G gamer isn’t buying a lot of $70 games when they drop, either. The developer has no agreement or moral obligation to support all GPUs in existence or sold in the last year. I am pretty sure there are some GT 740s and the like still new in box somewhere ;)

If it turns out to a FAFO moment for the developer I guess we will all get that entertainment too.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
4060 providing a worse gaming experience than the 3060 12GB in multiple games. Daniel-san's conclusion is bad, and I think he is setting us up in an effort to explain why the B580 having more vram isn't a reason to pick it over the 4060. I hope I am wrong.

 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
If it turns out to a FAFO moment for the developer I guess we will all get that entertainment too.
The money and assistance Nvidia provides lowers the ROI needed to be successful. It runs well enough on consoles so sales won't be negatively impacted there. It will also get some PC gamers to upgrade to an RTX card, even if it's on the used market. Nvidia isn't going to miss a trick either - https://blogs.nvidia.com/blog/geforce-now-thursday-indiana-jones-and-the-great-circle/

I am certain MS ponied up to have it on Game Pass too. That's where I'll be playing. And of course they can get some to pay for Ultimate to play it from the cloud on their TV, Phone, whatev.
 
  • Like
Reactions: Tlh97 and blckgrffn

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
31,681
31,538
146
Any bets on if Nvidia will equip 5060 with more than 8gb of vram ?

That thread is gold. Big LOLs had over the advice to turn down textures on a $300 card. 6700XT-6750XT, and of late, 7600XT have been $300 or less here in the U.S. NIB.

And my guess is there is no chance a $300 or under RTX has more than 8GB. Their $300 and under cards are for esports now, not AAA. I agree with everyone else saying they are looking to upsell you to $500 and above for the AAA experience. 4060ti SKUs are for suckers; no one else would touch one.
 

blckgrffn

Diamond Member
May 1, 2003
9,676
4,308
136
www.teamjuchems.com
100%. I expect the 5060 Ti to be 8 GB too, although they might offer a 12 GB option when the 3 GB chips come out.

I think they'll still give us the option of a 16GB 5060Ti for an extra $100 with 4GB chips (I am assuming they'll be available some point). This requires the least possible effort, imo.