8GB VRAM not enough (and 10 / 12)

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
Can't be A. Based on UE4.

i can think of a lot of broken UE4 games which were riddled with memory dumps.
So im gonna go with

D. which is all of the above + a studio which does not have much experience in optimization.

IF the game can make it to a nintendo switch, you know the game has been optimized to peak, which obviously this game has not, as well as almost all UE4 games in general.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,361
2,849
106
It is, and I am sure it will look terrible. Similar to how The Witcher 3 is terrible on the switch. It will likely run below 720P, 30fps, and with very muddy textures. But they will still sell a crap load of copies.
Considering the specs of switch, It's not surprising It looks like sh*t. That handheld is just too weak and Nintendo instead of releasing a new version is still milking this one even thought It's 6 years old and never was subsidized by them.
They basically just need to put another mobile SoC inside, and there are many options available. Even If they needed something closer to desktop(laptop) GPU architecture, then they still could have used Exynos 2200 with AMD's technology. I don't think Samsung would have had a big problem with that, especially If they made It in their fabs.
Nintendo just doesn't want to increase BOM, If the sales are still good.
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,498
20,615
146
It is, and I am sure it will look terrible. Similar to how The Witcher 3 is terrible on the switch. It will likely run below 720P, 30fps, and with very muddy textures. But they will still sell a crap load of copies.
Nintendo has perfected the kool aid formula. No wonder Nvidia is working with them. ;) No matter what you do millions of Frys line up and shout

futurama-fry.gif


Last segment in the Nvidia human caterpillar marketing chain, better know as Alex of DF, will find a way to be impressed. :p
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,971
126
4060/4060TI are expected to have 8GB. If true they're DOA. It's almost like the people who design nVidia memory configurations don't know ray tracing exists.

First consumer 8GB card:


That card was released in 2014. Nine years later nVidia regurgitates 8GB while adding miner markup pricing. But don't worry folks, we have TV "aye eye" frames to the rescue!
 

Golgatha

Lifer
Jul 18, 2003
12,651
1,514
126
Nintendo has perfected the kool aid formula. No wonder Nvidia is working with them. ;) No matter what you do millions of Frys line up and shout

futurama-fry.gif


Last segment in the Nvidia human caterpillar marketing chain, better know as Alex of DF, will find a way to be impressed. :p

I mean, their 1st party games and Rabbids are fun. Everything else is available on a better platform. For me, a Nintendo Switch purchase is in addition to what I've already got available. So yeah, shut up and take my money.

P.S. If it's a game where graphics aren't the end all be all, the portability is great too.
 
  • Like
Reactions: DAPUNISHER

jpiniero

Lifer
Oct 1, 2010
14,607
5,226
136
4060/4060TI are expected to have 8GB. If true they're DOA. It's almost like the people who design nVidia memory configurations don't know ray tracing exists.

They do. They are trying to deal with TSMC's wafer prices. Cutting the number of memory chips helps keep the costs down.

At some point 4 GB chips will be available, maybe?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
They do. They are trying to deal with TSMC's wafer prices. Cutting the number of memory chips helps keep the costs down.

At some point 4 GB chips will be available, maybe?

This makes no sense. If that was the case, Intel and AMD would also be limiting memory. But they aren't. nVidia has the highest MSRP's with the highest volume. Meaning they are the least impacted by BoM costs.
 

jpiniero

Lifer
Oct 1, 2010
14,607
5,226
136
This makes no sense. If that was the case, Intel and AMD would also be limiting memory. But they aren't.

N33 is also 128-bit and they haven't announced any desktop parts yet, so we don't exactly know which nVidia product it will compete with. Presumably if they can fix N31's performance in time it will be re-released at higher prices.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
Anything costing more than $300 will need to have more than 12GB of memory. Not only are the 6700XT and A770 lurking around there but, an more importantly for the folks who never consider any other brand, the 3060 itself is sitting right there, not to mention the 2060 12GB, as forgotten as it is…

If you wanted to ignore the weird timeline, you could argue and not be wrong that the last two x60 products featured 12GB optionally and then as a standard. The 4060 retreating to 8GB is a regression, and an important one at that.

If die size is the problem they should have kept L2 the same (or maybe just doubled it, whatever keeps it efficient for the L1) and kept the 192 bit bus as an option.
 
Last edited:
  • Like
Reactions: Ranulf

jpiniero

Lifer
Oct 1, 2010
14,607
5,226
136
If die size is the problem they should have kept L2 the same (or maybe just doubled it, whatever keeps it efficient for the L1) and kept the 192 bit bus as an option.

It's also more power efficient, but suggesting they should have done something like that (if 4 GB chips are unlikely to be available in time) is totally fair.
 

blckgrffn

Diamond Member
May 1, 2003
9,127
3,069
136
www.teamjuchems.com
It's also more power efficient, but suggesting they should have done something like that (if 4 GB chips are unlikely to be available in time) is totally fair.

Yup, I get that. If a 160W card becomes a 180W card but performs the same but has 12GB vs 8GB of frame buffer I think the vast majority would find that peak power usage bump and likely smaller average power cost completely irrelevant.

The 4060 mobile part needs that focus, the desktop versions of everything have seemingly thrown down that gauntlet and when a nice card is less than 200W its a nice to have 😂
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,896
5,832
136
4060/4060TI are expected to have 8GB. If true they're DOA. It's almost like the people who design nVidia memory configurations don't know ray tracing exists.

First consumer 8GB card:


That card was released in 2014. Nine years later nVidia regurgitates 8GB while adding miner markup pricing. But don't worry folks, we have TV "aye eye" frames to the rescue!

Can't wait to see that gimpy 4060 launch at $400 too.
 

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,699
136
4060/4060TI are expected to have 8GB. If true they're DOA. It's almost like the people who design nVidia memory configurations don't know ray tracing exists.

First consumer 8GB card:


That card was released in 2014. Nine years later nVidia regurgitates 8GB while adding miner markup pricing. But don't worry folks, we have TV "aye eye" frames to the rescue!
A quick perusing of r/buildapcsales also shows you could buy one of those pretty much exactly 8 years ago for $350, or $446 adjusted for inflation to 2023. Wonder how the 3060 Ti will compare to that.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,361
2,849
106
They do. They are trying to deal with TSMC's wafer prices. Cutting the number of memory chips helps keep the costs down.

At some point 4 GB chips will be available, maybe?
Poor Nvidia, they already had so low margins and now the evil TSMC asks high prices for wafers compared to Samsung, and they have to keep using only 8GB Vram to at least break even. :rolleyes:

Now, without the sarcasm.:)
For **70Ti they increased MRSP by $200, which is 1/3 more and as a bonus you gain 1/2 more Vram.
It's pretty unlikely to expect RTX 4060(Ad107) or RTX 4060Ti(Ad106) to cost only $329 or $399 as their predecessors.

GDDR6 shouldn't be that expensive to begin with, when RX 6500XT 8GB costs only $20 more than the 4GB version. Then cost of 8GB Vram is <$40.
 
Last edited:

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Correct, but it's apples to apples with the original TechPowerUp article, and I think shows even highly optimized games are going to use over 8GB at 4K.

True, it matches their test. I tried to find somebody that had measured Witcher 3 Enhanced Edition, but wasn't able to find one in my quick search. But I know on my own machine, its using significantly more than the old version.
 
  • Like
Reactions: Tlh97 and Golgatha

gamervivek

Senior member
Jan 17, 2011
490
53
91
The whole thing is a chicken and egg situation since the game creators reduce the VRAM usage if the mainstream cards don't have enough VRAM.

Cyberpunk reduces texture quality at 1080p severely to keep it within 8GB. Far Cry 6 went ahead with way more VRAM usage, but even that was patched to reduce the LoD. The reviewers also don't test for long and don't have anything else open/ second screen. You lose amost 1GB of VRAM for watching videos on youtube.

And that is before the texture mods come into picture which were a staple of PC gaming. Cyberpunk has a few texture mods that improve blurry textures that the game uses in some scenarios and it can easily push total VRAM usage over 16GB at 4k.

Since my last post, Cyberpunk was patched for frame generation and using it also requires more VRAM. 4070Ti can get to 4k60 with DLSS+FG, but starts running out of VRAM in the below video at 26m50s,



4090 can get 4k60 with native+FG but the VRAM requirements start nearing 16GB, or even higher if you're using texture mods.
pRAmlE2.jpg
 

amenx

Diamond Member
Dec 17, 2004
3,909
2,132
136
But how can we differentiate between games that dynamically use (not necessarily need) vram according to the cards capacity of vram and games that actually will not run properly due to the lack of vram? And not confuse the latter with an already weaker GPU due to its other specs?
 

Golgatha

Lifer
Jul 18, 2003
12,651
1,514
126
But how can we differentiate between games that dynamically use (not necessarily need) vram according to the cards capacity of vram and games that actually will not run properly due to the lack of vram? And not confuse the latter with an already weaker GPU due to its other specs?

Honestly, you'll probably just see it in the benchmarks. Things like a 3060 beating a 3070 Ti in FPS past a certain resolution etc.