Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
EVGA confirmed it right there. They used POS CAPS, just like I said.
Yeah, again, not cheap caps. The SP-Caps are basically the gold standard in low voltage power supply filtering, that’s why if you flip the "overbuilt" FE board you'll find them used everywhere under the VRMs.
1601091598742.png

If you look up unit pricing on the SP-caps, you'll find they're generally not far off 10x the cost of a good X7R/X8R 0603 ceramic cap which appears to be the sized used there. Typically if you're buying a reel at Digikey it's about $1/piece for one of the 220uF SP-caps, and $0.05-$0.1 for a ceramic cap.

They're built for different purposes. The ceramic caps in the 2x5 grid are probably in the ballpark of 1uF, maybe up to a few microfarads. Total for one grid you're looking at 10-40uF total capacitance. Each of those SP-Caps is 470uF. The SP-Caps are built for bulk capacitance of VRMs though, which typically run in the ballpark of 1MHz with harmonics above that.

tl;dr, the polymer electrolytics or sintered tantalums aren't cheap-o capacitors. Without the additional higher-frequency ceramics the board might be unstable at the highest frequencies and power draws, but that might mean they underestimated the decoupling needs at high frequency. Probably a rushed launch and poor guidance by nvidia on what the actual needs were, it's not fair to dump this on the board partner's feet saying they cheaped out.
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
I'll take back what I said before with this evidence in hand.

Glad I didn't get a card. Doubt they'll change production. Rather they'll change the BIOS, much cheaper, quicker and thus easier.

I'm already annoyed by the fact that for whatever I will get I have to somehow check what thingies are on it (but how) and will I get a straight answer to begin with. Ugh Ampere, ugh! A terrible start after a terrible preceeding series.
 

loki1944

Member
Apr 23, 2020
99
35
51
Gonna say no. I can tell MS flight sim 2020 is making my 1080ti chug. I've seen some other games like resident evil 2 and call of duty use up my vram pretty quickly if I want to turn up the settings like I used to.

I would prefer more especially for 4k but I currently game in 3440x1440 and in VR sometimes too.

Except VRAM is not the only potential factor, a gpu could have 30GB of VRAM and still chug if it's not fast enough or if the game is just badly put together, seen both scenarios with an 880M 8GB in multiple games and then with 1080Ti in Gears of War 5; chug and stutter central.
 

amenx

Diamond Member
Dec 17, 2004
3,851
2,019
136
The 24gb RTX 3090 is not far off the 3080 in 4k perf in MSFS 2020. In fact its ahead by roughly the same margin as in other non-vram intensive games (10-15%).
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
The 24gb RTX 3090 is not far off the 3080 in 4k perf in MSFS 2020. In fact its ahead by roughly the same margin as in other non-vram intensive games (10-15%).
In general the 3090 is not much more performant than the 3080 from what I have seen in most benchmarks. But ms flight sim 2020 is pretty heavy on visuals, especially if you use all live assets.
 

Bingy

Junior Member
Oct 19, 2005
21
0
66
Anecdotally this exact debate has been happening my entire gaming career :) I remember people arguing about buying a 16mb or 32mb TNT2. It's certainly important but there are so many other factors at play. How fast is the memory? How efficient/big is the cache? The new Radeon cards are rumoured to be using GDDR6 still with some leaks even suggesting the new 6700 xt will have 6gb. There's more to it than just the total amount of memory

Another thing I think causes people to worry a bit more than they need to is the use of monitoring software that tends to report memory allocation rather than usage. Monitoring Flight Sim might show full allocation of VRAM but that doesn't necessarily mean it's actually using all that.

For the poll I answered yes, but I think it's a lot more nuanced for what constitutes 'enough.' The stock 2080 has about 448gbps memory bandwidth for that 8gb while the 3080 cranks that up to 760gbps for its 10gb. More room for data and much less time spent holding on to it.

I get the temptation to wait for the 20gb card but I think that will only benefit people doing workstation/rendering types of tasks. For gaming the 3080's overall performance will likely be the thing that causes you to buy a new card within the next 3-4 years, not simply its 10gb of memory. The answer depends on what level of performance is 'enough' for you. If you're set on running ultra 4k in all games all the time with 60fps+ then yeah you'll probably want to upgrade within a few years. That doesn't mean the 20gb card will make much difference.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126
Slightly OT, but screw it, I just ordered 2x RX 5600 XT 6GB cards, factory refurb, for $249.99 ea. from Newegg. Couldn't stand to wait for "Big Navi". Got to get more mining in.

Now THAT's a "VRAM limitation" - 6GB. Hopefully, enough to mine ETH for two more years, at least.
 

Soulmetzger

Member
Sep 30, 2020
26
3
51
www.heatware.com
There is nothing worse than buying hardware that doesn't fulfill your needs for the expected timeline. If you plan to purchase a 3080 then I would opt to spend the extra on the the 20gb version. Especially if you're spending that much to begin with the extra premium shouldn't break the bank for those people.
 

Kuiva maa

Member
May 1, 2014
181
232
116
I went back to Resident Evil 2 remake for another playthrough today. Full ultra 3440x1440, 12.5GB of VRAM usage and an orange warning (my card has 16GB). Marvel Avengers spiked above 8GB as well with settings below max. Just 10GB feels way too low for anything above 1080p to me.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
I went back to Resident Evil 2 remake for another playthrough today. Full ultra 3440x1440, 12.5GB of VRAM usage and an orange warning (my card has 16GB). Marvel Avengers spiked above 8GB as well with settings below max. Just 10GB feels way too low for anything above 1080p to me.
Was that actual usage measured by an app on the background? Becuause if it showed orange in the settings page of the game, I don't think that is memory that is actually used but what the game "thinks" it will need. But I do agree, if you want all bells and whistles on at higher than 1080p with rays tracing, you want to feel safer with more than 10gb.
 

Kuiva maa

Member
May 1, 2014
181
232
116
Was that actual usage measured by an app on the background? Becuause if it showed orange in the settings page of the game, I don't think that is memory that is actually used but what the game "thinks" it will need. But I do agree, if you want all bells and whistles on at higher than 1080p with rays tracing, you want to feel safer with more than 10gb.

I did two quick runs,one under DX11 and one under DX12 with AB performance overlay running. After the game starts, it takes a few seconds for VRAM usage to climb to roughly 8300-8400MB which is the baseline - depending on area or level of action, it can rise all the way to roughly 10500MB and if i open the map, it will spike to11400 or so. There doesn't seem to be a noticeable difference between APIs as far as VRAM is concerned but this wasn't a deterministic scenario, just a quick and broad comparison. I assume that the game knows that given the selected settings, it will probably request the declared amount at some point (12930 MB in my case-not too far from the 11400 I saw). The card is a Radeon VII btw. I am in the market for an upgrade and from what I have been seeing lately, 16GB is the baseline I am comfortable with, no less.
 
  • Like
Reactions: Tlh97

sze5003

Lifer
Aug 18, 2012
14,177
622
126
I did two quick runs,one under DX11 and one under DX12 with AB performance overlay running. After the game starts, it takes a few seconds for VRAM usage to climb to roughly 8300-8400MB which is the baseline - depending on area or level of action, it can rise all the way to roughly 10500MB and if i open the map, it will spike to11400 or so. There doesn't seem to be a noticeable difference between APIs as far as VRAM is concerned but this wasn't a deterministic scenario, just a quick and broad comparison. I assume that the game knows that given the selected settings, it will probably request the declared amount at some point (12930 MB in my case-not too far from the 11400 I saw). The card is a Radeon VII btw. I am in the market for an upgrade and from what I have been seeing lately, 16GB is the baseline I am comfortable with, no less.
Makes sense and pretty much confirmed what I saw when I had RE 2 remake installed a month or two ago. I saw similar ram usage on my 1080ti and then MS 2020 came along and does the same thing. I think the RE 3 remake is similar in usage as well. I know people are saying 10gb is enough for 4k and what not but I've seen game requirements and usage go up each year as more and more options are available in graphics settings.

Sure you can turn down some settings but I'm thinking most people won't do that if they have been used to playing their games a certain way.
 
  • Like
Reactions: Tlh97

psolord

Golden Member
Sep 16, 2009
1,875
1,184
136
I went back to Resident Evil 2 remake for another playthrough today. Full ultra 3440x1440, 12.5GB of VRAM usage and an orange warning (my card has 16GB). Marvel Avengers spiked above 8GB as well with settings below max. Just 10GB feels way too low for anything above 1080p to me.

I did two tests on RE3 remake a while back. One on my 7950 and one on my GTX 970.

I uploaded the videos if anyone cares (not clickbaiting, non monetized, non sponsored, hobbyist channel)



The game estimates 12.34GB for maxed settings, yet both the 7950 with 3GBs and the 970 with *ahem* 3.5GBs, did OK.

I tend to think that Capcom overstimates the VRAM requirements.
 

Kuiva maa

Member
May 1, 2014
181
232
116
The game does tell you that you might encounter bugs in case VRAM usage is too high. Other than that for us to verify these cards actually did ok and we would need to compare against a similar card with more VRAM in various video memory intensive settings. I recall people did report annoying stutters above a certain threshold. I would have tried to reproduce these myself but I couldn't make the game go above 16GB.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Looks like another case of allocated vs. used where it will happily set aside a boatload of memory even though it doesn't really need all of it. Even if it could load all of that allocated memory up with game assets, it likely just saves time when going between areas since it already has what it would want to load in memory.

I suppose at some point you'd have enough where all of a game's assets could be loaded into VRAM and it never has to go to disk or system memory for anything. Seems like less of an issue given how good an SSD is compared to a spinning disk and the new technologies we're seeing companies develop to push that even further.

Outside of a few specific titles that may just legitimately need a lot of data in VRAM for whatever reason, I leaning more towards thinking 10 GB will wind up being fine for the next 3 years. Eventually it will start to hit a wall, but if people are using DLSS for R then they're not running the game in 4k native anyway and the memory requirements will naturally be more relaxed.
 
  • Like
Reactions: psolord and Tlh97

samboy

Senior member
Aug 17, 2002
217
77
101
My take is that if you plan on gaming in 4K then 10GB is not sufficient.

1080p seemed to converge on needing 4GB
1440p seemed to converge on needing 8GB (80% more pixels than 1080p. Some folks have 1600p which is 100% more pixels)

This would imply that 2160p or 4K would need around 16GB
Essentially all the graphics assets need to scale proportionally to the resolution you are driving it.

As someone mentioned earlier on, game developers will still target 10GB; it will be popular enough and they can do this by using lower resolution graphics assets (which they need anyway to support lower resolutions).

My expectation is that you will be dissapointed down the road if you plan on gaming in 4K and using the highest fidelity graphic assets (such as textures etc); especially once the install base of 16GB cards grows enough for developers to target - which may take 1-2 years
 
  • Like
Reactions: VirtualLarry

TheF34RChannel

Senior member
May 18, 2017
786
309
136
Seeing as it depend on a case by case basis, I'd like to opt for 16GB. It's better to have and not need than to need and not have.

I'm also not buying an expensive card to turn down settings <1yr. :p
 
  • Like
Reactions: Magic Carpet

n0x1ous

Platinum Member
Sep 9, 2010
2,572
248
106
Quake Champions maxed at 4k on 3090 uses 12GB. I know it may just be allocated not needed but just another data point FWIW. I couldnt deal with the 10gb anxiety so had to go 3090
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Cross post from Ampere Architecture thread:

That goalpost is now 3 time zones from where it started.

We've had this conversation for two plus decades now. Basically every time those who say less vram will be fine have been wrong, especially immediately before a new console generation.

That conversation was not about future proofing, it was about someone claiming a memory size would be good forever. No-one is saying 8GB will be fine forever.

I am just pointing out that the one example people keep harping on is NOT a problem. It just hints at a problem.

Bottom line: More VRAM won't future proof current cards. Because you will need to turn settings down sooner and more often for the lack of bandwdith/cores, before you need to turn them down from lack of VRAM.

The above is demonstrably true, because this is already the case.
 
Last edited:
  • Like
Reactions: deathBOB

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The people have spoken. Whether 10GB is "enough" or not is case dependant and is now beside the point. However, with a "flagship" card, there should be no question about it, and clearly, there is nothing but questions about Nvidia's choice to screw people with 10GB of ram. People want more and aren't happy with a new "flagship" card having less ram than the previous two generations had. It's looking more and more likely that Dr. Su is about to kick their Vram-skimping butts. They deserve to get rekt.