Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 11 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
The people have spoken. Whether 10GB is "enough" or not is case dependant and is now beside the point. However, with a "flagship" card, there should be no question about it, and clearly, there is nothing but questions about Nvidia's choice to screw people with 10GB of ram. People want more and aren't happy with a new "flagship" card having less ram than the previous two generations had. It's looking more and more likely that Dr. Su is about to kick their Vram-skimping butts. They deserve to get rekt.

"The people" have created larger demand for that 10GB card than just about any GPU in history.
 
  • Like
Reactions: amenx

dr1337

Senior member
May 25, 2020
311
514
106
Bottom line: More VRAM won't future proof current cards. Because you will need to turn settings down sooner and more often for the lack of bandwdith/cores, before you need to turn them down from lack of VRAM.
This is false and I have proof.


Please I implore you to read this whole article. Really pay attention to the summary and the tests at max settings but the Real Bottom Line: is that more vram 100% can contribute to the performance of a GPU. Back then a 50% increase in memory was a big selling point and while the 680 handily beat the 7970 at lower resolutions, it really fell behind at higher res. And now here in the future the 7970 absolutely eclipses the 680 and can handle high texture settings much much better.

One could argue something along the lines of AMD fine wine or nvidia gimping drivers/not adding new support. And maybe thats part of it, but seeing this massive divergence in performance historically, why would you want to artificially limit how "future proof" your gpu is? Especially considering consoles are easily going to be using 12gb+ of vram in a few years easy.
 

amenx

Diamond Member
Dec 17, 2004
3,851
2,019
136
Cant compare different architectures to make a case for vram. All other variables must be identical. Best wait for RTX 3070 8gb and 16gb versions and put them through the tests.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
This is false and I have proof.


Please I implore you to read this whole article. Really pay attention to the summary and the tests at max settings but the Real Bottom Line: is that more vram 100% can contribute to the performance of a GPU. Back then a 50% increase in memory was a big selling point and while the 680 handily beat the 7970 at lower resolutions, it really fell behind at higher res. And now here in the future the 7970 absolutely eclipses the 680 and can handle high texture settings much much better.

One could argue something along the lines of AMD fine wine or nvidia gimping drivers/not adding new support. And maybe thats part of it, but seeing this massive divergence in performance historically, why would you want to artificially limit how "future proof" your gpu is? Especially considering consoles are easily going to be using 12gb+ of vram in a few years easy.

Aside from completely different architectures, the real problem here is are you making an argument for keeping a top end GPUs past the point, where new entry level GPUs are beating it.

By that point, it's kind of moot. Sure 7 years from now 16GB will likely be a much bigger factor, but 7 year old GPU will be considered obsolete by then, and if you are the type of person that buy brand new top end GPUs near release, then chances are you will replace before 5 years is out.
 

CastleBravo

Member
Dec 6, 2019
119
271
96
This is false and I have proof.


Please I implore you to read this whole article. Really pay attention to the summary and the tests at max settings but the Real Bottom Line: is that more vram 100% can contribute to the performance of a GPU. Back then a 50% increase in memory was a big selling point and while the 680 handily beat the 7970 at lower resolutions, it really fell behind at higher res. And now here in the future the 7970 absolutely eclipses the 680 and can handle high texture settings much much better.

One could argue something along the lines of AMD fine wine or nvidia gimping drivers/not adding new support. And maybe thats part of it, but seeing this massive divergence in performance historically, why would you want to artificially limit how "future proof" your gpu is? Especially considering consoles are easily going to be using 12gb+ of vram in a few years easy.

So there are a tiny handful of modern games that, eight years later, the 680 chokes on while the 7970 can still manage 60+ fps at 1080p medium settings. Looking at the 2017 or earlier titles, the 680 trades blows with the 7970, so even with its tiny 2GB of VRAM, it still performed well for at least five years.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
For starters the intent of the this posting is for the memory demand related topic. Sure it's a AMD sponsored game, but they haven't really gamed their console dominance in the past. Nvidia on the other hand is a different story.....


I'm leaning towards the console and new AMD offerings have the same 16GB's of memory for more then the bigger number.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
That's just AMD creating a setting that uses a little bit more VRAM than NVidia cards have, as a slime move.

The exact kind of thing that AMD fans whine about NVidia doing.

There are no good guys when it comes to corporate behavior.
 
  • Haha
Reactions: spursindonesia

VirtualLarry

No Lifer
Aug 25, 2001
56,229
9,990
126
That's just AMD creating a setting that uses a little bit more VRAM than NVidia cards have, as a slime move.

The exact kind of thing that AMD fans whine about NVidia doing.

There are no good guys when it comes to corporate behavior.
Did you likewise criticize Nvidia and their sponsored games, for "over-tesselation" of things like ground surface and "hidden" objects, just to spite AMD, when NVidia cards had greater tesselation power?

Thought not.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
For starters the intent of the this posting is for the memory demand related topic. Sure it's a AMD sponsored game, but they haven't really gamed their console dominance in the past. Nvidia on the other hand is a different story.....


I'm leaning towards the console and new AMD offerings have the same 16GB's of memory for more then the bigger number.

And this is what AMD's boot tastes like as it rapidly enters Nvidia's throat at Su-speed.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
The more I think about it the more I think that 10 GB will probably be fine for another 2-3 years for the vast majority of titles. Sure there will be a few where that isn't quite enough or some settings you can enable to cause 10 GB to become a wall, but I suspect it won't be a problem in the same way that 8 GB will be in the same time frame.

First of all, even though the consoles will have 16 GB of RAM that's not the same as 16 GB of VRAM since the console OS will use some of that memory and the game will need some memory to store anything that normally gets put in RAM on a PC. How much that is exactly remains to be seen.

The other thing that makes me think that 10 GB will be enough is that the new Xbox has 10 GB of fast RAM. Titles on that console which use more than that will have to manage it a bit more carefully or just accept that performance may suffer a little as a result.

Obviously not every title will play by those rules, but all those factors make me think that 10 GB will be a common target that developers of the big cross platform titles will aim for.
 
  • Like
Reactions: psolord

Heartbreaker

Diamond Member
Apr 3, 2006
4,222
5,224
136
Did you likewise criticize Nvidia and their sponsored games, for "over-tesselation" of things like ground surface and "hidden" objects, just to spite AMD, when NVidia cards had greater tesselation power?

Thought not.

Same kind of thing. I am not defending NVidia. It's the way corporations behave when they perceive an upper hand.

I am just pointing out one is just as bad as the other. As AMD rises expect more of this from them...
 

Glo.

Diamond Member
Apr 25, 2015
5,661
4,419
136
Well, Guys from Hardware Unboxed tested the RTX 3070 8GB VRAM framebuffer, and its impact on performance in Watch Dogs Legion in 1440p, at highest possible settings, and they say that in that 1440p 8 GB VRAM was not sufficient enough.

So yeah, I'd say that 10 GB's are not enough for future games. 8 GB vor 1080p, 10 GB for 1440p(!) and 12 GB at bare minimum for 4K. Depending on titles.

The more VRAM you have the higher lifespan your GPUs will have.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
What do us PC gamers always want? We've always wanted developers to push games to use high-end PC hardware. AMD is leading the way in this area with larger frame buffers, and developers are using that ability. If Nvidia falls behind with their hilarious 8 and 10GB buffers, that's not my problem and it shouldn't be yours either. Just don't buy overpriced crap with inadequate RAM and you'll be in good shape.
Also, I wouldn't be happy with 10GB for 1440p either. Just saying. The 6000 series has 16GB all around, so you aren't forced to compromise in an area as critical as RAM. If you save money with a 6800 and have a 1440p monitor, you can upgrade it to an ultrawide or 4K without stressing about RAM. Nvidia just screws you there. Want a decent priced Nvidia card? You get a slow 3070 with 8GB of ram, LOL. Nice one there. The 6800 absolutely REKS that card with twice the ram for only $80.00 more. Good Lord.
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,632
136
What do us PC gamers always want? We've always wanted developers to push games to use high-end PC hardware. AMD is leading the way in this area with larger frame buffers, and developers are using that ability. If Nvidia falls behind with their hilarious 8 and 10GB buffers, that's not my problem and it shouldn't be yours either. Just don't buy overpriced crap with inadequate RAM and you'll be in good shape.
Also, I wouldn't be happy with 10GB for 1440p either. Just saying. The 6000 series has 16GB all around, so you aren't forced to compromise in an area as critical as RAM. If you save money with a 6800 and have a 1440p monitor, you can upgrade it to an ultrawide or 4K without stressing about RAM. Nvidia just screws you there. Want a decent priced Nvidia card? You get a slow 3070 with 8GB of ram, LOL. Nice one there. The 6800 absolutely REKS that card with twice the ram for only $80.00 more. Good Lord.

It seems to me like game engines have become a lot smarter about memory management. My point is, that we have games today that can use (not just allocate) more than 8 GB of VRAM, however, unlike in the past, these games don't become a stuttering mess when run on 8 GB cards on max settings. Maybe it's just because they aren't going over 8 GB by much, but it seems like the games can handle this situation better than and go into resource management mode to deal with the lack of VRAM. The end result is lowered performance overall and maybe some minor stuttering, but not like it used to be.

Maybe it's just that system RAM and bus speeds have gotten so much faster as well that going out to system RAM is less of a penalty. Maybe a combination of a lot of things. Main point is, going over VRAM doesn't seem like it is as torturous as it used to be though obviously you'd rather not take the performance penalty in doing so. Maybe as games use more and more VRAM extending well past 8GB we'll see the major issues pop up again. I also know some devs have started dynamic quality settings to manage such situations as well where you won't get a performance penalty, but the game will dynamically load in lower quality assets to keep the performance. I think this has mostly been used on consoles but can and probably will be used on PC as well.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
So your theory is that AMD convinced the developers to practice bad memory management just to be sure to exceed 10 GB of VRAM?

I assume it doesn't take too much effort to convince developers to be lazy and not try to optimize something harder.

If we wanted a real comparison, wouldn't AMD need to the developers load in some large assets that are basically beneath the map and never actually seen by the players?
 

Golgatha

Lifer
Jul 18, 2003
12,640
1,482
126
Thinking more on it, I normally wouldn't worry about VRAM size, as I used to upgrade often enough that it didn't matter. Now, video cards are much more expensive, but also tend to get a year or two more use for me. I want something I can use and have a great experience with for 2-4 years as an expected lifetime in my computer. I don't think cards with 8-10GB of RAM can deliver this over that time frame. I also want to see some reviews and read forum posts about people using Big Navi. I haven't purchased a Radeon board since my 5850 Crossfire days, but I haven't seen anything in reviews and forum posts to give me confidence AMD's drivers have improved to the level of "it just works", which is the experience I've had with nVidia hardware for years now. Honestly, I can't remember the last time I had a game crash on my computer, and I have hundreds of games installed.
 
  • Like
Reactions: spursindonesia

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
Two games at more then 10 GB VRAM now. But Godfall is associated with AMDs marketing. MS flight sim is more worrying.

Games designed for consoles will have 16 GB of shared ram. Shared console ram runs at GPU speeds, giving consoles 16 GB - (non-graphics usage) for their video ram. Most ram is used for graphics.

If a console game allocates more then 10 GB for graphics, any 10 GB card is going to have to compromise. Likely on texture quality.


I would be leery about a midrange card with 10 GB now, much less a high end one. Texture quality is frequently the biggest difference, and is determined by ram more then gpu.
 
  • Love
Reactions: spursindonesia

sze5003

Lifer
Aug 18, 2012
14,177
622
126
Could be that Nvidia is probably going to counter with this once the AMD series reviews and benchmarks are out. Although I don't see how it's meant to swing buyers over so much. It's probably meant to trade blows and make up for the 10gb ram on the 3080's.


Again it's only a leak and about the second time I've seen it so far.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
If true this 3080ti would have the same amount for cuda cores as the 3090 but 20gb ram. I think they would be competing with the 6800xt here.

Probably still cheaper for me to get this 3080ti than to do an all AMD build with a 6800xt. Although I've been thinking what that experience would be like using an all amd build with a Gsync monitor lol.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
The people have spoken. Whether 10GB is "enough" or not is case dependant and is now beside the point. However, with a "flagship" card, there should be no question about it, and clearly, there is nothing but questions about Nvidia's choice to screw people with 10GB of ram. People want more and aren't happy with a new "flagship" card having less ram than the previous two generations had. It's looking more and more likely that Dr. Su is about to kick their Vram-skimping butts. They deserve to get rekt.

Yeah, I've been with Intel and Nvidia for the last 20 years but now it just straight up doesn't make sense anymore. I already bought a Ryzen.