Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
Are you actually using more RAM at 1080 360Hz than 1080 144Hz at the same settings? 1080 isn’t really the issue, I run 4K and that’s where the concern is.

The comment was geared more or less towards the 3080's performance. Depending on reviews they might be worthy of one another.

I'm a 144Hz free sync RX 5700 part time peasant gamer that balks at cards over $400. I only paid $300 at launch for my 5700 thanks to Microcenters double $50 bundle deal.
 
  • Like
Reactions: Tlh97 and blckgrffn

Golgatha

Lifer
Jul 18, 2003
12,650
1,512
126
That RTX 3070 is basically obsolete right out of the box IMO. 8GB should be for 3060 class cards for 1080p-1440p gaming. Is 8GB even enough for 1440p moving forward?

I would say no. Wildlands at 1440p is bumping up around 6.5GB of vRAM on Ultra. It's only going to get worse and consoles will have about 10GB to use, so developers will develop to this standard going forward over the next few years. Personally, I'll wait for a 20GB 3080 or a Ti with a little more RAM.
 
  • Like
Reactions: Tlh97

Jaskalas

Lifer
Jun 23, 2004
33,428
7,489
136
It might be with NV I/O. Otherwise, I can see at least some XSX/PS5 ports being bottle-necked.

Assuming the PS5 shares RAM between CPU and GPU, it only has 16GB total. An even split affords the PS5 just 8GB of VRAM.

You can start to see how even a 3070 is future proof.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,267
136
Assuming the PS5 shares RAM between CPU and GPU, it only has 16GB total. An even split affords the PS5 just 8GB of VRAM.

You can start to see how even a 3070 is future proof.

Well, you aren't going to have a perfectly even split, and some games are going to use >10GB for GPU.

Besides that, a lot of games, especially PS5 ones, will use the SSD more like memory than storage.

You can probably get away with not replicating the SSD trick if you have a ton of memory. And you can probably get away with not having a ton of memory if you replicate the SSD trick. If you have neither you're going to run into problems, the only question being how soon into the generation you do.
 
  • Like
Reactions: Tlh97 and Saylick

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I have yet to see anyone actually show 10GB+ VRAM being used and not just reserved, unlike us lot Nvidia has the tools to see actual usage of VRAM and not the reservation that user side tools show.

360Hz is just nuts, makes my 3x 1080p 120Hz displays seem slow as hell.

Hardware unboxed did a test showing the 8GB 2080 suddenly dropping off at 4K, allowing the 1080Ti to blow past it even when the 2080 was killing it at 1440p. I've read about (but not confirmed myself) performance tests showing Vram issues with 8GB cards showing hitching, stuttering, and lower FPS due to Vram bottlenecks. I've experienced these issues myself in the past with GTX 570's in SLI (1.5GB Vram instead of the 2GB they should have had) when playing BF3, so I know what happens when a generation of cards gets skimped on with low Vram. They are fine at release, but when a new game comes out even a few months down the road, suddenly you have issues that can't be resolved unless you lower settings.

There is no feeling more frustrating than having to lower settings to prevent hitching when you have plenty of GPU horsepower but are being held back by low Vram. There's just no way I can justify this in my mind and buy a 3080 on release. I'm in no mood to buy a $700 card only to switch it out a few months later with the card it should have been from the start.
 

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
There should be all kinds of VRAM benchmarks out very soon, maybe even before the release on older cards. I bet all the hardware sites and channels are on it.
 

Ranulf

Platinum Member
Jul 18, 2001
2,348
1,165
136
I would not feel comfortable with 10gb vram on a flagship level card with new 4K consoles about to be released. That's just me though. 8GB for the 3070 is just crazy imo. I'd definitely get the 16GB version of that model.

Kinda reminds me of the 3gb v 6gb debate and we all know how that panned out (spoiler alert, the 3gb version aged like milk)

And that debate was just a rehash of 2gb vs 4gb for the previous generation, like the gtx 960 cards.
 

Thala

Golden Member
Nov 12, 2014
1,355
653
136
They are fine at release, but when a new game comes out even a few months down the road, suddenly you have issues that can't be resolved unless you lower settings.

Chance is higher that you need to lower the settings anyway due to performance issues before you need to lower the settings due to VRAM issues. Point in case Flight Simulator 2020 seems to work fine with 8GByte - however even with a RTX3080 you will not hit 60fps at highest settings 4K.
 

ryrynz

Junior Member
Feb 8, 2009
16
0
66
So for at least 1 game, it appears the new 8 and 10GB cards are obsolete right out of the gate. I'm wondering if the game is just "caching" instead of requiring that framebuffer amount.

You're wrong, this game hasn't even been benchmarked on a 10GB 30 series, so you don't know the texture requirements for this game on this hardware.
Tensor Memory Compression changes the game here literally.
 

Capt Caveman

Lifer
Jan 30, 2005
34,547
651
126

Q: Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?

We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price. In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples – if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory. Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
 

CakeMonster

Golden Member
Nov 22, 2012
1,389
496
136
You're wrong, this game hasn't even been benchmarked on a 10GB 30 series, so you don't know the texture requirements for this game on this hardware.
Tensor Memory Compression changes the game here literally.

I'd like to bump my question from earlier in the thread:

Can we clear up compression once and for all? I seem to remember the 980 or possibly the 1080 had a compression scheme that suddenly improved bandwidth but that's not what we're talking about here? I see tons of people making outrageous claims about this new memory acting like twice the amount. Even I understand that that's bunk, but has there been any real reduction whatsoever of the actual space assets take up while in GPU RAM, and is that even possible?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
You're wrong, this game hasn't even been benchmarked on a 10GB 30 series, so you don't know the texture requirements for this game on this hardware.
Tensor Memory Compression changes the game here literally.

I don't believe you. I think the truth is that the 3080 is simply a 2080 replacement and it's not even all that much faster than a 2080Ti outside of the bandwidth-sensitive 4K scenarios shown by Nvidia. Nvidia hasn't announced an upgrade for 1080Ti people yet. It's going on what, 4 years now? I'm waiting...
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
I don't believe you. I think the truth is that the 3080 is simply a 2080 replacement and it's not even all that much faster than a 2080Ti outside of the bandwidth-sensitive 4K scenarios shown by Nvidia. Nvidia hasn't announced an upgrade for 1080Ti people yet. It's going on what, 4 years now? I'm waiting...

Waiting for a replacement or backlash from the community? /s

The 3080 is a salvaged die. The real question is was it a engineering oversight that caused it or was it intentionally gimped to 10GB's of memory?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
just buy the 3090, problem solved

Very true. I've seen some people on other forums who feel stuck like me between downgrading to a 3080 or overspending on a 3090. There isn't an appropriate offering yet, and I think people have caught on to that. Not everyone though. Most people already dumped their $2080Ti on ebay for peanuts; a decision they may come to regret. I got a feeling a $2080Ti will prove to be a faster card that a 3070 and have more Vram to boot, and at $400-ish? Dang. Question is, how will all those $2080Ti people feel if that 3080 isn't quite as good as they thought? I'm really hoping the only downside is that missing gig of ram, but I'm starting to get some bad vibes.
 

brianmanahan

Lifer
Sep 2, 2006
24,233
5,632
136
Very true. I've seen some people on other forums who feel stuck like me between downgrading to a 3080 or overspending on a 3090. There isn't an appropriate offering yet, and I think people have caught on to that. Not everyone though. Most people already dumped their $2080Ti on ebay for peanuts; a decision they may come to regret. I got a feeling a $2080Ti will prove to be a faster card that a 3070 and have more Vram to boot, and at $400-ish? Dang. Question is, how will all those $2080Ti people feel if that 3080 isn't quite as good as they thought? I'm really hoping the only downside is that missing gig of ram, but I'm starting to get some bad vibes.

i think there's a %50 chance i go nuts and buy the 3090

and %50 that i do the sensible thing and wait for a 3080 ti/super/whatever for like half the price
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
i keep confusing moonbogg with moonbeam

and then i get surprised when his posts aren't off the wall philosophical lectures

though perhaps an off the wall philosophical lecture is what i need to help me decide between the 3090 and the 3080 v2

Being able to fork over the cash for a product is totally separate from an offensively priced product making your stomach turn. Paying an offensive price is a behavioral act independent of all other factors, financial or otherwise. It's a symptom of one's character that manifests itself as a badge of shame during the contemplation of such a purchase. However, once the act is committed, it is forever sealed in both time and our collective memory. The badge then transitions into an iron-branded symbol of TOMFOOLERY upon the center of the forehead of the self-victimized one, remaining forever recognizable as such within the social context of their experience.
 

brianmanahan

Lifer
Sep 2, 2006
24,233
5,632
136
Being able to fork over the cash for a product is totally separate from an offensively priced product making your stomach turn. Paying an offensive price is a behavioral act independent of all other factors, financial or otherwise. It's a symptom of one's character that manifests itself as a badge of shame during the contemplation of such a purchase. However, once the act is committed, it is forever sealed in both time and our collective memory. The badge then transitions into an iron-branded symbol of TOMFOOLERY upon the center of the forehead of the self-victimized one, remaining forever recognizable as such within the social context of their experience.

RG0BS1U.gif
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
Very true. I've seen some people on other forums who feel stuck like me between downgrading to a 3080 or overspending on a 3090. There isn't an appropriate offering yet, and I think people have caught on to that. Not everyone though. Most people already dumped their $2080Ti on ebay for peanuts; a decision they may come to regret. I got a feeling a $2080Ti will prove to be a faster card that a 3070 and have more Vram to boot, and at $400-ish? Dang. Question is, how will all those $2080Ti people feel if that 3080 isn't quite as good as they thought? I'm really hoping the only downside is that missing gig of ram, but I'm starting to get some bad vibes.
I think all of us 1080ti owners are in this situation now. On one hand I dont want to wait another year or 6 months IF a 3080ti ever comes.

And the inverse is spend a ton on the 3090 and keep it as long as you can as it should be pretty darn good for the next 3 years. The thing with this model is if you cant get one from Nvidia directly you will spend nearly 2k or more on the third party brands.
 
  • Like
Reactions: Tlh97 and moonbogg

MrTeal

Diamond Member
Dec 7, 2003
3,569
1,698
136
$2k is insanity for a Geforce card.
No one at the time said the 2080 Ti was a good value, but at least it was 42% more shaders and the extra 3GB for 72% more money. The 3090 is 21% more shaders and 23% more bandwidth for an extra 114% more money. It has the 24GB VRAM, but that's meaningless for a gaming card. That at launch you have to look at spending an extra $800 for an okish bump in shaders and a really safe amount of RAM is nuts.

The more I think about it, the more I'm leaning on waiting since I'll probably be breaking my loop later in the fall for Zen 3 anyway.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I've decided to just accept that Nvidia is just holding their cards close to the vest to see what AMD offers. They have a halo card in the 3090 that they are sure that AMD can't touch, a premium card in the 3080 that is generally a performancencrease from the 2080ti in all but a few situations, and a card that is clearly better than the best that AD currently makes, priced to draw a lot of people in for a reasonable performance upgrade in the 3070.

If AMD decides to come out with VRAm guns blazing pushing 16 GB cards, then Nvidia has the ability to easily match and beat that with the 3070ti at 16GB, the 3080ti at 20GB, and still has the crown with the 3090. The only snag wold be if AMD went with 32GB of VRAM, but that would just be insane, priced even higher than the 3090 to even break even, and still not bring any performance benefit in games.
 
  • Like
Reactions: Tlh97 and moonbogg