• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Is 10GB of Vram enough for 4K gaming for the next 3 years?

  • Yes

    Votes: 47 32.6%
  • No

    Votes: 97 67.4%

  • Total voters
    144

CakeMonster

Golden Member
Nov 22, 2012
1,025
96
91
I can't imagine this being a problem within 2 years (definitely new generation by then with lots more ram and smaller process). 3-4 years is probably safe too but getting harder to predict. If you are at a 1-2 year upgrade cycle anyway, there's no reason not to get the 3080, if you need/want the performance that is.

Can we clear up compression once and for all? I seem to remember the 980 or possibly the 1080 had a compression scheme that suddenly improved bandwidth but that's not what we're talking about here? I see tons of people making outrageous claims about this new memory acting like twice the amount. Even I understand that that's bunk, but has there been any real reduction whatsoever of the actual space assets take up while in GPU RAM, and is that even possible?
 

MrTeal

Diamond Member
Dec 7, 2003
3,104
775
136
I can't imagine this being a problem within 2 years (definitely new generation by then with lots more ram and smaller process). 3-4 years is probably safe too but getting harder to predict. If you are at a 1-2 year upgrade cycle anyway, there's no reason not to get the 3080, if you need/want the performance that is.

Can we clear up compression once and for all? I seem to remember the 980 or possibly the 1080 had a compression scheme that suddenly improved bandwidth but that's not what we're talking about here? I see tons of people making outrageous claims about this new memory acting like twice the amount. Even I understand that that's bunk, but has there been any real reduction whatsoever of the actual space assets take up while in GPU RAM, and is that even possible?
Are you talking about RTX IO?
I don't believe so. My understanding is that it allows the GPU to directly stream and decompress game data off storage, rather than the CPU handling that and the GPU fetching the decompressed data from system memory. Once the data is on the GPU, it would behave the same as it traditionally does.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,329
232
106
Yeah, I might actually have misunderstood that. The storage might need to be PCIe connected, which doesn't necessarily preclude SATA SSDs, but it might not be as simple.

This is the page he linked about GPUDirect storage

So, how does RTX IO work for NVMe drives that are directed connected to a CPU x4 interface? The data would still need to be shuttled through the CPU obviously, but it would still be a benefit as it can move in a compressed form and without the CPU having to process it outside of routing it to the x16 GPU interface?
You know what would make it so much simpler... putting a SSD on the graphics card, I guess someone was ahead of the curve on that front.

 
  • Like
Reactions: Tlh97 and SamMaster

CakeMonster

Golden Member
Nov 22, 2012
1,025
96
91
Are you talking about RTX IO?
I don't believe so. My understanding is that it allows the GPU to directly stream and decompress game data off storage, rather than the CPU handling that and the GPU fetching the decompressed data from system memory. Once the data is on the GPU, it would behave the same as it traditionally does.
Nope, not RTX IO but GPU memory compression. I'll take any bonus from SSD game data handling, but I'm not taking anything for granted and making my decision purely on GPU RAM for now. But there seems to be a lot of (dis)info about GPU memory in various discussions right now.
 

blckgrffn

Diamond Member
May 1, 2003
7,937
1,125
126
www.teamjuchems.com
I'm wondering about that 10 GB as well. My trusty 1070 gave me 5 years and it would be nice if the 3080 would do the same. I've been back and forth on waiting for AMD but I will probably try to buy a FE on launch. I game at 1440p/165 Hz anyway and need to play Cyberpunk 2077

There's always lowering settings .. I often use mostly low/medium settings on newer games with my 1070 and don't mind.
If you are willing to turn settings down to get FPS up the 3800 is an outstanding step from your 1070 I’d say. It’s huuuuuuge :)
 

blckgrffn

Diamond Member
May 1, 2003
7,937
1,125
126
www.teamjuchems.com
Yeah, I might actually have misunderstood that. The storage might need to be PCIe connected, which doesn't necessarily preclude SATA SSDs, but it might not be as simple.

This is the page he linked about GPUDirect storage

So, how does RTX IO work for NVMe drives that are directed connected to a CPU x4 interface? The data would still need to be shuttled through the CPU obviously, but it would still be a benefit as it can move in a compressed form and without the CPU having to process it outside of routing it to the x16 GPU interface?
Pretty sure SATA ssds, by virtue of being on the SATA interface, are not PCIe attached drives.

The alternative to NVMe drives are those”old” school enterprise drives that plugged right into a PCIe slot, I am pretty sure.

Burst bandwidth on SATA is way too slow.
 
  • Like
Reactions: Tlh97

CP5670

Diamond Member
Jun 24, 2004
4,855
229
106
I posted this in the big thread, but some current games (e.g. Deus Ex Mankind Divided) use right around 10GB at 4K on my 1080ti, as reported by EVGA Precision in some maps or situations. It's not clear if there would actually be a performance hit if there was less memory though. 10GB is certainly enough in general but maybe not with "ultra" settings at 4K.
 

MrTeal

Diamond Member
Dec 7, 2003
3,104
775
136
Pretty sure SATA ssds, by virtue of being on the SATA interface, are not PCIe attached drives.

The alternative to NVMe drives are those”old” school enterprise drives that plugged right into a PCIe slot, I am pretty sure.

Burst bandwidth on SATA is way too slow.
Yeah, I should just retract everything I said. They showed drives behind a RAID controller, but even a PCIe RAID controller does make the drives NVMe which seems to be a requirement for this to work.

Which honestly is a really PITA. I have been looking at replacing my boot drive with an NVMe one, but I would have planned on keeping the 2TB of SATA storage I already have to use as a Steam and games drive. Now to use RTX IO I either need to drop another couple hundred bucks on a 2TB main drive, or deal with with moving games back and forth between active and storage drives.
 

DiogoDX

Senior member
Oct 11, 2012
706
162
116
Having FuryX flashbacks with the 3080 10GB. I always keep the cards for 2 or 3 years so I am not regressing from 11GB of my 1080Ti that I use in 4K.

I will wait for the 20GB model or go with AMD if they deliver the same performance with 16GB.
 
  • Like
Reactions: Tlh97 and moonbogg

StinkyPinky

Diamond Member
Jul 6, 2002
6,559
482
126
Having FuryX flashbacks with the 3080 10GB. I always keep the cards for 2 or 3 years so I am not regressing from 11GB of my 1080Ti that I use in 4K.

I will wait for the 20GB model or go with AMD if they deliver the same performance with 16GB.
Me too. Exciting cards for sure, but I don't feel comfortable regressing in vram.
 

UNCjigga

Lifer
Dec 12, 2000
23,008
5,469
136
Sounds like 12-16GB will be the sweet spot for the next several years--heck the 3090 has 24GB??

Microsoft is promising to bring some of that Xbox NVMe magic to PCs with DirectStorage, so maybe we'll see smarter use of caching for those ultra-res game assets and hopefully that will help us folks sticking with $300 midrange cards.

 

moonbogg

Lifer
Jan 8, 2011
10,116
1,905
126
Yeah, the 10GB is really getting me down to be honest. I must admit though, I feel like quite an odd creature because 2 days ago I felt my 11GB was still overkill but suddenly 10GB is trash. That is kind of odd. However, game requirements continue to increase and this new card is supposed to be for the next 2-3 years or so. 10GB feels a little on the light side.
Should it also be said that game developers would have to be out of their minds to require more Vram than the current higher-end offerings have? Most people will still be on 8GB cards, so for a developer to just smash through 10GB to max their game does sound a little odd on their part as well.
 
  • Like
Reactions: Mopetar and guachi

Saylick

Golden Member
Sep 10, 2012
1,007
861
136
Yeah, the 10GB is really getting me down to be honest. I must admit though, I feel like quite an odd creature because 2 days ago I felt my 11GB was still overkill but suddenly 10GB is trash. That is kind of odd. However, game requirements continue to increase and this new card is supposed to be for the next 2-3 years or so. 10GB feels a little on the light side.
Should it also be said that game developers would have to be out of their minds to require more Vram than the current higher-end offerings have? Most people will still be on 8GB cards, so for a developer to just smash through 10GB to max their game does sound a little odd on their part as well.
How urgent are you in needing a new flagship GPU? I have no doubt in my mind that there will be a 20 GB version of the 3080, Nvidia shouldn't have wafer supply issues this time around, and Big Navi will come out before the end of the year. By that time, Cyberpunk 2077 will have released, which hopefully is a good representation of your average AAA title for the near future, and we'd also know if a 3080 with 10 GB runs fine on it.
 

moonbogg

Lifer
Jan 8, 2011
10,116
1,905
126
I'm in no hurry. I can easily wait all next year for a proper Ti replacement if necessary. 10GB is just baffling to be honest. They aren't fooling anyone when they call it their flagship GPU. It's the 2080 replacement, not the Ti replacement. Ti buyers should just wait IMO if you want the real deal or go for the 3090.
 

blckgrffn

Diamond Member
May 1, 2003
7,937
1,125
126
www.teamjuchems.com
How urgent are you in needing a new flagship GPU? I have no doubt in my mind that there will be a 20 GB version of the 3080, Nvidia shouldn't have wafer supply issues this time around, and Big Navi will come out before the end of the year. By that time, Cyberpunk 2077 will have released, which hopefully is a good representation of your average AAA title for the near future, and we'd also know if a 3080 with 10 GB runs fine on it.
I think it’s fair to wonder, in this case as in the last couple of launches, whether you are going to get burned by bad availability and big markups if you don’t hit day one? I can see how people would be torn between waiting and getting the card with better legs or be stuck waiting weeks/months/paying hundreds over MSRP because of hesitation?

I am no fan of the 10GB but if I was hot for a 3080 I’d hit it as soon as it drops. It should be an easy flip if supplies are good or bad, worst case you’re out tax and $50? If you’re spending 700+tax on a GPU then it seems like an easy gamble to me :)
 

blckgrffn

Diamond Member
May 1, 2003
7,937
1,125
126
www.teamjuchems.com
Yeah, I should just retract everything I said. They showed drives behind a RAID controller, but even a PCIe RAID controller does make the drives NVMe which seems to be a requirement for this to work.

Which honestly is a really PITA. I have been looking at replacing my boot drive with an NVMe one, but I would have planned on keeping the 2TB of SATA storage I already have to use as a Steam and games drive. Now to use RTX IO I either need to drop another couple hundred bucks on a 2TB main drive, or deal with with moving games back and forth between active and storage drives.
Ha, well, they make is sound promising!

The hardware profile required is so specific and I would argue low in market share it seems unlikely that many titles in the near future (2021?) would even bother supporting it, even if it is baked into Unreal/Unity. It’s just another thing to implement, support and probably get a bunch of twitter and review turds thrown your way. Better just to turn it off if only single digit percentages of the install base could use it.

I see that crazy IO power being baked into gameplay to really make it sing. Have you watched the Ratchet & Clank trailer for PS5? It’s the best reason yet I think MS is making a mistake with no first party Series X titles.

Spoiler alert - it uses Portals like in Portal but they go to completely different huge levels. There is jut a “bam” pause then it’s fully loaded. Quickly, in succession. You can’t build game logic/game play like that without everyone being onboard. “Normal” PCs that are sold with HD secondary storage or only HDs wouldn’t even be playable. Slower SSDs either. Can you imagine it being about 1 second for full nvme drive guy with the hot stuff, 10 seconds for mere mortals and like a minute for the plebs each time the game jumped like that? It just wouldn’t work.

Heck, I went from a very solid Intel large SATA SSD to the E12 powered solid TLC flash NVMe drive in my sig and in Borderlands 3 I think my load times on fast travel are maybe 20% of what the were before? It’s awesome. Thinking on that is what makes me certain the above scenario would be a big mess.
 

sze5003

Lifer
Aug 18, 2012
13,453
334
126
I'm in no hurry. I can easily wait all next year for a proper Ti replacement if necessary. 10GB is just baffling to be honest. They aren't fooling anyone when they call it their flagship GPU. It's the 2080 replacement, not the Ti replacement. Ti buyers should just wait IMO if you want the real deal or go for the 3090.
I've decided I'm going to go for the 3090. I'd love to spend as much as I did for the 1080ti this year but I also don't want to go back on the vram. I use my VR headset quite a bit and I've seen my card get used up quite a bit.

I skipped the 20 series completely because I was sure there would be a 3080ti this time around. I guess I just have crap luck! With my luck I'll get a 3090 and shortly after they screw us and announce a 3080ti for close performance and less money but I highly doubt it will be a thing.

I used to upgrade every year. But lately I've been fine not doing so because I've been able to run just about everything maxed out. Well, up until recently of course. I really wish reviews could be out sooner.
 
  • Like
Reactions: Tlh97 and blckgrffn

moonbogg

Lifer
Jan 8, 2011
10,116
1,905
126
The 3080 in terms of GPU power is already sort of close to the 3090; it's about what, 30% slower? Maybe less. I'm pretty sure a Ti is coming that will basically tie the 3090 and have less ram for a good chunk less money. That could take a year though. I'd expect to see 20GB variants of the 3080 well before a Ti drops. The only way a Ti gets released sooner is if AMD brings anything to the table, and some suggest they'd be lucky to compete with the 3070, let alone the 3080 and above. So yeah, the 3090 sounds like the only call to make if you want a proper Ti replacement now. It's going to cost a lot more than $1500 I suspect unless you grab one from the Nvidia site, but hey, you gotta pay to play, right?
Something tells me I'll still smash F5 on that 3080 on release day though, lol. Let's be honest; I want that thing haha.
 

mohit9206

Golden Member
Jul 2, 2013
1,371
491
136
Amd launched the R9 Fury X in 2015 at $649 with 4GB vram. So 10GB is still 2.5X that. 780Ti was also 3GB in 2013. Those who bought 2080Ti Ti 11GB for $1200, do not really have a right to complain about a faster card at $699 witu 1GB less ram.
 
  • Like
Reactions: ozzy702

Saylick

Golden Member
Sep 10, 2012
1,007
861
136
Something tells me I'll still smash F5 on that 3080 on release day though, lol. Let's be honest; I want that thing haha.
Hahaha. Just buy the damn thing! The poll asked if 10 GBs was good for 3 years and most said no. If you rephrased the question to 1 or 2 years, I'm sure it would be closer to 50:50 if not more skewed towards "yes". Something tells me that next generation GPUs, i.e. Hopper and RDNA3, are going to make all prior monolithic-based GPUs look feeble. 5nm + Chiplets is such a paradigm shift in mm2/$ that you're probably going to see a higher-than-average performance increase per price tier in the next generation. You're gonna be so incentized to upgrade in 2 years or less anyways so by that time, there will be opportunities for you to buy a GPU with more VRAM. And like Blckgrffn said, you're not bound to lose much resale if you decide to upgrade from your 3080 within 6 months.
 
  • Like
Reactions: Tlh97 and moonbogg

MrTeal

Diamond Member
Dec 7, 2003
3,104
775
136
Amd launched the R9 Fury X in 2015 at $649 with 4GB vram. So 10GB is still 2.5X that. 780Ti was also 3GB in 2013. Those who bought 2080Ti Ti 11GB for $1200, do not really have a right to complain about a faster card at $699 witu 1GB less ram.
Most of the ones here with hesitations aren't the 2080 Ti owners, they're the ones who bought the 1080 Ti 3.5 years ago for $699 moving to a faster $699 card with less RAM.
 

mohit9206

Golden Member
Jul 2, 2013
1,371
491
136
Most of the ones here with hesitations aren't the 2080 Ti owners, they're the ones who bought the 1080 Ti 3.5 years ago for $699 moving to a faster $699 card with less RAM.
Then they should wait for the 20GB model. Nvidia isn't forcing anyone to get the 10GB model.
 

moonbogg

Lifer
Jan 8, 2011
10,116
1,905
126
"As we can see, the game’s Ultra textures used 10.5GB of VRAM, whereas High textures used 8GB of VRAM. Thus, it will be interesting to see whether the NVIDIA RTX3080 will be able to handle this game in 4K with its 10GB VRAM."


That doesn't inspire confidence. A game that just came out using an HD texture pack blows through 10.5GB of Vram. I guess RTX 3080 people could always just lower settings to stay within the tight Vram budget...on a "flagship" graphics card that hasn't even been released yet.
 
  • Like
Reactions: ryan20fun

DeathReborn

Platinum Member
Oct 11, 2005
2,329
232
106
To quote Nvidia themselves (and not GPU-Z or EVGA precision with their dodgy VRAM consumption figures):

Why only 10 GB of memory for RTX 3080? How was that determined to be a sufficient number, when it is stagnant from the previous generation?
[Justin Walker] We’re constantly analyzing memory requirements of the latest games and regularly review with game developers to understand their memory needs for current and upcoming games. The goal of 3080 is to give you great performance at up to 4k resolution with all the settings maxed out at the best possible price.
In order to do this, you need a very powerful GPU with high speed memory and enough memory to meet the needs of the games. A few examples - if you look at Shadow of the Tomb Raider, Assassin’s Creed Odyssey, Metro Exodus, Wolfenstein Youngblood, Gears of War 5, Borderlands 3 and Red Dead Redemption 2 running on a 3080 at 4k with Max settings (including any applicable high res texture packs) and RTX On, when the game supports it, you get in the range of 60-100fps and use anywhere from 4GB to 6GB of memory.
Extra memory is always nice to have but it would increase the price of the graphics card, so we need to find the right balance.
 
  • Like
Reactions: ryan20fun

linkgoron

Platinum Member
Mar 9, 2005
2,034
446
136
Should it also be said that game developers would have to be out of their minds to require more Vram than the current higher-end offerings have? Most people will still be on 8GB cards, so for a developer to just smash through 10GB to max their game does sound a little odd on their part as well.
We're talking about 4K ultra settings, not "most people". For "most people" you can look at the top of the Steam survey - its filled with 1X50/1X60/1X70 cards.
We're also talking about cards that are getting released just before two quite powerful consoles with a lot of memory as well.
 
  • Like
Reactions: Tlh97

ASK THE COMMUNITY