Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
I guess the question is if you have a 1080ti and haven't upgraded in 3 years, is the 3090 worth the price and will it sustain that same timeframe?

I'm tempted to avoid trying to order on launch day just to see reviews.
 

anthrax

Senior member
Feb 8, 2000
695
3
81
I guess the question is if you have a 1080ti and haven't upgraded in 3 years, is the 3090 worth the price and will it sustain that same timeframe?

I'm tempted to avoid trying to order on launch day just to see reviews.

Agree, Waiting to see what AMD comes out with and actually benchmarks might be the best play here.

I think there will be something between a 3080 and 3090.. However, I doubt it will be called the TI Traditionally, the performance difference between TI and non TI is about 30%. A hypothetical card that sits in between the 3080 and 3090 would likely be only 10% faster. Maybe it will be called the 3080 super.
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
Agree, Waiting to see what AMD comes out with and actually benchmarks might be the best play here.

I think there will be something between a 3080 and 3090.. However, I doubt it will be called the TI Traditionally, the performance difference between TI and non TI is about 30%. A hypothetical card that sits in between the 3080 and 3090 would likely be only 10% faster. Maybe it will be called the 3080 super.
Yea I'm willing to wait just for reviews but don't think I'll want to wait 6 months or more. I know I want to get a new card in my case by the time cyberpunk launches.
 

anthrax

Senior member
Feb 8, 2000
695
3
81
Yea I'm willing to wait just for reviews but don't think I'll want to wait 6 months or more. I know I want to get a new card in my case by the time cyberpunk launches.

If cyberpunk is the game you want to play. Nvidia has surely got your wallet with DLSS already :).
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
If cyberpunk is the game you want to play. Nvidia has surely got your wallet with DLSS already :).
I can assure you I'm not the only one wanting to play cyberpunk. I just had told myself I should probably upgrade before it comes out because there's no way I'll be able to play it max settings with my current card.
 

JustMe21

Senior member
Sep 8, 2011
324
49
91
I'm actually waiting to see the people who jumped on the 8 GB 3070s & 10 GB 3080s regret their purchase when AMD announces 16 GB cards and Nvidia counters with 16 GB 3070s and 20 GB 3080s. To me, the next gen card purchase for 4K gaming needs to have 16 GB or more.
 
  • Like
Reactions: moonbogg and the901

brianmanahan

Lifer
Sep 2, 2006
24,203
5,608
136
I'm actually waiting to see the people who jumped on the 8 GB 3070s & 10 GB 3080s regret their purchase when AMD announces 16 GB cards and Nvidia counters with 16 GB 3070s and 20 GB 3080s. To me, the next gen card purchase for 4K gaming needs to have 16 GB or more.

yeah i really want a 3080 right now, but i can't bring myself to buy one because i know people will be making fun of me in like 6 months tops
 
  • Haha
Reactions: Saylick

sze5003

Lifer
Aug 18, 2012
14,177
622
126
It all depends on the 3080ti price. If it comes out at $1000 then the 3090 would not have been worth it. But if it's $1200, which is where it think it will be, then the 3090 is worth getting now.

I'm seeing some after market 3080's getting listed for $800 or more. Whatever fills the gap between a 3080 and 3090 will cost less than $1500 and probably be within 20% less powerful than the 3090.

Hell they could just be like here have a 20gb 3080 in response to AMD. But then again I don't see it costing $1200 unless you're talking custom models.
 

amenx

Diamond Member
Dec 17, 2004
3,851
2,019
136
I'm actually waiting to see the people who jumped on the 8 GB 3070s & 10 GB 3080s regret their purchase when AMD announces 16 GB cards and Nvidia counters with 16 GB 3070s and 20 GB 3080s. To me, the next gen card purchase for 4K gaming needs to have 16 GB or more.
I'm waiting to see the 3070 8gb beat the 2080ti 12gb in all 4k titles without a hitch for its 2 year run before Hopper/RDNA3 arrive. I have no doubt vram requirements will increase, but that currently they are done more so for marketing purposes and bit-bus either/or limitations. The 3090 has 24gb instead of 12gb just to be "more than" big Navis 16gb. Big Navi has 16gb because they cant sell an 8gb flagship vs Nvidias 12gb cards. Consoles have 16gb because they are in for the long haul (6-7 years) which is a good design choice for its life span. I could be wrong that 8gb wont last the 2 years before 2022 HW arrives, but we'll see. Also, in meantime we will have a good chance to test out the vram theories with the 8-16gb, 10-20 versions of the 3070, 3080 as it can only be done with identical cards.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Raster performance requirements seem to have been slowing down lately. Anything more powerful than a 2080Ti is likely to last a long time unless it only has 8 or 10GB of ram of course.
 

aleader

Senior member
Oct 28, 2013
502
150
116
I don't see these posted anywhere, and my common sense and gut instinct tell me this is true...we don't NEED all this VRAM, even at 4K:

(24:22)

(26:00)

It's the 20 blade razor syndrome...more is better. Unless you're seeing stuttering, you have enough. What would be really helpful is if Afterburner showed us how much of the RAM (system and VRAM) is actually just being cached and not used, but it doesn't.

Also, if you're upgrading your video card every 2-3 years, who cares about 'future-proofing'? To me a 10 year card needs to be future-proofed, although that is way less important now than it used to be. The only caveat is that if you do upgrade all the time, you may want the extra VRAM just because everyone else 'believes' that you need it and it may hurt resale value.
 

cytg111

Lifer
Mar 17, 2008
23,049
12,719
136
I'm wondering about that 10 GB as well. My trusty 1070 gave me 5 years and it would be nice if the 3080 would do the same. I've been back and forth on waiting for AMD but I will probably try to buy a FE on launch. I game at 1440p/165 Hz anyway and need to play Cyberpunk 2077

There's always lowering settings .. I often use mostly low/medium settings on newer games with my 1070 and don't mind.
Thats right, all the new shiny things come out just in time for assembly before cyberpunk!
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I still see it like this, if you're not playing at 4K, you don't need more than 8GB of VRAM. The 3070 8GB will be an absolute barn burner for 1440. The 3080 10GB will also be more than enough for 1440, and will let you play at 4K with no issues at all but the absolute highest texture quality settings. I wouldn't hold out for a 16GB 3070 ti if I wasn't going to do anything in 4K, unless I had a specific game that seemed to just chug through VRAM in obscene amounts (I'm looking at you MSFS 2020).

Remember, it's also possible, though unlikely, that NVidia will release a 3090 SE or a 3080 Super that is essentially the 3090 chip configuration with 12GB of RAM instead of 24, but with lower clocks all around. 24 GB of GDDR6X in clamshell can't be cheap...
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
^ Well, 11GB of non-clamshell cost $1200 last time, so I suppose the $1500 for 24GB of clamshell is pretty worth it then? Sounds like a bargain to me, and that 3080 is basically free in comparison, being only 10GB of non-clamshell and all.
 
  • Haha
Reactions: Tlh97 and Elfear

richaron

Golden Member
Mar 27, 2012
1,357
329
136
It's chicken and egg situation with game settings and hardware requirements. Dev's could design a game which runs at 3fps on the best GPU and/or needs terabytes of memory.

Following on, for the next 3-5 years, some settings on some games are going to be "using" more than 10GB vRAM simply because cards like that exist. The dev's might actually take advantage of the extra RAM... but even if "using" more RAM does nothing for performance they know some customers like to think they're taking full advantage of their stupid expensive flagship card.

Correspondingly a vast majority of games will work fine at "4k" with 8 or 10GB vRAM because there will be plenty of 4k capable GPUs out there in this range.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
This is more or less a question that is depends on the actual memory management. Things got rude with Vulkan and D3D12, but AMD provides two middlewares, and these are huge help. With VMA and D3D12ME the memory management problems are not so bad as two or three years ago. Normally these middlewares are just works out of box, a lot of next-gen engine use them, and they are also very good as a stepping stone to create a unique management solution. For example I working on an unannounced project, and in the engine we use VMA. We modified the original code to create a very custom streaming solution, so our game won't have a texture quality option anymore. When it first load the map, it will try to load the high quality assets, and if there is not enough VRAM, than the streaming algorithm just change some of the loaded textures to medium or low resolution, and there is also a defragment solution to the VRAM. With this concept it is not a problem if a GPU has only 4-6-8-10 GB of VRAM. The performance will be always good, but the picture quality will be different. Obviously there is a target for us, and in our tests 16 GB VRAM is needed to always load the high quality assets. And this is guaranteed, we checked it with Radeon VII. But RTX 3080 works very well with 10 GB of VRAM, but some loaded assets will be medium and low quality, because there is not enough memory to load the high quality versions. But we are very happy with the performance we see, and it is hard to notice the poorer picture quality. If you don't search intentionally for the low quality assets, you will probably miss them. :)

If it's important to get the absolute best quality, than sure go for a 16 GB card, but 10 GB won't limit the performance. We know that these cards are exist, and we try to solve their limitations in clever and user friendly ways. Not just a some popup message in the options menu, that warns the user to lower the texture quality. :)

For DirectStorage ... this not a good thing for the cards with lower VRAM capacity. This API will solve the decompression problem on the CPU side, but to do this, it will increase the memory footprint inside the VRAM. So with DirectStorage the VRAM pressure will be much-much worse.
It also have several other issues, like the GPU caches must be flushed after every read from the SSD. In a marketing slide, this might be a good feature, but it won't work well on todays GPUs. It will works on the XS S/X and the PS5, but those consoles are designed around this feature, and this is a huge difference.
 

aleader

Senior member
Oct 28, 2013
502
150
116
Yah, and this is what I'm getting at. The high end GPU market (4K, 3070+) is tiny...like super-tiny. If the majority of the cards out there (over 70%) are 8GB, that's going to be the target. They will not develop games that limit their sales to 1% of the market...unless they are complete business noobs.
 

brianmanahan

Lifer
Sep 2, 2006
24,203
5,608
136
Yah, and this is what I'm getting at. The high end GPU market (4K, 3070+) is tiny...like super-tiny. If the majority of the cards out there (over 70%) are 8GB, that's going to be the target. They will not develop games that limit their sales to 1% of the market...unless they are complete business noobs.

however, they might make the ultra settings use more VRAM that the majority of users own
 

DJinPrime

Member
Sep 9, 2020
87
89
51
This is more or less a question that is depends on the actual memory management. Things got rude with Vulkan and D3D12, but AMD provides two middlewares, and these are huge help. With VMA and D3D12ME the memory management problems are not so bad as two or three years ago. Normally these middlewares are just works out of box, a lot of next-gen engine use them, and they are also very good as a stepping stone to create a unique management solution. For example I working on an unannounced project, and in the engine we use VMA. We modified the original code to create a very custom streaming solution, so our game won't have a texture quality option anymore. When it first load the map, it will try to load the high quality assets, and if there is not enough VRAM, than the streaming algorithm just change some of the loaded textures to medium or low resolution, and there is also a defragment solution to the VRAM. With this concept it is not a problem if a GPU has only 4-6-8-10 GB of VRAM. The performance will be always good, but the picture quality will be different. Obviously there is a target for us, and in our tests 16 GB VRAM is needed to always load the high quality assets. And this is guaranteed, we checked it with Radeon VII. But RTX 3080 works very well with 10 GB of VRAM, but some loaded assets will be medium and low quality, because there is not enough memory to load the high quality versions. But we are very happy with the performance we see, and it is hard to notice the poorer picture quality. If you don't search intentionally for the low quality assets, you will probably miss them. :)

If it's important to get the absolute best quality, than sure go for a 16 GB card, but 10 GB won't limit the performance. We know that these cards are exist, and we try to solve their limitations in clever and user friendly ways. Not just a some popup message in the options menu, that warns the user to lower the texture quality. :)

For DirectStorage ... this not a good thing for the cards with lower VRAM capacity. This API will solve the decompression problem on the CPU side, but to do this, it will increase the memory footprint inside the VRAM. So with DirectStorage the VRAM pressure will be much-much worse.
It also have several other issues, like the GPU caches must be flushed after every read from the SSD. In a marketing slide, this might be a good feature, but it won't work well on todays GPUs. It will works on the XS S/X and the PS5, but those consoles are designed around this feature, and this is a huge difference.
Wouldn't the overhead of DirectStorage just be hundred(s) of MB? You only need the buffer big enough to decompress 1 whole asset at minimum right? Going back to your project, if you had DirectStorage technology, do you still need to load all the assets of the level? You're doing that currently because of the performance hit during game play if you have to load new assets right? But if DS takes out the CPU/RAM path, loading asset should have minimal impact to game play. So, instead of loading all the assets at the start of the level, you can load 1/4 (for example) and have a thread to stream in new asset as needed. The ps5 ratchet and clank demo shows the ability to load in completely different levels seamlessly. To me, this indicates that VRAM usage will be much more efficient and smarter going forward.