Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
At what settings?
Usually there are many graphics presets in games namely Low, Medium, High, Very High and Ultra.
Since you are referring to 3080, I would imagine that anyone who buys that card would prefer to play at either Very High or Ultra. I am fairly confident that 10GB will be enough for both these settings at 4K for the next 3 years.
As for 3070,one would have to make do with V High or High in the tail end of its life. As that one "only" has 8GB.
 

alcoholbob

Diamond Member
May 24, 2005
6,271
323
126
Well you can look at it this way, console assets will be 4K native now, so your high or ultra texture quality should be the same between console and PC, and the Xbox Series X has 10GB of video optimized memory so that should be the cut-off point. It's enough for 4K. If there are higher ultra quality PC only options however, you probably want more than 10GB. This will probably be a very small subset of games.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Having only a yes and no option seems too limiting. My answers would be, "Yes for 99% of games, but there will be one or two before that point where 10 GB is a limiting factor."

How much of a deal that is probably depends on the game. I don't think it will be a major issue unless someone keeps the card for 5+ years and at that point people are likely running lower settings anyhow.

Even with consoles moving to 16 GB of memory now it's still going to take a while before games use that as a baseline.
 
  • Like
Reactions: Kuiva maa

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Having only a yes and no option seems too limiting. My answers would be, "Yes for 99% of games, but there will be one or two before that point where 10 GB is a limiting factor."

How much of a deal that is probably depends on the game. I don't think it will be a major issue unless someone keeps the card for 5+ years and at that point people are likely running lower settings anyhow.

Even with consoles moving to 16 GB of memory now it's still going to take a while before games use that as a baseline.

Choose the best answer and discuss the nuance in the comments.
 

Tup3x

Senior member
Dec 31, 2016
944
925
136
Having only a yes and no option seems too limiting. My answers would be, "Yes for 99% of games, but there will be one or two before that point where 10 GB is a limiting factor."

How much of a deal that is probably depends on the game. I don't think it will be a major issue unless someone keeps the card for 5+ years and at that point people are likely running lower settings anyhow.

Even with consoles moving to 16 GB of memory now it's still going to take a while before games use that as a baseline.
Xbox will have 10 GB fast dedicated for GPU. Games get to use 13,5 GB max and rest is for OS and other background stuff. 10 GB will probably be fine for quite some time.
 
  • Like
Reactions: loki1944

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
I said "No" because I believe that in 3 years, we'll have console ports that leverage most, if not all, of the console's available RAM (~10-13 GB), and knowing how PCs are less optimized than consoles, those games will require more VRAM than they do on consoles.

The future of video games is to compress as much of the game assets as possible, and then stream in the compressed data as fast as realistically possible, ideally in real-time right before it's needed if possible. Any data that needs to arrive faster than real-time will need to be moved onto RAM ahead of when it's actually needed, and given how PCs don't have dedicated compression and decompression hardware, I think there will be likely that console-to-PC ports take up a larger memory footprint than they do on consoles. In other words, games that are not designed to leverage SSDs and just-in-time streaming of assets will need to have more assets stored in RAM at any given time if they are to maintain the same asset quality of games that do leverage JIT streaming.
 

CastleBravo

Member
Dec 6, 2019
119
271
96
I said "No" because I believe that in 3 years, we'll have console ports that leverage most, if not all, of the console's available RAM (~10-13 GB), and knowing how PCs are less optimized than consoles, those games will require more VRAM than they do on consoles.

The future of video games is to compress as much of the game assets as possible, and then stream in the compressed data as fast as realistically possible, ideally in real-time right before it's needed if possible. Any data that needs to arrive faster than real-time will need to be moved onto RAM ahead of when it's actually needed, and given how PCs don't have dedicated compression and decompression hardware, I think there will be likely that console-to-PC ports take up a larger memory footprint than they do on consoles. In other words, games that are not designed to leverage SSDs and just-in-time streaming of assets will need to have more assets stored in RAM at any given time if they are to maintain the same asset quality of games that do leverage JIT streaming.

We know Ampere will support Microsoft DirectStorage, which is supposedly the JIT streaming tech in the Xbox Series X. I will be very disappointed in AMD if they don't include the same tech in RDNA2.
 
  • Like
Reactions: kurosaki

Hitman928

Diamond Member
Apr 15, 2012
5,177
7,628
136
I chose no simply because if I were buying a 3080, I would be doing so to play at 4K, no compromises (as advertised) and I don't think that will be possible with 10 GB of RAM over the next even 2 years, not for at least a handful of AAA games. Now, I'm sure you could turn down some settings here and there and probably hardly notice a difference and be fine but that's not what I would be buying the "flagship" model for.
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
We know Ampere will support Microsoft DirectStorage, which is supposedly the JIT streaming tech in the Xbox Series X. I will be very disappointed in AMD if they don't include the same tech in RDNA2.
I wonder how games will manage then if you don't have a GPU or SSD capable of supporting JIT streaming. Will certain settings, e.g. high res textures, be simply blocked off to you if the game detects you don't have a system capable of streaming in the assets fast enough without bogging down the game? I suppose the alternative is that you get texture pop but that can be alleviated by simply having more RAM, if I'm not mistaken, which goes against the idea of needing less RAM moving foward.
 
  • Like
Reactions: Tlh97 and blckgrffn

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
I wonder how games will manage then if you don't have a GPU or SSD capable of supporting JIT streaming. Will certain settings, e.g. high res textures, be simply blocked off to you if the game detects you don't have a system capable of streaming in the assets fast enough without bogging down the game? I suppose the alternative is that you get texture pop but that can be alleviated by simply having more RAM, if I'm not mistaken, which goes against the idea of needing less RAM moving foward.

Exactly. That API works when you have hardware that meets a performance baseline. Is the game going to run a storage benchmark on the assets before it lets you flip that on? I mean, flipping that on when you have all your games on rust drive is likely to have very little impact.

Even SATA SSDs are unlikely to have the raw throughput that games built around this API/Console storage subsystems are expecting. To get similar performance, you'll need a solid SSD (read, anyway) on a PCIe interface for huge burst transfers with low latency.

That makes the amount of normal gamer PCs that could really leverage this technology as likely laughably low.

I bought a 2TB NVME because I was tired of having an sata connections at all in my rig, but I think it is much more fashionable right now for a NVME boot drive backed by one or more larger and cheaper sata SSD drives.

Practically speaking, most computers can really only have one NVME drive at full bandwidth atm?

(to be clear, I feel like the inclusion of this API is nearly throw away at this point, to create some illusion of parity with the consoles, I could be wrong)
 

CastleBravo

Member
Dec 6, 2019
119
271
96
Exactly. That API works when you have hardware that meets a performance baseline. Is the game going to run a storage benchmark on the assets before it lets you flip that on? I mean, flipping that on when you have all your games on rust drive is likely to have very little impact.

Even SATA SSDs are unlikely to have the raw throughput that games built around this API/Console storage subsystems are expecting. To get similar performance, you'll need a solid SSD (read, anyway) on a PCIe interface for huge burst transfers with low latency.

That makes the amount of normal gamer PCs that could really leverage this technology as likely laughably low.

I bought a 2TB NVME because I was tired of having an sata connections at all in my rig, but I think it is much more fashionable right now for a NVME boot drive backed by one or more larger and cheaper sata SSD drives.

Practically speaking, most computers can really only have one NVME drive at full bandwidth atm?

I don't have details on how the DirectStorage stuff works, but they might be able to leverage multiple tiers of storage including system RAM, an SSD cache, and the actual storage (SSD or HDD) that the game is installed to, all of which will be compressed until it gets to the GPU. This way you would have uncompressed assets needed right now on the GPU's VRAM, compressed assets that might be needed very soon on system RAM, compressed assets that might be needed soonish on an SSD cache drive, and the rest on of it compressed on spinning rust. Done right, PCs with 16+ GB of system RAM in addition to 8+ GB of VRAM might have a significant advantage over consoles that have less (V)RAM, but maybe faster storage.
 
  • Like
Reactions: psolord

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
I don't have details on how the DirectStorage stuff works, but they might be able to leverage multiple tiers of storage including system RAM, an SSD cache, and the actual storage (SSD or HDD) that the game is installed to, all of which will be compressed until it gets to the GPU. This way you would have uncompressed assets needed right now on the GPU's VRAM, compressed assets that might be needed very soon on system RAM, compressed assets that might be needed soonish on an SSD cache drive, and the rest on of it compressed on spinning rust. Done right, PCs with 16+ GB of system RAM in addition to 8+ GB of VRAM might have a significant advantage over consoles that have less (V)RAM, but maybe faster storage.

I didn't read the whole thing, but I think it's NVMe only?

Why NVMe?
NVMe devices are not only extremely high bandwidth SSD based devices, but they also have hardware data access pipes called NVMe queues which are particularly suited to gaming workloads. To get data off the drive, an OS submits a request to the drive and data is delivered to the app via these queues. An NVMe device can have multiple queues and each queue can contain many requests at a time. This is a perfect match to the parallel and batched nature of modern gaming workloads. The DirectStorage programming model essentially gives developers direct control over that highly optimized hardware.


I'd assume Nvidia's will just piggy back on Microsofts.
 
  • Like
Reactions: Tlh97 and kurosaki

StinkyPinky

Diamond Member
Jul 6, 2002
6,761
777
126
I would not feel comfortable with 10gb vram on a flagship level card with new 4K consoles about to be released. That's just me though. 8GB for the 3070 is just crazy imo. I'd definitely get the 16GB version of that model.

Kinda reminds me of the 3gb v 6gb debate and we all know how that panned out (spoiler alert, the 3gb version aged like milk)
 

sze5003

Lifer
Aug 18, 2012
14,177
622
126
Gonna say no. I can tell MS flight sim 2020 is making my 1080ti chug. I've seen some other games like resident evil 2 and call of duty use up my vram pretty quickly if I want to turn up the settings like I used to.

I would prefer more especially for 4k but I currently game in 3440x1440 and in VR sometimes too.
 
  • Like
Reactions: lobz and Leeea

StinkyPinky

Diamond Member
Jul 6, 2002
6,761
777
126
Gonna say no. I can tell MS flight sim 2020 is making my 1080ti chug. I've seen some other games like resident evil 2 and call of duty use up my vram pretty quickly if I want to turn up the settings like I used to.

I would prefer more especially for 4k but I currently game in 3440x1440 and in VR sometimes too.

Even worse if you think the 3070 is clearly a 4K card as well but the limit on the vram really makes it a cautious choice.
 
  • Like
Reactions: Tlh97 and kurosaki

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
I didn't read the whole thing, but I think it's NVMe only?




I'd assume Nvidia's will just piggy back on Microsofts.
The Nvidia guy in the reddit AMA said it wasn't NVME exclusive, which is nice.
 

linkgoron

Platinum Member
Mar 9, 2005
2,286
810
136

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
I'd wait for review of it before I'd trust the Nvidia guy. Sure it might somewhat work, but NVMe might be way faster?
Yeah, I might actually have misunderstood that. The storage might need to be PCIe connected, which doesn't necessarily preclude SATA SSDs, but it might not be as simple.

This is the page he linked about GPUDirect storage

So, how does RTX IO work for NVMe drives that are directed connected to a CPU x4 interface? The data would still need to be shuttled through the CPU obviously, but it would still be a benefit as it can move in a compressed form and without the CPU having to process it outside of routing it to the x16 GPU interface?
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136

IMO no.

So for at least 1 game, it appears the new 8 and 10GB cards are obsolete right out of the gate. I'm wondering if the game is just "caching" instead of requiring that framebuffer amount.
 

MrTeal

Diamond Member
Dec 7, 2003
3,554
1,658
136
Not having minimums sucks, but the 5700XT gets 20 FPS at Ultra 4k, while the Radeon VII gets 24. It doesn't seem like it's chugging with the 8GB limit.
 

repoman0

Diamond Member
Jun 17, 2010
4,442
3,280
136
I'm wondering about that 10 GB as well. My trusty 1070 gave me 5 years and it would be nice if the 3080 would do the same. I've been back and forth on waiting for AMD but I will probably try to buy a FE on launch. I game at 1440p/165 Hz anyway and need to play Cyberpunk 2077

There's always lowering settings .. I often use mostly low/medium settings on newer games with my 1070 and don't mind.