Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...
 

Saylick

Diamond Member
Sep 10, 2012
3,084
6,184
136
Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...
Just try to buy the FE 3080 10 GB on launch, then re-evaluate in a few months. If there's really a need for 20 GB, then you can probably re-sell the 3080 for a small loss, or maybe even a small gain if rumors are to believed about artificially limited supply.
 

brianmanahan

Lifer
Sep 2, 2006
24,203
5,608
136
Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...

i was AMD/ATI for 15 years, but the C2D CPUs and GTX GPUs were too good to ignore, so i switched

next CPU might be AMD but i figure my 8700k should last me another 5 years

what'll be happening in 2025 is anyone's guess

we might be living in caves again by that point...

(actually since that's the case, i think i'm going to buy the 3090 after all)
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Wouldn't the overhead of DirectStorage just be hundred(s) of MB? You only need the buffer big enough to decompress 1 whole asset at minimum right? Going back to your project, if you had DirectStorage technology, do you still need to load all the assets of the level? You're doing that currently because of the performance hit during game play if you have to load new assets right? But if DS takes out the CPU/RAM path, loading asset should have minimal impact to game play. So, instead of loading all the assets at the start of the level, you can load 1/4 (for example) and have a thread to stream in new asset as needed. The ps5 ratchet and clank demo shows the ability to load in completely different levels seamlessly. To me, this indicates that VRAM usage will be much more efficient and smarter going forward.
I think there is a lot of misunderstanding, maybe the source of this is how Sony and Microsoft detailed the new technology. Yes, consoles are designed to manage the data much more efficient, and this is done by several, more or less individual components:

The IO API, DirectStorage for example is just one technology. With this an NVMe drive can be managed much more efficiently. It is now possible to generate 30k IO requests, or even more, I don't really test the limits on PS5, but the actual IO APIs on PC are limited to 250-300 IO requests, and these number is also hard to achieve, because of the HDDs. So a tipical scenario is designed to 50-60 IO requests. The jump here is incredible and here comes the problem, because nowadays the assets are shipped in a compressed form. With 50-60 IO requests this is not an issue, nearly every modern CPU can handle the decompression, but 30k IO requests are a different level, so there is a need for an accelerator to offload this workload. This can be a fixed function block like what the consoles are using, or the GPU. On a PC the new IO API will allow to generate much more IO requests using the new NVMe features, and the GPU can copy the content to it's VRAM with direct memory access. These are not new technologies! NVMe is here for several years, and on the GPU side the requirement is at least two DMA units (there should be more, if the performance is matter), and a GPU memory access technology that allows direct memory transfers between PCI devices. This is also not new, search NVIDIA GPUDirect or AMD DirectGMA.

Now the IO API is just one component, and both consoles are using a second technology to allow fine-grained control over data. With only DirectStorage a PC engine still needs to manage the allocations in the VRAM, but with a newer memory/cache controller the data management can be hardware-based, and it is possible move only the memory pages. It is also important to manage the coherency, because flushing the GPU caches on every read is not so efficient.

The reason why Ratchet & Clank demo can load different levels seamlessly, is the IO API, the fine-grained data management, and the coherency engines. All three component is needed, and DirectStorage only provide the first. The secpnd and third is not available on PC yet.
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
Nvidia might have lost me 2 generations in a row (good thing they don't care). I've been waiting for over 3 years for an upgrade and I'm not appreciative of the BS games they are playing with the release cycle. If a 3080Ti or 20GB version is coming, it should have released first or have been announced so people know what to expect and can plan their purchase in an informed way. I'm not aware of AMD playing these stupidass games with their releases. Nvidia feels like Intel to me. I'm afraid it might be too late for me. I feel myself turning...
You can wait for RDNA 3. I'm hearing it will be very good. If not, there's always RDNA 4
 

samboy

Senior member
Aug 17, 2002
217
77
101
Yeah...... I've decided to skip trying to nab a 3080 this week:-

1. 10GB is fine for now; but I don't think it will be in 5 years time (how long I would keep it)
4GB seemed to be about right for 1080p; so 16GB would be a good fit for 4k
2. Memory prices have gone done; so why hasn't NVidia significantly increased VRAM for the 3080?
It's the same as last generation.
I suspect that NVidia is holding this card so they can respond to AMD's upcoming cards.
They will launch a 20GB 3080; but at what price and that will likely depend on AMD
3. AMD is likely going to offer a 16GB card; this seems a much more reasonable amount for a new top end card
4. The new NVidia power requirements are a problem in my mind
I read the theory that the 3080FE cards being released this week have really good coolers and expensive to make
However, one has to ask why all this is required?
Hot power hungry video cards are simply going to be less reliable & have been problematic in the past
I'm really interested on how AMD will compete on this front
5. This is the first time in a while that AMD may have something competitive
Likely not faster than the 3080 but may be within 20%
We need to let competition kick in and bring back the creep we have had in graphics card price increases
This is going to require patience from everyone and some time

I'm resigned that I probably won't be getting any next generation card in 2020; availability will be low. January 2021 is more likely and one will be in a position to compare products; see how stable drivers etc are and with a bit of luck some price competition.
 
  • Like
Reactions: Tlh97

sze5003

Lifer
Aug 18, 2012
14,177
622
126
When is AMD supposed to announce and release their cards anyway? I'm going to try to go for a 3090 but if I don't get one I will just have to wait and see how the competition plays out. I won't be going AMD but at that point if I can't get one it would make sense to wait and see if a Ti model ever comes out or a 3080 with higher memory.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
When is AMD supposed to announce and release their cards anyway? I'm going to try to go for a 3090 but if I don't get one I will just have to wait and see how the competition plays out. I won't be going AMD but at that point if I can't get one it would make sense to wait and see if a Ti model ever comes out or a 3080 with higher memory.

Zen 3 is on the 8th. Maybe they'll drop some Big Navi info? Otherwise it's October 28th for Big Navi.
 
  • Like
Reactions: Tlh97 and Elfear

Dave2150

Senior member
Jan 20, 2015
639
178
116
Yeah...... I've decided to skip trying to nab a 3080 this week:-

1. 10GB is fine for now; but I don't think it will be in 5 years time (how long I would keep it)
4GB seemed to be about right for 1080p; so 16GB would be a good fit for 4k
2. Memory prices have gone done; so why hasn't NVidia significantly increased VRAM for the 3080?
It's the same as last generation.
I suspect that NVidia is holding this card so they can respond to AMD's upcoming cards.
They will launch a 20GB 3080; but at what price and that will likely depend on AMD
3. AMD is likely going to offer a 16GB card; this seems a much more reasonable amount for a new top end card
4. The new NVidia power requirements are a problem in my mind
I read the theory that the 3080FE cards being released this week have really good coolers and expensive to make
However, one has to ask why all this is required?
Hot power hungry video cards are simply going to be less reliable & have been problematic in the past
I'm really interested on how AMD will compete on this front
5. This is the first time in a while that AMD may have something competitive
Likely not faster than the 3080 but may be within 20%
We need to let competition kick in and bring back the creep we have had in graphics card price increases
This is going to require patience from everyone and some time

I'm resigned that I probably won't be getting any next generation card in 2020; availability will be low. January 2021 is more likely and one will be in a position to compare products; see how stable drivers etc are and with a bit of luck some price competition.

5 years? That's a very long time in GPU terms, by which point all current GPU's will be worthless potatoes.
 

blckgrffn

Diamond Member
May 1, 2003
9,111
3,029
136
www.teamjuchems.com
My 980ti begs to differ. I'm not scared of 10GB; worst case scenario, I drop down to high textures from ultra in a few years.

But you had 6GB way before it was cool! :)

My 290x had a similar run. 2014 - 2019. In retrospect, I don't think I ever regretted getting only the 4GB version, but I did pick up a 2560*1080 monitor on purpose as that resolution is way easier to push than 2560 *1440.

It lives on in my Cycling simulation/back up PC rig. Mostly because I can't part with my LGA 2011 rig so I am keeping it all together until the end times or it is the equivalent of a first gen Phenom.
 

CastleBravo

Member
Dec 6, 2019
119
271
96
But you had 6GB way before it was cool! :)

My 290x had a similar run. 2014 - 2019. In retrospect, I don't think I ever regretted getting only the 4GB version, but I did pick up a 2560*1080 monitor on purpose as that resolution is way easier to push than 2560 *1440.

It lives on in my Cycling simulation/back up PC rig. Mostly because I can't part with my LGA 2011 rig so I am keeping it all together until the end times or it is the equivalent of a first gen Phenom.

I'm still on a 2011v3 with a 5820k. Current plan is to try and snag a 3080 FE tomorrow morning if the price really is $700, and maybe stick it in a Zen3 build next month.
 
  • Like
Reactions: Tlh97 and blckgrffn

blckgrffn

Diamond Member
May 1, 2003
9,111
3,029
136
www.teamjuchems.com
I'm still on a 2011v3 with a 5820k. Current plan is to try and snag a 3080 FE tomorrow morning if the price really is $700, and maybe stick it in a Zen3 build next month.

I wanted to last until DDR5 with my 3930k but the thought of going into a pandemic stuck with "old" hardware and tough performance in some modern games had me standing in a socially distanced line at Microcenter in March. Sigh.

My Dad is doing the same as you - 5820k to Zen 3. Should be a really solid jump too.

Enjoy the 3080 should you pull the trigger! :)
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Steve Burke @ Gamers Nexus comments on the 10gb vram of the 3080 (@ 3m mark)


HUB demonstrated that a 2080 starts to tank at 4K in Doom vs a 1080Ti because of the lack of Vram. That's with 8GB at 4K right now. This wasn't him just looking at some software reading. The performance was ahead at 1440p, then it tanked at 4K due to a lack of RAM. How long before 10GB gets rekt at even lower resolutions? All it takes is one good AAA game to break the 10GB buffer. I'd be pissed.
There's no way this much GPU power is a good fit for only 10GB of ram. It's ridiculous. New models will be out after AMD drops and everyone will look back at the 10GB card and wonder why the hell it was even released. It's the skimped-out starter card to maker Nvidia look good. It only looks good right now (to some people at least) because it's the first release of a new generation for both parties. I'm not buying it.
 

Capt Caveman

Lifer
Jan 30, 2005
34,547
651
126

GeForce RTX 3080: Is 10GB VRAM Enough?
In the past few weeks since the RTX 30-series announcement, there have been quite a few discussions about whether the 3080 has enough memory. Take a look at the previous generation with 11GB, or the RTX 3090 with 24GB, and 10GB seems like it's maybe too little. Let's clear up a few things.

There are ways to exceed using 10GB of VRAM, but it's mostly via mods and questionably coded games — or running a 5K or 8K display. The problem is that a lot of gamers use utilities that measure allocated memory rather than actively used memory (e.g., MSI Afterburner), and they see all of their VRAM being sucked up and think they need more memory. Even some games (Resident Evil 3 remake) do this, informing gamers that it 'needs' 12GB or more to properly run the ultra settings properly. (Hint: It doesn't.)

Using all of your GPU's VRAM to basically cache textures and data that might be needed isn't a bad idea. Call of Duty Modern Warfare does this, for example, and Windows does this with system RAM to a certain extent. If the memory is just sitting around doing nothing, why not put it to potential use? Data can sit in memory until either it is needed or the memory is needed for something else, but it's not really going to hurt anything. So, even if you look at a utility that shows a game using all of your VRAM, that doesn't mean you're actually swapping data to system RAM and killing performance.

You'll notice when data actually starts getting swapped out to system memory because it causes a substantial drop in performance. Even PCIe Gen4 x16 only has 31.5 GBps of bandwidth available. That's less than 5% of the RTX 3080's 760 GBps of bandwidth. If a game really exceeds the GPU's internal VRAM capacity, performance will tank hard.

If you're worried about 10GB of memory not being enough, my advice is to just stop. Ultra settings often end up being a placebo effect compared to high settings — 4K textures are mostly useful on 4K displays, and 8K textures are either used for virtual texturing (meaning, parts of the texture are used rather than the whole thing at once) or not used at all. We might see games in the next few years where a 16GB card could perform better than a 10GB card, at which point dropping texture quality a notch will cut VRAM use in half and look nearly indistinguishable.

There's no indication that games are set to start using substantially more memory, and the Xbox Series X also has 10GB of GPU memory, so an RTX 3080 should be good for many years, at least. And when it's not quite managing, maybe then it will be time to upgrade to a 16GB or even 32GB GPU.

If you're in the small group of users who actually need more than 10GB, by all means, wait for the RTX 3090 reviews and launch next week. It's over twice the cost for at best 20% more performance, which basically makes it yet another Titan card, just with a better price than the Titan RTX (but worse than the Titan Xp and 2080 Ti). And with 24GB, it should have more than enough RAM for just about anything, including scientific and content creation workloads.
 

amenx

Diamond Member
Dec 17, 2004
3,851
2,019
136
HUB demonstrated that a 2080 starts to tank at 4K in Doom vs a 1080Ti because of the lack of Vram. That's with 8GB at 4K right now. This wasn't him just looking at some software reading. The performance was ahead at 1440p, then it tanked at 4K due to a lack of RAM. How long before 10GB gets rekt at even lower resolutions? All it takes is one good AAA game to break the 10GB buffer. I'd be pissed.
There's no way this much GPU power is a good fit for only 10GB of ram. It's ridiculous. New models will be out after AMD drops and everyone will look back at the 10GB card and wonder why the hell it was even released. It's the skimped-out starter card to maker Nvidia look good. It only looks good right now (to some people at least) because it's the first release of a new generation for both parties. I'm not buying it.
HUB's results differ from TPU, GameGPU, KitGuru and other sites. Steve Walton/HUB is the only person who reports this result. He also made an argument re 2080 vram performance weeks or months ago with Doom Eternal 4k vs RVII and 1080ti as well.

TPU 4k Doom E with Ultra Nightmare settings:

performance-3840-2160.png



This is the game with the highest vram 'usage' reported. GameGPU almost identical to TPU results as well as KitGuru. Steve Walton/HUB is the lone standout with his anamolous results.
 
Last edited:
  • Like
Reactions: DeathReborn

amenx

Diamond Member
Dec 17, 2004
3,851
2,019
136
When the 3070 8gb and 16gb versions are released, I think we will have sufficient data to put the vram argument to rest for next couple years at least.