Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 15 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
Godfall using 14.7 GB of vram at 4k and 13.9 at 1440p with max settings and ray tracing with a 6800xt. I wonder how much of that is actually needed though. I guess we will see when they let Nvidia cards use ray tracing later.

< 60 FPS :( in an action genre.
That is a fail in my book. Yea, the 3080 would fail to.

I saw it was on ultra* settings. I wonder if it could get 80 fps with acceptable graphics settings? The idea being raytracing might be more valuable then some imperceptible ultra setting.

To show the >10 GB v-ram really matters the rx 6000 series needs to show a playable framerate where the 3080 flops. I wonder if it would achieve that in GodFall with RT off + ultra?
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
Godfall using 14.7 GB of vram at 4k and 13.9 at 1440p with max settings and ray tracing with a 6800xt. I wonder how much of that is actually needed though. I guess we will see when they let Nvidia cards use ray tracing later.

Does it really need it though. I'm pretty sure the benchmarks released so far have shown the 3080 doing fine, so it may not actually need to have that much memory as long as it can swap in the new assets that are needed far enough in advance.
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,633
136
Does it really need it though. I'm pretty sure the benchmarks released so far have shown the 3080 doing fine, so it may not actually need to have that much memory as long as it can swap in the new assets that are needed far enough in advance.

With RT off, it stays under 10 GB of VRAM. With RT on it goes above 10 GB of VRAM, but there is no RT support on Nvidia yet to test. We'll have to wait until the RT patch for Nvidia comes out to see how much VRAM is really needed.
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,633
136
< 60 FPS :( in an action genre.
That is a fail in my book. Yea, the 3080 would fail to.

I saw it was on ultra* settings. I wonder if it could get 80 fps with acceptable graphics settings? The idea being raytracing might be more valuable then some imperceptible ultra setting.

To show the >10 GB v-ram really matters the rx 6000 series needs to show a playable framerate where the 3080 flops. I wonder if it would achieve that in GodFall with RT off + ultra?

You can watch the video and see where he uses 4K Epic settings with RT off. The FPS still dips into the 40s and is consistently under 60 fps. Even a 3090 can't maintain 60 fps at 4K max settings (RT off).

Edit: According to the video posted, the 6800XT can do 60+ FPS at 1440p Epic with RT on. If even at 1440p with RT on it needs more than 10 GB of VRAM, that would obviously be a bad look for the 3080 but, again, we'll have to wait for the NV RT patch to see how much is really needed.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
According to the video posted, the 6800XT can do 60+ FPS at 1440p Epic with RT on.

That is very interesting!

.

It leaves a person with a choice:
either flip off Raytracing (perhaps it would not really be all that noticeable?)
or scale up from 1440p (which is not that awful, I do it frequently on my Vega 56 with AMDs old gen scaling)
or on a 3080 scale up from 1080p (which is still not that awful)


This would yield an interesting flame war, what is better: AMD scaling from 1440p or DLSS from 1080p?
 
Last edited:

zlatan

Senior member
Mar 15, 2011
580
291
136
The VRAM usage in ray tracing might be massive. This is because the standard PC implementations don't support flexible LOD systems. Every object must be loaded to the memory with really high resolution, no matter if it is far or not from the camera. The LOD must be selected before shooting the rays, and there is no standardized way to change it after. The only solution that works is to drastically limit the length of the rays. If a program won't do this, than the RT effect will be a memory hog.

I honestly think that the 3080 will have enough VRAM for 4K in the most situations, but not for ray tracing. If this matters, than don't buy a VGA with less than 16 GB memory. To be honest I would go for the 3090 with 24 GB VRAM. Sounds overkill, but not for ray tracing, 16 GB is the absolute minimum for 4K. 12 GB would be ok for 1080-1440p
 
Last edited:

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
I honestly think that the 3080 will have enough VRAM for 4K in the most situations, but not for ray tracing. If this matters, than don't buy a VGA with less than 16 GB memory. To be honest I would go for the 3090 with 24 GB VRAM. Sounds overkill, but not for ray tracing, 16 GB is the absolute minimum for 4K. 12 GB would be ok for 1080-1440p
The intended usage for 3080 as communicated by Nvidia in their marketing materials is 4K DLSS with RT. In an optimistic scenario, with some optimizations from developers aiming for Ampere cards with lower memory pools, do you see 10GB being enough for 1440p /w RT?
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The intended usage for 3080 as communicated by Nvidia in their marketing materials is 4K DLSS with RT. In an optimistic scenario, with some optimizations from developers aiming for Ampere cards with lower memory pools, do you see 10GB being enough for 1440p /w RT?

Are suggesting for the first time in ages that console versions of games will look better than their PC counterparts in a bid to appease nVidia and not have memory issues on their cards with not enough memory? Because that would be disappointing if so.
 
  • Like
Reactions: Tlh97 and Leeea

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
Are suggesting for the first time in ages that console versions of games will look better than their PC counterparts in a bid to appease nVidia and not have memory issues on their cards with not enough memory? Because that would be disappointing if so.
I'm not suggesting anything, I merely took the marketing proposition of Nvidia and asked for an opinion from @zlatan . Think of it as trying to understand Nvidia's point of view, understanding does not require agreeing.
 
  • Like
Reactions: Stuka87

zlatan

Senior member
Mar 15, 2011
580
291
136
The intended usage for 3080 as communicated by Nvidia in their marketing materials is 4K DLSS with RT. In an optimistic scenario, with some optimizations from developers aiming for Ampere cards with lower memory pools, do you see 10GB being enough for 1440p /w RT?
Don't believe in marketing materials. Their job is to sell the products, nothing else.

There are several way to lower the memory consumption. The easiest and most useful way is to limit the length of the rays. This is possible in a standard APIs, and it will help a lot, but the quality will be decreased. Most RT effect should allow some quality options, for the ray length. Low settings could be optimized for memory usage.

A more elegant way to solve this problem is a new API that allows programmable traversal. PS5 for example support this, and it is possible to change the lod after shooting the rays. This could lead to a very flexible LOD management, which can lower the memory requirements by a lot, without decreasing the quality. All RT capable hardware can support programmability for the BVH traversal stage, so a standard API might come for this, but not in the near future.
 
  • Like
Reactions: Tlh97 and Leeea

zlatan

Senior member
Mar 15, 2011
580
291
136
Are suggesting for the first time in ages that console versions of games will look better than their PC counterparts in a bid to appease nVidia and not have memory issues on their cards with not enough memory? Because that would be disappointing if so.
Might be, but not because the hardware. I'm only familiar with the PS5, and Sony's IO/RT APIs are miles ahead of the PC options. What makes me sad is that this is purely a software-based advantage. Nothing stops the newer PC GPUs to execute a custom BVH traversal via compute shader, the hardware is there, but the API is missing, and this leads to very suboptimal execution and huge memory requirements.
 
  • Like
Reactions: Tlh97 and Stuka87

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
Might be, but not because the hardware. I'm only familiar with the PS5, and Sony's IO/RT APIs are miles ahead of the PC options. What makes me sad is that this is purely a software-based advantage. Nothing stops the newer PC GPUs to execute a custom BVH traversal via compute shader, the hardware is there, but the API is missing, and this leads to very suboptimal execution and huge memory requirements.
I was under the impression that DXR 1.1 allows you to call TraceRay() in any shader type (including Compute) in any shader stage:


Th video actually suggests to use it from the compute-queue and it has optimization flags available (SKIP_PROCEDURAL_PRIMITIVES, FORCE_OPAQUE)

Could you explain how the PS5 implementation is more capable? (I'm not saying it isn't, just really interested to know)
 

DJinPrime

Member
Sep 9, 2020
87
89
51
You can watch the video and see where he uses 4K Epic settings with RT off. The FPS still dips into the 40s and is consistently under 60 fps. Even a 3090 can't maintain 60 fps at 4K max settings (RT off).

Edit: According to the video posted, the 6800XT can do 60+ FPS at 1440p Epic with RT on. If even at 1440p with RT on it needs more than 10 GB of VRAM, that would obviously be a bad look for the 3080 but, again, we'll have to wait for the NV RT patch to see how much is really needed.
I been playing with this game on my 3070. I'm running it on 3440x1440 epic and in the first area, i'm getting good frames but huge variance from 70-90. The ingame benchmark gives 53 fps, which reading other people's review seems to be more accurate indication of later levels. Also, you can turn on RTX by changing the GameUserSettings.ini in your AppData\Local\Aperion\Saved\Config\WindowsNoEditor folder. I actually gain 3 fps with it on in the benchmark. But just like AMD cards, I'm not sure if I like RT on. It really changes the way shadows look and I'm not sure in a good way. Also, the game is very boring and the mechanics is very basic...I really don't get why the game have such low fps, the models aren't that complex and there's actually not a whole lot going on in the game. And turning the game setting to medium, which is ugly, only got me 69 fps with no RT. Lol.
 
  • Like
Reactions: dr1337

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,633
136
I been playing with this game on my 3070. I'm running it on 3440x1440 epic and in the first area, i'm getting good frames but huge variance from 70-90. The ingame benchmark gives 53 fps, which reading other people's review seems to be more accurate indication of later levels. Also, you can turn on RTX by changing the GameUserSettings.ini in your AppData\Local\Aperion\Saved\Config\WindowsNoEditor folder. I actually gain 3 fps with it on in the benchmark. But just like AMD cards, I'm not sure if I like RT on. It really changes the way shadows look and I'm not sure in a good way. Also, the game is very boring and the mechanics is very basic...I really don't get why the game have such low fps, the models aren't that complex and there's actually not a whole lot going on in the game. And turning the game setting to medium, which is ugly, only got me 69 fps with no RT. Lol.

Since you are on an Nvidia card and getting 3 fps higher results in the benchmark, I have to think that trying to force RT on through the config file isn't working properly and RT isn't actually being used.
 
  • Like
Reactions: lightmanek

DJinPrime

Member
Sep 9, 2020
87
89
51
Since you are on an Nvidia card and getting 3 fps higher results in the benchmark, I have to think that trying to force RT on through the config file isn't working properly and RT isn't actually being used.
The benchmark is bugged with RT on. If I run the benchmark back to back, the 2nd time is significantly worse, and the 3rd is even worse, etc. But in game seems to be okay. I made 2 videos comparing RT on and off for my 3070 using Shadowplay, there does appear to be some performance hit when I played in game with this on. Please watch in HDR if your monitor supports it, if YT rendered it correctly, you should get at least 1440p. HDR makes much more of an impact visually than RT.

This is without RT.
This is with RT turned on via the config file change.

You can compare this to someone with a RX 6800 XT to confirm the RT setting is working.
https://www.youtube.com/watch?v=7QwdbyVEJhY

It seems that with RT, you lose a lot of the "fake" shadows? There's just a lot less for some reason. Maybe it's more realistic? Or the game is not using a lot of rays or very limited. Perhaps that's why the performance hit is not that significant. But in any event, I don't like it, I prefer the "fake" shadows. This game is unplayable with Epic settings unfortunately for me, have to drop to high to get 60 fps. It really is somewhat of a bait and switch, the game was performing really well in the tutorial section of the game. Once you start the real story, fps dropped significantly. Getting back on topic, vram does not appear to be the problem, or I haven't ran into it yet. The game just perform really bad at high resolution, it really needs DLSS even without RT.
I just noticed something in my videos and the 6800XT one, computer RAM usage is significantly higher with RT on. Another mystery with this game. I think I'm going to hold off playing more of this game until the developers can tune it, it really needs a 30% improvement.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
It seems that with RT, you lose a lot of the "fake" shadows?
It's probably the same situation we saw with some Metro footage in terms of global illumination: adding RT effects at the end as opposed to making them part of the entire product development cycle can alter the experience in unexpected ways. Sometimes it looks better, sometimes it deviates from the artistic direction of the game, one for which game designers tweaked the scenes without RT effects enabled. Strong light rays casting shadows through a set of columns and then bouncing back from different surfaces will always look better with RT on, but a dark & moody scene in which various tricks were used to get a cinematic feeling... that might fall apart and loose charm.

We may see a reverse for this situation as RT effects tend to become ubiquitous: RT enabled scenes will be in sync with artistic direction of the game, old techniques for illumination and shadows will make the game feel awkward, lacking in some way or another.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
I was under the impression that DXR 1.1 allows you to call TraceRay() in any shader type (including Compute) in any shader stage:

Th video actually suggests to use it from the compute-queue and it has optimization flags available (SKIP_PROCEDURAL_PRIMITIVES, FORCE_OPAQUE)

Could you explain how the PS5 implementation is more capable? (I'm not saying it isn't, just really interested to know)

This is true, but that's not the problem. The issue is that when you shoot the ray, you need to set the LOD, and that lavel is applied for the full lenght of the ray. This means some objects far from the camera might be loaded in the highest resolution, while in theory it would be enough to load just a low resolution LOD for them. But this is not possible on DXR 1.0 and 1.1.

PS5 has a solution. A custom BVH traversal allows to control the full mechanism, so even if you set the highest LOD level at the launch of the ray, you can switch LOD as much as you want after the ray has passed some distance from the camera. You can save a lot of memory with this, multiple gigabytes.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
This is true, but that's not the problem. The issue is that when you shoot the ray, you need to set the LOD, and that lavel is applied for the full lenght of the ray. This means some objects far from the camera might be loaded in the highest resolution, while in theory it would be enough to load just a low resolution LOD for them. But this is not possible on DXR 1.0 and 1.1.

PS5 has a solution. A custom BVH traversal allows to control the full mechanism, so even if you set the highest LOD level at the launch of the ray, you can switch LOD as much as you want after the ray has passed some distance from the camera. You can save a lot of memory with this, multiple gigabytes.
Thanks for the info! Hopefully this will be remedied in new Vulcan and DXR versions. i'm not holding my breath however, My hunch is that DXR didn't implement it because Nvidias fixed-function BVH traversal units probably can't support it.
 

Hitman928

Diamond Member
Apr 15, 2012
5,182
7,633
136
The benchmark is bugged with RT on. If I run the benchmark back to back, the 2nd time is significantly worse, and the 3rd is even worse, etc. But in game seems to be okay. I made 2 videos comparing RT on and off for my 3070 using Shadowplay, there does appear to be some performance hit when I played in game with this on. Please watch in HDR if your monitor supports it, if YT rendered it correctly, you should get at least 1440p. HDR makes much more of an impact visually than RT.

This is without RT.
This is with RT turned on via the config file change.

You can compare this to someone with a RX 6800 XT to confirm the RT setting is working.
https://www.youtube.com/watch?v=7QwdbyVEJhY

It seems that with RT, you lose a lot of the "fake" shadows? There's just a lot less for some reason. Maybe it's more realistic? Or the game is not using a lot of rays or very limited. Perhaps that's why the performance hit is not that significant. But in any event, I don't like it, I prefer the "fake" shadows. This game is unplayable with Epic settings unfortunately for me, have to drop to high to get 60 fps. It really is somewhat of a bait and switch, the game was performing really well in the tutorial section of the game. Once you start the real story, fps dropped significantly. Getting back on topic, vram does not appear to be the problem, or I haven't ran into it yet. The game just perform really bad at high resolution, it really needs DLSS even without RT.
I just noticed something in my videos and the 6800XT one, computer RAM usage is significantly higher with RT on. Another mystery with this game. I think I'm going to hold off playing more of this game until the developers can tune it, it really needs a 30% improvement.

Can you record the benchmark with RT on for comparison?
 
  • Like
Reactions: Tlh97 and Leeea

zlatan

Senior member
Mar 15, 2011
580
291
136
Thanks for the info! Hopefully this will be remedied in new Vulcan and DXR versions. i'm not holding my breath however, My hunch is that DXR didn't implement it because Nvidias fixed-function BVH traversal units probably can't support it.
Programmable traversal can be supported by NVIDIA via shaders. A fixed-function BVH traversal unit never able to support this kind of functionality.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
We may see a reverse for this situation as RT effects tend to become ubiquitous: RT enabled scenes will be in sync with artistic direction of the game, old techniques for illumination and shadows will make the game feel awkward, lacking in some way or another.

I'm not necessarily sure of that. With actual cinema the set can be constructed around an end goal with controlled artificial lighting and fixed camera positions to achieve the desired effect. If you're using real ray tracing you can't fake it like that unless you're building a cutscene, because once the player has free control it will become obvious that the world doesn't look right or you'll need to construct the world around the lighting.

It's more realistic, but it also limits you to the rules of reality. But we're probably a decade off from RT being powerful enough that it can be used like that, so perhaps developers will be able to adapt to it over time and figure out how to harness it without those kinds of problems. Worlds have to be designed around gameplay as well and sometimes that's going to come into conflict with how they could be naturally lit.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
I'm not necessarily sure of that. With actual cinema...
I wasn't talking about actual cinema, but rather about artistic direction. Games have a visual style, levels have a certain mood to them.

If lights gain perceived intensity with RT on due to reflections in the environment then you could see why a level designer may have added more (or stronger) lights in the room to gain the same effect with "traditional" lighting. The experience was calibrated for a different technique, so enabling RT will likely brighten the room considerably.

When reviewing RTX on Metro Exodus, Hardware Canucks had lots of praise for RT but also a few warnings: darker environments that could previously be navigated now required having a flashlight enabled, yet some outdoor scenes received so much (bounced) lighting that they almost became washed out with RT on.
 

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
I was just using it to illustrate the point that in the real world where you have to obey the laws of physics we have to jump through a lot of hoops in order to achieve a desired effect. Video games can just fake it as much as they want without it really needing to make sense. The dungeon can be as brightly lit as you want it to be regardless of whether there're any light sources or not.

Realistic lighting doesn't necessarily add to the gameplay unless you want to focus to be around having light to see what's going on around you. Even if that were the focus of a game, it could always set a minimal amount of visibility that exists no matter what and most people completely accept that because they realize they're playing a game. If you're doing everything as pure RT then you get no visibility when there's no light source. Faking it after making it behave more realistically is going to be far more grating.

Someone in another thread posted a video of CoD on the consoles that showed the difference with and without RT. They were just using it for subtle shadow effects and I think that probably is about how it should be used. It isn't going to kill the frame rate, but it does provide a nice visual bump over not having it.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
I was just using it to illustrate the point that in the real world where you have to obey the laws of physics we have to jump through a lot of hoops in order to achieve a desired effect. Video games can just fake it as much as they want without it really needing to make sense.
On this we fully agree, I didn't get the message right the first time.
 
  • Like
Reactions: Tlh97 and Leeea

DJinPrime

Member
Sep 9, 2020
87
89
51
Can you record the benchmark with RT on for comparison?
Here you go.

I also found something unexpected with RT and Godfall. There is a serious drop in FPS when I use a 2 hand weapon. This game will generate a smoke effect when using this type of weapon and it cause massive fps drops. I think the dust is impacting the ray calculations. Kind of nuts if rays are being process bouncing off the dusts lol. This drop do not happen with RT off.

The VRAM usage in ray tracing might be massive.
That doesn't seems to be the case with Godfall and Wolfenstein youngblood. The VRAM allocated difference between RT on and off is not very noticeable. With Godfall, you see a massive system RAM increase though. There's no RAM increase in Wolfenstein. Also, Cyberpunk requirement for RT is just a 2060 with 6GB, so I don't think RT (at least in the current form) use a massive amount of VRAM.
 
  • Like
Reactions: Tlh97 and Leeea