Question Is 10GB of Vram enough for 4K gaming for the next 3 years?

Page 16 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
The question is simple. Will 10GB be enough moving forward for the next 3 years? We all know what happens when Vram gets breached: skips, stutters and chugging.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
That doesn't seems to be the case with Godfall and Wolfenstein youngblood. The VRAM allocated difference between RT on and off is not very noticeable. With Godfall, you see a massive system RAM increase though. There's no RAM increase in Wolfenstein. Also, Cyberpunk requirement for RT is just a 2060 with 6GB, so I don't think RT (at least in the current form) use a massive amount of VRAM.

The RT in Godfall is very memory hog. You just don't see it, because it can't be activated on a GeForce in the currenct program version. No matter how hard to try, the service library is missing in your system. As you can see in the benchmark, those are not RT shadows, the overall quality is too low.

Godfall is working in a very special way. It uses DXR 1.1 to start the RT, and it will pass the work to Radeon Rays 4.0. But on your system the program can't do that, because the service library for the hit token is missing. So in the and you get back nothing, and every shader inside DXR will be a miss shader. It's really no point to turn on this, because the effect in it's current form is not implemented in a standard way. A standard RT shadow effect in Godfall would be too memory heavy. Your system just do some extra calculation for nothing.

The memory usage is depends on the implementation, if the lenght of the rays are minimal, than the effect won't consume a lot of memory, but otherwise it will.
 
  • Like
Reactions: Tlh97 and Leeea

DJinPrime

Member
Sep 9, 2020
87
89
51
The RT in Godfall is very memory hog. You just don't see it, because it can't be activated on a GeForce in the currenct program version. No matter how hard to try, the service library is missing in your system. As you can see in the benchmark, those are not RT shadows, the overall quality is too low.

Godfall is working in a very special way. It uses DXR 1.1 to start the RT, and it will pass the work to Radeon Rays 4.0. But on your system the program can't do that, because the service library for the hit token is missing. So in the and you get back nothing, and every shader inside DXR will be a miss shader. It's really no point to turn on this, because the effect in it's current form is not implemented in a standard way. A standard RT shadow effect in Godfall would be too memory heavy. Your system just do some extra calculation for nothing.

The memory usage is depends on the implementation, if the lenght of the rays are minimal, than the effect won't consume a lot of memory, but otherwise it will.
Godfall RT is working with NV cards, you can compare my benchmark video with the 6800xt benchmark. They look the same, and I can tell, the shadows are there when they're there. If something is written using a DX API, then any hardware that supports it will work. That's why the 6800s can run on existing RT games (minus a few bugs). Why the developer don't enable it for NV card? I don't know, might be some bug that I haven't hit or since it's an AMD sponsored game, they're giving AMD exclusivity for now. Since it's an Unreal engine game, the configuration file could be changed to enabled it. Also, the fact that all current RT titles, including the heavily used in Control could be runs on a 6GB 2060 says RT doesn't use a massive amount of VRAM.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
Godfall RT is working with NV cards, you can compare my benchmark video with the 6800xt benchmark. They look the same, and I can tell, the shadows are there when they're there. If something is written using a DX API, then any hardware that supports it will work. That's why the 6800s can run on existing RT games (minus a few bugs). Why the developer don't enable it for NV card? I don't know, might be some bug that I haven't hit or since it's an AMD sponsored game, they're giving AMD exclusivity for now. Since it's an Unreal engine game, the configuration file could be changed to enabled it. Also, the fact that all current RT titles, including the heavily used in Control could be runs on a 6GB 2060 says RT doesn't use a massive amount of VRAM.

I saw it, and the RT shadow is missing from the game itself.

The problem is that the RT in Godfall is not implemented inside the DXR API. It's an interop implementation that do the job outside DirectX. You can check the library file inside the game folder: amdrtshadow.dll.

I wrote it earlier. A standard DXR 1.1 implementation would need much more VRAM. 18-20 GB even on 1440p. So there is no point to enable it on NVIDIA, when most of their cards are not have this much memory. There will be some optimization later, or maybe NVIDIA will allows the hit token via NVAPI, and an RT implementation could be possible in OptiX, but not today.

You can change the configuration file, but every hit token goes outside DXR. That's why an amdrtshadow.dll file used inside the game folder. NVIDIA can't use it, so while the shaders will start, they don't get back any results, and every job will start miss shader. That's why there is no RT shadows in your video.

The RT implementations can be different. The older games configured a very limited length for the rays. Godfall configured it's RT effect for much bigger scenes, so the length of the rays are longer, and this causes much bigger memory usage.
 

DJinPrime

Member
Sep 9, 2020
87
89
51
I saw it, and the RT shadow is missing from the game itself.

The problem is that the RT in Godfall is not implemented inside the DXR API. It's an interop implementation that do the job outside DirectX. You can check the library file inside the game folder: amdrtshadow.dll.

I wrote it earlier. A standard DXR 1.1 implementation would need much more VRAM. 18-20 GB even on 1440p. So there is no point to enable it on NVIDIA, when most of their cards are not have this much memory. There will be some optimization later, or maybe NVIDIA will allows the hit token via NVAPI, and an RT implementation could be possible in OptiX, but not today.

You can change the configuration file, but every hit token goes outside DXR. That's why an amdrtshadow.dll file used inside the game folder. NVIDIA can't use it, so while the shaders will start, they don't get back any results, and every job will start miss shader. That's why there is no RT shadows in your video.

The RT implementations can be different. The older games configured a very limited length for the rays. Godfall configured it's RT effect for much bigger scenes, so the length of the rays are longer, and this causes much bigger memory usage.
See the video where my character is standing at the start of the video. Compare it against the gameplay video of the 6800xt, it's the same. And I have compared it more in game in different areas and angles. It's the same. Also, even in the 6800xt the allocated memory is not maxed out. You're wrong about how much rays Godfall uses, it really don't use that much. It's not very impressive at all. Cyberpunk will be coming out in less than 2 weeks, hopefully (and I think it will) it will be a better representation of next gen graphics than Godfall.
 
  • Like
Reactions: Leeea

CastleBravo

Member
Dec 6, 2019
119
271
96
See the video where my character is standing at the start of the video. Compare it against the gameplay video of the 6800xt, it's the same. And I have compared it more in game in different areas and angles. It's the same. Also, even in the 6800xt the allocated memory is not maxed out. You're wrong about how much rays Godfall uses, it really don't use that much. It's not very impressive at all. Cyberpunk will be coming out in less than 2 weeks, hopefully (and I think it will) it will be a better representation of next gen graphics than Godfall.

CP2077 was developed as a last-gen title and was intended to be released long ago. I'm sure the graphics look great, but I don't think it will be a good representation of what true next-gen game engines written to take full advantage of PS5, XSX, RDNA2, or even Ampere will be capable of.
 

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,793
136
I'm developing on C++ under Ubuntu. My application has a lot of threads (more than 25).
My question is:
  1. If I will run my app on the graphic card (which is known as CPU with lot of cores), will I see an improvement in the performance?
  2. I know that graphic cards have more cores than I7 (for example), so my app will has less context switches and thus better performance, is that true?
  3. Do the cores of the graphic cards (in the basic) have a better performance than the CPU cores? I know that it depends on the graphic card and CPU type, but is there a general answer?

Not at all the right place to post this. I suggest you start a thread in the programming forum:

 

DJinPrime

Member
Sep 9, 2020
87
89
51
CP2077 was developed as a last-gen title and was intended to be released long ago. I'm sure the graphics look great, but I don't think it will be a good representation of what true next-gen game engines written to take full advantage of PS5, XSX, RDNA2, or even Ampere will be capable of.
And you don't think it would take years to develop this "next-gen" engine and then the game? All that matters is how well the game looks. The advantage of Ampere over Turing is basically performance, there's nothing new in terms of features. So, as long as the engine scales with the hardware (more rays, enable dlss, variable shaders), it's "next-gen". RDNA2 on the PC side didn't add much either, it doesn't have the geometry engine of the ps5. SAM is hit or miss and even when on it doesn't add much. So, I don't see anything revolutionary coming out anytime soon. I think the consoles were way overhyped, when people dig into it like what DF do with the game reviews, they see that it's just ok when compared to what PC gaming is like. Only the new Horizon dawn game look impressive but at the end of the DF break down, he cautioned that we might just be seeing cinematic and no clue what actual game play might look like. Happy to be wrong, but I think we got fooled with this "next-gen".
 
  • Like
Reactions: Mopetar and Leeea

zlatan

Senior member
Mar 15, 2011
580
291
136
See the video where my character is standing at the start of the video. Compare it against the gameplay video of the 6800xt, it's the same. And I have compared it more in game in different areas and angles. It's the same. Also, even in the 6800xt the allocated memory is not maxed out. You're wrong about how much rays Godfall uses, it really don't use that much. It's not very impressive at all. Cyberpunk will be coming out in less than 2 weeks, hopefully (and I think it will) it will be a better representation of next gen graphics than Godfall.
I still don't see it. You have the normal shadows.

Certainly, the allocated memory is not needed. But this is the problem of the PC platform. The memory management is inefficient, and the memory usage is much higher than the theoretical needs. So even if you don't need a lot of data, it has to bo loaded inside the memory. In the end you need more memory, because the program won't run, and there is no solution on PC to get around the OS limitations. On the consoles this is much better, because the OS won't write inside the app memory, so the program can do anything.

Godfall uses one ray per pixel, with one secondary ray. This is normal or above normal. But the problem is the length of the ray, Godfall is very heavy on this parameter compared to other titles, and this is why it needs a lot of VRAM. Limiting the length of the ray will save a lot of memory.
 
  • Like
Reactions: Tlh97 and Leeea

DJinPrime

Member
Sep 9, 2020
87
89
51
I still don't see it. You have the normal shadows.

Certainly, the allocated memory is not needed. But this is the problem of the PC platform. The memory management is inefficient, and the memory usage is much higher than the theoretical needs. So even if you don't need a lot of data, it has to bo loaded inside the memory. In the end you need more memory, because the program won't run, and there is no solution on PC to get around the OS limitations. On the consoles this is much better, because the OS won't write inside the app memory, so the program can do anything.

Godfall uses one ray per pixel, with one secondary ray. This is normal or above normal. But the problem is the length of the ray, Godfall is very heavy on this parameter compared to other titles, and this is why it needs a lot of VRAM. Limiting the length of the ray will save a lot of memory.
I'm not sure what you're looking at. Here's my original post.
The benchmark is bugged with RT on. If I run the benchmark back to back, the 2nd time is significantly worse, and the 3rd is even worse, etc. But in game seems to be okay. I made 2 videos comparing RT on and off for my 3070 using Shadowplay, there does appear to be some performance hit when I played in game with this on. Please watch in HDR if your monitor supports it, if YT rendered it correctly, you should get at least 1440p. HDR makes much more of an impact visually than RT.

This is without RT.
This is with RT turned on via the config file change.

You can compare this to someone with a RX 6800 XT to confirm the RT setting is working.
https://www.youtube.com/watch?v=7QwdbyVEJhY

It seems that with RT, you lose a lot of the "fake" shadows? There's just a lot less for some reason. Maybe it's more realistic? Or the game is not using a lot of rays or very limited. Perhaps that's why the performance hit is not that significant. But in any event, I don't like it, I prefer the "fake" shadows. This game is unplayable with Epic settings unfortunately for me, have to drop to high to get 60 fps. It really is somewhat of a bait and switch, the game was performing really well in the tutorial section of the game. Once you start the real story, fps dropped significantly. Getting back on topic, vram does not appear to be the problem, or I haven't ran into it yet. The game just perform really bad at high resolution, it really needs DLSS even without RT.
I just noticed something in my videos and the 6800XT one, computer RAM usage is significantly higher with RT on. Another mystery with this game. I think I'm going to hold off playing more of this game until the developers can tune it, it really needs a 30% improvement.

In the top video, you can see that there are shadows along the left side of the path. The circle arches also have shadows of the trees. Both of those are missing or significantly reduced when I turned on RT. The 2 video is completely different. If you think it's just something wrong with my NV card, then look at the 3rd link of a 6800xt running with RT. You see that it has the exact same issue as my video. You really overstate Godfall's RT, it's really not impressive. VRAM is not that signficant, at 4k only 10GB is allocated in the Jansk Benchmark video compared to 9GB with it off.

It has the same geometry engine. This unit is also identical on Xbox Series S and X. The APIs are different, but the underlying hardware is the same.
You're the only person who says that. It's widely known that PS5 has this custom hw. RDNA3 is expected to have this.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
I'm not sure what you're looking at. Here's my original post.
I see your post, but there are no RT shadows in the video.

You're the only person who says that. It's widely known that PS5 has this custom hw. RDNA3 is expected to have this.
I saw a lot of PS5 in the last year. I had almost every early sample, and I can safely say that RedGamingTech has no clue what they talking about.

Both consoles use custom Zen and custom RDNA based on Zen 2 and RDNA 2. The geometry engine is the same in both console. The only difference is that Sony uses the primitive shader name for mesh shading, but it's the same thing. Also this can't be accelerated, this is a shader stage, it's programmable so the codes must be executed on the ALUs. The PS5 support the same functionality that provided by sampler feedback, they just don't use a fancy name on it, but it is already inside the API.

The PS5 has some addition compared to the base RDNA 2 uarch, like cache scrubbers, these are awesome. There are some missing instruction.
 

Hitman928

Diamond Member
Apr 15, 2012
5,244
7,793
136
While not exactly the topic, but a similar one, Watch Dogs Legion with RT on, Ultra settings at 1080p needs just over 8 GB of VRAM. The 3070 basically dies trying to run it at 1080p. Upping the resolution to 4K doesn't increase the VRAM needed too much so it looks like textures and RT are a huge chunk of the needed VRAM.

 
  • Like
Reactions: lightmanek

Leeea

Diamond Member
Apr 3, 2020
3,617
5,363
136
Anyone care to speculate at the MSRP of a 3080 Ti with >10 GB of vram?

( We all know it is coming, sometime, next summer. It is not like we will be able to buy anything until next summer anyway. )
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
Anyone care to speculate at the MSRP of a 3080 Ti with >10 GB of vram?

( We all know it is coming, sometime, next summer. It is not like we will be able to buy anything until next summer anyway. )


I heard it would be announced in January via rumors but we will see. Probably $999.




Nvidia recommends 10GB VRAM for max settings in Cyberpunk. We've already hit the limit.

View attachment 35446

They should note in that table that those settings are targeting 30fps. Even in game settings has the fps target defaulted to 30.
 
  • Like
Reactions: Tlh97 and Leeea

sze5003

Lifer
Aug 18, 2012
14,182
625
126
Was finally able to score an EVGA 3080 FTW3 Ultra off Amazon last night. The Discord Notification bot finally worked for me. Not going to wait for the 3080ti anymore.
I'm on that discord bot too yet I didn't see any notification for amazon. I saw some newegg alerts for the combos.

Do you have the link handy?

I'm prepared to wait only until January to see if an announcement of 3080ti will show up.
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
Was finally able to score an EVGA 3080 FTW3 Ultra off Amazon last night. The Discord Notification bot finally worked for me. Not going to wait for the 3080ti anymore.

Desperate times regarding many consumer products. Seriously, it's almost impossible to find many products in totally unrelated hobbies. It makes me feel like the world might be coming to an end, but we must ask ourselves if it's worth living in a world where an $800 GPU only has10GB of RAM. I think not. Revolution.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
The 3080ti is supposedly pushed to Feb 11-17 now. I don't think it will be easily available until the summer, the way things are going with these cards.
 

sze5003

Lifer
Aug 18, 2012
14,182
625
126
The 3080ti is supposedly pushed to Feb 11-17 now. I don't think it will be easily available until the summer, the way things are going with these cards.
Yea that sucks, was hoping it was sooner. I want to upgrade at some point and it looks like I don't want to wait that long. Already have been waiting the past 2 years.
 

maddie

Diamond Member
Jul 18, 2010
4,738
4,667
136
Desperate times regarding many consumer products. Seriously, it's almost impossible to find many products in totally unrelated hobbies. It makes me feel like the world might be coming to an end, but we must ask ourselves if it's worth living in a world where an $800 GPU only has10GB of RAM. I think not. Revolution.
Did you collaborate on Tom's Hardware article, Ray-tracing and end of life regrets? :D
 

moonbogg

Lifer
Jan 8, 2011
10,635
3,095
136
I have to be perfectly honest here; I'm totally content with my 1080Ti still. The only reason to upgrade is what, Cyberpunk? By the time these GPUs are available, maybe that game will finally be ready as well. I figure by the time Battlefield 6 comes out, maybe GPUs will be available. Although by that time the next gen GPU's will be right around the corner, so it looks like this entire generation is a huge dud.
 

CP5670

Diamond Member
Jun 24, 2004
5,510
588
126
Same here. There are only a few games I want a new card for (Cyberpunk, Control and FS2020), and I have more than enough older games in my play queue. I actually want HDMI 2.1 more than anything else so I can use VRR and 120hz on the desktop, which a 3080 would do just as well. I wouldn't even mind buying a 3090 (if I could actually find one), but see no practical benefit in it over the cheaper cards.
 

undertaker101

Banned
Apr 9, 2006
301
195
116
Same here. There are only a few games I want a new card for (Cyberpunk, Control and FS2020), and I have more than enough older games in my play queue. I actually want HDMI 2.1 more than anything else so I can use VRR and 120hz on the desktop, which a 3080 would do just as well. I wouldn't even mind buying a 3090 (if I could actually find one), but see no practical benefit in it over the cheaper cards.
Need to sign on a discord and keep an eye out, I have been able to order multiple 3070s/3080s/3090s using that ..canceled most since I am not a scalper and my day job makes it not worth the hassle thankfully trying to resell for a small profit. Waiting on the 3080 Ti release though, hopefully I can snag one at launch and then my 3080 will be in the FS section near cost.