8GB VRAM not enough (and 10 / 12)

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
2,972
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:

Ranulf

Platinum Member
Jul 18, 2001
2,356
1,176
136
To be fair, the game is unoptimized in the test. Then again, it doesn't really look good graphically. There are already comparison videos of it to the first two games, from 2006 and 2013. The washed out look isn't great. As far as I could tell, I was getting 60 fps (steam's meter) most of the time on my 4790k and 2060Super at 1080p (with 2 AI players). CPU min req. is an i5 6th gen too. GPU min is a gtx 950/ R9 370 lol. Game is also due to hit consoles months after PC release apparently.

The RX 6600 scores are the most concerning to me.
 
  • Like
Reactions: Leeea

gamervivek

Senior member
Jan 17, 2011
490
53
91
The whole thing is a chicken and egg situation since the game creators reduce the VRAM usage if the mainstream cards don't have enough VRAM.

Cyberpunk reduces texture quality at 1080p severely to keep it within 8GB. Far Cry 6 went ahead with way more VRAM usage, but even that was patched to reduce the LoD. The reviewers also don't test for long and don't have anything else open/ second screen. You lose amost 1GB of VRAM for watching videos on youtube.

And that is before the texture mods come into picture which were a staple of PC gaming. Cyberpunk has a few texture mods that improve blurry textures that the game uses in some scenarios and it can easily push total VRAM usage over 16GB at 4k.
 

Golgatha

Lifer
Jul 18, 2003
12,651
1,514
126
The whole thing is a chicken and egg situation since the game creators reduce the VRAM usage if the mainstream cards don't have enough VRAM.

Cyberpunk reduces texture quality at 1080p severely to keep it within 8GB. Far Cry 6 went ahead with way more VRAM usage, but even that was patched to reduce the LoD. The reviewers also don't test for long and don't have anything else open/ second screen. You lose amost 1GB of VRAM for watching videos on youtube.

And that is before the texture mods come into picture which were a staple of PC gaming. Cyberpunk has a few texture mods that improve blurry textures that the game uses in some scenarios and it can easily push total VRAM usage over 16GB at 4k.

I like that Far Cry 6 has an optional install for higher resolution textures which use extra VRAM. I wish more games had this option.
 

blckgrffn

Diamond Member
May 1, 2003
9,128
3,069
136
www.teamjuchems.com
We released the game with downgraded textures coz not everyone will notice them and less people will complain about the game not running on their low VRAM cards. By the way, that will be $15. Thank you.

And the ability to set a resolution in the above 1080p, the ability to use ultrawide monitors and a host of other features you assumed would be in the base game! Need to invert the mouse? Use a gamepad? This the day one DLC pack you are looking for!

And an exclusive multiplayer skin so everyone knows you got the real version of the game!

/s smh :/
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
The video card manufacturers ahould add a slot that would allow users to add RAM if they want...wouldn"t it be great to be able to drop in 16 or 32GB of VRAM?

Not an option. It would significantly impact performance to have the memory in slots. Trace length is a major issue with GPUs. Plus, there is the issue of memory addressing. You basically could not have an empty slot. All slots would have to be filled in order to retain the memory bus width. And, all memory would have to match, no mixing and matching capacities or speeds. Its a big mess.

Its an issue with CPUs too, its the reason Apple has moved to integrated memory. It has huge performance gains.

But there is a reason no GPU made in the last 25 years has had upgradable memory.
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
it can easily push total VRAM usage over 16GB at 4k.
my rx6900xt is already limited :(

that was faster then expected.

The video card manufacturers ahould add a slot that would allow users to add RAM if they want...wouldn"t it be great to be able to drop in 16 or 32GB of VRAM?
I suspect that would not play nice with the memory timings.

This used to be a feature but fell out of favor:

Not an option. It would significantly impact performance to have the memory in slots. Trace length is a major issue with GPUs. Plus, there is the issue of memory addressing. You basically could not have an empty slot. All slots would have to be filled in order to retain the memory bus width. And, all memory would have to match, no mixing and matching capacities or speeds. Its a big mess.

Its an issue with CPUs too, its the reason Apple has moved to integrated memory. It has huge performance gains.

But there is a reason no GPU made in the last 25 years has had upgradable memory.
Its been done:

and is not a new thing:


Plus, there is the issue of memory addressing.
The memory upgraded cards can see and address the additional memory. The issues are more on the stability end of things.
 
Last edited:
  • Wow
Reactions: igor_kavinski

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
my rx6900xt is already limited :(

that was faster then expected.


I suspect that would not play nice with the memory timings.

This used to be a feature but fell out of favor:


Its been done:

and is not a new thing:



The memory upgraded cards can see and address the additional memory. The issues are more on the stability end of things.

That's a far cry from user upgradable memory. And this sort of thing has been done on cards for a long time. Replacing the soldered on chips is super cool, but is not what the person above was referencing.
 

BoomerD

No Lifer
Feb 26, 2006
62,915
11,306
136
Not an option. It would significantly impact performance to have the memory in slots. Trace length is a major issue with GPUs. Plus, there is the issue of memory addressing. You basically could not have an empty slot. All slots would have to be filled in order to retain the memory bus width. And, all memory would have to match, no mixing and matching capacities or speeds. Its a big mess.

Its an issue with CPUs too, its the reason Apple has moved to integrated memory. It has huge performance gains.

But there is a reason no GPU made in the last 25 years has had upgradable memory.

Sounds like something that COULD be solved...IF there was enough demand for it. Sell GPU's with a single slot, filled with 8 or 16 GB or VRAM, have a VERY limited QVL for VRAM for that particular GPU. Let folks upgrade the VRAM if/as they want.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Sounds like something that COULD be solved...IF there was enough demand for it. Sell GPU's with a single slot, filled with 8 or 16 GB or VRAM, have a VERY limited QVL for VRAM for that particular GPU. Let folks upgrade the VRAM if/as they want.

One slot would result in traces that are too long. All the memory needs equal length traces. And the traces have to be as short as possible. Notice how all the memory is located around the GPU. We would need a way to keep memory in those positions.

But really, its not going to happen because as time goes on, memory that is located outside the die is most likely going to go away. Apple has already moved this way for the CPU and GPU with unified memory. No maker is going to sacrifice performance to allow for end users to upgrade the memory. Especially since nVidia uses memory to make cards become obsolete faster than they otherwise would.
 
Last edited:
  • Like
Reactions: Leeea

q52

Member
Jan 18, 2023
68
36
51
been playing Cyberpunk on Nvidia RTX A4500, it has 20GB memory but still struggles to get a decent framerate on 4K Ultra settings with Ray Tracing, I get like 15FPS at best. Had to drop it down to 1440p and low Ray Tracing to keep 60FPS (my monitor is 60Hz anyway)

after reading this thread i guess I should do some more formal testing on it though to see if I cant find a way to bring back ray tracing at 4K and maintain 60FPS
 
  • Wow
Reactions: igor_kavinski

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
been playing Cyberpunk on Nvidia RTX A4500, it has 20GB memory but still struggles to get a decent framerate on 4K Ultra settings with Ray Tracing, I get like 15FPS at best. Had to drop it down to 1440p and low Ray Tracing to keep 60FPS (my monitor is 60Hz anyway)

after reading this thread i guess I should do some more formal testing on it though to see if I cant find a way to bring back ray tracing at 4K and maintain 60FPS

You can always enable DLSS if you are ok with some of the visuals.
 
  • Like
Reactions: igor_kavinski

TESKATLIPOKA

Platinum Member
May 1, 2020
2,362
2,854
106
been playing Cyberpunk on Nvidia RTX A4500, it has 20GB memory but still struggles to get a decent framerate on 4K Ultra settings with Ray Tracing, I get like 15FPS at best. Had to drop it down to 1440p and low Ray Tracing to keep 60FPS (my monitor is 60Hz anyway)

after reading this thread i guess I should do some more formal testing on it though to see if I cant find a way to bring back ray tracing at 4K and maintain 60FPS
20GB Vram is not a problem. You won't get 60FPS even with RTX 4090. You have to lower the settings.
Screenshot_2.png
 

q52

Member
Jan 18, 2023
68
36
51
You can always enable DLSS if you are ok with some of the visuals.
Oh i forgot i do have it enabled.
I'll do a better breakdown of the settings vs performance later
Was not sure what to expect with this card, got it for Stable Diffusion but being able to play some games is a plus. Didn't have space in my mITX build for 3090 or similar
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
been playing Cyberpunk on Nvidia RTX A4500, it has 20GB memory but still struggles to get a decent framerate on 4K Ultra settings with Ray Tracing, I get like 15FPS at best. Had to drop it down to 1440p and low Ray Tracing to keep 60FPS (my monitor is 60Hz anyway)

after reading this thread i guess I should do some more formal testing on it though to see if I cant find a way to bring back ray tracing at 4K and maintain 60FPS
Turn raytracing off. ( outside that one room in the game can you really tell the difference? )

Revert to 4k.

Enjoy.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
I’m CPU bound in Cyberpunk getting around 100 fps at 1440p due to all the extra AI/traffic and scripts. Only see around 6GB of VRAM usage. The game runs fantastic and mods make the game actually decent to play.

The mods I see for upscaling textures have not looked that good imo, but maybe I have missed the good ones.

I don’t think RT is worth it in 2077 at all. Same for FSR and DLSS. Combining AMD Fidelity FX and sharpening filter makes the TAA blur acceptable.

It is interesting how different our experiences in the same game are.
 
  • Like
Reactions: Leeea

q52

Member
Jan 18, 2023
68
36
51
What does it take to be "CPU bound" in Cyberpunk? I'm using Ryzen 3950X and highest usage i ever see is the occasional spike to 40% but most times it's like 20% sometimes less. That's with Eco Mode on as well
 

Leeea

Diamond Member
Apr 3, 2020
3,625
5,368
136
What does it take to be "CPU bound" in Cyberpunk? I'm using Ryzen 3950X and highest usage i ever see is the occasional spike to 40% but most times it's like 20% sometimes less. That's with Eco Mode on as well
Your CPU bound most likely.

So your CPU is capable of processing 32 threads across its 16 cores, but the game is never going to use that many.

The typical game is only going to use six or eight threads. Interestingly it looks like cyberpunk is using 13 threads. Windows will assign 1 thread to 1 core for performance reasons, so 13 of your 16 cores are being used. When you see 50% usage all of your cores are being used.

While a core can run two threads it gives up about 30% individual thread performance to do that. Good for productivity apps, but not ideal for gaming.


The primary thread is this is the most important one. The primary thread will spin off secondary threads to speed things up, but those secondary threads will return their work to the primary thread for assembly to form the games state. The frame will not be sent to the GPU until the primary thread finishes its work.

Slow single threaded performance the primary thread, or not enough cores to handle the secondary threads will CPU bound a game.

Your most likely CPU bound on the primary thread.

--------------------------------------------

The real question is do you even care if your CPU bound? If your frame rate is good enough turn up the GPU quality settings in game and enjoy.

If you care, go buy a 5800x3d and enjoy. Personally, I would keep the 3950x if I were you, that is a nice CPU.


Yes a 5800x3d is 16%* faster then a 3950 in cyberpunk at 1080p with a rtx3090, but odds are your not running 1080p or a rtx3090. So reality is going to be 8% faster for $330 to buy a 5800x3d. Doesn't seem worth it to me.


*yes, the 5800x3d pulls away a lot farther in other games, but it is not so much in cyberpunk
 
Last edited:

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,014
136
Some games probably could use 16 cores, but most wouldn't need even half as many. Even if they can use 16 cores, most won't be loaded in most cases.

A lot of game development is driven by console hardware and they're going to be at 8 cores for a while. Even the next generation is likely to leave that unchanged with any deviation being the addition of some OS specific cores that games can't touch.