8GB VRAM not enough (and 10 / 12)

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
8GB
Horizon Forbidden West 3060 is faster than the 2080 Super despite the former usually competing with the 2070. Also 3060 has a better 1% low than 4060 and 4060Ti 8GB.
pFJi8XrGZfYuvhvk4952je-970-80.png.webp
Resident Evil Village 3060TI/3070 tanks at 4K and is slower than the 3060/6700XT when ray tracing:
RE.jpg
Company Of Heroes 3060 has a higher minimum than the 3070TI:
CH.jpg

10GB / 12GB

Reasons why still shipping 8GB since 2014 isn't NV's fault.
  1. It's the player's fault.
  2. It's the reviewer's fault.
  3. It's the developer's fault.
  4. It's AMD's fault.
  5. It's the game's fault.
  6. It's the driver's fault.
  7. It's a system configuration issue.
  8. Wrong settings were tested.
  9. Wrong area was tested.
  10. Wrong games were tested.
  11. 4K is irrelevant.
  12. Texture quality is irrelevant as long as it matches a console's.
  13. Detail levels are irrelevant as long as they match a console's.
  14. There's no reason a game should use more than 8GB, because a random forum user said so.
  15. It's completely acceptable for the more expensive 3070/3070TI/3080 to turn down settings while the cheaper 3060/6700XT has no issue.
  16. It's an anomaly.
  17. It's a console port.
  18. It's a conspiracy against NV.
  19. 8GB cards aren't meant for 4K / 1440p / 1080p / 720p gaming.
  20. It's completely acceptable to disable ray tracing on NV while AMD has no issue.
  21. Polls, hardware market share, and game title count are evidence 8GB is enough, but are totally ignored when they don't suit the ray tracing agenda.
According to some people here, 8GB is neeeevaaaaah NV's fault and objective evidence "doesn't count" because of reasons(tm). If you have others please let me know and I'll add them to the list. Cheers!
 
Last edited:
Jul 27, 2020
15,738
9,806
106
What does it take to be "CPU bound" in Cyberpunk? I'm using Ryzen 3950X and highest usage i ever see is the occasional spike to 40% but most times it's like 20% sometimes less. That's with Eco Mode on as well
You could try turning one CCX off to see if that makes any difference. It might help the core with the primary thread boost higher.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Spells.... smells danger !
Interesting to see the 3060 have a higher minimum than the 3070TI at 1080p.

Back in the day nVidia and Digital Foundry were truly & deeply (tm) concerned about uneven frames and pacing on AMD cards, so I expect they'll quickly be all over nVidia's situation to expose the problem in great detail. o_O

As an aside I'm amazed how many times I see people on other forums say "oh, it's no big deal, just drop the texture quality on the 3070TI", as if it's somehow acceptable that a cheaper 3060 has no such issue.
 
Jul 27, 2020
15,738
9,806
106
As an aside I'm amazed how many times I see people on other forums say "oh, it's no big deal, just drop the texture quality on the 3070TI", as if it's somehow acceptable that a cheaper 3060 has no such issue.
It's not like Nvidia has to pay from their own pockets for the extra VRAM. It's simply pure greed, to push customers towards their higher end offerings. The 3060 12GB is an anomaly and a boon to the cost conscious gamers whose prayers SOMEHOW got answered. I hope AMD releases the 7500 XT with 12GB VRAM to put the final nail in the coffin for 8GB cards.
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,329
2,811
106
It's not like Nvidia has to pay from their own pockets for the extra VRAM. It's simply pure greed, to push customers towards their higher end offerings. The 3060 12GB is an anomaly and a boon to the cost conscious gamers whose prayers SOMEHOW got answered. I hope AMD releases the 7500 XT with 12GB VRAM to put the final nail in the coffin for 8GB cards.
N33 has 128bit bus, you can't have 12GB Vram with It. It would need to have 96-bit and then use 6 memory modules in clamshell.
 

q52

Member
Jan 18, 2023
68
36
51
I have been playing with the settings in Cyberpunk on my RTX A4500 and it seems like there is a "wall" around 50-60FPS in both 1440p and 4K resolutions, been fiddling with the settings including DLSS and Ray Tracing on/off, various other configs, and I actually get pretty similar FPS on both resolutions with RT turned off and DLSS makes no difference it seems. Also ends up around avg 55 FPS. Its not really a big deal because the game plays completely fine, but its kinda a head-scratcher at how I would go from 1440p to 4K with barely a difference in performance. Thoughts?
 
  • Like
Reactions: Leeea

In2Photos

Golden Member
Mar 21, 2007
1,597
1,636
136
I have been playing with the settings in Cyberpunk on my RTX A4500 and it seems like there is a "wall" around 50-60FPS in both 1440p and 4K resolutions, been fiddling with the settings including DLSS and Ray Tracing on/off, various other configs, and I actually get pretty similar FPS on both resolutions with RT turned off and DLSS makes no difference it seems. Also ends up around avg 55 FPS. Its not really a big deal because the game plays completely fine, but its kinda a head-scratcher at how I would go from 1440p to 4K with barely a difference in performance. Thoughts?
My first thought is some sort of frame cap in the Nvidia control panel if you don't have one enabled in game? AA perhaps? Do other games go above 60 fps? Or do you have v-sync enabled with a 60Hz monitor?
 

q52

Member
Jan 18, 2023
68
36
51
There is definitely something weird going on with the settings. Its not a hard-cap at 60 fps, with vsync on for 60Hz it peaked at 74 fps and with vsync off it peaked at 68 fps.

Just did another test, changed the fullscreen resolution to 1024x768, the loading screen animations ran at 500+ fps but when I run the Graphics Benchmarks in the settings menu it runs at 15 fps.
 
  • Like
Reactions: Tlh97 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Yeah, sounds like vsync is on. Which is fine. If you have a 60Hz display, there is no point in putting out higher FPS than that in a single player game like Cyber Punk.
 

Leeea

Diamond Member
Apr 3, 2020
3,599
5,340
106
I have been playing with the settings in Cyberpunk on my RTX A4500 and it seems like there is a "wall" around 50-60FPS in both 1440p and 4K resolutions, been fiddling with the settings including DLSS and Ray Tracing on/off,
Your CPU bound.

There is definitely something weird going on with the settings. Its not a hard-cap at 60 fps, with vsync on for 60Hz it peaked at 74 fps and with vsync off it peaked at 68 fps.

Just did another test, changed the fullscreen resolution to 1024x768, the loading screen animations ran at 500+ fps but when I run the Graphics Benchmarks in the settings menu it runs at 15 fps.
classic CPU bound behavior.

See:



Its not really a big deal because the game plays completely fine, but its kinda a head-scratcher at how I would go from 1440p to 4K with barely a difference in performance. Thoughts?
Your just CPU bound.

Your CPU Cannot Render the Game faster then that. Some scenes it can, but most scenes it cannot.


All of the settings your changing only effect GPU load.
Change a setting that effects CPU load, and watch your frame rate change.

tip: The only Cyberpunk setting that effects CPU load is "crowd density" under game play. It does not have a big effect, but reduces the number people in the crowd.
 

BFG10K

Lifer
Aug 14, 2000
22,672
2,817
126
Hogwarts Legacy

At 1080p with ray tracing, 3060, 6750XT and even Arc is faster than the 3070, 3070TI and the 3080. Imagine paying $2500 for a 3080 in 2021 then 18 months later finding it's obsolete.

HL2.jpg

Comedy gold. I wonder where the "RTX ready" 2060 6GB would rank? Getting upscaled from 320x240 with "aye eye" TV motion interpolation, no doubt.


Oh, and it looks like when a 4xxx card is installed into the system, DLSS 3.0 "accidentally" turns itself on despite the setting being grayed out. You can clearly see the end-game of scum-fraudulent nVidia, trying to silently redefine what a frame is.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
I have a 6800XT MB sitting around I was planning on selling, but sometimes I wonder if I should sell my 3080 10GB FTW3 and swap in the 6800XT. If if wasn't plumbed into my WC loop I probably would.

Does your block support a 3090? Given you could upgrade for maybe $200 or less depending on your local market, it might be the safe “don’t have to mess with it for a long time” play. I am not sure it’s exciting but the 3090 will probably be relevant for a long time, everything is great about it except how thirsty it can be.

I say that because locally 3080s go for $500-$600 and 3090 for $700-$800. I mean if you played that right the difference might be $100.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,323
4,904
136
I have a 6800XT MB sitting around I was planning on selling, but sometimes I wonder if I should sell my 3080 10GB FTW3 and swap in the 6800XT. If if wasn't plumbed into my WC loop I probably would.

I'm in the same situation. I have a 6800XT (Aorus Master) and a 3080 10GB FTW3 Ultra. I was planning to sell one or the other. This most recent data more firmly nudges me in the direction of selling the 3080.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,028
136
www.teamjuchems.com
It would be prudent to act quickly. Gamers all over the world may come to the same realization, leading to a flood of 3080s suddenly being available on the used market which would mean a price crash.

The used market for the 3060ti, 3070 and 3070ti is already super crowded, as there is so little real differentiation in performance and feature set. The 3060 firmly sits around $300 and then those three cards topped by the 3080 crowd in from there to $500. Its sorta nuts, and nvidia is very pleased that they are are all showing their age right about now so they can come in and sell new cards that are only more efficient and have larger frame buffers but really aren't going to have much past that - but there will still be a reason to upgrade!

They could care less if all the used cards out their dagger the sub $300 value market as they seemed to have moved on from there anyway and OEMs still gonna sell 1650s into 2025 or so.

TBH, I thought the line would hold at 10GB a lot longer because of the Series X fast pool/3080 prevalence but I guess I was wrong.
 

solidsnake1298

Senior member
Aug 7, 2009
302
168
116
nVidia really needs to fix their driver overhead. In CPU-limited situations it's remarkably clear which vendor has less overhead:
View attachment 76230
When AMD said last year (the year before?) that they were going to rework their drivers from the ground up, I didn't really believe they could deliver any tangible improvement. But they delivered and man, how the tables have turned.
 

GodisanAtheist

Diamond Member
Nov 16, 2006
6,715
7,004
136
nVidia really needs to fix their driver overhead. In CPU-limited situations it's remarkably clear which vendor has less overhead:
View attachment 76230

- That's pretty startling and actually a good gut check because I was still on the "AMD drivers have more overhead" narrative that dropped back during the Maxwell GCN3 days.

However I don't think NV has a game ready driver for HWL, maybe they need to whitelist ReBAR or tweak a few things and the graphs will make more sense again.

Very unlike NV not to have a game ready driver ready to drop, WB must have really held the game close to their chest to make sure there was no risk of leakage from either AMD or NV (although that doesn't explain Intel dropping a driver).
 

TESKATLIPOKA

Platinum Member
May 1, 2020
2,329
2,811
106
Hogwarts Legacy

At 1080p with ray tracing, 3060, 6750XT and even Arc is faster than the 3070, 3070TI and the 3080. Imagine paying $2500 for a 3080 in 2021 then 18 months later finding it's obsolete.

HL2.jpg
I find these results pretty weird.
RTX 3060 12GB is a bit faster than RX 6750XT 12GB, but RX 6650XT 8GB is significantly faster than RTX 3080 10GB, and It's almost 2x as fast as RTX 3070 8GB.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,323
4,904
136
- That's pretty startling and actually a good gut check because I was still on the "AMD drivers have more overhead" narrative that dropped back during the Maxwell GCN3 days.

However I don't think NV has a game ready driver for HWL, maybe they need to whitelist ReBAR or tweak a few things and the graphs will make more sense again.

Very unlike NV not to have a game ready driver ready to drop, WB must have really held the game close to their chest to make sure there was no risk of leakage from either AMD or NV (although that doesn't explain Intel dropping a driver).

Steve says in the final thoughts section of the HWUB video that both nV and Intel have HWL ready drivers which were used in the benchmarking, while AMD does not have official HWL ready drivers until their next release...
 
  • Like
Reactions: Cableman