The "8GB not enough" thread

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

igor_kavinski

Diamond Member
Jul 27, 2020
8,898
5,245
106
What does it take to be "CPU bound" in Cyberpunk? I'm using Ryzen 3950X and highest usage i ever see is the occasional spike to 40% but most times it's like 20% sometimes less. That's with Eco Mode on as well
You could try turning one CCX off to see if that makes any difference. It might help the core with the primary thread boost higher.
 

BFG10K

Lifer
Aug 14, 2000
22,184
1,494
126
Spells.... smells danger !
Interesting to see the 3060 have a higher minimum than the 3070TI at 1080p.

Back in the day nVidia and Digital Foundry were truly & deeply (tm) concerned about uneven frames and pacing on AMD cards, so I expect they'll quickly be all over nVidia's situation to expose the problem in great detail. o_O

As an aside I'm amazed how many times I see people on other forums say "oh, it's no big deal, just drop the texture quality on the 3070TI", as if it's somehow acceptable that a cheaper 3060 has no such issue.
 

igor_kavinski

Diamond Member
Jul 27, 2020
8,898
5,245
106
As an aside I'm amazed how many times I see people on other forums say "oh, it's no big deal, just drop the texture quality on the 3070TI", as if it's somehow acceptable that a cheaper 3060 has no such issue.
It's not like Nvidia has to pay from their own pockets for the extra VRAM. It's simply pure greed, to push customers towards their higher end offerings. The 3060 12GB is an anomaly and a boon to the cost conscious gamers whose prayers SOMEHOW got answered. I hope AMD releases the 7500 XT with 12GB VRAM to put the final nail in the coffin for 8GB cards.
 

TESKATLIPOKA

Golden Member
May 1, 2020
1,346
1,636
106
It's not like Nvidia has to pay from their own pockets for the extra VRAM. It's simply pure greed, to push customers towards their higher end offerings. The 3060 12GB is an anomaly and a boon to the cost conscious gamers whose prayers SOMEHOW got answered. I hope AMD releases the 7500 XT with 12GB VRAM to put the final nail in the coffin for 8GB cards.
N33 has 128bit bus, you can't have 12GB Vram with It. It would need to have 96-bit and then use 6 memory modules in clamshell.
 

q52

Member
Jan 18, 2023
57
29
46
I have been playing with the settings in Cyberpunk on my RTX A4500 and it seems like there is a "wall" around 50-60FPS in both 1440p and 4K resolutions, been fiddling with the settings including DLSS and Ray Tracing on/off, various other configs, and I actually get pretty similar FPS on both resolutions with RT turned off and DLSS makes no difference it seems. Also ends up around avg 55 FPS. Its not really a big deal because the game plays completely fine, but its kinda a head-scratcher at how I would go from 1440p to 4K with barely a difference in performance. Thoughts?
 
  • Like
Reactions: Leeea

In2Photos

Golden Member
Mar 21, 2007
1,062
1,088
136
I have been playing with the settings in Cyberpunk on my RTX A4500 and it seems like there is a "wall" around 50-60FPS in both 1440p and 4K resolutions, been fiddling with the settings including DLSS and Ray Tracing on/off, various other configs, and I actually get pretty similar FPS on both resolutions with RT turned off and DLSS makes no difference it seems. Also ends up around avg 55 FPS. Its not really a big deal because the game plays completely fine, but its kinda a head-scratcher at how I would go from 1440p to 4K with barely a difference in performance. Thoughts?
My first thought is some sort of frame cap in the Nvidia control panel if you don't have one enabled in game? AA perhaps? Do other games go above 60 fps? Or do you have v-sync enabled with a 60Hz monitor?
 

q52

Member
Jan 18, 2023
57
29
46
There is definitely something weird going on with the settings. Its not a hard-cap at 60 fps, with vsync on for 60Hz it peaked at 74 fps and with vsync off it peaked at 68 fps.

Just did another test, changed the fullscreen resolution to 1024x768, the loading screen animations ran at 500+ fps but when I run the Graphics Benchmarks in the settings menu it runs at 15 fps.
 
  • Like
Reactions: Tlh97 and Leeea

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,553
136
Yeah, sounds like vsync is on. Which is fine. If you have a 60Hz display, there is no point in putting out higher FPS than that in a single player game like Cyber Punk.
 

Leeea

Platinum Member
Apr 3, 2020
2,820
4,277
106
I have been playing with the settings in Cyberpunk on my RTX A4500 and it seems like there is a "wall" around 50-60FPS in both 1440p and 4K resolutions, been fiddling with the settings including DLSS and Ray Tracing on/off,
Your CPU bound.

There is definitely something weird going on with the settings. Its not a hard-cap at 60 fps, with vsync on for 60Hz it peaked at 74 fps and with vsync off it peaked at 68 fps.

Just did another test, changed the fullscreen resolution to 1024x768, the loading screen animations ran at 500+ fps but when I run the Graphics Benchmarks in the settings menu it runs at 15 fps.
classic CPU bound behavior.

See:



Its not really a big deal because the game plays completely fine, but its kinda a head-scratcher at how I would go from 1440p to 4K with barely a difference in performance. Thoughts?
Your just CPU bound.

Your CPU Cannot Render the Game faster then that. Some scenes it can, but most scenes it cannot.


All of the settings your changing only effect GPU load.
Change a setting that effects CPU load, and watch your frame rate change.

tip: The only Cyberpunk setting that effects CPU load is "crowd density" under game play. It does not have a big effect, but reduces the number people in the crowd.
 

BFG10K

Lifer
Aug 14, 2000
22,184
1,494
126
Hogwarts Legacy

At 1080p with ray tracing, 3060, 6750XT and even Arc is faster than the 3070, 3070TI and the 3080. Imagine paying $2500 for a 3080 in 2021 then 18 months later finding it's obsolete.


Comedy gold. I wonder where the "RTX ready" 2060 6GB would rank? Getting upscaled from 320x240 with "aye eye" TV motion interpolation, no doubt.


Oh, and it looks like when a 4xxx card is installed into the system, DLSS 3.0 "accidentally" turns itself on despite the setting being grayed out. You can clearly see the end-game of scum-fraudulent nVidia, trying to silently redefine what a frame is.
 

DAPUNISHER

Super Moderator and Elite Member
Moderator
Aug 22, 2001
25,668
11,835
146
It really is comical seeing the 3060 merk its big bros when they run out of vram.

FG turning on like that is super sus, for sure.

If AMD's driver next week brings performance gains in this game, things will get really exciting.
 

blckgrffn

Diamond Member
May 1, 2003
8,785
2,428
136
www.teamjuchems.com
I have a 6800XT MB sitting around I was planning on selling, but sometimes I wonder if I should sell my 3080 10GB FTW3 and swap in the 6800XT. If if wasn't plumbed into my WC loop I probably would.
Does your block support a 3090? Given you could upgrade for maybe $200 or less depending on your local market, it might be the safe “don’t have to mess with it for a long time” play. I am not sure it’s exciting but the 3090 will probably be relevant for a long time, everything is great about it except how thirsty it can be.

I say that because locally 3080s go for $500-$600 and 3090 for $700-$800. I mean if you played that right the difference might be $100.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,182
4,427
136
I have a 6800XT MB sitting around I was planning on selling, but sometimes I wonder if I should sell my 3080 10GB FTW3 and swap in the 6800XT. If if wasn't plumbed into my WC loop I probably would.
I'm in the same situation. I have a 6800XT (Aorus Master) and a 3080 10GB FTW3 Ultra. I was planning to sell one or the other. This most recent data more firmly nudges me in the direction of selling the 3080.
 

blckgrffn

Diamond Member
May 1, 2003
8,785
2,428
136
www.teamjuchems.com
It would be prudent to act quickly. Gamers all over the world may come to the same realization, leading to a flood of 3080s suddenly being available on the used market which would mean a price crash.
The used market for the 3060ti, 3070 and 3070ti is already super crowded, as there is so little real differentiation in performance and feature set. The 3060 firmly sits around $300 and then those three cards topped by the 3080 crowd in from there to $500. Its sorta nuts, and nvidia is very pleased that they are are all showing their age right about now so they can come in and sell new cards that are only more efficient and have larger frame buffers but really aren't going to have much past that - but there will still be a reason to upgrade!

They could care less if all the used cards out their dagger the sub $300 value market as they seemed to have moved on from there anyway and OEMs still gonna sell 1650s into 2025 or so.

TBH, I thought the line would hold at 10GB a lot longer because of the Series X fast pool/3080 prevalence but I guess I was wrong.
 

solidsnake1298

Senior member
Aug 7, 2009
302
167
116
nVidia really needs to fix their driver overhead. In CPU-limited situations it's remarkably clear which vendor has less overhead:
View attachment 76230
When AMD said last year (the year before?) that they were going to rework their drivers from the ground up, I didn't really believe they could deliver any tangible improvement. But they delivered and man, how the tables have turned.
 

MrTeal

Diamond Member
Dec 7, 2003
3,265
1,126
136
The Alphacool block I have would fit the FTW3 3090 and 3080 12GB/Ti, but I don't see any available for sale anywhere near me unfortunately.
 
  • Like
Reactions: blckgrffn

GodisanAtheist

Diamond Member
Nov 16, 2006
4,785
4,112
136
nVidia really needs to fix their driver overhead. In CPU-limited situations it's remarkably clear which vendor has less overhead:
View attachment 76230
- That's pretty startling and actually a good gut check because I was still on the "AMD drivers have more overhead" narrative that dropped back during the Maxwell GCN3 days.

However I don't think NV has a game ready driver for HWL, maybe they need to whitelist ReBAR or tweak a few things and the graphs will make more sense again.

Very unlike NV not to have a game ready driver ready to drop, WB must have really held the game close to their chest to make sure there was no risk of leakage from either AMD or NV (although that doesn't explain Intel dropping a driver).
 

TESKATLIPOKA

Golden Member
May 1, 2020
1,346
1,636
106
Hogwarts Legacy

At 1080p with ray tracing, 3060, 6750XT and even Arc is faster than the 3070, 3070TI and the 3080. Imagine paying $2500 for a 3080 in 2021 then 18 months later finding it's obsolete.

I find these results pretty weird.
RTX 3060 12GB is a bit faster than RX 6750XT 12GB, but RX 6650XT 8GB is significantly faster than RTX 3080 10GB, and It's almost 2x as fast as RTX 3070 8GB.
 

IEC

Elite Member
Super Moderator
Jun 10, 2004
14,182
4,427
136
- That's pretty startling and actually a good gut check because I was still on the "AMD drivers have more overhead" narrative that dropped back during the Maxwell GCN3 days.

However I don't think NV has a game ready driver for HWL, maybe they need to whitelist ReBAR or tweak a few things and the graphs will make more sense again.

Very unlike NV not to have a game ready driver ready to drop, WB must have really held the game close to their chest to make sure there was no risk of leakage from either AMD or NV (although that doesn't explain Intel dropping a driver).
Steve says in the final thoughts section of the HWUB video that both nV and Intel have HWL ready drivers which were used in the benchmarking, while AMD does not have official HWL ready drivers until their next release...
 
  • Like
Reactions: Cableman

ASK THE COMMUNITY