Info 64MB V-Cache on 5XXX Zen3 Average +15% in Games

Page 147 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kedas

Senior member
Dec 6, 2018
355
339
136
Well we know now how they will bridge the long wait to Zen4 on AM5 Q4 2022.
Production start for V-cache is end this year so too early for Zen4 so this is certainly coming to AM4.
+15% Lisa said is "like an entire architectural generation"
 
Last edited:
  • Like
Reactions: Tlh97 and Gideon

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
...the Beta Bios I was running for a similar MSI board already kicked the CPU in pants when I mistakenly enabled the CPU boost option in the firmware, but this was not stable.

Which BIOS you using for your Tomahawk? I am still on 1.2.0.6c for the B550 Tomahawk since I keep fTPM disabled and found a reliable way for the X3D single core boost to work. I just checked the X570 Tomahawk page and we have same BIOS releases unsurprisingly.

Looks like Tomahawk owners are kinda left in the dust when it comes to newer beta BIOS' from MSI, but it isn't a big deal unless a real 5800X3D unlock arrives (doubtful).


If the board has the game mode option in the UEFI you should try it. Tomahawk B450 set my 5600G to 4.7GHZ all core, but all power saving features were still active. Max voltage was lower than OOB PBO.

I've been googling it and people said it was bad, with voltages still too high. I doubt enabling it on the X3D will do anything but increase voltages, but I am curious.

Enabling PBO in BIOS with scalar of 2x does increase core clocks by a little bit, effectively overclocking the X3D by a miniscule amount. With my BCLK OC, enabling PBO scalar 2x creates boot instability which I find kinda interesting. Maybe I need a bit more voltage, which I can increase with MSI's tools.

We can even turn on Game Boost with Dragon Center. Not brave enough to enable it right now.

boost.jpg
 

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,274
19,921
146
Oh I assumed you meant 5800X cause game mode should definitely overclock it! I am willing to be a guinea pig for the X3D game mode soon though, haha.
giphy.gif


For science. :D
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,029
136
www.teamjuchems.com
Which BIOS you using for your Tomahawk? I am still on 1.2.0.6c for the B550 Tomahawk since I keep fTPM disabled and found a reliable way for the X3D single core boost to work. I just checked the X570 Tomahawk page and we have same BIOS releases unsurprisingly.

Looks like Tomahawk owners are kinda left in the dust when it comes to newer beta BIOS' from MSI, but it isn't a big deal unless a real 5800X3D unlock arrives (doubtful).

I've been googling it and people said it was bad, with voltages still too high. I doubt enabling it on the X3D will do anything but increase voltages, but I am curious.

Enabling PBO in BIOS with scalar of 2x does increase core clocks by a little bit, effectively overclocking the X3D by a miniscule amount. With my BCLK OC, enabling PBO scalar 2x creates boot instability which I find kinda interesting. Maybe I need a bit more voltage, which I can increase with MSI's tools.

We can even turn on Game Boost with Dragon Center. Not brave enough to enable it right now.

So, embarrassingly, I was using the stock bios still. I had done several builds lately - including a 5800X3D on a x570s MSI board and somehow I thought I had already updated it. Nope. So, on the earlier (earliest listed on the downloads page) bios releases it just ran at 3.4 ghz, no boost, etc.

I had been so pleased with the temps and lack of noise from my PC :D Turns out that was all a mistake on my part.

Ok, so got that firmware updated to the 7/17/22 drop but boost was just not happening. Actually, it was worse. Taking forever to spin up during benchmarks, Novabench being my go to for a 2 minute sanity check bench. Dropped 20% performance from my B450 board. WHAT IS HAPPENING???

Check the power plan, somehow it is at power saving? What the crap? Change that. Bother to activate windows. Enable Above 4G decoding, etc.

So now I am at the latest (non-beta now! woo!) bios with official X3D support. Performance is up across the board and I am *done* tuning my main rig for the foreseeable future.

With a Frost Commander (enormous) cooler, I am only getting boost to ~4.4 ghz or so, but temps are OK. It's fine, I am not going to fight it at this point. It does run memory at 3600 with zero issue and generally after getting it to run like it's supposed to and seeing solid gains over my previous B450 setup I am going to wait a bit now.

FWIW, enabling the Game Mode on a different MSI board (x570S Edge Max Wifi running the beta bios for x3d released in May) and X3D CPU by mistake (I must have clicked it, but in "basic" UEFI mode all it does is glow a little if it is on?) and it ran the CPU at ~4.7 ghz all core but it wasn't stable and temps were leading me to believe there was way too much juice. @Schmide having lost one, I immediately powered off and checked that setting - confirmed it was on - and turned it off.

I am guilty of spending way more time tweaking builds I am selling and then with my own PC I just put it together and expect it to work. I am not sure why that is ¯\_(ツ)_/¯
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
I am guilty of spending way more time tweaking builds I am selling and then with my own PC I just put it together and expect it to work. I am not sure why that is ¯\_(ツ)_/¯

If you screw up someone else's build then at least you aren't stuck with it, maybe? Though one would hope you'd fix problems/replace smoked components before selling the units to customers.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,029
136
www.teamjuchems.com
If you screw up someone else's build then at least you aren't stuck with it, maybe? Though one would hope you'd fix problems/replace smoked components before selling the units to customers.

You take my meaning wrong.

I don’t mess or experiment with builds I sell, I test and tune them. Updating firmware, disabling stupid performance settings, setting power limits relative to the cooling capacity of the whole system, setting fan curves, ensuring performance is good with benchmarks, overnight stress testing, etc.

I get a lot of positive feedback for building quiet and stable computers and I’d like to keep that way.

And then for me I stuff it in a case in the minimum time and call it good. Twice I can recall spending the same time as I spent on client PCs for myself and those were long held builds.
 

blckgrffn

Diamond Member
May 1, 2003
9,110
3,029
136
www.teamjuchems.com
This is the part I was getting at. Tuning doesn't always work out, especially if you are overclocking.

I am not, especially by Intels definition.

If anything, I tune them to run more within the specifications and not the other way around.

For the folks I build for, they are either happy to pay for the parts that are supposed to run faster or are going to be completely oblivious to single percentage performance gains relative to “stock” performance. Modern overclocking is not anything like it was when you could get massive gains by flipping a FSB or get an extra all core ghz by changing a multiplier and never even adjusting the voltage. It’s not worth my time, but understand why some bother anyway.
 
Last edited:

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
With a Frost Commander (enormous) cooler, I am only getting boost to ~4.4 ghz or so, but temps are OK. It's fine, I am not going to fight it at this point.

I am seeing single core boost happen more often using the AMD Ryzen™ Balanced LowPower plan on overclock.net:


I highly recommend downloading PBO2 Tuner and configuring -30mv per core and limiting PPT/TDC/EDC with Task Scheduler so the settings apply on Windows login:


This provided a big drop in voltage, temps and near zero performance loss.
 

inf64

Diamond Member
Mar 11, 2011
3,685
3,957
136
  • Like
Reactions: lightmanek

Ranulf

Platinum Member
Jul 18, 2001
2,331
1,139
136

Four years brings us to 100fps on average for all major desktop cpu models and a 3090ti at 1080p. Although at least it is with RTX on and "very high" graphics settings for that game. Better than a PS4 or 5 I guess. More proof that for gaming anything more than a $200 cpu is pointless except for some edge cases.
 
  • Like
Reactions: Leeea

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
ZEN4D vs RPL looks like murder more than anything else.
Curious to see how it will really be.
Eh, something to think about, Spider-Man is keeping the 8 e-cores busy while the game is running by using them to compile shaders in real time. RPL will have twice as many e-cores at several levels while running them faster. It'll be interesting to see how that helps. On console, it uses precompiled shaders.
 
  • Like
Reactions: KompuKare and Leeea

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Eh, something to think about, Spider-Man is keeping the 8 e-cores busy while the game is running by using them to compile shaders in real time. RPL will have twice as many e-cores at several levels while running them faster. It'll be interesting to see how that helps. On console, it uses precompiled shaders.

Several PC games precompile shaders. All the Call of Duty PC ports do this.
 
  • Like
Reactions: Leeea and Tlh97

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,478
14,434
136
Eh, its been that way for months. 5900X has been at $400 since maybe April? The big price drops were in June/July with sales down to $330 on Prime Day and a few other places in $320-40.
Yup, supply and demand. The 5800X3D in many cases (especially gaming) is superior to the 5900x. If they want more cores, they go for the 5950x.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
While the 5800x3d may be better for gaming, I can't help but think that the 5900x will have greater longevity due to the extra cores. I conceed that the Broadwell high cache processors stayed relevant for a very long time, but, the comparable quad cores had substantially less cache than what the 5900x has.