Info 64MB V-Cache on 5XXX Zen3 Average +15% in Games

Page 84 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kedas

Senior member
Dec 6, 2018
355
339
136
Well we know now how they will bridge the long wait to Zen4 on AM5 Q4 2022.
Production start for V-cache is end this year so too early for Zen4 so this is certainly coming to AM4.
+15% Lisa said is "like an entire architectural generation"
 
Last edited:
  • Like
Reactions: Tlh97 and Gideon
Nov 26, 2005
15,166
390
126
I won't be able to tell if it'll help with Unreal Tournament 4 via reviews, lol, I'll just have to see for myself and it'll be hard because most servers are capped at 240, then the one I play on is 250 so minimums are what I'll be looking at. I do play cs:go and if it gives me 5900X to 12900K performance then that's cool. I'll probably jump on one as soon as they're in stock at Micro Center even before reviews. I'll have my 5800X to fall back on if need be, or if the 3D shines then I'll build a media pc with the spare 5800X, hmm, or sell it.
 

DrMrLordX

Lifer
Apr 27, 2000
22,065
11,693
136
$450 is a pretty steep price for an 8 core CPU in 2022. My fears (expectations?) have been confirmed for the 5800X3D re: pricing.

I would be most worried for people who are streamers. A 16c Zen3D would be the go-to for that role (until Genoa arrives, and maybe even after) but 8c ain't gonna cut it.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
$450 is a pretty steep price for an 8 core CPU in 2022. My fears (expectations?) have been confirmed for the 5800X3D re: pricing.

I would have jumped on it as an upgrade from a 3600 had it been more reasonably priced. As it stands, I'll probably just get the $199 5600 and call it a day for AM4. I game at 1440P so it's not like I'll see the full 15% uplift in games, I'll be surprised if there is much more than a 5-10% difference at 1440P between a 5600 and 5800X3D. That would also apply to other Zen 3 (or ADL) CPUs that are already enough to max out current gen GPUs at 1440P.

Honestly the 5800X3D fills a very small niche. Basically, 1080P (or competitive) gamers with high end GPUs who don't mind the inflated pricetag.

If you're building a new gaming PC from scratch and dont have unlimited funds (or just want to maximise bang for buck) then a better way to allocate your budget would be to invest in a ~$200 CPU like a 12400 or the upcoming 5600 and put that $250 towards a faster GPU. $250 is enough to go up a GPU tier, and I would bet that in the vast majority of games a 5600/12400 + faster GPU would beat out a 5800X3D + slower GPU combo.
Honestly, this chip was probably never meant for you, and both the 5600 or 12400 would/will serve you very-very, very well until your next comprehensive PC update.
 
  • Like
Reactions: Tlh97 and Mopetar

Mopetar

Diamond Member
Jan 31, 2011
8,113
6,768
136
Most gamers are going to be GPU-bound and won't see much if any uplift from a CPU upgrade, especially those who have an AMD or Intel CPU from within the last 3 years.

This is a CPU that should only be considered by those who have too-end GPUs or are running a very high-end GPU at lower resolutions where the bottleneck will shift to the CPU.

The only other exceptions are any titles that have a smaller memory footprint where the increased L3 actually enables most of that memory to fit in the cache and benefit from the decreased latency to access it.
 
  • Like
Reactions: Tlh97 and lobz

Makaveli

Diamond Member
Feb 8, 2002
4,810
1,277
136
Most gamers are going to be GPU-bound and won't see much if any uplift from a CPU upgrade, especially those who have an AMD or Intel CPU from within the last 3 years.

This is a CPU that should only be considered by those who have too-end GPUs or are running a very high-end GPU at lower resolutions where the bottleneck will shift to the CPU.

The only other exceptions are any titles that have a smaller memory footprint where the increased L3 actually enables most of that memory to fit in the cache and benefit from the decreased latency to access it.

True.

The extra cache should affect 0.1% lows the most I would think which is something one feels more than the average fps. So many questions on the release of this cpu. I don't believe we have ever had a cpu with so much cache available on the consumer market.
 

cytg111

Lifer
Mar 17, 2008
24,026
13,536
136
Most gamers are going to be GPU-bound and won't see much if any uplift from a CPU upgrade, especially those who have an AMD or Intel CPU from within the last 3 years.

This is a CPU that should only be considered by those who have too-end GPUs or are running a very high-end GPU at lower resolutions where the bottleneck will shift to the CPU.

The only other exceptions are any titles that have a smaller memory footprint where the increased L3 actually enables most of that memory to fit in the cache and benefit from the decreased latency to access it.

Pubg 5800x, 3080ti, 1440p <- 100% CPU bound.
 

MadRat

Lifer
Oct 14, 1999
11,946
265
126
If it becomes popular then it becomes perhaps more than a niche.

Would this impact cryptology at all? Seems like it enjoys cache speeds over a large working memory.
 

eek2121

Diamond Member
Aug 2, 2005
3,100
4,398
136
Pubg 5800x, 3080ti, 1440p <- 100% CPU bound.
When is the last time you played PUBG? My 3090 is the bottleneck on my system. I have a 5950x.

EDIT: CPU usage is 30%. A LONG time ago your statement was true, however a while back they did a decent overhaul of the engine, and it is now GPU bound on my system. I play with G-SYNC/Freesync and I hit the 120hz cap I have set. Haven't tried benchmarking the highest framerate.

EDIT: In case you think it is the cap, a friend of mine has a 3080 and a 5900x and plays uncapped, and he's also GPU bound. He also plays at 1440p where I play at 5120x1440.
 
  • Like
Reactions: Tlh97 and Mopetar

cytg111

Lifer
Mar 17, 2008
24,026
13,536
136
When is the last time you played PUBG? My 3090 is the bottleneck on my system. I have a 5950x.

EDIT: CPU usage is 30%. A LONG time ago your statement was true, however a while back they did a decent overhaul of the engine, and it is now GPU bound on my system. I play with G-SYNC/Freesync and I hit the 120hz cap I have set. Haven't tried benchmarking the highest framerate.

EDIT: In case you think it is the cap, a friend of mine has a 3080 and a 5900x and plays uncapped, and he's also GPU bound. He also plays at 1440p where I play at 5120x1440.

Well, I got a 240hz dispaly with gsync. Its true they did an update that put the gpu at ~9x% but think they rolled it back due to crashes.. I am running that 25-30% CPU and like 70% GPU right now (120-160fps normally), and the 30% translates into at least one core 100% as I guesstimate to be the bottleneck.
(I can get dips lower and hit 240 as well in rare occasions.. never 240 when it matters).
 

Kedas

Senior member
Dec 6, 2018
355
339
136
We may not be able to overclock the V-cache version.
Would this be a heat issue or some timing issues that AMD fears.

Seems a bit strange, if it's heat wouldn't we detect it that the overclock makes it too hot. Maybe it can break easily (mechanically) when getting hot.
 

StefanR5R

Elite Member
Dec 10, 2016
5,926
8,863
136
I am running that 25-30% CPU [...]
This means little on a multi-thread/ multi-core/ multi-corecomplex CPU. It just says that the workload does not saturate the CPU, but it doesn't say whether or not the CPU bottlenecks the workload.

First you need to look at CPU usage per program thread. Next, if this indicates that one or more program thread may be using a hardware thread fully each, one would have to figure out whether this is limited by the CPU's execution performance or I/O performance. If it is the latter, a larger level-3 cache will help. To what extent it will help depends on the program's memory access patterns of course.

We may not be able to overclock the V-cache version.
Would this be a heat issue or some timing issues that AMD fears.

Seems a bit strange, if it's heat wouldn't we detect it that the overclock makes it too hot. Maybe it can break easily (mechanically) when getting hot.
A CPU which is not overclocked may get hot too. Plus, all recent desktop-tier CPUs are already factory-overclocked.
 

cytg111

Lifer
Mar 17, 2008
24,026
13,536
136
This means little on a multi-thread/ multi-core/ multi-corecomplex CPU. It just says that the workload does not saturate the CPU, but it doesn't say whether or not the CPU bottlenecks the workload.

First you need to look at CPU usage per program thread. Next, if this indicates that one or more program thread may be using a hardware thread fully each, one would have to figure out whether this is limited by the CPU's execution performance or I/O performance. If it is the latter, a larger level-3 cache will help. To what extent it will help depends on the program's memory access patterns of course.


A CPU which is not overclocked may get hot too. Plus, all recent desktop-tier CPUs are already factory-overclocked.
While you are right its not definitive proof it is a pretty big, super big, maybe even borderline gigantic, hint that this is the case. Kernel times? I suppose, but that would be super bad coding (sort of idling the rest of a core on io or whatever… no), I cant imagine anyone shipping such a thing.
 

StefanR5R

Elite Member
Dec 10, 2016
5,926
8,863
136
[...] super bad coding (sort of idling the rest of a core on io or whatever… no), I cant imagine anyone shipping such a thing.
By "I/O performance" I was actually thinking of the processor's RAM I/O performance mostly, not so much about PCIe I/O, and not at all about peripheral I/O.

On a related note, while I don't know about game engines, but GPGPU applications often have a polling thread which is feeding the GPU. GPGPU performance tends to depend to a degree on the speed of the core which hosts this polling thread.
 

Mopetar

Diamond Member
Jan 31, 2011
8,113
6,768
136
We may not be able to overclock the V-cache version.
Would this be a heat issue or some timing issues that AMD fears.

Seems a bit strange, if it's heat wouldn't we detect it that the overclock makes it too hot. Maybe it can break easily (mechanically) when getting hot.

You can't really overclock Zen 3 all that well to begin with. Sure you can feed it a lot more power, but the gains are minimal and not really worth the effort.

AMD had already given some actual results and we know that the 15% performance figure is already accounting for the lower clock speed.
 

Hans Gruber

Platinum Member
Dec 23, 2006
2,305
1,218
136
You can't really overclock Zen 3 all that well to begin with. Sure you can feed it a lot more power, but the gains are minimal and not really worth the effort.

AMD had already given some actual results and we know that the 15% performance figure is already accounting for the lower clock speed.
But is the 15% performance gain on 1080p only or does it improve 1440p and 4k performance as well. Once you get over 1080p the newest CPU's are all the same. The other point of contention is what will intel release to counter any Zen 3 offerings.
 

Mopetar

Diamond Member
Jan 31, 2011
8,113
6,768
136
But is the 15% performance gain on 1080p only or does it improve 1440p and 4k performance as well. Once you get over 1080p the newest CPU's are all the same. The other point of contention is what will intel release to counter any Zen 3 offerings.

At 4K you'll almost certainly be GPU bound, possibly to the point where even a 5600X or even a four-core Alder Lake won't perform any worse. 1440p is likely to be GPU bound for most titles as well.

The only thing Intel will have is the i9 model with even higher clock speeds that's been talked about before.

Anyone buying either is doing it just to have the fastest possible CPUs or because they play competitively and want any extra advantage they can get.
 
Jul 27, 2020
20,040
13,737
146
Intel bent over backwards for Apple and gave them 128MB eDRAM as recently as the 8th gen Core series but they refuse to remove the E-cores and replace them with eDRAM that would actually help gamers. Intel is all about forcing on everyone what they think is the right way.