Info 64MB V-Cache on 5XXX Zen3 Average +15% in Games

Page 106 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kedas

Senior member
Dec 6, 2018
355
339
136
Well we know now how they will bridge the long wait to Zen4 on AM5 Q4 2022.
Production start for V-cache is end this year so too early for Zen4 so this is certainly coming to AM4.
+15% Lisa said is "like an entire architectural generation"
 
Last edited:
  • Like
Reactions: Tlh97 and Gideon

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136
So they are the outlier then since nowhere else does use RT for CPU tests. Would be good if other places did to corroborate the tests but without that an outlier result is an outlier result.
OK if you don't like CP2077, look at Troy: A Total War Saga. Most people in the benchmarking business have no idea what settings and scenes are CPU or GPU bound because they do not play the games, they're just making content.
 

Timorous

Golden Member
Oct 27, 2008
1,969
3,850
136
OK if you don't like CP2077, look at Troy: A Total War Saga. Most people in the benchmarking business have no idea what settings and scenes are CPU or GPU bound because they do not play the games, they're just making content.

Unfortunately I cannot find any Troy memory scaling benchmarks. I see from that YouTuber it seems to like fast ram but these kinds of games don't get tested all that much and that is something that really should change.

EDIT: Of course if it likes fast ram it will probably like L3 Cache since that is 2TB/s so ya know plenty of room for improvement over the standard 5800X.
 

eek2121

Diamond Member
Aug 2, 2005
3,387
5,014
136
I suspect part of it will be used by the new parts out of the gate. Intel pushing their chips beyond 200W let them benchmark a bit better. AMD doesn't need to go nearly that far, but an extra 50W will let them stretch the bars a bit more.

It also gives some extra room for all core boost on the two chiplet parts. Those were constrained a fair bit because when 16 cores all want to use that TDP and there's an IO die to consider as well, 125W doesn't go quite as far as one would think.

There's also the rumors that all of the Zen 4 CPUs will have a small amount of integrated graphics on the IO die this time around. That will add to the power draw if it's being used and they probably want extra room for it.

Eventually they will release Zen 4D parts and the v-cache is going to need to be powered. With the 5800X3D they lowered the boost clocks in part because the TDP needed to stay the same.

So I don't think there's just one reason for the 170W TDP, but a lot of little reasons. We may not even see the initial Zen 4 CPUs use all of that 170W either, but it will be there in case they need it later and it will ensure that boards will support future parts that actually will draw that full amount of power.
I suspect we will see at least one product with 170W, more thoughts on this below.

I find 170W regrettable but if your competitor is doing it you have to play that game. Really, AMD shouldn't follow but they will because halo parts work with irrational consumer behavior.

But it'll be really neat to see if the 5800X3D is as competitive as AMD claims while not using more power. On a 4 year old process. On an old platform. With old memory.

Intel will do whatever is required to get to number one DIY/OEM, 400W ? 500W ? 12900ks already uses 500w+.

The 5800X3D is showing some decent gains in gaming and a few other areas. Reviews should drop in two days, so we'll see how overall performance is. It will be interesting to see how widespread the lead is over the 12900K/KF.

Regarding AMD and the higher power limits, I suspect AMD doesn't NEED to go to a higher limit, but if Intel is leading the way, adding more breathing room for mult-chiplet products will only help strengthen their lead. I suspect they hope to push Intel up even higher.

Zen 3 was power limited in a number of scenarios, so raising that power limit will help.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
If those results are true, 5800X3D is a killer Gaming CPU at half the price of 12900K. If DDR5 is used the perf/$ is getting even worse for the AlderLake.
 

nicalandia

Diamond Member
Jan 10, 2019
3,331
5,282
136
If those results are true, 5800X3D is a killer Gaming CPU at half the price of 12900K. If DDR5 is used the perf/$ is getting even worse for the AlderLake.
Tunned DDR5 Worth it's salt are very expensive, MB are Very Expensive and 12900KS are very expensive. Can someone do a MB + DDR5 + 12900KS + 3090 Ti build against a Decent AM4 + DDR4 + 5800X3D + 3080 Ti and see what is the actual gaming/price difference?
 

Saylick

Diamond Member
Sep 10, 2012
3,943
9,195
136
Tunned DDR5 Worth it's salt are very expensive, MB are Very Expensive and 12900KS are very expensive. Can someone do a MB + DDR5 + 12900KS + 3090 Ti build against a Decent AM4 + DDR4 + 5800X3D + 3080 Ti and see what is the actual gaming/price difference?
That's easily a $500 difference, even if you ignored the GPU.
 

Det0x

Golden Member
Sep 11, 2014
1,461
4,985
136
i2Hard tests with RT enabled. Unless you enable RT, CP2077 is mostly GPU-bound.
Maxed tweaked regular Zen3
Cyberpunk 2077: 1080p low = ~252 average fps
1649776448169.png
Maxed tweaked Alder Lake @ ~5.4ghz and DDR5 @ 7100MT/s CL30 getting 304 average at same settings
1649776641049.png

And for those interested, Troy at same ddr5 7100MT/s CL30
1649777391818.png

Alder Lake benchmarks done by "Mumriken" from a Norwegian forum.
 
Last edited:

nicalandia

Diamond Member
Jan 10, 2019
3,331
5,282
136
I believe you have those backwards, 5800XD + DDR4 will allow you to buy a faster GPU for the same price vs 12900KS+DDR5
Why not both? I mean If buying a 5800X3D would allow you to game the same with a 6900XT than if you would with a 3090 Ti paired with a 12900K then why not?
 
  • Like
Reactions: Tlh97 and AtenRa

Mopetar

Diamond Member
Jan 31, 2011
8,447
7,649
136
If those results are true, 5800X3D is a killer Gaming CPU at half the price of 12900K. If DDR5 is used the perf/$ is getting even worse for the AlderLake.

I'm not sure the price matters to anyone who just wants to buy top performance. If 12900K and DDR5 is on top, it won't matter to people who have to spend an extra $800 or more. They probably already spent at least that much on getting a slightly better GPU than the next best option.

The most compelling case is the other one that has been brought up where if you have a specific budget, going with Zen 3D/DDR4 let's you shift a lot more of that budget to the GPU.

But at the end of the day it will be a halo product to get AMD more mindshare with the general public. Most gamers are going to end up choosing between a 5600 and a 12400, but maybe these top end parts helps sway average consumers a little bit. Unless you're buying a high-end GPU like a 3080 or a 6800XT, you're not going to be held back by a mid-range CPU. The same goes if you're gaming at 4K or even 1440p in most games. The KS and 3D are both wasted there.

Frankly it just makes me even more excited to see what a Zen 4D is capable of doing or if AMD could create an insane APU with v-cache that's accessible by the GPU.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,865
3,730
136

It's basically a tie with the 12900K. The biggest gain is in Borderlands 3, and there is a regression in CS:GO (as expected).

Well it's a good CPU, but not something that can claim to be the best gaming CPU.
 
  • Like
Reactions: Saylick

nicalandia

Diamond Member
Jan 10, 2019
3,331
5,282
136
It's basically a tie with the 12900K. The biggest gain is in Borderlands 3, and there is a regression in CS:GO (as expected).

Well it's a good CPU, but not something that can claim to be the best gaming CPU.
Are you Grasping at Straws again? The 5800X3D will be hailed as The New Gaming King...! Until the next new CPU is released. It's going to be a short reign.
 

jpiniero

Lifer
Oct 1, 2010
16,568
7,071
136
It's basically a tie with the 12900K. The biggest gain is in Borderlands 3, and there is a regression in CS:GO (as expected).

Well it's a good CPU, but not something that can claim to be the best gaming CPU.

Might need a faster GPU than a 3080 ;)

Far Cry 5 is also a big improvement. There just might not be enough games where the cache makes a difference. Especially when it must be pretty conservative on clocks.
 
Jul 27, 2020
26,538
18,253
146
Especially when it must be pretty conservative on clocks.
The lower clocks are really hurting it. In future iterations, maybe they can put smaller V-cache dies together with dark silicon in between to reduce the thermal impact and keep frequency high.
 

Saylick

Diamond Member
Sep 10, 2012
3,943
9,195
136
The lower clocks are really hurting it. In future iterations, maybe they can put smaller V-cache dies together with dark silicon in between to reduce the thermal impact and keep frequency high.
I got a feeling that the next iteration of V-cache will allow overclocking so the issue of lower clocks is just teething pains associated with implementing V-cache for the first time. Once they get more data/experience with V-cache, I think this will eventually sort itself out. Smaller V-caches won't be necessary.
 

nicalandia

Diamond Member
Jan 10, 2019
3,331
5,282
136
Is there something weird going on with TPU gaming numbers where on 1440 and 4K the older 1090k AND 12400K are often at the top of the chart? GPU Bottle Neck?

1649789462931.png