Info 64MB V-Cache on 5XXX Zen3 Average +15% in Games

Page 145 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kedas

Senior member
Dec 6, 2018
355
339
136
Well we know now how they will bridge the long wait to Zen4 on AM5 Q4 2022.
Production start for V-cache is end this year so too early for Zen4 so this is certainly coming to AM4.
+15% Lisa said is "like an entire architectural generation"
 
Last edited:
  • Like
Reactions: Tlh97 and Gideon

jamescox

Senior member
Nov 11, 2009
637
1,103
136
Are there really enough partially busted vcache dies to make a product out of it? That's the part that would be a bit surprising.
It can also be something went wrong in the hybrid bonding such that not all of the v-cache can be used even if it was a known good die before stacking.
 

biostud

Lifer
Feb 27, 2003
18,241
4,755
136
If AMD has capacity in the 7nm/vcache production line and they can still sell the CPUs with reasonable margins there is no reason not to put the 5600X3D and 5900X3D on the market.
 

tomatosummit

Member
Mar 21, 2019
184
177
116
Presumably the new models are about getting rid of the partially busted vcache chips.
I don't think it's about broken cache. I'd expect there to be an epyc or even threadripper (low volume as it is) sku in the offering if there were enough broken parts.
My bet is that the 5600x3d with less cache is more of an artificial segmentation to price it below the 5800x3d and above/around the 5700x.
 
  • Like
Reactions: Tlh97 and coercitiv

biostud

Lifer
Feb 27, 2003
18,241
4,755
136
I don't think it's about broken cache. I'd expect there to be an epyc or even threadripper (low volume as it is) sku in the offering if there were enough broken parts.
My bet is that the 5600x3d with less cache is more of an artificial segmentation to price it below the 5800x3d and above/around the 5700x.
yeah, I would think it is to battle the 12600K in the gaming budget segment.
 

Timorous

Golden Member
Oct 27, 2008
1,608
2,753
136
I don't think it's about broken cache. I'd expect there to be an epyc or even threadripper (low volume as it is) sku in the offering if there were enough broken parts.
My bet is that the 5600x3d with less cache is more of an artificial segmentation to price it below the 5800x3d and above/around the 5700x.

Might be infrequent enough that putting 8 of them in a 48c Milan-X sku simply makes it too low volume where as when you can use those dies to make 8 5600X3Ds the volume is high enough to make it viable.
 

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
If I wanted to upgrade to a faster CPU, for gaming, and If I wanted to get a better motherboard at this point, would it make sense in waiting for the Ryzen 7800X, X670 motherboards, and DDR5, than upgrading right now to a Ryzen 5800X3D and an X570S based motherboard? I currently have a Ryzen 5800X and a Asus B550-F (Wi-Fi) motherboard, and even though my motherboard supports the 5800X3D, I want a better motherboard. I'm just finding it hard at this point to justify purchasing the Ryzen 5800X3D especially at it's current price when in a few months from now the Ryzen 7800X will be released but I keep thinking about the 5800X3D.
 

ZGR

Platinum Member
Oct 26, 2012
2,052
656
136
I would ask yourself which games you play, and how many benefit from the bigger cache. The games I play do, so it was an easy decision.

I still wanna wait for much faster, lower latency DDR5 to come out before hopping on the Ryzen 7000 train.

If I were to get 7800X on release, I know I would be upgrading the memory later on because DDR5 is improving quickly. Then I would really want to upgrade to 7800X3D. And we don’t even know how Ryzen 7000’s IMC can handle DDR5 7000+

I want to avoid double dipping lIke I did in the DDR3 generation.
 
Jul 27, 2020
16,165
10,240
106

eek2121

Platinum Member
Aug 2, 2005
2,930
4,026
136
Most applications don't benefit from the extra v-cache. It's usually a case of no gains at all or really big gains, with fewer results in between. For most applications the additional clock speed of a 5800X is worth more than the extra cache.

It probably helps out with general system performance if a lot of applications are running because more of their data can stay in the cache, but that's harder to measure. But most people wouldn't want a 5900X3D over a 5900X unless they're primarily using it for one of those applications that does get a big boost.
I am curious if the non-gaming performance scenario changes at all with an even larger cache.

I suspect AMD will be tweaking their CPU designs in the future to help the extra cache to provide more performance uplift. It isn’t just about bolting cache on, you need to optimize for it.

I am curious as to whether AMD will launch a Zen 4 part with cache on the IO die as well. A unified last-level cache could do wonders for multi chiplet designs.

Good point. I am no expert on this. I have read some articles about wafer testing (probing) and known-good-die (KGD) schemes, in which it is stated that these are crucial for chip stacking to work out at production scale, and a key benefit of chiplet design. However, these schemes probably do not provide 100% test coverage, so as you say, there may be lack of certainty and hence a statistical factor involved. In particular, wafer probing may still be limited in what it can do, I guess. I presume AMD is using a die-on-wafer bonding approach, so they may have more certain knowledge about the V-Cache chiplet (which already has been diced and tested to KGD standards, I presume) than the CCD wafer (which has not yet been diced, and hence can only be probed). But, I would think AMD has pretty good knowledge of the functional status of the chiplets before stacking them together.

That said, as @nicalandia pointed out, AMD's V-Cache prototype, shown at Computex last year, was indeed a 12-core dual-CCD chip, with fully functional 64 MB V-Cache on each 6-core (salvaged) CCD, for a full 192 MB total L3 cache. However, this was a prototype, which has not, so far, turned into full scale production. Limiting the demo to 12-core may have been down to package power and thermal limits, I guess, since 16-core is already constrained in the AM4 socket.


IIRC the demo chip was limited to 4ghz.
I don't think it's about broken cache. I'd expect there to be an epyc or even threadripper (low volume as it is) sku in the offering if there were enough broken parts.
My bet is that the 5600x3d with less cache is more of an artificial segmentation to price it below the 5800x3d and above/around the 5700x.

It is likely due to the way the cache and chiplets are connected.

Speaking of Threadripper, AMD should really release a 5995wx3d. 🤣
 

Makaveli

Diamond Member
Feb 8, 2002
4,717
1,051
136

Look at the 1080p, 1440p and 4K performance summary graphs. Are those games and the performance increases worth the expense to you?

As someone who is on a 5800X right now and at 3440x1440 the gains are not worth it. So unless you are on a 1080p monitor i wouldn't consider it.
 

Makaveli

Diamond Member
Feb 8, 2002
4,717
1,051
136
I'm on a 1080p 60Hz monitor and have a Ryzen 5800X.

if that was a high refresh rate monitor at 1080p I would have leaned that way but at 60hz still a no. The extra frame rate you will be getting will never see on a 60hz screen.

Unless you are one of those I don't use vsync guys and like playing with screen tearing.

What GPU are you using?
 
Last edited:

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
I'm on a 1080p 60Hz monitor and have a Ryzen 5800X.

Well in X-Plane 11, at 1080p, it gets as low as the low 20's at LAX on my 5800X but this is with a whole bunch of addons including the Felis 747-200 and Traffic Global that bring down the performance significantly compared to the stock X-Plane 11 installation. I'm hoping I can turn those low 20's to at least 30 (I cap my frame rate to 30) in the huge busy areas with a CPU upgrade. In Flight Simulator in many huge cities where there is photogrammetry, I will be in the upper 20's with a terrain LOD setting of 300 at 1080p. These are the two games I want to see performance gains.
 

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
I'd absolutely get a 1440p144 freesync panel instead of upgrading the CPU if you live in USA. 1440p Freesync is often below $250.

It won't make a difference in X-Plane 11 and Flight Simulator as I cap my fps to 30 in these games since I almost always below 60 fps (when fps is not capped).
 

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
if that was a high refresh rate monitor at 1080p I would have leaned that way but at 60hz still a no. The extra frame rate you will be getting will never see on a 60hz screen.

Unless you are one of those I don't use vsync guys and like playing with screen tearing.

What GPU are you using?


Well in X-Plane 11, at 1080p, it gets as low as the low 20's at LAX on my 5800X but this is with a whole bunch of addons including the Felis 747-200 and Traffic Global that bring down the performance significantly compared to the stock X-Plane 11 installation. I'm hoping I can turn those low 20's to at least 30 (I cap my frame rate to 30) in the huge busy areas with a CPU upgrade. In Flight Simulator in many huge cities where there is photogrammetry, I will be in the upper 20's with a terrain LOD setting of 300 at 1080p. These are the two games I want to see performance gains.
 

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
What makes you think the problem is the CPU and not your GPU?

In X-Plane 11 you can turn on the frame times display which tell you if the CPU or the GPU are a bottleneck. The component with higher number is the bottleneck for the situation, which is the CPU in my case at LAX, especially at 1080p. I doubt a GPU upgrade will increase the frame rates at 1080p in this game from what I have now which is a GTX 1080 Ti as when I upgraded from a GTX 780 Ti to a GTX 1080 Ti, I did not get any performance increase at 1080p in this game. However I plan on upgrading to a 4k HDR IPS monitor, I just haven't found one locally that I wanted, and a RTX 3090 FE or the upcoming 4080 to 4080 Ti (if it gets too close to the RTX 4000 release date by the time I can get a 3090 FE and at MSRP).
 
Jul 27, 2020
16,165
10,240
106
bottleneck for the situation, which is the CPU in my case at LAX, especially at 1080p.

I think you are in that thread (Dave34). Doesn't seem like the X3D will be that much of an improvement for you. Wait for Zen4X3D and upgrade the other components in the meanwhile.
 
  • Like
Reactions: Tlh97 and Makaveli