Info 64MB V-Cache on 5XXX Zen3 Average +15% in Games

Page 65 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kedas

Senior member
Dec 6, 2018
355
339
136
Well we know now how they will bridge the long wait to Zen4 on AM5 Q4 2022.
Production start for V-cache is end this year so too early for Zen4 so this is certainly coming to AM4.
+15% Lisa said is "like an entire architectural generation"
 
Last edited:
  • Like
Reactions: Tlh97 and Gideon

DrMrLordX

Lifer
Apr 27, 2000
22,898
12,963
136
Why should AMD plan to have a half generation desktop part when it would be replaced by Zen 4 as few as 5 months later?

Keeping AM4 + DDR4 alive a little longer in the middle of a DDR5 shortage isn't a terrible idea. If they can sell everything they make then it helps them with desktop market share, too. Let's be honest, they can probably sell them all. Only issue is that they can try taking server market share instead with Milan-X, so . . . choices, choices.

Why did it take so long for Intel to fire back when Zen 3 came out? It was a year before before they had anything to answer AMD. Why weren't they, a company many times the size of AMD with far more resources at their disposal, able to bend the laws of reality to excite you? If AMD not having something to beat Intel until sometime this spring is unbearable, I can't imagine your anguish at a year without excitement. However were you able to survive?

Intel's execution has been pretty poor for awhile. Few people are genuinely excited about their lineup. And they still won't sell consumers more than 8 P cores which is a massive disappointment.

Fine. AMD can do no wrong. Happy?

AMD can do plenty wrong. The main truth that people must accept is that the enthusiast PC market isn't their top priority anymore. I'll call myself out for predicting (correctly) price increases on products back during the XT launch and acting like it was a big deal, when the reality is that:

1). AMD did raise prices
2). AMD will continue to raise prices
3). AMD will serve other markets first despite the increase sales volume AND ASPs from the DiY market

What's wrong for us may be right for the shareholders.
 

Kocicak

Golden Member
Jan 17, 2019
1,177
1,232
136
Silly me expected all the CPUs from 5600X to 5950X to get a V-cash variant, with these being just a little bit more expensive and the original processors getting a noticeable discount.
Now I do not see much sense in putting this tech on "incomplete" chiplets (in 5600X and 5900X), it may even not be possible at all? Has 5950X 3D any use beside gaming?

The discount with the Alder lake offerings is inevitable, at last here the lower end lakes are already available and the price difference between them and ryzens in pretty obvious, how long is AMD going to wait before they lower the prices?
 
Jul 27, 2020
27,953
19,100
146
In a significant number of people eyes this is called " an act of desperation"
12900K was an act of desperation against Zen 3, yet it is selling and it has the highest IPC so far. Where is AMD's response? 12900KS will likely come before Zen 4. People who want the best ST and gaming performance NOW will go with Intel. That is lost sales for AMD.

Intel's website has a whole list of gaming benchmarks where the 12900K is beating 5950X. Go to AMD's benchmarks page and what are they showing?

1641803402117.png

Benchmarks of an unreleased CPU. So what is the gamer to do? Wait. How many are gonna wait? You know gamers and drug addicts. Both want their fix ASAP.
 
Last edited:

eek2121

Diamond Member
Aug 2, 2005
3,408
5,046
136
Don't see the point in that.

5800X3D @ 105 TDP PPT 142
5890X3D @ 125 TDP PPT 142

You would have to raise PPT to 162. These platforms are designed with specifications, and if the base specification called for 105W, that would be a problem.
Now I do not see much sense in putting this tech on "incomplete" chiplets (in 5600X and 5900X), it may even not be possible at all? Has 5950X 3D any use beside gaming?

The discount with the Alder lake offerings is inevitable, at last here the lower end lakes are already available and the price difference between them and ryzens in pretty obvious, how long is AMD going to wait before they lower the prices?
Depends on whether other workloads receive a decent uplift. AMD says they won’t, but we will see.
12900K was an act of desperation against Zen 3, yet it is selling and it has the highest IPC so far. Where is AMD's response? 12900KS will likely come before Zen 4. People who want the best ST and gaming performance NOW will go with Intel. That is lost sales for AMD.

Intel's website has a whole list of gaming benchmarks where the 12900K is beating 5950X. Go to AMD's benchmarks page and what are they showing?

View attachment 55737

Benchmarks of an unreleased CPU. So what is the gamer to do? Wait. How many are gonna wait? You know gamers and drug addicts. Both want their fix ASAP.

Yet Zen 3 is still significantly outselling Alder Lake. There is no competitive action by AMD required. They are still “winning”.

Just because you are bitter AMD didn’t launch a shiny new chip does not mean AMD should launch one. AMD has no reason to do anything right now.
 
Jul 27, 2020
27,953
19,100
146
Just because you are bitter AMD didn’t launch a shiny new chip does not mean AMD should launch one. AMD has no reason to do anything right now.
It's not about me. I'm probably not gonna upgrade until Nova Lake. This is about AMD being complacent. I will shut up about that if Zen 4 leaves Intel a whole year without a competitive response. But Intel posting an ST record despite process disadvantage should be taken as a threat by AMD. They should be springing into action. At the very least, put the 5800X3D into the hands of reviewers to reassure gamers that AMD is still the best.
 

leoneazzurro

Golden Member
Jul 26, 2016
1,114
1,867
136
If they had infinite capacity quite probably you would have all the Zen 3D Lineup launched, plenty of GPU availability and so on. Sadly they don't, so they have to prioritize the most lucrative market first, and this is something that was clear at the first signs of the shortage.
 

biostud

Lifer
Feb 27, 2003
19,902
7,008
136
What are the sales ratio of 12900K(S) and the upcoming 5800X3D compared to the more value oriented processors?

The main reason for having the performance crown is not for giving the best gaming experience, but for being able to say Intel or AMD is best for gaming. And as it was posted somewhere on these forums, when a non tech savvy person has heard that "Intel is the best" ten they go to the store and get a 10400K, 11400K processor oar maybe even a 12700k, or if "AMD is the best" they get a 3800X or if they are lucky a 5800X.

And besides benchmarks, if you took a 5800X, 5800X3D, 5900X, 12700K or 12900K(S) and matched it with a 3080 and played at 1440p or 4K you would get exactly the same gaming experience. So really any of these processors can do what they're supposed to do at their price bracket.
 

moinmoin

Diamond Member
Jun 1, 2017
5,242
8,456
136
I would be excited if AMD had a desktop 6nm Zen3+ die (with V-cache and RDNA2 iGPU) by end of January.
It's one thing to speculate about far off possibilities, it's another to literally ask for oxymoron to become true as a prerequisite to get excited.
  • N6 Zen 3+ (which is what cancelled Warhol likely was) likely wouldn't have made significant enough performance difference to make a launch worth this close to Zen 4.
  • X3D makes a significant enough performance difference so it does launch.
  • From what we know X3D V-Cache is only possible on N7, not N6.
  • There doesn't exist any MCM iGPU solution on AM4, by all indications the first one such is coming with Zen 4 on AM5 later this year.
  • You only get excited to see all of that at once by end of January...
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
What are the sales ratio of 12900K(S) and the upcoming 5800X3D compared to the more value oriented processors?

The main reason for having the performance crown is not for giving the best gaming experience, but for being able to say Intel or AMD is best for gaming. And as it was posted somewhere on these forums, when a non tech savvy person has heard that "Intel is the best" ten they go to the store and get a 10400K, 11400K processor oar maybe even a 12700k, or if "AMD is the best" they get a 3800X or if they are lucky a 5800X.

And besides benchmarks, if you took a 5800X, 5800X3D, 5900X, 12700K or 12900K(S) and matched it with a 3080 and played at 1440p or 4K you would get exactly the same gaming experience. So really any of these processors can do what they're supposed to do at their price bracket.

So are you saying the 5800X3D is designed as a 'mindshare' CPU? I'm not sure how well that will work, considering its a single SKU and is essentially a stop gap for a very particular niche until Zen 4 launches.

I honestly don't believe it will be that well received when it actually launches, mainly due to its relative lack of cores compared to simiarly priced (or cheaper) CPUs from both Intel and AMD themselves. Think $350 12700F here, or perhaps a discounted 5900X if AMD drops prices to compete.

As I said earlier, without a top to bottom stack of Zen3D chips, the 5800X3D finds itself in an odd position in the processor hierachy. Its not truly 'flagship' because it only has 8 cores, so will its call to fame be as a fast gaming CPU that, in all likelyhood, as you mentioned yourself, wouldn't even be noticeable in 'real world' gaming?
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
Keeping AM4 + DDR4 alive a little longer in the middle of a DDR5 shortage isn't a terrible idea. If they can sell everything they make then it helps them with desktop market share, too. Let's be honest, they can probably sell them all. Only issue is that they can try taking server market share instead with Milan-X, so . . . choices, choices.

Yes, seems like it would have been a good idea. AMD were working on 6N for the recently announced monolithic APUs, spreading the engineering costs over CPU & APU seems like it would have made sense. Perhaps both Intel and AMD expected better availability of DDR5. So, maybe, a bit of opportunity lost.

Intel's execution has been pretty poor for awhile. Few people are genuinely excited about their lineup. And they still won't sell consumers more than 8 P cores which is a massive disappointment.

The better YouTube influencers had to work harder to praise Alder Lake, even then the regrets came after issues started popping up. Personally, it took me a while for me to consider AMD a better option than Intel. Intel would have to come out with something special to make me reconsider them whenever my next full system build comes due.

AMD can do plenty wrong. The main truth that people must accept is that the enthusiast PC market isn't their top priority anymore. I'll call myself out for predicting (correctly) price increases on products back during the XT launch and acting like it was a big deal, when the reality is that:

1). AMD did raise prices
2). AMD will continue to raise prices
3). AMD will serve other markets first despite the increase sales volume AND ASPs from the DiY market

What's wrong for us may be right for the shareholders.
Absolutely! In particular, AMD is capacity constrained, so moving product to the most profitable segments is even more critical. I would think mobile APUs offer the most $$s/wafer (I don't have the numbers so...), but the advantages of bulk shipping server CPUs and chipsets to hyperscalers grants tremendous economies of scale. AMD needs a healthy desktop market to sell the lower quality CCDs into, though OEM contracts provide more predictable income through lower friction sales channels. The fact that AMD does give some importance to DIY is a business decision that, in part I would think, recognizes the PR value** of increasing AMD's brand presence across a broader audience. Intel does the same thing.


** an actual calculable value, not just 'good' feelings. The nod to shareholders shows that at the end of the day, AMD, and all companies, are focused on dollar amounts vs anything else. How do we make the most money for off of the resources we are able to afford.
 

Ajay

Lifer
Jan 8, 2001
16,094
8,114
136
N6 Zen 3+ (which is what cancelled Warhol likely was) likely wouldn't have made significant enough performance difference to make a launch worth this close to Zen 4.
The only thing we know for sure is that the payoff on the engineering spent on the N6 Zen3+ CCD wasn't there. Otherwise AMD would have done so. Since AMD can't really 'tweak' CCDs that go into server CPUs (because of re-validation costs and time), there would be a loss in value for N7 Zen3 CCDs destined for desktops. The profit on Zen3+, a less expensive node, apparently wasn't worth the engineering effort for a given timeframe. Neither Intel nor AMD could have predicted that their would be a shortage of components for making DDR5 DIMMs.

I think that the value of Zen3 + V-Cache included PR** value (as a 'halo' product) and it's value it provided as an opportunity to refine the 3D packaging process that would be used in future AMD products.

** again, an actual calculable value, not just 'good' feelings. The 'good' feelings help boost AMD's brand value, resulting in higher future sales or profits, etc.
 

Hitman928

Diamond Member
Apr 15, 2012
6,695
12,370
136
N7 cache die stacked on top of N6 CCD cannot communicate? Is there a plausible technical reason for that?

As far as we know, TSMC has stated that stacking is only available on N7. Now, N6 is part of the "N7 family" so it's unclear if it would still be possible on N6, but at face value it isn't. It could come down to simply that N6 is relatively new and TSMC hasn't had time to develop/validate a stacking flow for the process but N7 has been around for a while now so that's what they have ready to use for stacking.
 

biostud

Lifer
Feb 27, 2003
19,902
7,008
136
So are you saying the 5800X3D is designed as a 'mindshare' CPU? I'm not sure how well that will work, considering its a single SKU and is essentially a stop gap for a very particular niche until Zen 4 launches.

I honestly don't believe it will be that well received when it actually launches, mainly due to its relative lack of cores compared to simiarly priced (or cheaper) CPUs from both Intel and AMD themselves. Think $350 12700F here, or perhaps a discounted 5900X if AMD drops prices to compete.

As I said earlier, without a top to bottom stack of Zen3D chips, the 5800X3D finds itself in an odd position in the processor hierachy. Its not truly 'flagship' because it only has 8 cores, so will its call to fame be as a fast gaming CPU that, in all likelyhood, as you mentioned yourself, wouldn't even be noticeable in 'real world' gaming?

It exists because it is not too much trouble for AMD as they already are producing the chiplets and they try to steal some of Intels thunder. AMD is not going to be earning buckets of gold on 5800X3D and they wouldn't on 5900X3D of 5950X3D either, sure they would be nice to have for such connoisseurs as ourselves, but financially they're not that interesting so close to a zen4 launch for AMD.

5800X3D is a chip for those who would buy a 12900K(S) purely for gaming, to have the best possible no matter what, which in it self is a relatively small market. And we don't know the cost of the CPU yet, or if AMD plans to cut som prizes on their CPU's when it launches. So lets see what happens at launch. :)
 
  • Like
Reactions: Tlh97

Makaveli

Diamond Member
Feb 8, 2002
4,975
1,571
136
And besides benchmarks, if you took a 5800X, 5800X3D, 5900X, 12700K or 12900K(S) and matched it with a 3080 and played at 1440p or 4K you would get exactly the same gaming experience. So really any of these processors can do what they're supposed to do at their price bracket.

This is a important point alot of people forget. You will see people going back and forth on forums for days on intel vs amd cpu and nv vs amd and soon to be intel on the gpu side over 5 or 10 fps differences. When the most important thing is playable performance vs non playable performance.
 

eek2121

Diamond Member
Aug 2, 2005
3,408
5,046
136
It's not about me. I'm probably not gonna upgrade until Nova Lake. This is about AMD being complacent. I will shut up about that if Zen 4 leaves Intel a whole year without a competitive response. But Intel posting an ST record despite process disadvantage should be taken as a threat by AMD. They should be springing into action. At the very least, put the 5800X3D into the hands of reviewers to reassure gamers that AMD is still the best.

Zen 4 is literally going to do just that. Around the time Raptor Lake launches Zen 4 will launch, and all current leaks indicate Zen 4 will be faster than Raptor Lake. Alder Lake clearly is not a threat to AMD, and Intel is really struggling to deliver in the server markets as well.

Personally, I try to be neutral about both companies. If Intel released a competitive product I would buy it. However, 241W for “almost” 5950X performance when the 5950X uses 100W less isn’t cutting it, especially when you factor in the price difference.
 

DisEnchantment

Golden Member
Mar 3, 2017
1,777
6,791
136
ADL is the better performer in lots of these use cases and very attractive especially when starting out on a new platform.

However, being capacity constrained, it won't be a smart move for AMD to price match and undercut ADL offerings when they still have demand for their chips elsewhere and can get higher premium too.
They will operate by the book, which is to satisfy the demand where they can command higher premiums. Let the marketing momentum carry them for a couple of quarters and launch a proper response.
They are playing a long game, trade a few percentage points of share for much more substantial financial gains with proper product positioning.

They just need to drip feed some Zen 4 hype time to time and the average consumer will already be having second thought going with the competition. Their marketing apparatus will start building the hype pretty soon, it is very predictable.
It is easier to regain consumer market share than server market share anyway. Consumer buys current best thing, server customer buys a roadmap.
 
Jul 27, 2020
27,953
19,100
146
As far as we know, TSMC has stated that stacking is only available on N7. Now, N6 is part of the "N7 family" so it's unclear if it would still be possible on N6, but at face value it isn't. It could come down to simply that N6 is relatively new and TSMC hasn't had time to develop/validate a stacking flow for the process but N7 has been around for a while now so that's what they have ready to use for stacking.
I get it now. I was totally not factoring in the TSMC aspect. AMD's hands are tied coz they can't force TSMC to do whatever they need. They can't be TSMC's boss. They have to be nice and professional with them and can only just coax them to do their best. Intel has the upper hand here since they have direct control over their foundries and can dedicate the required resources to whatever the CEO/CTO dictate.

Hopefully, the server business boom will allow AMD to buy their own fab, hopefully dedicated for gamers :D
 

Hitman928

Diamond Member
Apr 15, 2012
6,695
12,370
136
Hopefully, the server business boom will allow AMD to buy their own fab, hopefully dedicated for gamers :D

Not gonna happen. At most, AMD will move to an Apple like position and work even closer with a foundry in more or less co-developing a process specific for their needs. AMD already does this but they are behind Apple in priority so they have to wait their turn or accept whatever node specs Apple is co-developing with TSMC.

Having your own fab is great when things are going well. It has benefited Intel tremendously through the years to have not just a world class fab, but the leading foundry for decades. Problem with fabs is when things aren't going well, they become a very large anchor to your whole business and can eventually sink the whole thing. If Intel doesn't get their next few nodes right and in a competitive time frame, we may see a major shakeup at Intel in terms of their business model. Maybe if they can get enough outside customers to buy wafers, they can afford to keep financing their R&D after some more missteps, but that's a big if. There's also the government grants being lobbied for, but we'll see how much of that pie Intel gets versus it being spread out between multiple companies.