Info 64MB V-Cache on 5XXX Zen3 Average +15% in Games

Page 119 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Kedas

Senior member
Dec 6, 2018
355
339
136
Well we know now how they will bridge the long wait to Zen4 on AM5 Q4 2022.
Production start for V-cache is end this year so too early for Zen4 so this is certainly coming to AM4.
+15% Lisa said is "like an entire architectural generation"
 
Last edited:
  • Like
Reactions: Tlh97 and Gideon

Mopetar

Diamond Member
Jan 31, 2011
8,378
7,467
136
I'm more inclined to think that a 5900X3D or 5950X3D with a larger V-cache may pose a threat to Zen 4 since that won't have the benefit of V-cache for some time to come.

Only for workloads that benefit from the extra cache. Zen 4 is doubling the L2 cache which is also going to cut into the advantage that v-cache offers to some degree. Between that and other IPC increases as well as the clock speed increases that Zen 4 will see, there probably aren't many cases where a hypothetical 5950X3D would wind up winning, particularly if AMD would still need to limit the voltages.

The dual chiplet Zen3D CPUs don't have enough of a reason to exist. There are absolutely cases where they do outperform a similar Zen 3 CPU by more than enough to justify the extra cost, but a lot of those applications are going to be even more suited to HEDT or servers due to the massive core counts.

The only other area where Zen 3D justified a consumer part was gaming and there isn't really a need for more than 8 cores. AMD can sell twice as many 5800X3D CPUs and given the gains that exist in specific titles, there will be demand. Anyone who just plays Flight Simulator that's already on AM4 would have a hard time turning down the kind of performance gains that Zen 3D brings to the table.
 

nicalandia

Diamond Member
Jan 10, 2019
3,331
5,282
136

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Sure, it was just if in some alternative universe Intel hadn't released a competitive product. And while we use a lot of time speculating about unreleased products, I think those working in the business might have more knowledge about what their competitors will release than we do.
It isn't about releasing a competitive product, it's about releasing a product. If no one gets in trouble for buying Intel, to borrow from IBM's old ads, then Intel releasing their half baked 11th gen which was abysmal while Zen 3 was around was still a product release and still sold well. Intel isn't going to sit around for 2-3 years between product launches. 13th and 14th gen could be terrible, and it'll still sell because there's still a lot of loyalists who'll buy Intel even if the product is terrible compared to AMD's, because it's Intel, and more "stable."
 

yuri69

Senior member
Jul 16, 2013
643
1,131
136
Intel isn't going to sit around for 2-3 years between product launches. 13th and 14th gen could be terrible, and it'll still sell because there's still a lot of loyalists who'll buy Intel even if the product is terrible compared to AMD's, because it's Intel, and more "stable."
Even if Intel kept refreshing their 14nm Skylake clones they would sell like hot cakes (except DIY). Why? Because these would be available in millions unlike TSMC-bound AMD products.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Even if Intel kept refreshing their 14nm Skylake clones they would sell like hot cakes (except DIY). Why? Because these would be available in millions unlike TSMC-bound AMD products.
AMD's supply is ever-increasing though, that's most prominently seen in the server space. VERY good decision from them, that provides stable ground for times when 1-2 products might not enjoy the greatest of success, to be able to bounce back later with the next product, without getting into the downward spiral of missing a massive chunk of income.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,160
136
Even if Intel kept refreshing their 14nm Skylake clones they would sell like hot cakes (except DIY). Why? Because these would be available in millions unlike TSMC-bound AMD products.
Kind of my point. I didn't think it needed to be said, to be honest. As the user after you pointed out, AMD's share is on a slow rise. Should Intel's ambitions of running fabs for other companies actually come to fruition, there will be less competition at TSMC for wafers. I could see AMD making grabs.

Let's be honest, Intel could always bring up a modern turd of a Prescott 2.0 and lose to AMD, severely.. They'll still sell tons of processors. Even when presented with facts not plucked out of the astrosphere, loyalists will buy any old junk.

Speaking as a former Intel loyalist.
 

Mopetar

Diamond Member
Jan 31, 2011
8,378
7,467
136

Here's HUB's 40 game analysis between the 5800X3D and 12900K. Both are using some of the best memory available.

Expanding to 40 games didn't really change the overall average too much, but does make a more compelling case for buying one over the other if you do have a particular title or small set of games that seem to prefer one CPU over the other as there still are a fair number of cases where the Alder Lake CPU has a large advantage.
 

ondma

Diamond Member
Mar 18, 2018
3,237
1,645
136

Here's HUB's 40 game analysis between the 5800X3D and 12900K. Both are using some of the best memory available.

Expanding to 40 games didn't really change the overall average too much, but does make a more compelling case for buying one over the other if you do have a particular title or small set of games that seem to prefer one CPU over the other as there still are a fair number of cases where the Alder Lake CPU has a large advantage.
Amazing difference from title to title.
 

coercitiv

Diamond Member
Jan 24, 2014
7,136
16,557
136
Expanding to 40 games didn't really change the overall average too much, but does make a more compelling case for buying one over the other if you do have a particular title or small set of games that seem to prefer one CPU over the other as there still are a fair number of cases where the Alder Lake CPU has a large advantage.
It comes down to what other workloads you're running: AMD has gaming covered with 5800X3D and productivity heavy-lifting with 5900X/5950X, while Intel offers a hybrid approach (pun intended) with 12700K/12900K, a different mix of gaming/productivity/value. If I were to build a rig with heavy gaming focus I'd go 5800X3D, better all-round value would mean 12700K /w fast DDR4, strong(er) productivity would be 5900X/5950X, and if I were to build with other people's money... definitely 12900K /w DDR5 6400 CL32!

So there you go Pixel Pat, you're still in the game (pun intended)... for a few more months. Can't wait for the RPL/Zen4 showdown.
 

Vattila

Senior member
Oct 22, 2004
817
1,450
136
Amazing difference from title to title.

Apart from the obvious effect of cache, the corollary of this is that there are things programmers can do that will greatly affect performance on one architecture or the other. Taking into account that no one, presumably, writes game software without considering how it will run on an Intel processor, I conclude that there still is neglect in optimising/testing software on AMD hardware.
 
Last edited:

Shamrock

Golden Member
Oct 11, 1999
1,441
567
136

Here's HUB's 40 game analysis between the 5800X3D and 12900K. Both are using some of the best memory available.

Expanding to 40 games didn't really change the overall average too much, but does make a more compelling case for buying one over the other if you do have a particular title or small set of games that seem to prefer one CPU over the other as there still are a fair number of cases where the Alder Lake CPU has a large advantage.

Did someone in here say that the 5800x3d falls on its face at 4k? I think it holds its own with 100 watts less and $350 cheaper.
 

biostud

Lifer
Feb 27, 2003
19,506
6,562
136
Is it possible to make some kind of data as to where Intel or AMD shines, based game engine, api or game release date, to qualify where the CPU limit lies.

Also I was wondering when we move from being CPU limited at 1080p to GPU limited at 4k, is load on the CPU the same or does it change?
 
Jul 27, 2020
24,268
16,926
146
  • Like
Reactions: lightmanek

biostud

Lifer
Feb 27, 2003
19,506
6,562
136
Review: Ryzen 7 5800X3D is an interesting tech demo that’s hard to recommend | Ars Technica

Have to agree with him. While it's great for games, if you play games AND use your PC for other CPU intensive stuff, this might not be the best bang for your buck.
And has anything else ever been stated about the 5800X3D?

How many percentages of desktop users actually do a lot of work in their spare time, where they can benefit from more cores?

Those who actually can benefit from all the cores on a 5950x or 12900k, know that 5800X3D is not for them, but for the rest of the PC users that game, it is nice finish for the AM4 platform. If I was forced to choose today it could be either ADL 12700K or 5800X3D, and I would probably be happy either way, as I'm still rocking a computer from 2014, in the end I would probably go with AMD because the price/performance/power usage is better.
 

MadRat

Lifer
Oct 14, 1999
11,965
278
126
Why not game at 2160 rather than 4K? Seems like a bit of a stretch when not long ago it was 1024 versus 1280, then we suddenly started talking in television terms. Monitor manufacturers trying to scale out stuff way too fast for the videocard hardware limitations. 4K monitors can support 2K, so it makes more sense to me. Even though people insist you cannot see the difference between 60fps and 120fps, thats not true. Our brains all sync to different key frames and different people have completely different abilities to process. People used to use zoom to give 360 fov in games and it drastically reduced their need to process what they saw. The eyeball has the ability to only really see a small focal point in detail and those details fall off in the periphery. So regardless if its 1080 or 8k, your eyes see the same lack of detail outside the focal cone.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Those who actually can benefit from all the cores on a 5950x or 12900k, know that 5800X3D is not for them, but for the rest of the PC users that game, it is nice finish for the AM4 platform. If I was forced to choose today it could be either ADL 12700K or 5800X3D, and I would probably be happy either way, as I'm still rocking a computer from 2014, in the end I would probably go with AMD because the price/performance/power usage is better.

What I feel is lacking in coverage is the upgrade angle. If you already have a whole AM4 system, the X3D* is just a drop in upgrade. Don't even need to reinstall Windows or anything (unless you want to of course).

Pitched as that, it's a very interesting proposition compared with buying a complete Alder Lake system with similar performance. As a new build it's pretty meh. I'd wait for Zen4, also to get into the AM5 ecosystem. Who knows? You might be able to do a similar drop in upgrade to the Ryzen 9000-series or whatever in 5 years time.

*Along with the 5700X. If you're coming from a Zen1(+) it's just a massive upgrade. Real value.
 
Jul 27, 2020
24,268
16,926
146
When Zen 4 launches, it would be sensible of reviewers to just benchmark 5800X3D for gaming and 5950X for productivity. Anything else, they would just be increasing the time required to post a review.
 
  • Like
Reactions: lightmanek