This will probably sound like a bit of a 'rant' but I'll make a point here regarding this.
It's mostly about reviews (and their authors) and how they seemingly never want to touch resolutions above 1080p for CPU benchmarks. Are they being told by AMD (or Intel when applicable) not to do it because 'bigger numbers' don't show up as much beyond 1080p and are afraid of the potential buyer's perception that it's "not as good as they would otherwise think it is"?
I do get that most reviews will focus their games benchmarks at 1080p since it would be more CPU-dependent at that resolution, and obviously if you do 1080p benchmarks with the latest CPU tech then - amazingly, lo and behold - big numbers show up! Incredible! Isn't it. But more and more PC gamers are buying 2K / 1440p monitors, it's unfortunately a resolution range that's often missing in CPU benchmarks (for GPU benchmarks it's always there, of course); but it should be just as viable for that crowd to forge a proper opinion on the Pros and Cons at such resolutions. Not 'everyone' games at 1080p anymore, it's 2022, not 2014.
The irony is that the YouTube review above actually has 1440p resolution available, of all things.
The very few 5800X3D reviews I've seen that did some tests at 1440p (and above) shows improvements in some specific games, and very minimal gains in some others. In the best cases at 1440p, it's about +30 FPS in the games where this can be observed (and here I mean +30 FPS over the regular 5800X, not over the best of Intel CPUs). In a game like FarCry 6 apparently the extra Cache does show good improvements. So yes, even at 1440p it seems that in some games a good bump in performance can happen.
But here's my problem: In the reviews where that sort of performance is seen, they're testing with an RTX Freakin' 3090. And I don't think that we're past the 1% of gamers with that card in the market yet to this day so many months after it was released. What's the point of bumping your performance by +30 FPS with a 5800X3D, if you already had 120+ FPS anyway? Amazing, now you have 150 FPS. Gone is the slide-show you had to endure previously! Praise 150 FPS! There's another review where the 5800X3D was not even compared at all to the regular 5800X lol, it's just obviously put against the top Intel line and the best of Ryzen and that's it.
Anyway, honestly, I don't see a lot of viable reasons to buy something like a 5800X3D if someone is gaming at 1080p, which to me is the biggest irony of all. I wouldn't say something like "Hey, if you want good performance at 1080p, just buy a 5800X3D and forget about a good GPU, now with this new extra Cache only a good CPU is enough!". Sincerely, anyone with the money to buy this new CPU should have the cash for all the rest then. Just thinking out loud here but don't tell me someone out there will work hard for weeks if not months to save for a brand new shiny 5800X3D only to resort on using a GTX 970 with it. Please.
Instead, I'd recommend something like a Ryzen 5 5600X which is more than plenty for 1080p gaming, and a decent GPU to accompany that. Certainly not a CPU that costs $450 (current MSRP). To me that's like saying you could drive at 200 miles per hour on your local boulevard to go do your groceries with a Ferrari now, but no one needs a Ferrari to drive at that speed for such a mundane everyday task (yes, sorry but 1080p gaming in 2022 is mundane, common and does not require this kind of power).
The long and short of it is differences in CPUs become extremely small at resolutions higher than 1080p.
If you're like me, on an old 6600K processor with a 1440p/144hz monitor, the real news story here is how many folks are unloading their 5600x or 5800x processors for firesale prices on the second hand market right now (either for Intel's Alderlake or for the 5800x3D).
The smart play if you're looking to save a buck at the moment is to go scoop up a used high end processor from the "prior gen" on the second hand market.
Concerning the used market: I agree; buying gear that is only a year or 2 old is an excellent choice.
But all the different reasons why people say they don't want a 3D, obfuscates what it actually is; a rare beast. The fact that the 3D is as fast as it is for gaming while being a power sipper is where the value is at. When is the last time you could use a $30 cooler on a flagship gaming CPU and get full performance? That it won't be blasting out up to a 150W of heat in some games, would be appreciated this summer too. Nor do you have to worry about whether or not your board's power delivery is up to the task, it is. And you can throw it in a board that is 4yrs old; It is a truly rare beast IMO.
In response to the reviews rant: I agree. And have been ranting about them myself for years now. I don't care how sound the methodology is, it's inadequate, and given far too much weight in PC tech forums/discussions. Influencers indeed. All the reasons for why they do it, and how they do it, while valid, in the final analysis, can be incapsulated in a single word - money. Got to strike while the proverbial iron is hot. And when they do follow ups, it is just more games, but no gaming; the ironing is delicious.
Most of our old, crusty, crowd, are utterly indoctrinated into accepting and repeating the entrenched dogma. Thinking that those bar graphs and text articles are all they need? Utterly laughable from my perspective. You do you; I climbed off that bandwagon a long time ago.
Instead of building one killer system, I use a bunch of different hardware. I don't buy parts, I rent them. Preferably low to mid range parts. The result is that my experiences sometimes diverge from the dogma that is incessantly repeated. Dogma based on the same bar graphs and methodologies that are almost as old and tired as our crowd/membership is.
I daily driver the stuff, not throw it on the bench, run my bot script level routine, and compile data. I will give you a great example of an inadequacy that has nothing to do with the gaming performance - The 5700XT reviews. I won't call him out by name, but one of the big guys was implying all the issues being reported were PICNIC. I preorder my Sapphire Reference model, so had it in hand on release day. It was literally hot garbage. The hardware video acceleration was broken in everything from Youtube to VLC. It was hot, loud, and the drivers were super flaky. Had a weird hitching issue, like dancing under the strobe light, with Fallout 4 I'd never seen before. But hey, it turned in some epic 3DMark scores. Before it locked itself to 800MHz that is. Which even a fresh windows install did not resolve. Took months before any big reviewer, especially turd blossom, acknowledged there were actually issues. But he never had any, so he did his best surprised Pikachu that the issues were real. And of course, being so far up his own rear end that he couldn't find his way out with a map and a flashlight, no apology was ever issued. But as to his claim of no issues? Of course he didn't have any. You have to use the card normally, expect it to play videos, and old games that are not in your testing suite. You know, the games the IHV was smart enough to optimize for when sending you the review sample. They know which games you will all test, and you know that they know. But do you or they know, that we know? Because I know, that they know, that you know, that they know you know.
As to the 3D for gaming, to post completely on topic again: It is a rare beast, as I opined above. No amount of influencer propaganda is going to change my mind about that. Or gamers that already have high end gear, that ask silly questions like, and I quote - "Who is this CPU for?" If you are going to smooth brain it that hard; sit down and don't raise your hand to ask another question. Because you are ill equipped to understand the answers.
If we could dump the decades of programming we run as DIYers, we could have more nuanced discussions. Instead of throwing up bar graphs based on canned bench runs, or running around for 25-60 seconds in what they claim is a demanding area. Bud, I play the games. I do my best to be the god of chaos in the ones that allow it. And hear me now, believe me tomorrow, but remember this yesterday - Your charts don't mean squat to me sometimes. Your recommended hardware isn't even close to what I end up needing for flawless gameplay. You will never know that, because you will never play that whole game and get to the really intense stuff.
Some here have been complaining they don't test late game in Anno or Planet Coaster or something. They never will. You'd have to spend many hours playing and getting to that CPU destroying level then save point and use it for testing. They will have churned out multiple articles/videos by then. Baby needs a new pair of shoes.
Ones of people read this forum. This was more of a, my BBQ isn't quite ready to come out yet rant.
For the 3 people that read my post, odds are one out of the 3 of you will be ready to make a pedantic reply about how you can get XX percent of the performance of whatever CPU and keep the power and heat under control. Or even worse, fit in a humble brag about their own silicon lottery winner gear. Don't bother. The 3D can do its tricks PnP, yours' doesn't. That means the board need not even support those tweaks in the UEFI to get full performance; extending how ubiquitous it is capable of being for a flagship PC gaming CPU. Which is what no one seems to talk about. Speaking for myself; it is its best trick. And I have trouble thinking of the last time a flagship gaming CPU could do that?
Honestly, the 5800X3D sounds good to me, and you make some good points.
I'm going to make a negative point now.
This is the very beginning of, now having to choose between the best CPU for gaming, versus, best all-around of for productivity.
It used to be that CPUs were compared wholistically, both gaming, productivity/content-creation, encryption, etc.
Now, we have to make choices - points into SEPARATE skill-trees, if you will.
It was bound to happen, I guess.
If we can see these benefits with CPUs, what about APUs? We've seen the performance benefits of RDNA2 GPU with Infinity Cache, what will APUs be able to accomplish, allocating some/all of the 3D V-cache to the APU iGPU.
No my friend, it is not the beginning. It happened perforce, due to Zen showing up over 5yrs ago. For quite a few years, if someone posted for help with a build, no matter what it was for, you could just reply with i7. Lower budget was where the fight was. FX v. locked i5 and i3 was much less cut and dry. Particularly if you needed some productivity and gaming on a tight budget.
With the advent of Zen, that changed. Zen became the dominate choice for one, Intel for the other. Now, we have so much parity, that we are back to pricing, platform features, power, and cooling, being the nitpicks. You can't choose wrong either way for a balanced build. Only highly competitive gamers, those that aspire to be, and those with a particular workflow need to be picky.