If Reviews are not released with 3D V-Cache Performance Optimizer then they are leaving performance on the table.
View attachment 59920
Also the TPU Review used a x570 MB which according to some posters(from the TPU Forums) it can't do much BCLK overclocking(capped at 101 mhz)
It Would seem that others are implementing some form of Performance Enhancer for the 5800X3D on their Bios. The MSI does not mention any Performance Optimizer..
View attachment 59927
This early tester says to stay away from thatThe bios tpu used
BIOS 7C37vAG2, AGESA 1.2.0.6c (5800X3D)
This early tester says to stay away from that
![]()
PSA: Stay away from AGESA 1.2.0.6c, AMD restricting VCORE to 1.2v with BIOS PBO EDC above 140
Thought it was bad enough AGESA prior started capping VCORE at 1.425v if you ran above 140? How about 1.2v [https://i.imgur.com/Zh2iZk6.png](https://i.imgur.com/Zh2iZk6.png) I've ran a CB23 above, booted a game for a few minutes, opened/closed web browser and so on. Replicated this across...www.fpshub.com
To be fair, the crowd that AMD is trying to target this at out aren't in it for the 'bang for buck'. That's not what the 'halo gaming' market is about. Think 3090 Ti owners here. They just spent $1500 on their GPU to gain 5% over a vanilla 3090, so spending a bit more (or less) on a CPU doesn't really matter that much as long as it is the 'fastest'. Trust me, I've seen enough 3090 Ti owners out there to see that most run 12900K or 5950Xs, CPU pricing isn't a concern for them. I do wonder if most 12900K/5950X owners would be tempted to 'sidegrade' to a 5800X3D just to get a few more fps though? Personally, I wouldn't, but I'm not just a gamer and use my PC for work as well, so I'll take 'moar corez' thanks 😉Few people will pair the R5-5600 or i5-12400 with a high end GPU. But if AMD is successful in portraying the image of the 5800X3D as the world's best gaming CPU, you can bet people will buy it over the 12900K/KS. Less heat/no E-core issues/no RAM tuning/no expensive RAM. So many headaches gone.
I don't think it is under performing. It ends up around the 12900K. That's about what was expected.
It's just much later than we had hoped. Still a money saver for folks with previous AM4 boards.
I wasn't really comparing it to the 12900K, more to its Zen 3 counter-parts. AMD claimed on average a 15% increase ast 1080P over a 5900X, which I thought was a reasonable and realistic claim. The TPU results show a ~7.5% increase, which is actually a fair bit lower than what I expected, and as I said earlier, literally half of what AMD's estimated gains to be.
The fact that it can't truly beat the 12900K (according to TPU anyway) also means it is somewhat 'underperforming' since that is the whole point of the CPU - for AMD to regain the gaming performance crown. Not to draw, but to outright beat the 12900K.
I think I'll wait for more reviews with bigger game sample sizes before calling AMD out here for their 15% claim - I still think it's possible, but for all we know TPU might have just chosen a bunch of games that don't necessarily scale that well with larger caches.
Yea, but if price is no object, why not just wait for Zen 4. Get a new platform, faster I/O, DDR5, and unless the gains are far less than expectations, you could probably get gaming equal to or better than 5800X3D without the limitation of 8 cores.To be fair, the crowd that AMD is trying to target this at out aren't in it for the 'bang for buck'. That's not what the 'halo gaming' market is about. Think 3090 Ti owners here. They just spent $1500 on their GPU to gain 5% over a vanilla 3090, so spending a bit more (or less) on a CPU doesn't really matter that much as long as it is the 'fastest'. Trust me, I've seen enough 3090 Ti owners out there to see that most run 12900K or 5950Xs, CPU pricing isn't a concern for them. I do wonder if most 12900K/5950X owners would be tempted to 'sidegrade' to a 5800X3D just to get a few more fps though? Personally, I wouldn't, but I'm not just a gamer and use my PC for work as well, so I'll take 'moar corez' thanks 😉
Also, I don't wish to derail this thread, but as an 12900K owner those 'headaches' with the 12900K aren't really true - for gaming, heat is a non issue, and I actually prefer having the E-cores to be available for background tasks while I game. RAM tuning? I think most people just set XMP and that's it (until recently I was in that camp too, the gains are small even with tuned subtimings). You may be exaggerating for effect, but honestly, I encountered none of those issues with my 12900K WR to gaming.
FWIW, DDR4 can still be expensive if you go for the high speed / low latency stuff. For my 12900K, I went with Trident Z Neo 'B-Die' 3600 C16 which was approaching the price of DDR5 kits (albeit entry level), and my DDR4 kit isn't even considered truly premium like those DDR4-4000 kits. Yes, when comparing fast DDR5 vs fast DDR4, DDR5 is obviously more expensive still, but I do see the prices trending down slowly for DDR5.
Yeah, I'd agree with that. Zen 4 is close enough that it makes sense to wait, plus DDR5 should (hopefully) be ready for prime time then as well.Yea, but if price is no object, why not just wait for Zen 4. Get a new platform, faster I/O, DDR5, and unless the gains are far less than expectations, you could probably get gaming equal to or better than 5800X3D without the limitation of 8 cores.
As much as you and others would like the CP2077 result to be an outlier, it really isn't. That's the norm in many RT heavy games when you enable raytracing. It's just another aspect that reviewers are oblivious to.He said the 12900KS would be faster than 5800X3D with the exception of edge cases like SCII. Then he proceeded to make a point by show what we can probably consider an outlier by now: a CP2077 benchmark where ADL-S was up to 50% faster.
The argument was meant to be disarming, that's why I'm asking the question now, after getting proper DDR5 benchmarks in a variety of games. (including CP2077, obviously a different run than the one presented in the other video).


The point is to test marketing claims. Isn't that what tech journalists are supposed to do?What's the point?
It's too bad that Hardware Unboxed is already trying to paint the 5800X3D in a favourable light in their upcoming review by testing with DDR4-3200:
View attachment 59941
In the quest of testing which is the fastest gaming CPU, as per AMD's claims, crippling your competition only reinforces the prevalent notions some people have with their channel.
Not sure. I was asking what's the point of such "outlier" discussion. As the TPU benches show there are also scenarios where Zen3D out performs GC even when Alder Lake is equipped with memory more than twice as expensive. There's bound to be plenty of examples on each side. Soon you'll be able to throw selected graphs at each other all day but why bother. Consider a more complete picture.The point is to test marketing claims. Isn't that what tech journalists are supposed to do?
The Bios he is using is without the 3D V-Cache Optimizer Driver and unoptimized AGESA 1.2.0.6c
What annoys me the most on this forum is people telling me what I think. That must be the meta in terms of "forumsplaining". If you're so worried about my opinion on ADL-S vs 5800X3D why don't you just ask me? I made my prediction (and bet) long ago when I upgraded to 12700K.As much as you and others would like
🤦🤦🤦🤦🤦i2Hard tests with RT enabled. Unless you enable RT, CP2077 is mostly GPU-bound.