Why not a single 5950X that you could game and Daily?I might buy two. I currently have 2 5800X desktops, one game and one daily.
If it would only be games, Milan-X would not exist. It's rather the other way around, it has just turned out to be great for most games as a maybe not fully unexpected, but also not necessarily intended benefit, that's playing perfectly into the somewhat extended release cadence, by rivaling ADL in games with an uArch that has measurably lower IPC and a product that has _significantly_ lower clock speed.The only part of this post I disagree with is the gimmick part if I understand you correctly. I expects the test results to show a 15% improvement in games, and not much else changes, so Zen3 can take the gaming crown back from ADL. With the multi-threaded crown already in their court, and power efficiency in their court, this leaves Zen 3 winning in all but a few single threaded benchmarks. I don't think thats a gimmick, but overall only a small boost for Zen, but all they need at the moment. My personal testing of ADL vs Zen3 on DC apps also show this power efficiency and multi-threaded advantage personally confirmed. And by50% for both. (details in the DC forum)
Why not a single 5950X that you could game and Daily?
I won't argue that, I just have no real experience with Milan-X, even though I know they rule the server world in performance and performance/watt.If it would only be games, Milan-X would not exist. It's rather the other way around, it has just turned out to be great for most games as a maybe not fully unexpected, but also not necessarily intended benefit, that's playing perfectly into the somewhat extended release cadence, by rivaling ADL in games with an uArch that has measurably lower IPC and a product that has _significantly_ lower clock speed.
Milan-X, even though I know they rule the server world in performance and performance/watt.
In Gaming, I bet it wins most games. Otherwise they would not be selling it. In apps, its benefits will be limited.It seems AMD is pushing it as a gaming chip, so we already know there's a market for it. I can't see too many others buying it though. Extra L3 cache isn't going to compensate for another 8 cores in a 5950X for most workloads.
Depends on where the cache gives most uplift, in cs:go it is 0% compared to 5900x and 12900k, so if you're building for competetive gaming, you will look for the specific game and which CPU is best.$450 is a pretty steep price for an 8 core CPU in 2022. My fears (expectations?) have been confirmed for the 5800X3D re: pricing.
I would have jumped on it as an upgrade from a 3600 had it been more reasonably priced. As it stands, I'll probably just get the $199 5600 and call it a day for AM4. I game at 1440P so it's not like I'll see the full 15% uplift in games, I'll be surprised if there is much more than a 5-10% difference at 1440P between a 5600 and 5800X3D. That would also apply to other Zen 3 (or ADL) CPUs that are already enough to max out current gen GPUs at 1440P.
Honestly the 5800X3D fills a very small niche. Basically, 1080P (or competitive) gamers with high end GPUs who don't mind the inflated pricetag.
If you're building a new gaming PC from scratch and dont have unlimited funds (or just want to maximise bang for buck) then a better way to allocate your budget would be to invest in a ~$200 CPU like a 12400 or the upcoming 5600 and put that $250 towards a faster GPU. $250 is enough to go up a GPU tier, and I would bet that in the vast majority of games a 5600/12400 + faster GPU would beat out a 5800X3D + slower GPU combo.
$450 is a pretty steep price for an 8 core CPU in 2022. My fears (expectations?) have been confirmed for the 5800X3D re: pricing.
Honestly, this chip was probably never meant for you, and both the 5600 or 12400 would/will serve you very-very, very well until your next comprehensive PC update.$450 is a pretty steep price for an 8 core CPU in 2022. My fears (expectations?) have been confirmed for the 5800X3D re: pricing.
I would have jumped on it as an upgrade from a 3600 had it been more reasonably priced. As it stands, I'll probably just get the $199 5600 and call it a day for AM4. I game at 1440P so it's not like I'll see the full 15% uplift in games, I'll be surprised if there is much more than a 5-10% difference at 1440P between a 5600 and 5800X3D. That would also apply to other Zen 3 (or ADL) CPUs that are already enough to max out current gen GPUs at 1440P.
Honestly the 5800X3D fills a very small niche. Basically, 1080P (or competitive) gamers with high end GPUs who don't mind the inflated pricetag.
If you're building a new gaming PC from scratch and dont have unlimited funds (or just want to maximise bang for buck) then a better way to allocate your budget would be to invest in a ~$200 CPU like a 12400 or the upcoming 5600 and put that $250 towards a faster GPU. $250 is enough to go up a GPU tier, and I would bet that in the vast majority of games a 5600/12400 + faster GPU would beat out a 5800X3D + slower GPU combo.
Most gamers are going to be GPU-bound and won't see much if any uplift from a CPU upgrade, especially those who have an AMD or Intel CPU from within the last 3 years.
This is a CPU that should only be considered by those who have too-end GPUs or are running a very high-end GPU at lower resolutions where the bottleneck will shift to the CPU.
The only other exceptions are any titles that have a smaller memory footprint where the increased L3 actually enables most of that memory to fit in the cache and benefit from the decreased latency to access it.
Most gamers are going to be GPU-bound and won't see much if any uplift from a CPU upgrade, especially those who have an AMD or Intel CPU from within the last 3 years.
This is a CPU that should only be considered by those who have too-end GPUs or are running a very high-end GPU at lower resolutions where the bottleneck will shift to the CPU.
The only other exceptions are any titles that have a smaller memory footprint where the increased L3 actually enables most of that memory to fit in the cache and benefit from the decreased latency to access it.
If I am right, the whole reason this CPU exists is to win the gaming crown from Alderlake 12900K at 241 watts. No idea how that relates to what you posted, but its all about benchmarks of 30 games or so.Pubg 5800x, 3080ti, 1440p <- 100% CPU bound.
When is the last time you played PUBG? My 3090 is the bottleneck on my system. I have a 5950x.Pubg 5800x, 3080ti, 1440p <- 100% CPU bound.
When is the last time you played PUBG? My 3090 is the bottleneck on my system. I have a 5950x.
EDIT: CPU usage is 30%. A LONG time ago your statement was true, however a while back they did a decent overhaul of the engine, and it is now GPU bound on my system. I play with G-SYNC/Freesync and I hit the 120hz cap I have set. Haven't tried benchmarking the highest framerate.
EDIT: In case you think it is the cap, a friend of mine has a 3080 and a 5900x and plays uncapped, and he's also GPU bound. He also plays at 1440p where I play at 5120x1440.
This means little on a multi-thread/ multi-core/ multi-corecomplex CPU. It just says that the workload does not saturate the CPU, but it doesn't say whether or not the CPU bottlenecks the workload.I am running that 25-30% CPU [...]
A CPU which is not overclocked may get hot too. Plus, all recent desktop-tier CPUs are already factory-overclocked.We may not be able to overclock the V-cache version.
Would this be a heat issue or some timing issues that AMD fears.
Seems a bit strange, if it's heat wouldn't we detect it that the overclock makes it too hot. Maybe it can break easily (mechanically) when getting hot.
While you are right its not definitive proof it is a pretty big, super big, maybe even borderline gigantic, hint that this is the case. Kernel times? I suppose, but that would be super bad coding (sort of idling the rest of a core on io or whatever… no), I cant imagine anyone shipping such a thing.This means little on a multi-thread/ multi-core/ multi-corecomplex CPU. It just says that the workload does not saturate the CPU, but it doesn't say whether or not the CPU bottlenecks the workload.
First you need to look at CPU usage per program thread. Next, if this indicates that one or more program thread may be using a hardware thread fully each, one would have to figure out whether this is limited by the CPU's execution performance or I/O performance. If it is the latter, a larger level-3 cache will help. To what extent it will help depends on the program's memory access patterns of course.
A CPU which is not overclocked may get hot too. Plus, all recent desktop-tier CPUs are already factory-overclocked.
By "I/O performance" I was actually thinking of the processor's RAM I/O performance mostly, not so much about PCIe I/O, and not at all about peripheral I/O.[...] super bad coding (sort of idling the rest of a core on io or whatever… no), I cant imagine anyone shipping such a thing.
We may not be able to overclock the V-cache version.
Would this be a heat issue or some timing issues that AMD fears.
Seems a bit strange, if it's heat wouldn't we detect it that the overclock makes it too hot. Maybe it can break easily (mechanically) when getting hot.