So 5800X3D is 2% faster than 12900KS? Yay?
If you can overclock that 12900KS by even 10% it will blow away the 5800X3D. Yes power consumption is a huge problem for Intel and overclocking would make it all that much worse, but given all the reviews I see for 1000 watt power supplies and the coming of 1500 watt power supplies, I thought hardcore gamers didn't care about details like electric bills 🙂
Never probably. Raphael would be a better choice.
Especially if undervolting it, is something that can be applied widely.Idle power wastage of Ryzen desktop CPUs is just too high, Raphael makes way more sense. But then again there has been so many senseless DTR class laptops with ridiculous wattages, that 5800X3D would work wonders in one of them. Given how good it performs with horribad memory, it would be match made in heavens.
Not Alienware, but Schenker did put a 3950X into a laptop with XMG APEX 15. I'd think a 5800X3D would fit fine in there as well (though I don't see any BIOS updates on their site).When will Alienware put 5800X3D into a laptop?
That is the one thing I like about my current rig it doesn't heat up the room the computer is used in at all.
The system may dump a lot of heat, but for gaming, the majority of it will come from the gpu. TDP for 3090 is 350 watts or more, while the cpu will probably use only 150 watts (or less) in gaming.They may not care about electric bills but Heat getting dumped into the room can be an issue. a 12900k and a 3090 will drop a tremendous amount of heat into your room in extended sessions. That is the one thing I like about my current rig it doesn't heat up the room the computer is used in at all.
The system may dump a lot of heat, but for gaming, the majority of it will come from the gpu. TDP for 3090 is 350 watts or more, while the cpu will probably use only 150 watts (or less) in gaming.
I'd be surprised if AMD didn't customize a 8000-series X3D chip for the laptop gaming market. May not make sense for casual laptops, but wouldn't be a terrible idea in the near term. What's the highest priced laptop CPU out there ATM?
Oh my, how far have we come. AMD leads in ASP and sales volume.Interesting sales figures for R7 5800X3D, 360 CPU-s sold vs R7 5800X 350 processors sold.
When you have a better product for quite a while, that happens. Even with Alder lake, I hear nothing but excuses. "Oh, so what if they use twice the power and heat, gamers don't care". And "Nobody uses these desktop chips for productivity". This list goes on. And servers ? Not even excuses, just "they are coming". And the wattage I see is insane.300-400 watts for servers ? How many of you have actually been in an enterprise data center ? Believe me, heat, power usage and AC to deal with it can make or break a data center.Oh my, how far have we come. AMD leads in ASP and sales volume.
And why would you do that for a slower CPU in gaming ?
And the wattage I see is insane.300-400 watts for servers ? How many of you have actually been in an enterprise data center ? Believe me, heat, power usage and AC to deal with it can make or break a data center.
Can confirm. Had a 1950X in a bedroom sized office. The office would heat up in the summer-time to 95F. Had to run a portable AC unit to keep it cool. Note that the 12900K puts out MORE heat than the 1950X.They may not care about electric bills but Heat getting dumped into the room can be an issue. a 12900k and a 3090 will drop a tremendous amount of heat into your room in extended sessions. That is the one thing I like about my current rig it doesn't heat up the room the computer is used in at all.
Not really. As long as you have your Windows power plan set up properly, Ryzen uses very little power at idle. Ryzen mobile chips actually have lower idle power than Intel chips.Idle power wastage of Ryzen desktop CPUs is just too high, Raphael makes way more sense. But then again there has been so many senseless DTR class laptops with ridiculous wattages, that 5800X3D would work wonders in one of them. Given how good it performs with horribad memory, it would be match made in heavens.
Well let's be honest, power/socket is going up for both Intel and AMD in the server room. The question is: what do you get for all the extra power consumption? In cache-sensitive workloads, Milan-X offers a lot of bang/watt so to speak. It's difficult to compare it to Sapphire Rapids since availability is still so low that it hasn't been publicly benchmarked.
I think your post messed up. Thats me you quoted. Post 3,137Well let's be honest, power/socket is going up for both Intel and AMD in the server room. The question is: what do you get for all the extra power consumption? In cache-sensitive workloads, Milan-X offers a lot of bang/watt so to speak. It's difficult to compare it to Sapphire Rapids since availability is still so low that it hasn't been publicly benchmarked.
I think your post messed up. Thats me you quoted. Post 3,137
But your post says "nicalandia said: "Nah I quoted you quite deliberately. The point being, just because a single server CPU can pull 300W or more doesn't necessarily make it a bad CPU. If it has enough cores and/or does enough work through ILP then the expenditure could be worth it. Which is quite often the case with both Milan and Milan-X. Intel has already gone overboard with IceLake-SP: the 8380 already pulls 270W, and it's not even a match for older Rome CPUs with similar TDP. Sapphire Rapids would seem to be delivering more of the same. Or at least it would if anyone could actually buy one.
But your post says "nicalandia said: "
I just wanted credit for the post. No matter what Intel says their servers can't touch Rome/Milan in performance OR efficiency. Genoa will probably be a massacre.Oh I have no idea how that happened. I think the forum code bugged out.
But your post says "nicalandia said: "
Anyway, its does not exactly matter the wattage, but 28 cores compared to 64 at the same wattage ? Or worse. I would have to go look it up, but Intel is offering half the cores with half the performance per core for the same TDP.
They are comparing very old AMD chips (ignoring Milan-X, which already beats Intel current chips) vs. Intel current
Way to completely miss out on the point and analysis of these articles. No problem though, always refreshing to see real colors shining through 😂So 5800X3D is 2% faster than 12900KS? Yay?
If you can overclock that 12900KS by even 10% it will blow away the 5800X3D. Yes power consumption is a huge problem for Intel and overclocking would make it all that much worse, but given all the reviews I see for 1000 watt power supplies and the coming of 1500 watt power supplies, I thought hardcore gamers didn't care about details like electric bills 🙂
He did say "If" 😛But let's think about what you said for a second. Do you know what it actually means to overclock the 12900KS by a measly 10%?
The math is not hard, even I can do it, so I wonder: have you thought about what you say before you click on the blue button?