I9 9900k Official Reviews from Anandtech, Tomshardware. Add your own links to others !

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
14,585
5,209
136
Fair enough, but that is besides the point entirely, you are never going to use the igpu on a 9900k..so it adds next to zero value..

That's why I mentioned EMIB. Until the products arrive that support it, chips are going to be monolithic.. hence the need to include the IGP.
 
  • Like
Reactions: french toast
Aug 11, 2008
10,451
642
126
https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/2/

Well, computerbase.de actually tested the 9900K at the 95W TDP limit as well. Overall application performance goes down 8%. I actually think there is merit at running the 9900K at this setting if you are worried about thermals. An 8% reduction in performance for a ~50% drop in power consumption is a good result in terms of performance/watt.

Anyhow, performance figures with the 9900K (stock) normalised to 100%:

9900K (stock) 100%
9900K @95W 92%
2700X (stock) 82%
9700K (stock) 81%

So the 9900K still has a sizeable gap, even limited to 95W power setting, compared to the 2700X and 9700K.

The 'good news' for gamers is that gaming performance is virtually untouched at the 95W limit, going down 1% overall. This is probably what I would suggest for people without strong cooling (lets say a 100W rated HSF) that is using the 9900K predominantly as a gaming CPU but with the occasional productivity workload thrown in.

If that data is accurate, and holds true for the majority of cpus, it is clear Intel screwed up the marketing. I have even said this a couple of times before. They need to release a 9900 non-k, with clocks that are in the region where efficiency is much better, and at a price that at least reasonably competes with the 2700x. Or hell, even make the TDP 100 or 110 watts to allow a bit more performance, there is nothing magical about 95 watts. Then there would be much less reason to criticize the power, temps, and price of the 9900k. The user would have a choice, either max performance with the associated trade-offs, or a reasonably priced alternative still with performance on par or (most likely, especially in gaming) better than AMD's best.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
The only scenario where the 9900K power is at insane levels is P95 AVX2 stress testing - it's basically the 'Furmark' for CPUs. I don't know of any 'real world' applications that can come close to replicating the power levels shown in P95 though - do you?

See, just a few years ago, even with Prime 95 it couldn't reach TDP most of the time. It needed a very intensive AVX workload like Intel Linpack test.

Do all the power measuring tests say its sustained power consumption after running it for an hour or so rather than peak? Because it would change the story a lot. I'm assuming its sustained as its desktop testing, but just to be sure.

I know you are a very pro-Intel person from your posts, but try to dial it back a notch or two.

I am "pro-Intel" too, but I'm rooting them to be better, not lie all over the place and pretend they are better. You are just not being realistic if you bury your head in the sand and fill your head with imaginations its doing better.

This applies to stockholders too. Long-term stock performance will be better if the company doesn't hide behind mysterious tactics to hide what its really doing.
 

jpiniero

Lifer
Oct 1, 2010
14,585
5,209
136
If that data is accurate, and holds true for the majority of cpus, it is clear Intel screwed up the marketing. I have even said this a couple of times before. They need to release a 9900 non-k, with clocks that are in the region where efficiency is much better, and at a price that at least reasonably competes with the 2700x. Or hell, even make the TDP 100 or 110 watts to allow a bit more performance, there is nothing magical about 95 watts. Then there would be much less reason to criticize the power, temps, and price of the 9900k. The user would have a choice, either max performance with the associated trade-offs, or a reasonably priced alternative still with performance on par or (most likely, especially in gaming) better than AMD's best.

You pretty much described the 9700K.

BTW, I don't expect Intel to release any 8C16T desktop parts, at least for Coffee Lake Refresh. Laptop parts yes. Maybe with the Refresh Refresh.
 
Mar 10, 2006
11,715
2,012
126
You pretty much described the 9700K.

BTW, I don't expect Intel to release any 8C16T desktop parts, at least for Coffee Lake Refresh. Laptop parts yes. Maybe with the Refresh Refresh.

This leaked roadmap seems to disagree with you:
Intel-Roadmap-1-Large.jpg
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
See, just a few years ago, even with Prime 95 it couldn't reach TDP most of the time. It needed a very intensive AVX workload like Intel Linpack test.

Do all the power measuring tests say its sustained power consumption after running it for an hour or so rather than peak? Because it would change the story a lot. I'm assuming its sustained as its desktop testing, but just to be sure.



I am "pro-Intel" too, but I'm rooting them to be better, not lie all over the place and pretend they are better. You are just not being realistic if you bury your head in the sand and fill your head with imaginations its doing better.

This applies to stockholders too. Long-term stock performance will be better if the company doesn't hide behind mysterious tactics to hide what its really doing.
There are "pro-AMD" posters all over this thread with outright false claims about this chip and blatant bashing who are not being called out. Ironically, it's the "pro-Intel" poster who gets called out for refuting their claims.

There's nothing about the power consumption of the chip that is not already deducible from the 8700k. It's at least 33% more power at 4.3GHz, with 400MHz's worth added on top; and this doesn't even consider extra cache voltage. It actually consumes less per core in some instances, compared to the 8700k. Shocking. About prime95, the behavior of locked Intel chips is very different from unlocked chips. Your argument would stand if you were upset with why a locked chip sustains a tdp far above advertised figures. Most locked Intel chips do not come near tdp in stress tests.

In this case, however, you are talking about an unlocked chip. Unlocked. I don't understand how someone would run an avx based power virus app at 5GHz on an unlocked chip and then turn around and complain that their chip is consuming too much power, why isn't it downclocking itself? That discretion is left to the user. You can configure a lot, including power consumption on modern processors. Cooling is another crucial variable in all these. People actually buy these types of chips for that control. If your problem is that 4.7GHz ACT is too high and results in power consumption numbers way above advertised tdp, then are you going to be happier if the chip downclocks to 3.6GHz and consumes 80watts for a longer task duration? If that is what you want, get a locked chip and turn off turbo boost.
 
Last edited:

Guru

Senior member
May 5, 2017
830
361
106
Not impressed at all. At $550+ its only about 10% faster than the 8700k in games and basically equal to the 2700x across a range of workstation tests.

In office stuff it seems to be about 10% faster than 2700x, older single threaded apps it's about 15% faster, but encoding, compression, editing it seems to be trading blows with the 2700x, about -5% to +10%
 
Mar 10, 2006
11,715
2,012
126
Interesting. I don't know if a locked i9 really makes sense but since Intel will likely release a Xeon E I guess.

Of course it makes sense. It's an upsell opportunity for PC OEMs in markets outside of gaming, which is a huge percentage of the market. Core i9 vPro should be quite lucrative for Intel.
 
Aug 11, 2008
10,451
642
126
You pretty much described the 9700K.

BTW, I don't expect Intel to release any 8C16T desktop parts, at least for Coffee Lake Refresh. Laptop parts yes. Maybe with the Refresh Refresh.

Not really. It is a k cpu, so I assume you have to add a cooler. Plus it only has eight threads, so to get multithreaded performance approaching the 2700x, you need to overclock it to 5+ghz, and even then, it will still trail due to lack of threads. Now for a gaming chip, that is another story. It could very well be the sweet spot for that. I know it is the "mainstream" platform, but I dont really consider the k chips "mainstream", as in chips that will be put into OEM systems. I still say they need an eight core 9900 analogous to the 8700 vs 8700k.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
If that data is accurate, and holds true for the majority of cpus, it is clear Intel screwed up the marketing. I have even said this a couple of times before. They need to release a 9900 non-k, with clocks that are in the region where efficiency is much better, and at a price that at least reasonably competes with the 2700x. Or hell, even make the TDP 100 or 110 watts to allow a bit more performance, there is nothing magical about 95 watts. Then there would be much less reason to criticize the power, temps, and price of the 9900k. The user would have a choice, either max performance with the associated trade-offs, or a reasonably priced alternative still with performance on par or (most likely, especially in gaming) better than AMD's best.

I agree, a 9900 non K at say, 8700K clocks (4.3/4.7) at approx $400 would be a good option for better efficiency. Would still be a good ~10% ahead of a 2700X at more reasonable power levels.

I can understand why Intel clocked the 9900K so high, they know this is it for 14nm and want the biggest performance lead possible for the next 6 months at least. They left themselves open to criticism based on the power consumption but to be fair, it's actually in line with the 2700X in terms of performance/watt, numerous reviews have shown this already.

I find the TPU review interesting, for whatever reason their 9900K wasn't holding full 4.7GHz ACT (it was around ~4.5GHz) but look at what it did for the power efficiency as a result: https://www.techpowerup.com/reviews/Intel/Core_i9_9900K/16.html

efficiency-multithread.png


Discounting the 2950X which is a HEDT chip, the 9900K actually has the best power
efficiency amongst *every* desktop CPU just by running 200MHz lower than the stated ACT.

That is what a 9900 non K should be like IMO.
 

french toast

Senior member
Feb 22, 2017
988
825
136
I agree, a 9900 non K at say, 8700K clocks (4.3/4.7) at approx $400 would be a good option for better efficiency. Would still be a good ~10% ahead of a 2700X at more reasonable power levels.

I can understand why Intel clocked the 9900K so high, they know this is it for 14nm and want the biggest performance lead possible for the next 6 months at least. They left themselves open to criticism based on the power consumption but to be fair, it's actually in line with the 2700X in terms of performance/watt, numerous reviews have shown this already.

I find the TPU review interesting, for whatever reason their 9900K wasn't holding full 4.7GHz ACT (it was around ~4.5GHz) but look at what it did for the power efficiency as a result: https://www.techpowerup.com/reviews/Intel/Core_i9_9900K/16.html

efficiency-multithread.png


Discounting the 2950X which is a HEDT chip, the 9900K actually has the best power
efficiency amongst *every* desktop CPU just by running 200MHz lower than the stated ACT.

That is what a 9900 non K should be like IMO.
Yes, it is a theme known by AMD fans for years, chips clocked way past their efficiency curve for minor performance just to compete, destroying the efficiency of the processor for little gain.
Opening themselves up for a panning in the tech forums by unforgiving intel/nvidia shills.

You are going to see more of this from intel now on, and AMD fans giving it back to them now the shoe is on the other foot.
Welcome to crazy town!
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,830
136
An 8% reduction in performance for a ~50% drop in power consumption is a good result in terms of performance/watt.

We've never seen that from Intel's Coffeelake CPUs before, though. Something is really weird with how these chips are handling power usage via PL2 settings. I would rather see the 9900k running at fixed clocks/voltages up the clockspeed range to see what's really going on here.

Because some reviewers are comparing the 9900K to the first gen TR chips like the 1920X, and the overall performance is comparable? It's not like the 2950X wasn't used in comparison as well, but at $900 I don't think it's exactly directly comparable to the 9900K.

The 2920X, which is supposed to be $650 and thus the most comparable 2nd gen TR chip to the 9900K, hasn't actually been officially launched.

2920X is out later this month, though they could simulate one with a 2950X (probably) if they wanted to. Broad market availability of the 2920X will probably be higher than the 9900k for awhile, too.

but encoding, compression, editing it seems to be trading blows with the 2700x, about -5% to +10%

Just a nitpick, but in x265, the lead for the 9900k may be much higher than 10 or 15%.
 
Aug 11, 2008
10,451
642
126
Yes, it is a theme known by AMD fans for years, chips clocked way past their efficiency curve for minor performance just to compete, destroying the efficiency of the processor for little gain.
Opening themselves up for a panning in the tech forums by unforgiving intel/nvidia shills.

You are going to see more of this from intel now on, and AMD fans giving it back to them now the shoe is on the other foot.
Welcome to crazy town!
Aw, those poor abused AMD fans. There is a big difference though. AMD clocked FX way past the efficiency curve and still wasnt competitive in performance. In fact, no matter how you clocked it, it wasnt competitive in efficiency. With the 9900k, you get superior performance for the high clocks and power usage, and there are other chips that *are* competitive in performance per watt. Personally, I think Intel should have set the stock clocks lower to mitigate some of this FUD about CL being a hot, hungry, power hog, and let those who wanted to push the envelope do it by overclocking, but in the end, the usual suspects would have bashed it anyway.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Yes, it is a theme known by AMD fans for years, chips clocked way past their efficiency curve for minor performance just to compete, destroying the efficiency of the processor for little gain.
Opening themselves up for a panning in the tech forums by unforgiving intel/nvidia shills.

You are going to see more of this from intel now on, and AMD fans giving it back to them now the shoe is on the other foot.
Welcome to crazy town!

The difference is performance imho. We've seen these eras :

Athlon XP vs P4 Willamette.

Total AMD domination in price, performance, and upgrade. Socket 423 was shocking in all the wrong ways.

Athlon XP vs P4 Northwood.

Intel leading in performance, particularly once in the sweet spot at 3.0-3.2Ghz. Athlon still very viable, with amazing OC from cheap 1700+, etc.

Socket 940 Opty / A64 vs P4s. Suddenly AMD has top performance, with superior efficiency. A 3000+ and P4B/C @ 3.2 have nearly identical performance, but AMDs platform is just superior here. Intel released the pointless and overall bad Prescott. Netburst scaling was just poor past ~3.4 in any case.

A64 X2 vs Pentium D products. AMD domination continued, and the Intel S775 netburst dual cores just looked ridiculous compared to hugely more efficient X2 Athlons. AMD figured out they were dominating, and raised prices to ~$1k for flagship models. $300 got you a 3800 X2 though.

A64 vs Conroe. A holy crap moment for sure. Overnight both Netburst and AMDs existing platforms were obsoleted. A 2:1 IPC over Netburst, and a 1.5:1 IPC over the best X2/Opteron SKUs. Eg, lowly 1.6Ghz Conroe equal or better than 2.4Ghz A64 X2 or 3.2Ghz Pentium D, while sipping power. Once clocked much beyond 2Ghz, basically untouchable by all previous products.

AMD released a bunch of disappointing comeback products that never really turned the corner. The closest came with the Phenom II X4-X6 era, where they were truly competitive with the best Intel products (I went PhII here to save $). Timing was bad though, as soon after :

Core i7 920 followed by socket 1156 ruined Phenom II having much time as a relevant enthusiast product. Then Sandy came and made a mockery of everything. AMD released more laughable products like the FX series. Huge power consumption with the top models and OCed figures, but the crucial difference was that it didn't correspond with top-tier performance, making it especially ludicrous.

Intel got fat and lazy over the next 6 years, giving us tiny increases in IPC, tiny increases in features, edging prices up, and generally being greedy. Owning a 2700k @ 4.8, even a 7700k is mostly better only for the platform improvement, the actual CPU performance moved hardly at all.

Ryzen truly came and rocked the world. Almost a Conroe moment, untouchable multicore performance on a single socket, great prices, and only let down minimally by gaming/ST, where it certainly wasn't bad, just not better than the existing products. For general use and heavy MT, it vaulted to #1 with a bullet. And moved beyond quad cores instantly for the consumer.

Make no mistake, Ryzen is the only reason we were given hex cores with coffee Lake. And now seeing an octo Halo product.

It's thirsty, yes. It's also a pretty bad economic prospect honestly. I'm not going to buy one. But, it's nowhere near as embarrassing as Willamette, Prescott, Phenom I, or FX. It actually delivers halo performance for halo pricing, with a lot of heat and fairly thirsty. I just think it's poor value. Still pretty impressive for 14nm.

CPUs are fun again, and much of that (most of that) is due to AMD being competitive at many levels again.
 

french toast

Senior member
Feb 22, 2017
988
825
136
Aw, those poor abused AMD fans. There is a big difference though. AMD clocked FX way past the efficiency curve and still wasnt competitive in performance. In fact, no matter how you clocked it, it wasnt competitive in efficiency. With the 9900k, you get superior performance for the high clocks and power usage, and there are other chips that *are* competitive in performance per watt. Personally, I think Intel should have set the stock clocks lower to mitigate some of this FUD about CL being a hot, hungry, power hog, and let those who wanted to push the envelope do it by overclocking, but in the end, the usual suspects would have bashed it anyway.
Well if you had been reading my posts the last few months I was praising intel for this processor and recommended it as a grest solution for years to come, even going as far as saying zen2 wouldn't embarrass it.
Only the reviews, the stupid fake/paid benchmarks, the tdp fud.. has changed my opinion on this 9900k...Intel has itself to blame for that.
As for the FX 9590, it wasn't a great processor no and the 9900k is better than that was, at least AMD was upfront about its power consumption and bundled an appropriate cooler with it.
 

french toast

Senior member
Feb 22, 2017
988
825
136
The difference is performance imho. We've seen these eras :

Athlon XP vs P4 Willamette.

Total AMD domination in price, performance, and upgrade. Socket 423 was shocking in all the wrong ways.

Athlon XP vs P4 Northwood.

Intel leading in performance, particularly once in the sweet spot at 3.0-3.2Ghz. Athlon still very viable, with amazing OC from cheap 1700+, etc.

Socket 940 Opty / A64 vs P4s. Suddenly AMD has top performance, with superior efficiency. A 3000+ and P4B/C @ 3.2 have nearly identical performance, but AMDs platform is just superior here. Intel released the pointless and overall bad Prescott. Netburst scaling was just poor past ~3.4 in any case.

A64 X2 vs Pentium D products. AMD domination continued, and the Intel S775 netburst dual cores just looked ridiculous compared to hugely more efficient X2 Athlons. AMD figured out they were dominating, and raised prices to ~$1k for flagship models. $300 got you a 3800 X2 though.

A64 vs Conroe. A holy crap moment for sure. Overnight both Netburst and AMDs existing platforms were obsoleted. A 2:1 IPC over Netburst, and a 1.5:1 IPC over the best X2/Opteron SKUs. Eg, lowly 1.6Ghz Conroe equal or better than 2.4Ghz A64 X2 or 3.2Ghz Pentium D, while sipping power. Once clocked much beyond 2Ghz, basically untouchable by all previous products.

AMD released a bunch of disappointing comeback products that never really turned the corner. The closest came with the Phenom II X4-X6 era, where they were truly competitive with the best Intel products (I went PhII here to save $). Timing was bad though, as soon after :

Core i7 920 followed by socket 1156 ruined Phenom II having much time as a relevant enthusiast product. Then Sandy came and made a mockery of everything. AMD released more laughable products like the FX series. Huge power consumption with the top models and OCed figures, but the crucial difference was that it didn't correspond with top-tier performance, making it especially ludicrous.

Intel got fat and lazy over the next 6 years, giving us tiny increases in IPC, tiny increases in features, edging prices up, and generally being greedy. Owning a 2700k @ 4.8, even a 7700k is mostly better only for the platform improvement, the actual CPU performance moved hardly at all.

Ryzen truly came and rocked the world. Almost a Conroe moment, untouchable multicore performance on a single socket, great prices, and only let down minimally by gaming/ST, where it certainly wasn't bad, just not better than the existing products. For general use and heavy MT, it vaulted to #1 with a bullet. And moved beyond quad cores instantly for the consumer.

Make no mistake, Ryzen is the only reason we were given hex cores with coffee Lake. And now seeing an octo Halo product.

It's thirsty, yes. It's also a pretty bad economic prospect honestly. I'm not going to buy one. But, it's nowhere near as embarrassing as Willamette, Prescott, Phenom I, or FX. It actually delivers halo performance for halo pricing, with a lot of heat and fairly thirsty. I just think it's poor value. Still pretty impressive for 14nm.

CPUs are fun again, and much of that (most of that) is due to AMD being competitive at many levels again.
Great post! I agree with all of that.
9900k is better than an fx 9590 sure...BUT this is as close to an FX 9590 moment for intel that I can remember since pentium 4.
If they just clocked it 300mhz lower, stopped this tdp nonsense and labelled it properly, stopped the benchmark crap on the lead up to release, priced it around $400...I would suggest it is one of the best intel products for years (which is what I was saying in the lead up).

Alas was not to be.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/2/

Well, computerbase.de actually tested the 9900K at the 95W TDP limit as well. Overall application performance goes down 8%. I actually think there is merit at running the 9900K at this setting if you are worried about thermals. An 8% reduction in performance for a ~50% drop in power consumption is a good result in terms of performance/watt.

Anyhow, performance figures with the 9900K (stock) normalised to 100%:

9900K (stock) 100%
9900K @95W 92%
2700X (stock) 82%
9700K (stock) 81%

So the 9900K still has a sizeable gap, even limited to 95W power setting, compared to the 2700X and 9700K.

Again it is application depended,

When the application can utilize all threads the performance difference between PL2 at 210W and PL2 at 95W is double digit.

Examples from golem.de
https://www.golem.de/news/core-i9-9...te-5-ghz-kerne-sind-extrem-1810-136974-5.html

Lightroom = PL2 at 210W is 13% less time to finish the benchmark than 95W
mRIc1l6.png


Blender = PL2 at 210W is 15% less time to finish the benchmark than 95W
topPvnM.png


X265 = PL2 at 210W is 17% less time to finish the benchmark than 95W
ogO7O8R.png


Hell the funny thing is that even with PL2 at 95W the 9900K is still faster than 2700X at 110W in those apps. PL2 at 210W was a mistake from Intel TODAY, but they had to do this in order for the 9900K to be able to at least try and compete against 7nm ZEN 2 for the entire 2019.
 
Last edited:

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Great post! I agree with all of that.
9900k is better than an fx 9590 sure...BUT this is as close to an FX 9590 moment for intel that I can remember since pentium 4.
If they just clocked it 300mhz lower, stopped this tdp nonsense and labelled it properly, stopped the benchmark crap on the lead up to release, priced it around $400...I would suggest it is one of the best intel products for years (which is what I was saying in the lead up).

Alas was not to be.

True. I personally think the 3.73Ghz Prescott EE was Intel's 9590 moment. Thirsty, hot, overpriced, and not particularly fast or good value.
 

arandomguy

Senior member
Sep 3, 2013
556
183
116
If and when games can fully load an 8C/16T CPU at 100% usage (or close to), then we might start seeing some of the thermal struggles exhibited in other MT apps. That's probably not going to happen in the useful lifespan of the 9900K.

Usage will go up with more utilization through more software thread parallelization but for something like gaming it's not going to skyrocket up to match tasks like Blender. For tasks like gaming it can't be fed with a sustained workload.

https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/2/

The 'good news' for gamers is that gaming performance is virtually untouched at the 95W limit, going down 1% overall. This is probably what I would suggest for people without strong cooling (lets say a 100W rated HSF) that is using the 9900K predominantly as a gaming CPU but with the occasional productivity workload thrown in.

So this shows part of the reason why I suspect Intel does want to go with such high a power limit still even for gaming. You'll notice the 99th percentile numbers suffer a bigger drop and I wouldn't be surprised if they sued 99.9th percentile or min fps numbers even more so. There might be "bursts" in which the CPU does in fact shoot past 95w.

See, just a few years ago, even with Prime 95 it couldn't reach TDP most of the time. It needed a very intensive AVX workload like Intel Linpack test.

Prime95 has changed since a "few years ago" (not sure exactly sure what criteria here) to heavily leverage FMA3. This information first started getting ground when people were running Prime95 on Haswell systems and wondering why the temperature would skyrocket compared to SB/IVB or other stress tests.
 

Pandamonia

Senior member
Jun 13, 2013
433
49
91
I'm failing to see why anyone with sky lake should even consider these new cpu

Games are held back by gpu and there is no ipc benefits and they run hot and thirsty for zero benefit and the extra cores do nothing.

What exactly am I missing.

Sent from my SM-N960F using Tapatalk
 

beginner99

Diamond Member
Jun 2, 2009
5,210
1,580
136
I'm failing to see why anyone with sky lake should even consider these new cpu

Games are held back by gpu and there is no ipc benefits and they run hot and thirsty for zero benefit and the extra cores do nothing.

What exactly am I missing.

Sent from my SM-N960F using Tapatalk

Not much. The main part you are missing is that 6700k and 7700k are only quads and if you want more cores you need to upgrade. And what most gaming benchmarks don't show is that >4-cores has significant benefits in certain online multiplayer games notably BF-series. If you are gaming on 64-player maps, a 6-core Ryzen/I7 will beat a 7700k.

But mostly you are right.