Discussion i7-11700K preliminary results

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,643
136
The important part of the AVX 512 test are the accompanying graphs. If you look you see that as soon as the AVX 512 load starts it spikes in power and temp immediately. That shows if you are running a more mixed workload where only part is AVX512 enhanced you are still going to hit these temps and power numbers as soon as it starts to use these 512 features.
Yup this is exactly what happens when I run Handbrake x265 on my Tiger Lake laptop with ASM:AVX512. It immediately shoots up to PL2 before tau kicks in and brings it down to PL1, with gradual downclocking till the encode is finished.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
I don't really see that happening based on the AT review of the 11700K. There isn't enough clock speed headroom left and the increased latency seems to be a bigger hit to overall gaming performance than the increased IPC is able to compensate for. If you compare the results it looks like this:

Title / CPU5800X11700K10700K9900KS
Deus Ex MD (600p)269.8217.4211.8214.5
FF XIV (768p)315.0212.1216.1235.2
FF XV (720p)220.3199.0179.9186.8
World of Tanks (768p)733.8692.4707.0697.7
Borderlands 3 (360p)214.9172.6163.6175.9
F1 2019 (768p)384.7291.6291.6316.5
Far Cry 5 (720p)188.3178.3169.8181.5
Gears Tactics (720p)389.2310.9309.9306.2
GTA 5 (720p)180.8176.2175.4176.7
RDR 2 (384p)190.7149.8157.4167.1
Strange Brigade (720p)637.2435.5463.1513.3

Moving to 1080p or beyond leads to a GPU bottleneck in most titles in which case there isn't much of a gap between any of the CPUs and there are even a few cases where one of the Intel CPUs will wind up on top, but those are almost always within a margin of error. The 1080p Max quality benchmarks have the CPUs clumped up together in almost every title to the point where there's no difference what you go with, but in most cases the 11700K is still at the bottom.

The AVX results suggest that the i9 probably has some room left if it could draw up to say 275W in non-AVX workloads, but this thin is already pretty close to the edge and the increased clocks really only get it to where the 9900KS is already at. It's impressive that Intel has been able to push 14nm as far as they have, but I don't think an extra ~3% clock speed or any firmware tweaks are going to change what we're seeing now in a substantial way.

At stock, we already know the i9 will retake the gaming performance crown:

a4ANX3Z.jpg
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
The important part of the AVX 512 test are the accompanying graphs. If you look you see that as soon as the AVX 512 load starts it spikes in power and temp immediately. That shows if you are running a more mixed workload where only part is AVX512 enhanced you are still going to hit these temps and power numbers as soon as it starts to use these 512 features.
Of course?
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
You're not seeing the forest for the tree here. Is there a relationship between that "300W" and that 3DPM result? Hint, POVRay is 225W (Peak).
In the meanwhile the forest of AVX512 where RKL seems to be the king, consists of 6 scrawny 3-4 year old trees. It's easy *not* to see that garden when some actual trees are in the way.
 

maddogmcgee

Senior member
Apr 20, 2015
384
303
136
At stock, we already know the i9 will retake the gaming performance crown:

a4ANX3Z.jpg

Based on Intel's efforts to benchmark recently, I assume the Ryzen was running a single stick of ram from the back of the cupboard and a hard drive they took from their wife's laptop after it started to BSOD while the 11900K was running under liquid nitrogen while in a cool room.
 

coercitiv

Diamond Member
Jan 24, 2014
6,214
11,959
136
Intel is shooting themselves in the foot with these outright stupid power limits. There's no reason they need to let the silicon draw over 200W to win some AVX benchmarks that are niche for consumers anyway. Cap PL2 at 200W or even 175W and get sane stock power figures, with the upside of enabling lots more performance if the user is willing to go ballistic. (pun intended)

Unaware users won't be blindsided by 300W spikes and 100C+ temps, while enthusiasts will still be able to unlock a lot more power (pun again) with the switch of a button.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
At stock, we already know the i9 will retake the gaming performance crown:

a4ANX3Z.jpg
Hate to burst your bubble, but let me tell you how current intel CPUs perform in these games:
Far cry: 10900K 3%> 5900X
Watch dogs: 10900K 2%> 5900X
Total war: 10900K 5%< 5900X
Metro exodus: 10900K 3%< 5900X
Gears of war: 10900K = 5900X

Intel chose the only games for this picture where a) they were already in the lead with a couple of %, or b) where they could find some improvement, such overtaking AMD by a couple of %.

Where are the games they loved so much to brag about? Where are tomb raider & co? Oh wait. Zen 3 absolutely trounces intel in those games, so that's where they are: not on any official intel marketing material for sure.
 

majord

Senior member
Jul 26, 2015
433
523
136
Hate to burst your bubble, but let me tell you how current intel CPUs perform in these games:
Far cry: 10900K 3%> 5900X
Watch dogs: 10900K 2%> 5900X
Total war: 10900K 5%< 5900X
Metro exodus: 10900K 3%< 5900X
Gears of war: 10900K = 5900X

Intel chose the only games for this picture where a) they were already in the lead with a couple of %, or b) where they could find some improvement, such overtaking AMD by a couple of %.

Where are the games they loved so much to brag about? Where are tomb raider & co? Oh wait. Zen 3 absolutely trounces intel in those games, so that's where they are: not on any official intel marketing material for sure.

They've also failed to provide the test setup details on the URL referenced in that slide.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
Hate to burst your bubble, but let me tell you how current intel CPUs perform in these games:
Far cry: 10900K 3%> 5900X
Watch dogs: 10900K 2%> 5900X
Total war: 10900K 5%< 5900X
Metro exodus: 10900K 3%< 5900X
Gears of war: 10900K = 5900X

Intel chose the only games for this picture where a) they were already in the lead with a couple of %, or b) where they could find some improvement, such overtaking AMD by a couple of %.

Where are the games they loved so much to brag about? Where are tomb raider & co? Oh wait. Zen 3 absolutely trounces intel in those games, so that's where they are: not on any official intel marketing material for sure.

My post was actually being sarcastic, though I appreciate this may not have been obvious. I couldn't care less about 1080P game FPS, as in my eyes anything below 4k is now a obsolete, retro only experience.

I game at 4k, on a 6700k, 3080, CX48 system, where I'd not be able to tell the difference between a 5800x, 10700k, 11700k or 11900k as games are mostly GPU limited at 4k.

I found a few games were my 6700k quad core, that I've had since August 2015, no longer copes, and I get FPS spikes from > 100 fps to 20 or 30. Shadow of the Tomb Raider's town scenes is one example, so many NPC's running around. I've not even tried Cyberpunk yet, due to waiting for a good CPU to be in stock and all the bugs.

My original plan was to get a 5900x, but in the UK they've not been in stock even once.

Sitting here with motherboard and new cooler ready to go, just need that 11900k to arrive :p

tZSJpu0.jpg
 

TheELF

Diamond Member
Dec 22, 2012
3,973
731
126
Unaware users won't be blindsided by 300W spikes and 100C+ temps, while enthusiasts will still be able to unlock a lot more power (pun again) with the switch of a button.
Which unaware user will ever run the one single avx 512 synthetic bench app that can reach 300W ?
Even by mistake it's extremely unlikely to happen.
Hey, look at the upside, if you really want top gaming performance you can still stick a 10900K in that board. :p
MCE is fixing all cores to the same multiplier it's just like disabling PBO on ryzen when overclocking, you get lower performance in certain things.
The 11900k when set up correctly will always be faster than the 10900, even if it's just by a bit.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
Can't tell if sarcastic or extremely optimistic.

NH-D15 was sufficient for 5Ghz + all core on the 10990k.

No reason to suspect 5.3 won't be possible on the 11900k, considering it has two less cores. Stock is 5.3Ghz on one core..

Obviously not talking about in AVX-512 or even AVX2. Assume RKL will have a AVX offset for this purpose.
 
  • Haha
Reactions: spursindonesia

dmens

Platinum Member
Mar 18, 2005
2,271
917
136
NH-D15 was sufficient for 5Ghz + all core on the 10990k.

No reason to suspect 5.3 won't be possible on the 11900k, considering it has two less cores. Stock is 5.3Ghz on one core..

Obviously not talking about in AVX-512 or even AVX2. Assume RKL will have a AVX offset for this purpose.

No reason at all, other than different core architecture. That won’t change anything at all.
 

podspi

Golden Member
Jan 11, 2011
1,965
71
91
Yup this is exactly what happens when I run Handbrake x265 on my Tiger Lake laptop with ASM:AVX512. It immediately shoots up to PL2 before tau kicks in and brings it down to PL1, with gradual downclocking till the encode is finished.
I too can provide a hint: I play RPCS3, which I believe uses AVX-512. Performance is TERRIBLE on a 1065G7. Did some investigating,it throttles immediately. Under clocking to < 2ghz results in poor, but at least consistent performance.

Sent from my KFMAWI using Tapatalk
 

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,643
136
I too can provide a hint: I play RPCS3, which I believe uses AVX-512. Performance is TERRIBLE on a 1065G7. Did some investigating,it throttles immediately. Under clocking to < 2ghz results in poor, but at least consistent performance.

Sent from my KFMAWI using Tapatalk
Yeah that's the problem with running AVX instructions in power-limited scenarios, though I suspect that if you're running off the iGPU then that is where most of the power budget is being allocated.The Tiger Lake CPU in my laptop also downclocks to less than 2 GHz while playing games on the iGPU.
 

uzzi38

Platinum Member
Oct 16, 2019
2,636
5,985
146
I too can provide a hint: I play RPCS3, which I believe uses AVX-512. Performance is TERRIBLE on a 1065G7. Did some investigating,it throttles immediately. Under clocking to < 2ghz results in poor, but at least consistent performance.

Sent from my KFMAWI using Tapatalk
Not sure it does, but RPCS3 can happily push all 8 threads you have available, with a relatively heavy load too. That's what's really hurting clocks for you more than any AVX-512 utilisation - just having to use all the cores at the same time.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
I was actually hoping to see a performance/watt improvement, albeit at lower clocks. From what we know, Intel 10nm superFin doesn't clock higher than 14nm on the products we've seen thus far. That doesn't bode well for Alder Lake.

The main question is if AMD can improve the supply of Ryzen 5000's. We are at least starting to see Ryzen 5600x and 5800x staying in stock at major retailers which does signal that supply is finally catching up to demand.
 
  • Like
Reactions: Leeea

Mopetar

Diamond Member
Jan 31, 2011
7,848
6,015
136
I'm not really sure if they can improve the supply, at least not if both Sony and Microsoft have any kind of options for wafer priority. Their demand is through the roof so they obviously want to manufacture as many consoles as possible and the reports were that in the lead up to launch they were estimated to be collectively using somewhere around 75% of the wafers that AMD had at TSMC.

The 5800x has generally been in stock the whole time because it's the least desirable of the CPUs. For gamers it doesn't offer that much more than the 5600x for it's price and for anyone who wants more cores, the 5900x isn't that much of a step-up in price for what you get. Even if these newest chips from Intel aren't the best, they're still going to sell out of them just because they're good enough and actually available.
 

swilli89

Golden Member
Mar 23, 2010
1,558
1,181
136
I'm not really sure if they can improve the supply, at least not if both Sony and Microsoft have any kind of options for wafer priority. Their demand is through the roof so they obviously want to manufacture as many consoles as possible and the reports were that in the lead up to launch they were estimated to be collectively using somewhere around 75% of the wafers that AMD had at TSMC.

The 5800x has generally been in stock the whole time because it's the least desirable of the CPUs. For gamers it doesn't offer that much more than the 5600x for it's price and for anyone who wants more cores, the 5900x isn't that much of a step-up in price for what you get. Even if these newest chips from Intel aren't the best, they're still going to sell out of them just because they're good enough and actually available.

Console mix is definitely dropping off in favor of Zen chiplets and even some big Navi chips. Consoles are still eating up a large percentage, but I'm thinking AMD had contractual obligations front-loaded to the first several months of production.

I've been tracking shipments to MicroCenter specifically and indeed its gone from 2 or 3 Radeon 6xxx to 15+ per week, and with Ryzens we are starting to see way, way more 5900x and 5950x in the last month compared to Q4. There was enough 5900x at my local MC to actually stay in stock available to walk-in customers for almost 2 full days which is saying something.