• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Discussion i7-11700K preliminary results

Page 22 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

podspi

Golden Member
Jan 11, 2011
1,954
53
91
We may not even need to go bellow 1080p if Nvidia keeps their hardware & driver model for their 4000 series.

Again, I'm not trying to rain on anyone's parade here, these discounted CPUs are a fine purchase as long as they come with proper expectations. There is obviously a price where even I would bite the bullet and get an unlocked 10 core Skylake, but I wouldn't do so hoping to get relevant flagship performance 2 years from now.
Hopefully the days of buying a flagship CPU and it still having flagship performance 2+ years from purchase are over. I'd rather have rapid progress - I agree that the Zen 2 cores in the PS5 and Series X/S (not to mention the potatoes in the Nintendo Switch) will keep contemporary high core count CPUs relevant for a while.
 

Mopetar

Diamond Member
Jan 31, 2011
5,651
2,412
136
The performance may be good for the money NOW. But a lot of people keep the same computer for years. After e.g. three years, they will run 8 years old technology. That is a prehistory in the computer world. If they waited for Alder lake, their computer will not be a historical artifact after 3 years.
In this particular case we're talking about a 10850K. That's a 10-core CPU that can hit 5 GHz. It's not going to be an artifact after 3 years. At worst it might bottleneck a few games at 1080p but that's assuming it's being paired with a top end GPU.

If we were talking about picking up something like a 10300 maybe you'd have a point, but almost any quad core is eventually going to start feeling dated outside of lighter workload like web browsing and casual office work.

Anyone who is really concerned about that could just get a Zen 3 CPU right now, but only if they're actually that worried about future performance. Besides, Skylake and its descendants are one of the most prolific architectures of all time and the massive user base likely means software will continue to support it or even optimize for it for some time to come.
 

DrMrLordX

Lifer
Apr 27, 2000
17,031
6,001
136
How soon do we really expect Alder Lake to reach significant levels of retail availability? Intel just released Rocket Lake. It's probably going to be their top-end CPU for at least 9 months.
 

Dave2150

Senior member
Jan 20, 2015
635
174
116
In this particular case we're talking about a 10850K. That's a 10-core CPU that can hit 5 GHz. It's not going to be an artifact after 3 years. At worst it might bottleneck a few games at 1080p but that's assuming it's being paired with a top end GPU.

If we were talking about picking up something like a 10300 maybe you'd have a point, but almost any quad core is eventually going to start feeling dated outside of lighter workload like web browsing and casual office work.

Anyone who is really concerned about that could just get a Zen 3 CPU right now, but only if they're actually that worried about future performance. Besides, Skylake and its descendants are one of the most prolific architectures of all time and the massive user base likely means software will continue to support it or even optimize for it for some time to come.
A 10850K won't be useless after 3 years, but already it's old hat. It makes no sense to buy into a PCI-E gen3 platform, on a very old architecture, just because of being a lowest common market configuration statistic, IMO.

I believe the majority believe this, hence why the prices on Cometlake are so low right now. Some will see that as a bargain and that's perfectly fine. Others, see how much use their tech equipment gets these days (covid era) and would prefer to enjoy the new architectures, technologies, faster transfers, more convenience etc.
 

moinmoin

Platinum Member
Jun 1, 2017
2,369
2,941
106
I am impressed by Intel's ability to keep it relevant this long.
You are impressed Intel squandered its half a decade process and design lead by continuously launching refreshes with a couple more MHz and cores? I mean, I'm impressed as well, but not in a good way. Now Intel is on somewhat equal footing at best (AMD) or has to catch up quite some gap at worst (Apple).
 

gdansk

Senior member
Feb 8, 2011
612
288
136
You are impressed Intel squandered its half a decade process and design lead by continuously launching refreshes with a couple more MHz and cores? I mean, I'm impressed as well, but not in a good way. Now Intel is on somewhat equal footing at best (AMD) or has to catch up quite some gap at worst (Apple).
No, I'm more impressed by the fact that 14nm scaled up such that Skylake was competitive until Zen 3 launched. I do not describe their 10nm failures as impressive but I thought the following sentence made it clear what I was talking about
But then again other companies wouldn't be in this situation because they're fabless.
 
Last edited:

Hulk

Platinum Member
Oct 9, 1999
2,974
413
126
The performance may be good for the money NOW. But a lot of people keep the same computer for years. After e.g. three years, they will run 8 years old technology. That is a prehistory in the computer world. If they waited for Alder lake, their computer will not be a historical artifact after 3 years.
Thing is the apps that most people use, Word, Excel, Powerpoint, Web browsing, Photoshop, aren't changing much or at all in terms of compute. They are pretty much topped out. It's not like 25 years ago when the software compute required for snappy performance was far ahead of the hardware compute available.

SSD's helped a great deal as well of course.
 

Dave2150

Senior member
Jan 20, 2015
635
174
116
Thing is the apps that most people use, Word, Excel, Powerpoint, Web browsing, Photoshop, aren't changing much or at all in terms of compute. They are pretty much topped out. It's not like 25 years ago when the software compute required for snappy performance was far ahead of the hardware compute available.

SSD's helped a great deal as well of course.
I think your a bit out of touch with reality, if you believe those apps you mentioned are the most popular. Also if you believe people are buying i7's/i9's to power such apps... They're not.

The applications for which people still buy high end CPU's are very much capable of using their horsepower/connectivity.
 
  • Like
Reactions: Tlh97 and Thibsie

blckgrffn

Diamond Member
May 1, 2003
7,653
876
126
www.teamjuchems.com
I think your a bit out of touch with reality, if you believe those apps you mentioned are the most popular. Also if you believe people are buying i7's/i9's to power such apps... They're not.

The applications for which people still buy high end CPU's are very much capable of using their horsepower/connectivity.
You're right. Chrome spanks the crap out of any CPU I use it on! :D

Really, I read this and laughed. Plenty of people are buying current i7 and i9 CPUs for "gaming" with tiny GPUs and probably spending way more time in a web browser than any other app, hands down. Many of them will use integrated graphics for their entire life spent plugged in and doing whatever.

If you are buying an i7/i9 CPU for any scientific or heavy lifting, non-gaming purpose then you are already doing it wrong. But most consumers don't read benchmarks, they buy what's available. And cheap. Cheap helps a lot.
 

Mopetar

Diamond Member
Jan 31, 2011
5,651
2,412
136
If it's an AVX workload then Intel may be the better option, but I trust anyone who falls into that niche to be aware of what they should be buying.
 

Dave2150

Senior member
Jan 20, 2015
635
174
116
You're right. Chrome spanks the crap out of any CPU I use it on! :D

Really, I read this and laughed. Plenty of people are buying current i7 and i9 CPUs for "gaming" with tiny GPUs and probably spending way more time in a web browser than any other app, hands down. Many of them will use integrated graphics for their entire life spent plugged in and doing whatever.

If you are buying an i7/i9 CPU for any scientific or heavy lifting, non-gaming purpose then you are already doing it wrong. But most consumers don't read benchmarks, they buy what's available. And cheap. Cheap helps a lot.
Instead of laughing out loud, I suggest you brush up on English comprehension. 'Gaming' wasn't mentioned by the poster I quoted. They mentioned MS Office applications, web browsing and photoshop.

Gaming is obviously one of the most popular use cases of a high end CPU, as some of these games can actually leverage a benefit from a high end CPU, compared to a mid - range one.

You mention "scientific or heavy lifting, non-gaming purpose" type software. I agree that it's obvious, even to a small child, that such tasks are either best suited to cloud compute, or a Xeon/threadripper based workstation solution.

Again, brush up on your reading comprehension, it's a skill you'll find useful to not put yourself in embarrassing situations.
 
  • Haha
Reactions: blckgrffn

Hulk

Platinum Member
Oct 9, 1999
2,974
413
126
You're right. Chrome spanks the crap out of any CPU I use it on! :D

Really, I read this and laughed. Plenty of people are buying current i7 and i9 CPUs for "gaming" with tiny GPUs and probably spending way more time in a web browser than any other app, hands down. Many of them will use integrated graphics for their entire life spent plugged in and doing whatever.

If you are buying an i7/i9 CPU for any scientific or heavy lifting, non-gaming purpose then you are already doing it wrong. But most consumers don't read benchmarks, they buy what's available. And cheap. Cheap helps a lot.
Hi Dave,

I think "out of touch with reality" is a little harsh. I mean I can function in society, take care of my kids, etc.. I'm not the sharpest tool in the shed but I'm not out of touch with reality.

Anyway what apps do you think most people are using? Out of those which ones wouldn't be suitable on an 8 core granddaddy Skylake CPU?

As far as games that's mostly GPU once you are over 4 cores.
 
  • Like
Reactions: blckgrffn

blckgrffn

Diamond Member
May 1, 2003
7,653
876
126
www.teamjuchems.com
Instead of laughing out loud, I suggest you brush up on English comprehension. 'Gaming' wasn't mentioned by the poster I quoted. They mentioned MS Office applications, web browsing and photoshop.
Haha, OK. These CPUs will still be sold by the truck load to those users, especially the i7's.

I am not saying you have to like it. 2 minutes on Google showed me models by major OEMs like Dell HP selling 10700 PCs in their mainstream consumer product lines with no dedicated graphics card. Woo!

I still agree with @Hulk that the vast majority of these things are under-tasked.
 

TheELF

Diamond Member
Dec 22, 2012
3,238
384
126
This thing is almost literally a 10700K with more power draw. If they try to charge $400+ for this thing, I'll definitely be delivering more lolz to this thread than Intel delivers 11700k's to consumers.
It's about 7% faster in "productivity" with 100Mhz slower clocks and 10W less power draw, it's about what one would expect from intel.
Prices have released and the 11700F is going to be $298 and with MCE enabled and PL1 PL2 limits tweaked you would get higher performance than the 11700K stock that GN shows wile power would still be under control with the PL1 PL2 tweaks in place.
 

JoeRambo

Golden Member
Jun 13, 2013
1,049
837
136
Intel outdid themselves this time. Managed to increase performance where it does not matter - in Cinebenches and CPU-Z synthetics, and decrease performance where it does - in gaming and general use like web browsing. Not a coincidence that tasks that show great gains are not memory sensitive at all, and those who do are.

A shame really, to regress in L3 and memory latency so much. Did they forget that unlike certain other vendor they have just 16MB of L3, and out of that ~4MB already contains inclusive lower levels of all cores in chip? Can't afford AMD like latencies without having AMD sized cache.
One has to wonder, how did that happen? Actual memory speed increased to 3200 from 2933. How did IMC got limited to ~3733 in 1:1 mode? Lots of questions that have nothing to do with 14nm process, backporting and more with Intel's failures.
 

tamz_msc

Platinum Member
Jan 5, 2017
2,837
2,585
136
Prices have released and the 11700F is going to be $298.
Prices in retail will be closer to $350 because of the current situation and because $298 is 1K unit price.
with MCE enabled and PL1 PL2 limits tweaked you would get higher performance than the 11700K stock that GN shows wile power would still be under control with the PL1 PL2 tweaks in place.
You sure about that? Consider the 10700F in the same situation:


CPU Tests - +10%


Game test at 720p - +2%



Power Consumption - Cinebench MT - +59%

 
  • Like
Reactions: Tlh97

jpiniero

Diamond Member
Oct 1, 2010
9,047
1,743
126
One has to wonder, how did that happen? Actual memory speed increased to 3200 from 2933. How did IMC got limited to ~3733 in 1:1 mode? Lots of questions that have nothing to do with 14nm process, backporting and more with Intel's failures.
Sunny Cove only supports LPDDR4 3733 so it's possible that Intel did something to the IMC.
 

coercitiv

Diamond Member
Jan 24, 2014
4,262
5,396
136
So it seems Rocket Lake benefits from unlocked TDP even in gaming, where more power allows it to sit slightly above the competition. I would have liked to see power consumptin data from a game, but as long as they benchmarked with the same BIOS and different power settings... power is the only variable. This is good news for 11900K, since it shows there's potential here for a better binned chip. It's also quite the warning sign for those hoping to see low power consumption in gaming.

PS: @Racan - forum rules require us to add a minimal comment when posting videos/pics/links etc
 
  • Like
Reactions: Tlh97

tamz_msc

Platinum Member
Jan 5, 2017
2,837
2,585
136
So it seems Rocket Lake benefits from unlocked TDP even in gaming, where more power allows it to sit slightly above the competition. I would have liked to see power consumptin data from a game, but as long as they benchmarked with the same BIOS and different power settings... power is the only variable. This is good news for 11900K, since it shows there's potential here for a better binned chip. It's also quite the warning sign for those hoping to see low power consumption in gaming.

PS: @Racan - forum rules require us to add a minimal comment when posting videos/pics/links etc
What would have been interesting to see is the i7-10700K with "power limits disabled", as it has a 100 MHz higher all-core turbo than the i7-11700K. Of course we haven't even got to memory overclocking, where preliminary accounts suggest that Rocket Lake is at a definite disadvantage compared to Comet Lake.
 
  • Like
Reactions: Tlh97

IntelUser2000

Elite Member
Oct 14, 2003
7,337
1,994
136
I mean how long did it take from inception to a product? 2 years?

That's pretty much a straight backport with no optimization.

It's a backport to an older, worse process, and they rushed it doing so. No wonder it's so bad. Also tells us how important process is and it'll get even more important as shrinks get harder.

They have a presentation about Rocketlake and the team that made it happen, and it sounded like they were pretty proud about it. Not sure if they should have bothered, because there's nothing to be proud about this product.

Yea sure you got it done, but it's questionable if it's better. It's a "we did it cause we can" chip.

Good thing it didn't end up worse. Imagine this chip on mobile.
 

ASK THE COMMUNITY