Coffeelake thread, benchmarks, reviews, input, everything.

Page 35 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

MarkPost

Senior member
Mar 1, 2017
378
794
136
Are you sure that you want to publicly post that 7.5% is not within 5.7% to 17.5%?

Single core: 8700K is 4.7 GHz turbo; 7800X is 4.0 GHz turbo. 17.5% faster.
Dual core: 8700K is 4.6 GHz turbo; 7800X is 4.0 GHz turbo. 15.0% faster.
Quad core: 8700K is 4.4 GHz turbo; 7800X is 4.0 GHz turbo. 10.0% faster.
You already gave the 6-core difference.
Base guaranteed speed: 8700K is 3.7 GHz; 7800X is 3.5 GHz. 5.7% faster.

Do you have proof from Intel that there will be no more chips that can run on Z370?
I overclock my CPUs (not extremely, but I do OC), so yes, freq dif between both oced is irrelevant (about 5-7%) to be a factor.

And about other chips, as far as i know, reports say no, its all we know, just like happened with 7700k
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Do you use a lot of mobile or low-power devices? I use a lot of HTPCs, which use direct streams, which means my Plex Server doesn't have to do a lot of repackaging or re-encoding. I have been thinking about rebuilding my unRAID server with more potent hardware (it's an i5-4690K right now) to better support Plex, but I don't think I'd use it.

Yep. Rokus, Fire TV, iPads, Phones, Fire tablets, etc.
 

JimKiler

Diamond Member
Oct 10, 2002
3,561
206
106
Home software, for the most part, has gone backwards in requirements. It seems like the main hardware requirements to send 140 characters, scroll through photos, have a dog's nose overlaid on a photo, or tossing birds at blocks is not that great.

Proper voice recognition is about the only thing that most people could use that has a great hardware requirement. Apple and Google generally handle that by sending your voice to a server and have the server do the crunching. Does Amazon do the voice recognition on a server as well (I haven't researched the Echo and its variants)?

Video editing or ripping, that is very CPU intensive. But other stuff did go backwards. Windows requirements were reduced with Windows 7. My ACDSee photo editing software can use my GPU but editing RAW photos is still intensive.
 

TheLycan

Member
Mar 8, 2017
34
11
36
I didn't confuse TDP with power (which is why I used the word "average"). CPUs can vastly exceed TDP for a short period of time. But on average, they cannot exceed TDP with the appropriate cooling and voltages.

With your link, don't confuse short-term maximum power use in one benchmark (what the link reported) with typical or average power use under load.
Why not buying an i3 if its kept on idle? Nobody cares. But performance/watt in load matters because we have 1 planet...and because of cooling, heat, noise, electricity bill, overclocking...
 

dullard

Elite Member
May 21, 2001
26,042
4,688
126
But performance/watt in load matters because we have 1 planet...and because of cooling, heat, noise, electricity bill, overclocking...
Performance per watt in load IS exactly what matters. I agree with you there. Also, I agree for the reasons that you post.

What I disagree with is the link graphs temporary short-term maximum peak power under load. That is a pretty meaningless number. What you want is the AVERAGE power used under load and the amount of work done on average.
 

dullard

Elite Member
May 21, 2001
26,042
4,688
126
Video editing or ripping, that is very CPU intensive. But other stuff did go backwards. Windows requirements were reduced with Windows 7. My ACDSee photo editing software can use my GPU but editing RAW photos is still intensive.
Yes, but I still don't think that many people do video editing or ripping.
 

DrMrLordX

Lifer
Apr 27, 2000
22,937
13,021
136
AMD clearly is investing more in CPU. And it makes sense. Server CPUs have higher margins than GPUs. Especially with how they make EPYC, yields must be great. Given the lack of availability of Vega, yields either suck or AMD prefers to keep their fabs loaded with producing zeppelin dies.

I'm not so sure. You and I don't see the truckloads of GPU dice being tagged and sold for compute functions. Stop thinking about gaming and start thinking about direct mass-sales to AI research firms, neural net firms, etc.

A thread like the builders thread is where the exclusivity should be clearly enforced. This is a "benchmarks, reviews.......everything" thread, so if anything, leave those alone and do the comparisons here. My two cents.

Definitely not Intel's bidding. It goes without saying. Also, if you're not going to discuss the competition in a review thread, where is the appropriate place to do it? Just look at all the reviews. How many featured coffeelake alongside the competing chips?

Look, go view the carnage from too much discussion of "comparing the reviews". Or read them yourself and understand its place. Because we all love arguments that break out over whether such and such reviewer used the right speed of RAM on Ryzen . . . in a Coffeelake thread.

It missed the mark by two cores and a proper motherboard. If it had those two things then it would be the new 2600K. Its probably more like a 2500K. Good chip but will run out of grunt in 5 years.

I don't know that Z370 is really improper. Though Z390, today, would have been better in terms of feature set. The extra two cores probably won't make THAT big of a difference. I would only argue that Intel should have made this move with Broadwell or Skylake.

This still isn't an AMD thread.

Yes!

Start a comparison thread if you would like. I don't want the infighting in this thread. There's already advocates trying to derail it.

Gee, you noticed?
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
27,259
16,117
136
Look, go view the carnage from too much discussion of "comparing the reviews". Or read them yourself and understand its place. Because we all love arguments that break out over whether such and such reviewer used the right speed of RAM on Ryzen . . . in a Coffeelake thread.

If the thread is a review thread, and if it compares AMD to Intel CPU's, anything directly related to that review IS on-topic, even if about EITHER of the CPU's being compared.. Granted, it seems that sometimes it goes beyond that, which is not good.
 
  • Like
Reactions: CatMerc and Drazick

TheLycan

Member
Mar 8, 2017
34
11
36
Performance per watt in load IS exactly what matters. I agree with you there. Also, I agree for the reasons that you post.

What I disagree with is the link graphs temporary short-term maximum peak power under load. That is a pretty meaningless number. What you want is the AVERAGE power used under load and the amount of work done on average.
Or gaming. Yes, i looked again and in average its pretty much the same. I guess this is due to uneven utilization of all 6 cores but anyway its good.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
Outside of games, what interesting consumer software had made it to the PC in the last decade?

Sadly, I can't think of much.

Even with games, how many of the best games over the last 10 years have needed massive CPU power to run?

It strikes me that all the advances have been due to GPU improvements overwhelmingly.

Since Half Life 2, the only FPS that has grabbed my attention was the new Doom and that wasn't particularly demanding on ones CPU.

BTW, I am not suggesting that a CPU from 10 years ago would be able to run everything that one could reasonably want today, but a 4 core Sandy Bridge Core at 3.5+ Ghz most likely could.
 
Mar 10, 2006
11,715
2,012
126
Even with games, how many of the best games over the last 10 years have needed massive CPU power to run?

It strikes me that all the advances have been due to GPU improvements overwhelmingly.

Since Half Life 2, the only FPS that has grabbed my attention was the new Doom and that wasn't particularly demanding on ones CPU.

BTW, I am not suggesting that a CPU from 10 years ago would be able to run everything that one could reasonably want today, but a 4 core Sandy Bridge Core at 3.5+ Ghz most likely could.

Well if you game at high refresh rate a fast CPU definitely helps, but if you're at 60Hz, the case for a new CPU for most games just isn't really there if you're on SNB or newer.

I will freely admit that I buy new CPUs out of want, not need.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
'Someone' posed a question about a year back asking what could a mainstream consumer do with a 16 core chip, or so. I remember thinking long and hard and couldn't provide a good answer. The moar cores argument is approaching a hard wall at lightning speed.

I think once Intel releases an 8 core for the mainstream, that the Core Wars will take a break for a while and Single Thread performance and performance/watt will become the main focus again.
 
  • Like
Reactions: CatMerc
Mar 10, 2006
11,715
2,012
126
I think once Intel releases an 8 core for the mainstream, that the Core Wars will take a break for a while and Single Thread performance and performance/watt will become the main focus again.

Agreed. After 8 cores, it's just getting to the point of silliness.
 
  • Like
Reactions: CatMerc

majord

Senior member
Jul 26, 2015
509
711
136
The paper launch was a bad move on Intel's part. But, you can't honestly come in here and say up to 50% more processing capability (6 cores vs 4) with just 4.4% more power used on average (95 W vs 91 W) is not a spectacular engineering achievement. Especially, considering that it was based on a fairly power efficient mobile-based core to begin with.


Well the reality is a bit different to paper specs. Power consumption quite consistently appears to be a lot higher than 7700k In reviews . No ones done a proper analysis comparing Kbl and Cfl core power readings AFAIK, but I suspect you'll find its far far less than 50% More like 15 to 20% tops
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
Why not buying an i3 if its kept on idle? Nobody cares. But performance/watt in load matters because we have 1 planet...and because of cooling, heat, noise, electricity bill, overclocking...

Performance / Watt matters in real-world loads for actual tasks like playing a game, not so much in programs designed specifically to max out the CPU in artificial ways that don't match real work.
Well if you game at high refresh rate a fast CPU definitely helps, but if you're at 60Hz, the case for a new CPU for most games just isn't really there if you're on SNB or newer.

I will freely admit that I buy new CPUs out of want, not need.

I think it was Borderlands 1 that made me upgrade from my fast Core 2 Duo to a quad-core i5 2500 in 2011, but since I'm fine with 1080p and 60 FPS (and don't play "64 player Battlefield") it's still "good enough."

I'll be buying an i7-8700 out of want, but I could keep using the i5 for a bit longer if I needed to.
 
Last edited:
  • Like
Reactions: CatMerc and psolord

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
I think once Intel releases an 8 core for the mainstream, that the Core Wars will take a break for a while and Single Thread performance and performance/watt will become the main focus again.

Yeah probably for a year before 12C/24T Zen 2 lands. ST perf, MT perf and perf/watt will remain important but the weightage depends on individual consumer use cases. Looking forward to 2020 the next gen console CPU core count will decide how many cores are sufficient.

Agreed. After 8 cores, it's just getting to the point of silliness.

Not really. You never know what the next gen consoles are going to look like. 7nm Zen 2 coming in 2019 is likely to be a 12C/24T CPU at 95w running at base speeds close to 4 Ghz and die size around 150 - 170 sq mm. Clock that CPU down to 2.4-2.5 Ghz and the CPU could draw just 35w leaving roughly 65w for GPU. I think its possible we might see next gen consoles with 12 Zen 2 cores clocked around 2.4 Ghz.If that turns out to be the case the next gen console game engines will be designed to scale to 12C/24 threads. Anyway I think the CPU core count race ain't slowing down. In fact I think improving ST perf significantly is going to get very hard especially as the biggest challenge going forward at leading edge nodes is performance. CFL might have the highest ST perf for more than one generation depending on how well Icelake clocks as Intel process execution at 10 is very poor and there is no sign if the issues are fixed at 10+.
 
Mar 10, 2006
11,715
2,012
126
Not really. You never know what the next gen consoles are going to look like. 7nm Zen 2 coming in 2019 is likely to be a 12C/24T CPU at 95w running at base speeds close to 4 Ghz and die size around 150 - 170 sq mm. Clock that CPU down to 2.4-2.5 Ghz and the CPU could draw just 35w leaving roughly 65w for GPU. I think its possible we might see next gen consoles with 12 Zen 2 cores clocked around 2.4 Ghz.If that turns out to be the case the next gen console game engines will be designed to scale to 12C/24 threads. Anyway I think the CPU core count race ain't slowing down. In fact I think improving ST perf significantly is going to get very hard especially as the biggest challenge going forward at leading edge nodes is performance. CFL might have the highest ST perf for more than one generation depending on how well Icelake clocks as Intel process execution at 10 is very poor and there is no sign if the issues are fixed at 10+.

Current gen consoles have eight cores and still to this day eight cores for gaming on PC is mostly useless. 7700K still smashes all the HEDT chips from both Intel and AMD in many, if not most, games.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,476
136
Current gen consoles have eight cores and still to this day eight cores for gaming on PC is mostly useless. 7700K still smashes all the HEDT chips from both Intel and AMD in many, if not most, games.

The key here is current gen consoles have only 8 threads as Jaguar does not support SMT. Incidentally the 7700k (as does every mainstream quad core i7 since 2600k) supports 8 threads. Moreover Jaguar is a very small core and comparable only to Atom. The last gen consoles were designed when AMD had only Bulldozer and Jaguar to offer. It was a no brainer to avoid Bulldozer like the plague. The next gen consoles will probably sport Zen 2 core with SMT. How many cores and clocks is still not known but if Sony or Microsoft wanted they could get 12 Zen 2 cores clocked at 2.2-2.4 Ghz within 35w on a GF 7LP or TSMC N7. There is no comparison here with Jaguar. The next gen consoles will be extremely powerful in terms of CPU horsepower. One thing is sure the next gen consoles will provide a gameplay experience rivaling the high end PCs as they will not be CPU limited and will run close to metal APIs and extremely optimized OS/driver stacks .
 

Dufus

Senior member
Sep 20, 2010
675
119
101
But, you can't honestly come in here and say up to 50% more processing capability (6 cores vs 4) with just 4.4% more power used on average (95 W vs 91 W) is not a spectacular engineering achievement.


Go to the Intel's specifications on the ARK and click on the little question mark where it says "Processor Base Frequency". There you will get Intel's definition of TDP
Processor Base Frequency
Processor Base Frequency describes the rate at which the processor's transistors open and close. The processor base frequency is the operating point where TDP is defined. Frequency is measured in gigahertz (GHz), or billion cycles per second
I've boldded the pertinent part.

The 8700k has a base frequency of 3.7GHz vs the 7700k which is 4.2GHz so it should be obvious that you should not compare the two like you have done.

If we were to estimate dynamic power using a 25mV step per bin and for simplicity sake 3.7GHz at 1V then the increase in dynamic power at 4.2GHz would be (1.125/1.00)^2 * 4.2/3.7 = 1.436

95W * 1.436 / 91W = 1.5 or a 50% increase in power. Of course real values would be required for a proper comparison and static power also taken into consideration but it should be obvious it's no where near 4.4%.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
All these new builds that everyone has been doing has inspired me to do a mid-life update to my Ivy Bridge.

8gb DDR3 RAM(will give me 16gb in total), 500gb Samsung 850 Evo SSD(New O/S drive) & 6TB Hitachi Hard Drive will be added to my system in the weeks ahead. :p
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Well the reality is a bit different to paper specs. Power consumption quite consistently appears to be a lot higher than 7700k In reviews . No ones done a proper analysis comparing Kbl and Cfl core power readings AFAIK, but I suspect you'll find its far far less than 50% More like 15 to 20% tops
GamersNexus looked at CPU power draw:
8700k-power-draw-blender.png

It's definitely a bit more efficient than the 7700K per core, though not nearly 50% as you said. If power scaled linearly with cores then a hypothetical 6C 7700K would draw approx 110W compared to 95W on the 8700K.
 
  • Like
Reactions: Arachnotronic

coercitiv

Diamond Member
Jan 24, 2014
7,374
17,480
136
It's definitely a bit more efficient than the 7700K per core, though not nearly 50% as you said. If power scaled linearly with cores then a hypothetical 6C 7700K would draw approx 110W compared to 95W on the 8700K.
If we apply the same reasoning to the CB 15 power test from the same review, 8700K does 91W while the hypothetical KBL 6c/12t would do 95W. Shave 100Mhz from KBL to drop from 4.4 to 4.3Ghz and they would be even.

8700k-power-draw-cinebench-nt.png
 
  • Like
Reactions: Gikaseixas

stockwiz

Senior member
Sep 8, 2013
403
15
81
All these new builds that everyone has been doing has inspired me to do a mid-life update to my Ivy Bridge.

8gb DDR3 RAM(will give me 16gb in total), 500gb Samsung 850 Evo SSD(New O/S drive) & 6TB Hitachi Hard Drive will be added to my system in the weeks ahead. :p
I've decided to keep my 2600K @ 4.5 rig for awhile longer. Combined with sky high DDR4 prices that have almost doubled and benches that show I don't really need to upgrade yet, why bother.... spending $350 on RAM kind of kills any savings that comes from the cheaper mainstream platform. I'll wait for the 8 core variant to be released on the mainstream platform and hopefully by then RAM prices come back to earth. I don't want to upgrade the video card until Volta, anyways. I admit I'm tempted and this is as good a time as any for sandy and ivy bridge users to upgrade, however...
 
  • Like
Reactions: CHADBOGA

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
If we apply the same reasoning to the CB 15 power test from the same review, 8700K does 91W while the hypothetical KBL 6c/12t would do 95W. Shave 100Mhz from KBL to drop from 4.4 to 4.3Ghz and they would be even.

8700k-power-draw-cinebench-nt.png
Fair point, I used Blender because it draws the most power and seems to stress the cores the most, as evidenced by their MCE enabled 8700K @ 4.7GHz being unstable in Blender whilst being stable in everything else.