Intel Skylake / Kaby Lake

Page 499 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BigDaveX

Senior member
Jun 12, 2014
440
216
116
Overall power use is probably not that bad considering that it gets the job done faster and returns to idle sooner.

Tech Report's review looked at that; while SKL-X's power usage is high during the benchmark itself, in power efficiency terms it's actually middle of the pack.
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,684
136
Still, if the 7900X is really up by that much, then there's something very wrong.
There's nothing "very wrong" with the power consumption, if you calculate the power delta between idle and load, take losses into consideration etc you still end up close to TDP.

What's strange is comparing with previous generation SKUs, it shows Intel allowed SKL-X to draw more power by clocking higher, which they probably could have done with Broadwell as well.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
There's nothing "very wrong" with the power consumption, if you calculate the power delta between idle and load, take losses into consideration etc you still end up close to TDP.

What's strange is comparing with previous generation SKUs, it shows Intel allowed SKL-X to draw more power by clocking higher, which they probably could have done with Broadwell as well.
So the 6900K with 8 cores consuming more than 6950x with 10 cores in CBR15 is normal?
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Tech Report's review looked at that; while SKL-X's power usage is high during the benchmark itself, in power efficiency terms it's actually middle of the pack.
Funnily enough the 1700 doesn't feature in it, somehow tech sites are afraid to show how efficient it is, unlike in the past ~
aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9PL1AvNDUwNjAxL29yaWdpbmFsL2VmZmljaWVuY3kucG5n

taskenergy.png
 

krumme

Diamond Member
Oct 9, 2009
5,952
1,585
136
The packaging simply doesnt match the power consumption. Period. Its screaming.

I will eat my hat if Intel didnt opt to get ~150MHz extra out of those cpu vs the planned. Imo they did wrong. Its not the way to go with a valuable brand. Its something amd does (p10) and its shortsighted.
 

ManyThreads

Member
Mar 6, 2017
99
29
51
For a 7820K, what is the go-to ASUS mobo do you guys think? I will be OC'ing but I don't care about dual LAN, PCI RAID, or LED lights. I am going to buy Asus again as I like their boards.

All 3 of these are roughly the same price for me in Canada:

TUF Mk 1
Strix Gaming
Prime (Not Deluxe)

Looked at the TUF Mk 2 but it seems to have a lower phase (7) VRM than the Mk 1.

Any obvious choice there, perhaps in terms of VRM quality and overall component quality? I am using the comparison tools on Asus' website but I am preordering so I need to buy mostly off spec. TIA
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
These reviews don't look good for Intel at all. Poor performance in games, worse performance/$ compared to AMD, bad thermal performance, and much higher power consumption.

Who is this for then ? Is there a large market for people who don't play games, don't care about the price and don't care about power consumption ?
Didn't the reviewers say gaming performance was lower due to BIOS problems?
 
  • Like
Reactions: Drazick and Phynaz

Edrick

Golden Member
Feb 18, 2010
1,939
230
106
For a 7820K, what is the go-to ASUS mobo do you guys think? I will be OC'ing but I don't care about dual LAN, PCI RAID, or LED lights. I am going to buy Asus again as I like their boards.

All 3 of these are roughly the same price for me in Canada:

TUF Mk 1
Strix Gaming
Prime (Not Deluxe)

Looked at the TUF Mk 2 but it seems to have a lower phase (7) VRM than the Mk 1.

Any obvious choice there, perhaps in terms of VRM quality and overall component quality? I am using the comparison tools on Asus' website but I am preordering so I need to buy mostly off spec. TIA

Im looking that the TUF Mk 1 myself. Strange that Gigabyte only has 3 gaming boards for X299 currently.
 
  • Like
Reactions: ManyThreads

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
It's Cinebench R15 and X299 + 7900X is using around 50% more power than X99 + 6950X for around 16% more performance.

I can't help but feel we could have had quite a bit more performance on Broadwell-E as well if Intel had been just as relaxed on power consumption.


You mean a real world renderer is something people don't use?
If it renders a lot faster then overall it's not using that much more power is it? Tom's touched on that with the sledgehammer line.

If I use 150 watts for 2 hours to get a job done, it's more power than using 200 watts for one hour to get the job done, even though the 200W number looks bad next to the 150W number on the spec sheet.
 
  • Like
Reactions: Drazick

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Significant difference depending on motherboard, BIOS + RAM kit used:

TweakTown said:
Since many of you asked, I have upgraded my GPU from the GTX 980 to a GTX 1080 Ti. I have three motherboards on hand and one new memory kit. The motherboards were all used; one for overclocking, one for out of the box performance, and one for Intel optimized performance (correct specified with Turbo 3 and 2666MHz memory).

...Intel Optimized is according to Intel spec and not considered overclocking; it's what you will score near out of the box, the other results are from earlier BIOS versions

Read more: http://www.tweaktown.com/reviews/8225/intel-core-i9-7900x-series-skylake-cpu-review/index4.html

8225_38_intel-core-i9-7900x-series-skylake-cpu-review.png


8225_39_intel-core-i9-7900x-series-skylake-cpu-review.png


8225_40_intel-core-i9-7900x-series-skylake-cpu-review.png


http://www.tweaktown.com/reviews/8225/intel-core-i9-7900x-series-skylake-cpu-review/index3.html
 
Last edited:
  • Like
Reactions: vissarix
Mar 10, 2006
11,715
2,012
126
1000 bucks + at least another 250-300 for a X299 board... what a deal!

Agreed, just pre-ordered mine this morning. Solid value -- no real sacrifice in ST perf compared to my 7700K and I get a fun new platform with lots of cores and upgradeability.

Not for everyone, but it'll make me happy and I'm sure some others, too.
 
  • Like
Reactions: vissarix and Sweepr

tamz_msc

Diamond Member
Jan 5, 2017
3,708
3,554
136
Would like to see power consumption numbers while doing more real-world tests like compiling and encoding.

BTW it was a really crappy move from Intel to not supply European websites with review samples.
 
Last edited:
  • Like
Reactions: Drazick

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,684
136
If it renders a lot faster then overall it's not using that much more power is it? Tom's touched on that with the sledgehammer line.
Please read my entire post. I'm not complaining about the absolute power consumption of SKL-X, I'm observing the power delta between SKL-X and BDW-E and concluding Intel was "conservative" with previous gen turbo clocks.

As far as I can tell, using Tom's Hardware numbers from the OC test, 7900X should be within TDP numbers at stock clocks.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9JL1IvNjg0OTYzL29yaWdpbmFsLzEwLUNpbmViZW5jaC1NdWx0aS1Db3JlLnBuZw==
 

pantsaregood

Senior member
Feb 13, 2011
993
37
91
No, I don't think seeing 14nm+ hitting 5.0 GHz is "a bit of a stretch." 14nm+ has been able to hit 5.0 GHz fairly frequently with four cores. These CPUs will run at higher temperatures due to having more cores/higher power consumption, but they're primarily being limited by the fact Intel didn't solder them.

Take note that 1.3V is running CPUs directly into a thermal ceiling with Skylake-X - they're not brickwalling like Broadwell-E did, they're just limited by thermals. If anyone recalls, Skylake and Kaby Lake both scale quite well all the way up to 1.45V.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
Didn't the reviewers say gaming performance was lower due to BIOS problems?

It could be due to BIOS problem or the different cache structure. Who knows? It's kind of telling that most sites don't even show any gaming charts. Looks like they weren't' allowed to.
 

Zucker2k

Golden Member
Feb 15, 2006
1,810
1,159
136
According to the OC3D guy, Intel got pissed about Hexus and BitTech breaking NDA and therefore did not send out chips to most reviewers.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
AT have updated with power numbers. They exceed the tdp. And notice difference to bwe.

"But for the 140W Skylake-X parts, we recorded nearly 150W power consumption. Intel announced that the socket is suitable up to 165W, so it’s clear that they are pushing the frequencies here and it is going to be telling what might happen with the higher core count silicon."

Those power numbers mean the HCC SKUs (esp 16 and 18 core SKUs) are going to have some real difficulty with overclocking to 4 Ghz across all cores given you are going to hit thermal and power limits.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Didn't the reviewers say gaming performance was lower due to BIOS problems?

Yes, according to PCGamesHardware.de. It's one of the reasons they didn't include gaming benchmarks in their preview:

PCGamesHardware.de said:
Shortly before the NDA case, the mainboard manufacturers send us new BIOS versions, which improve the performance significantly. Previously, the results are far from consistent, which is reflected, among other things, in a poorer game performance of the Core i9-7900X compared to the previous Core i7-6950X.
 
Mar 10, 2006
11,715
2,012
126
Please read my entire post. I'm not complaining about the absolute power consumption of SKL-X, I'm observing the power delta between SKL-X and BDW-E and concluding Intel was "conservative" with previous gen turbo clocks.

As far as I can tell, using Tom's Hardware numbers from the OC test, 7900X should be within TDP numbers at stock clocks.

aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9JL1IvNjg0OTYzL29yaWdpbmFsLzEwLUNpbmViZW5jaC1NdWx0aS1Db3JlLnBuZw==

coercitiv, they were conservative with BDW-E clocks out of the box because they didn't have any reason to bin more aggressively. Binning for higher out-of-the-box frequencies means that fewer chips will pass validation, lowering the effective yield and thus increasing chip mfg cost.

With Ryzen/AMD around, Intel is necessarily more aggressive with binning because it needs to be to stay competitive/keep its average selling prices afloat (they are already down quite a lot core-for-core from BDW-E to SKX-X, thanks to Ryzen).
 
  • Like
Reactions: coercitiv