Intel Skylake / Kaby Lake

Page 305 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
For the first part, Anandtech recorded and correlated the following data: power state distribution, frequency distribution, and run-queue depth - which is an exact representation of the number of threads running through the system. Not only did Anandtech measure the right thing, they did so in a comprehensive manner to give us a detailed representation of how the big.LITTLE SOC behaves under Android. Please take your time and read through the article, it really is worth the read.

They measured whether the system doing a given task could use those resources but never measured if it was advantageous to use those resources in the way they were used. (The principle of big.LITTLE is that it is so but this has NOT been demonstrated). In other words there is no data to show that those cores are doing anything useful.

For instance, loading a webpage http://www.anandtech.com/show/9518/the-mobile-cpu-corecount-debate/2

What is interesting to see here is that even though it's mostly just 1 large thread that requires performance on the big cores, most of the other cores still have some sort of activity on them which causes them to not be able to fall back into their power-collapse state. As a result, we see them stay within the low-residency clock-gated state.

In other words the other high power cores are not being shut off despite there being little need for them. Is this power and performance efficient? Don't know - the article never discusses this. Multiple big cores are also used while scrolling. Is this needed? Probably not, much less multiple big cores (could power off the big cores entirely and simply use the small cores (like 2 or so) for tasks like scrolling.

AT's article demonstrates that multiple cores are used for everyday tasks but they have not demonstrated that they are in any way, shape, or form needed.

There is no control group.

I'm not saying this article is useless. AT did disprove that in big.LITTLE cpus the cores are not going completely idle but they did not show that big.LITTLE (or even having that many cores) was in any way advantageous.

For the second part, it seems to me you imply that optimised multithreaded software running on many cores needs proof of efficiency (perf, power, or both). No offense, but for you to claim that in a scenario when the browser can use up to 6-8 threads, using the additional available cluster of power optimised cores does not yield additional efficiency gains is a bit much. You're entitled to your opinion though, maybe we'll get the chance to compare results in a test with the little cores disabled, since that's the only way we can maintain data consistency (keep software and CPU arch/process the same).

Many programs will use quite a few threads - that doesn't mean that that many cores are needed (i.e. a game like battlefield will use dozens of threads but because most of those threads are very low usage there is no gain to using a CPU with more than about 6-8 cores. All those low usage threads can be thrown onto a single core and even then that core is barely active).
 

Maxima1

Diamond Member
Jan 15, 2013
3,515
756
146
Upcoming games will have far better MT support due to DX12/Vulkan, making very good use of all available cores, and thus lowering CPU power consumption (more cores at lower frequency) and allowing more TDP budget for the GPU. More GPU TDP -> more performance in the same form factor.

I believe the 1060 mobile is only ~5 or so more than 965M. So it's still status quo basically, and in some ways lower TDP (i.e. AMD's 35 TDP Polaris). The 1050 is basically the new 960m. The entry gaming laptops use standard voltage CPUs, but with 1050. Even with the 1060, the 6C doesn't bring much. And it's amusing looking at the power of the laptop CPUs vs. the new PS4 given the power of their GPUs.

As for using the GPU upgrade problem in a notebook, it's a weak argument: notebooks come with their unique set of compromises, they always did. Whether you think a gaming notebook is worth the investment or not is not the topic.

No, you're ignoring the obvious. It's better to have a weaker CPU and stronger GPU in a laptop. The only difference the 6C would make is possibly some strategy games. Thinking as an OEM, it's far better to mismatch and upsell the 1060 GPU in models with the 6C i7. They get more money from the consumer even though it would obsolete just as fast. Many features in laptops are segmented like this even though the cost difference can be so little.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Skylake performance on some of the latest titles:

o_proz.jpg


dv2_proz.png


dex_proz_2.jpg



Kaby Lake 35W shows up at SiSoftware:


Intel(R) Core(TM) i7-7700T CPU @ 2.90GHz (4C 8T 801MHz/3.8GHz, 800MHz IMC/3.5GHz, 4x 256kB L2, 8MB L3)

Intel(R) Core(TM) i5-7400T CPU @ 2.40GHz (4C 3GHz, 2.7GHz IMC, 4x 256kB L2, 6MB L3)


Dell updates XPS 13 with Kaby Lake, 22 hour battery life

The biggest upgrade to the new XPS 13 is, as you might expect, Intel's Kaby Lake processors. The XPS 13 will come in three variants, packing either the Core i3-7100U, Core i5-7200U, or Core i7-7500U depending on the price point. All of these CPUs are 15W low-power parts with respectable performance and efficiency gains over Skylake.

With the upgrade to Kaby Lake and a slightly larger 60 Wh battery, the 2016 XPS 13 boasts up to 22 hours of battery life if you opt for the 1080p panel. As with previous models, the QHD+ variant comes with a significant battery penalty: Dell only claims 13 hours of life for the high-resolution model.

http://www.techspot.com/news/66349-dell-updates-xps-13-kaby-lake.html


HP launches Kaby Lake powered ProBook 400 G4 series business laptops for $599 and up

The model that interests me most is the HP ProBook 430 G4, which is a 3.3 pound laptop with a 13.3 inch display. This model has a starting price of $599 for a model with a Core i3-7100U processor and a 1366 x 768 pixel display, but the laptop is also available with up to a Core i7-7500U processor and an optional 1920 x 1080 pixel display (or an optional touchscreen.

http://liliputing.com/2016/09/hp-launches-probook-400-g4-series-business-laptops-for-499-and-up.html
 
  • Like
Reactions: ShintaiDK

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
I think we are down to tiny variances in a GPU limited case. A few more runs may have changed it.

Perhaps. Obduction produced some . . . peculiar results. The 6600 just sort of jumped out at me, especially with the min fps.
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Good point from the comments on Dell XPS 13: No Iris graphics. I wonder if iPhone 7 now has 2x or 3x or 4x better graphics performance than those very expensive ultra books.
 
Mar 10, 2006
11,715
2,012
126
Good point from the comments on Dell XPS 13: No Iris graphics. I wonder if iPhone 7 now has 2x or 3x or 4x better graphics performance than those very expensive ultra books.

people play more games on phones than they do on ultrabooks :p
 

Maxima1

Diamond Member
Jan 15, 2013
3,515
756
146
people play more games on phones than they do on ultrabooks :p

You might as well say you don't need a 15 TDP i5 or i7 either.

I''ve seen a lot of people complain they can't play _____ game because of the laptop their mommy bought, btw. It's also pretty crappy for a 4k screen, and an actual upgrade for the igpu is much better than the lame advertising Intel is doing with Kabylake: "better battery life watching 4k youtube videos111!!!1"
 
Mar 10, 2006
11,715
2,012
126
You might as well say you don't need a 15 TDP i5 or i7 either.

I''ve seen a lot of people complain they can't play _____ game because of the laptop their mommy bought, btw. It's also pretty crappy for a 4k screen, and an actual upgrade for the igpu is much better than the lame advertising Intel is doing with Kabylake: "better battery life watching 4k youtube videos111!!!1"

sure, Coffee Lake should do the trick for you I guess. up to 4 cores +GT3e in a low power envelope. Cannon Lake cores and Gen10 GPU no less :)
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
sure, Coffee Lake should do the trick for you I guess. up to 4 cores +GT3e in a low power envelope. Cannon Lake cores and Gen10 GPU no less :)

A laptop with that 15/28 W 4+3e U chip is going to very difficult to find, based upon how few 2+3e U Skylake machines there are out there. I don't know why Intel's bothering. It'd be pretty nice though.
 

Maxima1

Diamond Member
Jan 15, 2013
3,515
756
146
sure, Coffee Lake should do the trick for you I guess. up to 4 cores +GT3e in a low power envelope. Cannon Lake cores and Gen10 GPU no less :)

I probably won't even bother with combining graphics dgpu with a laptop anymore. lmao

I thought the recent slide said GT2? Or are you talking about the U CPUs? If so, I seriously doubt they'll be used in any laptop but maybe a few expensive models that are released late in the cycle. What they'll really use is whatever is suppose to be the successor of the 7200/7500.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Good point from the comments on Dell XPS 13: No Iris graphics. I wonder if iPhone 7 now has 2x or 3x or 4x better graphics performance than those very expensive ultra books.

Not comparable:

Futuremark’s 3DMark is available on all platforms, although when throwing Windows into the mix we always have to take a bit more caution as the level of rendering precision is not always equal. This is due to the fact that lower precision rendering modes - widely available and regularly used on Android and iOS to boost performance and save on power consumption - are not commonly available on Windows PCs, which forces them to use high (full) precision rendering most of the time.

On the tablet comparisons, I’ve installed the OpenGL version of GFXBench. Once again the Surface Pro 4 outperforms everything, although in this test the margin is not quite as high. As with 3DMark, on Windows PCs, GFXBench runs at high precision only due to limitations in OpenGL versus OpenGL ES.

www.anandtech.com/show/9727/the-microsoft-surface-pro-4-review-raising-the-bar/5

But technically Kaby Lake-U GT2 is still ahead.

- HD 620
T-Tex: 133.5 FPS
Manhattan: 65.8 FPS

- iPhone 7+
T-Rex: 125.4 FPS
Manhattan: 63.8 FPS
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
But technically Kaby Lake-U GT2 is still ahead.

- HD 620
T-Tex: 133.5 FPS
Manhattan: 65.8 FPS

- iPhone 7+
T-Rex: 125.4 FPS
Manhattan: 63.8 FPS
Fair enough, but power enveloppe ----> clock speed is a lot higher of course.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Intel ULV/ULT CPUs (15-18W TDP) Comparison in Cinebench 11.5

Core i7-620UM (32nm Arrandale, 2C/4T @ 1.06 GHz - 2010)
- Single-Core: ?
- Multi-Core: 1.10

Core i7-2617M (32nm Sandy Bridge, 2C/4T @ 1.5 GHz - 2011)
- Single-Core: ?
- Multi-Core: 2.11

Core i7-3517U (22nm Ivy Bridge, 2C/4T @ 1.9 GHz - 2012)
- Single-Core: 1.2
- Multi-Core: 2.8

Core i7-4500U (22nm Haswell, 2C/4T @ 1.8 GHz - 2013)
- Single-Core: 1.3
- Multi-Core: 2.85

Core i7-5500U (14nm Broadwell, 2C/4T @ 2.4 GHz - 2015)
- Single-Core: 1.4
- Multi-Core: 3.2

Core i7-6500U (14nm Skylake, 2C/4T @ 2.5 GHz - 2015)
- Single-Core: 1.5
- Multi-Core: 3.5

Core i7-7500U (14nm Kaby Lake, 2C/4T @ 2.7 GHz - 2016)
- Single-Core: 1.6-1.68
- Multi-Core: 4.03

Median scores from NotebookCheck.
Next on line: 2017 2C/4T Cannon Lake, 4C/8T Kaby Lake and 2018 4C/8T Coffee Lake.
 
Aug 11, 2008
10,451
642
126
A laptop with that 15/28 W 4+3e U chip is going to very difficult to find, based upon how few 2+3e U Skylake machines there are out there. I don't know why Intel's bothering. It'd be pretty nice though.
I would argue just the opposite. They should make iris standard in mobile, at least in anything above 15 watt TDP. At 15 watt and below, not sure how much it can improve performance due to TDP limitations. But OTOH, when you pay 1000+ for an ultrabook, seems like you should get edram as well.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
I would argue just the opposite. They should make iris standard in mobile, at least in anything above 15 watt TDP. At 15 watt and below, not sure how much it can improve performance due to TDP limitations. But OTOH, when you pay 1000+ for an ultrabook, seems like you should get edram as well.

If anything it looks like Intel is giving up on Iris. OEMs don't care.