Intel Skylake / Kaby Lake

Page 258 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
I dont understand, are you saying they use the iris pro chips with a discrete gpu, or are you saying they use the standard chips with a discrete gpu? Iris pro with discrete would be the most expensive solution of all.
 

coercitiv

Diamond Member
Jan 24, 2014
6,199
11,891
136
I dont understand, are you saying they use the iris pro chips with a discrete gpu, or are you saying they use the standard chips with a discrete gpu? Iris pro with discrete would be the most expensive solution of all.
They use Iris Pro chips with discrete GPUs. Funny right? :)

PS: it seems this only happens in some regions of the world, so far I only got results from Eastern Europe and... Australia. Since I cannot find a proper link from Asus, here's a link from a local shop in my country.

PSS: here's a link from Amazon UK.
 
Last edited:
Aug 11, 2008
10,451
642
126
Yea, seems to defeat the purpose. Although it allows the cpu to run at the full TDP instead of sharing with the igp. But then, why not just use a cheaper non-iris cpu? AMD Carrizzo laptops seem to have the same strange configuration as well, but at least it is not a more expensive upgraded igp.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Have to imagine they must be getting them at at least a similar price to the standard stuff? Intel struggling to move them or something.

They did definitely seem to struggle to get them into laptops. Maybe these new ones (which are much improved) will help with that.

Should definitely motivate NV to do something about updating the 940 to 16nm fairly soon.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Yea, seems to defeat the purpose. Although it allows the cpu to run at the full TDP instead of sharing with the igp. But then, why not just use a cheaper non-iris cpu? AMD Carrizzo laptops seem to have the same strange configuration as well, but at least it is not a more expensive upgraded igp.

Not really. Quite a lot of those GPUs are significantly faster than the current Iris Pro 6200 chips. And by the time Iris Pro 580 is out, 16nm parts are going to make the Iris Pro 580 even worse relatively than 6200.

The issue was raised with first Iris Pro review at Anandtech. These parts were not perf/$ competitive with comparative discrete GPUs! Which is funny because that's supposedly the strength for the iGPUs but intentions that Intel has for these parts are clear, which is to make money. Also when the actual Iris Pro 5200 laptops came out it didn't have idle power characteristics better than discrete. It was about the same. So you get uncompetitive perf/watt, battery life, and perf/$, with the stigma that comes with Intel drivers.

Unless Intel puts serious effort into improving driver UI and game support, and bringing a part that is top of the class at every level(perf/watt, battery life, perf/$), no manufacturer or consumer is realistically going to consider one.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
+1 :thumbsup:

The lower end dGPUs are going out of business fast. Good thing that higher end gaming revenue is up significantly (GTX970 and up).

Core i3 Laptops cost $550 US on Amazon.com.

http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=Core+i3+laptop+6th+generation&rh=i%3Aaps%2Ck%3ACore+i3+laptop+6th+generation

Core i3 + Iris Pro 580 for $600 and that would live up to what the price sheet is saying.

http://www.amazon.com/s/ref=nb_sb_n...earch-alias=aps&field-keywords=Core+i3+laptop

Core i5 6200 Laptops cost $600, what about a same part with Iris Pro 580 for $650?

Unfortunately Ark prices only apply when we are buying individual chips as a computer builder. For laptops we are at the mercy of Intel and the manufacturer to put them without treating them as "halo" devices(which means $100-150 for Iris rather than $50).

For Desktops an Core i3 6100 + Iris Pro 580 should be $160 and available on Newegg/NCIX/Microcenter/whatever your favorite store. The fact that we haven't seen such efforts from Intel *and* manufacturers suggest that they want to use the "Iris" as a halo brand and for Intel specifically that means to get you to spend more on CPUs. So they want you to buy the $350 i7 K's which wouldn't happen if $160 Core i3 6100 with Iris Pro was available.
 
Aug 11, 2008
10,451
642
126
You totally missed the point. The point was that it doesnt make sense to use a more expensive iris pro cpu when you are adding a discrete card.
 

SuperJaw

Junior Member
Jan 10, 2016
20
0
6
Iris Pro makes the most sense in cheaper CPU's like the i3. If you can afford an i5/i7 you can probably afford a better than Iris GPU... I don't get intel.
I have an I3 6100 and a GTX960 that is way overpowered for what I use it for.
I would have gladly have intel an extra $30-40 for a Iris Pro chip. But the total cost has to make sense.

Intel has $50 less of my Money and Nvidia has $200 extra because of their refusal to offer socketed i3 chips with top spec iGPUs.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Not really. Quite a lot of those GPUs are significantly faster than the current Iris Pro 6200 chips. And by the time Iris Pro 580 is out, 16nm parts are going to make the Iris Pro 580 even worse relatively than 6200.

These are not going to hit the market till mid-2016 (earliest), while Iris Pro 580 is technically out right now. I am willing to bet that Skylake GT4e is much more competitive than Haswell GT3e was back in the day compared to current mainstream 28nm dGPUs. Apple knows this, they're probably waiting Pascal/Polaris to refresh their 15'' rMBP, a new faster / more efficient part than the Geforce GTX950/960M is needed to justify a more complex design with a dGPU.
 
Last edited:

Glo.

Diamond Member
Apr 25, 2015
5,707
4,551
136
Those chips also have the same base clock as their GT2 counterparts. Hmm, interesting.

They didn't put the MSRP yet though, so we need to prepare quite a bit for a twist. Hope they would be actually priced the same as in the sheet you posted earlier. Even then, that's just the MSRP, not the real price for OEMS. Who would sell the GT4e the same price as GT2?




This iris pro should be a tad slower than the 950M (3000+ Firestrike Scores with i7 quads) if we follow the intel's prediction (1.5x ~1500 of iris 6200 is 2250). I could only see the Iris Pro 580 replacing i7 quads+940M/945M, or ASUS might just use these chips like you said because they can.
Nope. Intel Iris 540 is on par in gaming performance with GT940M. And it is in 15W ULV CPU.

So HD580 will be in the range of 950M performance.
 

Rngwn

Member
Dec 17, 2015
143
24
36
Nope. Intel Iris 540 is on par in gaming performance with GT940M. And it is in 15W ULV CPU.

So HD580 will be in the range of 950M performance.

Remember that the Iris 580 only have 1.5x the EUs of the Iris 540/550 and the 950M is about 2x faster than 940M. The Iris 540/550 have 2x the EUs of the HD520 yet only outperform it by 60%~70%.

Assuming the same performance scaling, the Iris 580 would only bring 30%~35% improvement. Having it paired with quad-core CPUs would improve that a bit, but still, I must remain skeptical that the Iris 580 will actually be on par with the 950M.
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,629
10,841
136
Call me crazy, but I'd rather have Iris Pro on any Intel CPU so long as that means getting some of that sweet, sweet eDRAM. It'll help, dGPU or no. The extra EUs are just the icing on the cake.
 

coercitiv

Diamond Member
Jan 24, 2014
6,199
11,891
136
You totally missed the point. The point was that it doesnt make sense to use a more expensive iris pro cpu when you are adding a discrete card.
Actually, the bigger iGPU doesn't make sense, but the eDRAM is a perfect fit for a thermally constrained gaming laptop. We've seen how Broadwell "desktop" parts successfully obtained similar performance with significantly higher clocked Haswell counterparts.

If you can find a way to downclock a CPU inside a laptop by 400Mhz or more and still get equal or better performance in games, you just got yourself 5-10W worth of TDP budget to spend on dGPU power. (this varies with CPU load ofc, but you get the idea)

This is why I got so frustrated with Intel when I realized what Broadwell /w eDRAM can do in certain applications. Also, keep in mind in the mobile environment you don't usually get the chance to compensate through faster memory, so the effects may be even more pronounced. For most OEMs the norm is DDR3L 1600 CL11, and I imagine DDR4 will stay at an equally conservative level.

Call me crazy, but I'd rather have Iris Pro on any Intel CPU so long as that means getting some of that sweet, sweet eDRAM. It'll help, dGPU or no. The extra EUs are just the icing on the cake.
This is also my sentiment.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
You totally missed the point. The point was that it doesnt make sense to use a more expensive iris pro cpu when you are adding a discrete card.

The world does not run completely on logic. That's why certain marketing tactics work. That's why you see HD 520 setups(and previous HD graphics generations) bundled with discrete cards that are questionably worse than the iGPU in performance. But Nvidia's and AMD's reputation as a graphics company and Intel's lack of(and bad drivers) drive them to use it. There's also a stigma related to iGPUs which is probably why even Carrizo is offered with dGPUs.

There's couple reasons to use an Iris Pro enabled GPU
-Halo factor
-eDRAM does help a little bit for CPU

Remember that the Iris 580 only have 1.5x the EUs of the Iris 540/550 and the 950M is about 2x faster than 940M. The Iris 540/550 have 2x the EUs of the HD520 yet only outperform it by 60%~70%.
You know the 1.5x number people are using are probably based on Intel's own slides right? That's over their quad-core, 45W, Iris Pro 6200 setup. Quad cores iGPUs are noticeably faster than the dual core ones because they have more TDP headroom.

There's about 30% difference between Iris 540 15W and Iris 550 28W(in real games, not synthetics that run for a minute and never run into power throttling). I believe additional 20% exists from a 28W Iris 550 to a theoretical 45W Iris "555".

Of course, Intel's expectation of "50% gain" is according to the footnotes, "best case" to say roughly. But 50% isn't unrealistic, because architectural enhancements of Gen 9 is somewhat cancelled out by front-end of the iGPU in a GT4 setup not really increasing over a GT3 one.

-30% as observed from Gen 8 to Gen 9
-50% more slices, which brings in less than 50%(how much??)
-Front end nearly identical(well, in SKL, GT4 front end clocks slightly lower than GT3)
-Memory bandwidth identical?
-Effect of changing from victim cache eDRAM to new setup in SKL

Glo, Iris 540 is not competitive to a 940M. It is barely before its 15W power headroom runs out and then it runs noticeably slower. It's ~30% faster according to longer-term SP4 tests than HD 520. 28W Iris 550 on the other hand(new Vaio, and MBP 13") would be.

These are not going to hit the market till mid-2016 (earliest), while Iris Pro 580 is technically out right now.
All the Iris Pro 580 parts are supposed to be "H1". I am guessing April. That's like 2-3 months away from a Polaris part. Iris Pro 580 is 50% better than Iris Pro 6200. Polaris and Pascal are 2x over predecessors. That means comparatively it'll be worse than Iris Pro 6200 would have been. Of course that's if 2x is uniform. Probably AMD/Nvidia will capitalize on that for more margins at the low end Iris chips are competing. Intel needed Iris Pro 580 last summer.
 
Last edited:

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
All the Iris Pro 580 parts are supposed to be "H1". I am guessing April. That's like 2-3 months away from a Polaris part.

Again, the chips are available, so it's more like a 6+ months lead. Some design wins (HP comes to mind) were announced last year, technically they could be launched now.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,339
10,044
126
But Nvidia's and AMD's reputation as a graphics company and Intel's lack of(and bad drivers) drive them to use it.

I don't think that this can be overstated. Intel's iGPU drivers SUCK, straight-up. I had to finally break down and install a lower-end NVidia GPU (GT740, I just happened to have one handy), as I was having nothing but problems with my i3-6100's HD 530, and HDMI hand-shake issues with my 24" HDTV monitor.
 

Rngwn

Member
Dec 17, 2015
143
24
36
Call me crazy, but I'd rather have Iris Pro on any Intel CPU so long as that means getting some of that sweet, sweet eDRAM. It'll help, dGPU or no. The extra EUs are just the icing on the cake.

Only if those parts will not be too ridiculously expensive, I would want one as well. For now, the Iris Pro 580 is the only way to do any real 3D on Windows VMs in Linux laptops.

For the uninformed, Intel GPUs since haswell can be pass-thru'd into the VMs without separate monitors thanks to GVT-g. This technology is currently impemented on linux as XenGT and KVMGT.
 

Glo.

Diamond Member
Apr 25, 2015
5,707
4,551
136
Remember that the Iris 580 only have 1.5x the EUs of the Iris 540/550 and the 950M is about 2x faster than 940M. The Iris 540/550 have 2x the EUs of the HD520 yet only outperform it by 60%~70%.

Assuming the same performance scaling, the Iris 580 would only bring 30%~35% improvement. Having it paired with quad-core CPUs would improve that a bit, but still, I must remain skeptical that the Iris 580 will actually be on par with the 950M.

So far, what I see is that 950M in the same game is around 70% faster than GT940M.

However, also... I think HD580 will be in the range of GT945M and 850M, which are the same as 950M, only with slightly lower core clocks. 945M is 15% slower than 950M, and 850M is 10% slower.

But still I genuinely believe that at this stage, if you want something "low" profile you should go after integrated GPUs. Intel done absolutely great job.
 

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Out of curiosity, those performance estimates are for synthetics (like 3D Mark) or actual games ?? Because Intel iGPUs have way different performance in actual games than on synthetics.
 

Glo.

Diamond Member
Apr 25, 2015
5,707
4,551
136
Well, if we will look at performance figures from games and compare HD6100 which was supposed to be very fast GPU from Broadwell CPUs with HD530 which is supposed to be mediocre GPU from Skylake revision, it will be apparent that Skylake GPUs are much faster.

Skylake GPU with 24 cores is 15% faster than top of the line GPU from Broadwell line, that is without EDRAM. And HD6100 has twice the amount of cores. It looks like Intel improved graphics immensely.

But it should look like this. Kaby Lake is Rumored to have 80% more power than HD580(Iris Pro obviously). And Cannonlake GPUs should have 50% more power than KabyLake. It would put them in the range of GTX970/R9 390 performance.
 

mikk

Diamond Member
May 15, 2012
4,140
2,154
136
Of course, Intel's expectation of "50% gain" is according to the footnotes, "best case" to say roughly.


It isn't a best case.

Out of curiosity, those performance estimates are for synthetics (like 3D Mark) or actual games ?? Because Intel iGPUs have way different performance in actual games than on synthetics.

Compared to AMD the difference is bigger in favour to Intel because AMD does better in 3dmark. Compared to Intel itself the difference in gaming should be bigger for the faster SKUs with more TDP headroom.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Well, if we will look at performance figures from games and compare HD6100 which was supposed to be very fast GPU from Broadwell CPUs with HD530 which is supposed to be mediocre GPU from Skylake revision, it will be apparent that Skylake GPUs are much faster.

Skylake GPU with 24 cores is 15% faster than top of the line GPU from Broadwell line, that is without EDRAM. And HD6100 has twice the amount of cores. It looks like Intel improved graphics immensely.

If im not mistaken HD6100 was only found on 15W TDP SKUs when HD530 can be found on 35W TDP up to 91W TDP Desktop. There is no way the HD530 is faster than HD6100 at the same 35W TDP or above.

But it should look like this. Kaby Lake is Rumored to have 80% more power than HD580(Iris Pro obviously). And Cannonlake GPUs should have 50% more power than KabyLake. It would put them in the range of GTX970/R9 390 performance.

Not sure if you are serious or im missing your sarcasm.
 

Glo.

Diamond Member
Apr 25, 2015
5,707
4,551
136
If im not mistaken HD6100 was only found on 15W TDP SKUs when HD530 can be found on 35W TDP up to 91W TDP Desktop. There is no way the HD530 is faster than HD6100 at the same 35W TDP or above.
Nope. HD6100 was in 28W CPUs.

Not sure if you are serious or im missing your sarcasm.
I am completely serious. If Kaby Lake really will be 80% faster than Skylake Iris Pro it will bring it to performance levels of GTX960. Add another 50% over that(CannonLake) and you end up in the ranges of 970 and R9 390.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
If im not mistaken HD6100 was only found on 15W TDP SKUs when HD530 can be found on 35W TDP up to 91W TDP Desktop. There is no way the HD530 is faster than HD6100 at the same 35W TDP or above.

HD 520 is ~35% faster than HD 5500 @ Tomb Raider, Bioshock Infinite and Battlefield 4 according to NotebookCheck. Same number of EUs, process (14nm) and TDP (15W). Intel is gradually improving their graphics performance each generation.

Iris 6100 is found in 28W SKUs.

www.notebookcheck.com/Test-HP-EliteBook-745-G3-Notebook.157955.0.html
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,001
3,357
136
Nope. HD6100 was in 28W CPUs.

Yes there are three 28W TDP HD6100. Can you find and post any benchmarks between the 28W TDP HD6100 and the 35W TDP HD530 ??


I am completely serious. If Kaby Lake really will be 80% faster than Skylake Iris Pro it will bring it to performance levels of GTX960. Add another 50% over that(CannonLake) and you end up in the ranges of 970 and R9 390.

Ehm, where that 80% over Skylake Iris Pro comes from ??? And what Skylake are we talking about ??? GT3e ??
 
Last edited: