Intel Skylake / Kaby Lake

Page 268 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Because people want to bring back the fun of budget CPU overclocking to Intel systems. Looks like a more sofisticated solution than before, and I'm glad ASRock is still comitted to this. :)

It is not sophisticated at all. Intel even recomends not using their own clock generator on Z170 boards, and motherboard OEMs come up with their own clock generators. Asrock just used their third party clock generator used on their own Z170 board on the other chipset's boards.

Hopefully other OEMs back Asrock up with this bold decision. Someone needs to give Intel the F sign with their absurd artificial limitations in order to segment their products.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,325
10,034
126
Well, however they are accomplishing it, I hope that it sticks around, and can't be easily nerfed by Intel.

(Edit: And bring on the OCable Celeron SKL CPUs, Intel!)

I'm cautiously optimistic about these new boards. If they stick around, they could replace the G3258 combos as go-to budget (mild) OC boards for customers. (Especially if they get the onboard video working with BCLK OC.)
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,325
10,034
126
Just buy a mobo with the LSPcon. I dont think you are ever going to see HDMI 2.0 without. HDMI 1.4 already comes from a DP 1.2 conversion. In other words, I only think Intel supports DP out of the CPU now and future.

Is that one reason why I have so much trouble with HDMI audio handshake issues on my Asus H110M-A board and my i3-6100?
 
Aug 11, 2008
10,451
642
126
Because people want to bring back the fun of budget CPU overclocking to Intel systems. Looks like a more sofisticated solution than before, and I'm glad ASRock is still comitted to this. :)

People want a lot of things, but that does not make it so. It is great that a mb maker is "committed" to this, but honestly, I dont see how anyone can make a purchase decision based on something not officially sanction by the cpu maker.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
IP6200 is probably faster than a GT730, since it's sometimes close to a GTX750.

http://media.bestofmicro.com/T/J/497431/original/17-IGP-GTA-V.png

Iris Pro 6200 performs close to GTX 750 only on Tomshardware's contrived benchmark. Low resolution, outdated games. Iris Pro 6200 is slightly worse than GT 740 GDDR5.

Based on the 3DMark11 score and the specs I'd say it'd be on par with Iris 550(not the power limited Iris 540). But yea, Iris Pro 6200 is faster.

Would be fun to see a monstruous iGPU with lower clocks, probably more efficient than the current ~1GHz Iris Pro models.
I don't really care which way they take to get it there. *If* they want to be competitive with Polaris/Pascal parts that will bring 2x performance across the generation, they need a 2.5TFlop, 70GTexels/s equivalent GPU by that time. Otherwise, forget about it. It's possible that Kabylake's highest GPU is coming in mid-2017, meaning it has go against AMD's Zen APU with HBM.

(I am not sure how effective it is to decrease clocks. You only get better efficiency if you get the voltage reduction to go with it. It may be possible even at the 14nm generation, it isn't worth going much lower voltage wise because you lose significant frequency that counters any perf/watt you may gain from that lower voltage. They'd have to make it more efficient architecturally)
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
GTA 5 was released in April of 2015 for the PC, and the Tom's review of IP6200 was in June of 2015, so I think your criticism of that chart is unfair. Also, these IGPs are meant for low res games, so that's what I expect to see in a comparison.
 

Auric

Diamond Member
Oct 11, 1999
9,596
2
71
Any benchmarks of the 6200 overclocked? The 4600 default is about the same at 1150MHz, but happily runs at 1600MHz.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
GTA 5 was released in April of 2015 for the PC, and the Tom's review of IP6200 was in June of 2015, so I think your criticism of that chart is unfair. Also, these IGPs are meant for low res games, so that's what I expect to see in a comparison.

Not at all. The Iris Pro 6200 gets 124 fps and the GTX 750 gets 143. Also, its running at minimum settings with 1280x720 resolution. The really old games like HL2 runs at high, but then it wasn't demanding anyway. With the HW VS drivers you could run it playably even on GMA X3000.

You start emphasizing CPU at this point. Look at the settings for the non-Intel GPU. "Athlon X4 860K + Entry level cards"

Really? Do you think that compares at all with the 5775C? How do you think GTX 750 with 5775C would do?

If I had that GPU I would jack the GTA V settings way high and aim for 40-50 fps. I am pretty sure GTX 750 would do better at that setting too. Every other site aims for 40-50 fps except this one. A foreign site that compared it with GTX 740/750 showed that its on par with GTX 740. The GTX 750 is another 50% faster(meaning you need Skylake's Iris Pro 580 to equal GTX 750).

Misleading is always subtle.
 
Last edited:

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
AnandTech reviewed the Core i5-6260U based NUC with Iris 540. Too bad they missed the opportunity to test some games.


AnandTech: The Intel NUC6i5SYK Skylake UCFF PC Review

cropped-nuc-platform_575px.jpg


Intel has been pushing the performance per watt aspect and GPU performance heavily in the last few generations, making each successive NUC generation more attractive than the one before. We have already looked at multiple Broadwell NUCs. The Skylake NUCs currently come in two varieities - one based on the Core i3-6100U and another based on the Core i5-6260U. The i5 version is marketed with the Iris tag, as it sports Intel Iris Graphics 540 with 64MB of eDRAM.

The benchmark numbers show that it is a toss-up between the Broadwell-U Iris Core i7-5557U in the NUC5i7RYH and the Core i5-6260U in the NUC6i5SYK. The former is a 28W TDP part and can sustain higher clocks. Despite that, the performance of the two are comparable for day-to-day usage activities (such as web browsing and spreadsheet editing), as tested by PCMark 8.

pcm8_home_opencl.png


pcm7_pcmss.png


3dm11_es.png


cbr15_opengl.png


x264_p2.png


dolphin.png


The thermal design continues to be good, and the default BIOS configuration ensures that the Core i5-6260U can sustain higher operational power levels than what is suggested by its TDP of 15W. This is particularly interesting, since the processor doesn't officially have a configurable higher TDP. The Skylake GPU has also shown tremendous improvement compared to Broadwell and previous generations, and this is evident in the 3D benchmarks. The NUC6i5SYK also sports an Iris GPU with 64MB of eDRAM that helps improve performance for various workloads.

www.anandtech.com/show/10121/intel-nuc6i5syk-skylake-ucff-pc-review
 

coercitiv

Diamond Member
Jan 24, 2014
6,184
11,838
136
Well the NUC is with EDRAM and all :p

If they also leave out games with the i7 NUC with both Iris Pro and external GPU support. Then its truly a massive fail.
I didn't even know they started using eDRAM in non "Iris Pro" parts. Congrats Intel, you managed to completely mutilate what was left of your GPU naming scheme.

Then yes, I agree: massive fail in the review. Massive fail in product naming scheme as well.
 

Rngwn

Member
Dec 17, 2015
143
24
36
Massive fail in product naming scheme as well.

Pretty much, but this should be directed more to the haswell and broadwell era where the non-eDRAM gt3 is splitted into HD5000/Iris 5100 and HD6000/Iris 6100 respectively with the difference is only being in 15w or 28w chip.

Although I think that the naming convention for Skylake is quite alright now, since the "Iris" moniker reflects the eDRAM and GT3/4 iGP.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Just buy a mobo with the LSPcon. I dont think you are ever going to see HDMI 2.0 without. HDMI 1.4 already comes from a DP 1.2 conversion. In other words, I only think Intel supports DP out of the CPU now and future.

Intel will have native HDMI 2.0 eventually. They have to.
 
Last edited:

nerp

Diamond Member
Dec 31, 2005
9,866
105
106
Pretty poor from them, no game tests is a massive fail.

It's a review of NUCs. You can't even put in a discrete graphics card. They are made to replace people's home computers who don't play games beyond very light casual games.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
Alpine Ridge is not cheap, very little companies actually ship their motherboards with it because it increases BOM costs.

You won't see HDMI 2.0/2.0a mass adoption until Kaby Lake which ships with it natively.
Also Skylake without full fixed function HEVC Main10 decoding that Kaby Lake supports natively, it limits the appeal for 4K HTPC usage.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
The question is how viable HDMI is in the long run. DP is slowly getting into TVs. And I for one wouldn't mind a single standard for everything.
 

DidelisDiskas

Senior member
Dec 27, 2015
233
21
81
It's a review of NUCs. You can't even put in a discrete graphics card. They are made to replace people's home computers who don't play games beyond very light casual games.

The 6100 iris that i had (no edram) could play CS:GO and TFT2 1080p on medium to medium high settings, so the new iris with edram can certainly play more than very light casual games and it would be reasonable to look into that in a review.
 

nvgpu

Senior member
Sep 12, 2014
629
202
81
HDMI is entrenched in consumer electronics(CE) space and is the defacto standard there. Every AV receiver has HDMI inputs and outputs, there's no receiver with DP inputs & outputs. All current gaming consoles uses HDMI, not a single one has a DP output.

DP is the defacto standard for PC monitors.

DP came too late and will never displace HDMI in CE space no matter what you think.
 
Last edited: