• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."
  • Community Question: What makes a good motherboard?

Intel Skylake / Kaby Lake

Page 100 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mikk

Platinum Member
May 15, 2012
2,957
769
136
I think it was good to move to a 3 digit numbering system for the iGPUs, so it becomes easier to distinguish from the 4 digit CPU numbers. The old 4 digit iGPU numbers looked awfully similar to the CPU numbers for those not in the know, e.g. 4670 CPU and 4600 iGPU.

Not only this. AMD already did use a similar numbering system with HD 6000, HD 7000 etc. Without Intels numbering change Skylake would be HD 7000 as well.
 

B-Riz

Golden Member
Feb 15, 2011
1,232
307
136
Don't worry IDC, Intel's misdirection marketing mavens have already succeeded.

Pages and pages of this thread have seen (former?) overclocking enthusiasts debate the average-to-magnificent performance gains of Intel's stunning 14nm graphics technology. Performance so impressive that entry level 28nm dGPUs will be demolished. Again.

The brilliance of releasing globally (in a local sort of way) the Broadwell up-sized mobile CPU on the desktop with disappointing overclocking - but way better iGPU a few months before the next 14nm power-sipping powerhouse, softens up the resistance to the next generation of overclocking disappointment and 3-7% IPC gains.

The masters of die manipulation have already sold us on the vital importance of perf/watt - for CPUs. For graphics processing..well...that's what 10nm is for. Moar iGPU, because, today's desktop enthusiasts are obsessed with iGPU performance and the frame-spitting magic of eDRAM.

Seven versions of Iris!!! This is the stuff enthusiast..uh...shareholder dreams are made of.

This new era of dismal IPC gains and moar iGPUs is so exciting my decoder ring finger is getting twitchy.
At this point, this is the new normal.

The Sandy K series seem to be an anomaly, compared to what Intel has pushed out post Sandy Bridge.

Intel gave us a bone to chew on with 1155 K series chips, with the lol-tastic integrated graphics, it did not matter, and most people buying K series chips were going to have a fancy dedicated GPU.

But now, integrated on chip graphics is a BFD (thanks Apple, and maybe a little thanks to AMD too).

The X79 / X99 platform and chips is where we are supposed to go now (and after SB K was replaced with IB K), with "real" TIM between the die and heat-spreader, for the OC enthusiasts.

For a hot minute, we had enthusiast level chips on the mainstream platform, but, Intel changed that; server lite or bust for chips w/o integrated graphics and good TIM.
 

Sweepr

Diamond Member
May 12, 2006
5,151
1,127
131
Something in between GT1 and GT2, nothing new here. The SKL naming scheme is old stuff, known since several months.
Some websites mentioned 18 EUs, is that possible/real (same as the leaked Apollo Lake-I)? Maybe just a higher-clocked 12 EUs (GT1) part?

JoeRambo said:
Yeah... I feel like my haswell will last me for quite a time (hopefully as much as 920 lasted ). After build up of IPC hype, Skylake turned out to be business as usual IPC increase.
Not enough data to talk about IPC yet, but I agree, people expecting more than Haswell or Sandy Bridge clock-per-clock improvement should keep their expectations in check. Most hardware forums users are only concerned about top SKU generational improvements but let's not forget some people out there do care about platform advancements, better performance at lower TDPs (for the first time we have a solid quad-core 'T' 35W TDP lineup) and improved iGPU performance. Power users who need >4C/8T will slowly move to LGA201x. I only wish Intel didn't take so long to release their enthusiast lineup, just pull another 'Broadwell-K' with Broadwell-E and launch Skylake-E next year (instead of 2017).

Tovarisc said:
@Sweepr you have no idea how much I have been getting on Asus nerves the past couple of days via Facebook messages lol
Me: Show them link of other companies board images and info on the web. When are you going to show yours

Asus: In the next couple of days.

Me: Can you at least tell me how many fans headers are on the Z170 Deluxe

Asus: Please contact our Asus friends in your country for more details
Only 1 day to go. :)
 
Last edited:

Tovarisc

Member
Jun 12, 2015
50
0
0

imaheadcase

Diamond Member
May 9, 2005
3,850
7
76
I wonder if my i72600 with ddr3 willl be worth upgrading to this. Guess benchmarks let me know in next week or so.
 

RussianSensation

Elite Member
Sep 5, 2003
19,460
743
126
I wonder if my i72600 with ddr3 willl be worth upgrading to this. Guess benchmarks let me know in next week or so.
Logically no, unless you already have 980Ti SLI/Fury X CF and a 4K FreeSync/GSync monitor and a PCIe 3.0 x4 SSD and/or high quality headphones/speakers. Any of those is a way better buy for actual experience than moving from a 2600K OC to Skylake K quad OC. If you need video encoding/decoding, I bet GTX 950 will be better than Skylake too.

For productivity, 5820-5960X OC will beat Skylake and for modern games that use 6-cores, once again X99 will win. In other gaming scenarios 2600K @ 4.4ghz+ will be GPU limited anyway and upgrading to even a 5Ghz Skylake will hardly matter. I suppose if you are a hardcore strategy/WoW player where IPC is king, it could make sense. For i5 2500k users, Skylake makes a bit more sense.

Imo, a better upgrade path for 2600K OC is Skylake-E or waiting until next gen Icelake 10nm architecture in 2018. Most games do not scale linearly with higher IPC from Sandy. At least moving to 6 cores provides a benefit to productivity/some modern AAA games (but this is assuming you have 980TI SLI or faster).

This could change with newer games and 16nm GPUs but at that point Skylake-E could be out or Kaby Lake refresh. However, if you want to upgrade for fun to play with new toys and can resell the i7 2600k platform for a decent chunk of money, why not :)
 
Last edited:

B-Riz

Golden Member
Feb 15, 2011
1,232
307
136
I wonder if my i72600 with ddr3 willl be worth upgrading to this. Guess benchmarks let me know in next week or so.
I don't think it is truly going to be worth the money and hassle; maybe if it was 6 cores no HT.

After looking over the leaked stuff, and wanting 32 GB of RAM if going DDR4, I just decided (coming from 2700k / Z77), "eff it, 4790K and OC is the last hurrah of DDR3, will take the extra USB 3.0 and SATA 6.0 Gb/s ports now".

Will maybe plunk down for DDR4 when Skylake-E drops.
 

mikk

Platinum Member
May 15, 2012
2,957
769
136
Some websites mentioned 18 EUs, is that possible/real (same as the leaked Apollo Lake-I)? Maybe just a higher-clocked 12 EUs (GT1) part?


18 EUs would mean GT 1.5, this is unchanged to Broadwell.

The "some news website" stuff is unreliable nevertheless. GT2 is missing for Skylake Y and Iris graphics 540 is a GT3e part not GT3. Also GT 1.5 is named HD Graphics 510 not HD graphics 520 according to Intel. So many errors, a news to forget. For a more reliable numbering system people should take a look into the igdlh64.inf from a 15.40 driver.
 

DrMrLordX

Lifer
Apr 27, 2000
16,632
5,639
136
So much iGPU bashing! Some of us happen to like modern iGPUs. One of these days, devs are going to wake up and realize that Intel has rolled out millions upon millions of OpenCL 2.0-compliant iGPUs with a driver stack that actually works, and it's gonna be on like Donkey Kong. Not to speak of what engine developers will do with them via DX12/Vulkan. People holding on to older chips will have reason to upgrade. Those on LGA2011 v3 will not be happy.

I say, the more Iris Pro the better. Besides, that eDRAM l4 is great for lots of non-OpenCL/DX12/Vulkan stuff, when you can get it . . . stupid 5775C. Grumble grumble.
 

Blue_Max

Diamond Member
Jul 7, 2011
4,227
152
106
So much iGPU bashing! Some of us happen to like modern iGPUs. One of these days, devs are going to wake up and realize that Intel has rolled out millions upon millions of OpenCL 2.0-compliant iGPUs with a driver stack that actually works, and it's gonna be on like Donkey Kong. Not to speak of what engine developers will do with them via DX12/Vulkan. People holding on to older chips will have reason to upgrade. Those on LGA2011 v3 will not be happy.

I say, the more Iris Pro the better. Besides, that eDRAM l4 is great for lots of non-OpenCL/DX12/Vulkan stuff, when you can get it . . . stupid 5775C. Grumble grumble.
Totally agreed. I have a project in mind where an Intel NUC box would be PERFECT! The more CPU's have decent IGP, the better! Iris-level graphics are required though.

I'm glad they seemed to have made it smarter, too... if you're using a dedicated video card it won't be wasted, it'll still use a portion of it to speed up the rest of the system. (Did I read that right?) Damned clever!
 

Brunnis

Senior member
Nov 15, 2004
499
54
91
i7-6700K review: http://iyd.kr/758

Haven't looked through it all, but they've actually tested 4790K, 5775C and the 6700K at the 3.6GHz to compare IPC. Looks pretty decent, actually... Around 13% higher IPC than Haswell.

EDIT: Only 9% faster than the 4790K with both at stock, though. Guess that's due to the higher boost frequency of the 4790K.

Something funky going on with the gaming results, though. They're actually down...
 
Last edited:

crashtech

Diamond Member
Jan 4, 2013
9,532
1,447
126
So much iGPU bashing! Some of us happen to like modern iGPUs. One of these days, devs are going to wake up and realize that Intel has rolled out millions upon millions of OpenCL 2.0-compliant iGPUs with a driver stack that actually works, and it's gonna be on like Donkey Kong. Not to speak of what engine developers will do with them via DX12/Vulkan. People holding on to older chips will have reason to upgrade. Those on LGA2011 v3 will not be happy.

I say, the more Iris Pro the better. Besides, that eDRAM l4 is great for lots of non-OpenCL/DX12/Vulkan stuff, when you can get it . . . stupid 5775C. Grumble grumble.
It would be nice to use the igpu for physx etc, this new crop should actually be useful for such things.
 

Sweepr

Diamond Member
May 12, 2006
5,151
1,127
131
i7-6700K review: http://iyd.kr/758

Haven't looked through it all, but they've actually tested 4790K, 5775C and the 6700K at the 3.6GHz to compare IPC. Looks pretty decent, actually...
Nice. :)











For IPC comparisons remmember Core i7 6700K and Core i7 4790K are not running at equal clocks.
3.6GHz Skylake-S was 13.25% faster than 3.6GHz Haswell on average (17 benchmarks). About what you would expect from an Intel tock.
 

mscrivo

Member
Mar 22, 2007
57
0
61
Nice. :)
For IPC comparisons remmember Core i7 4790K and Core i7 4790K are not running at equal clocks.
3.6GHz Skylake-S was 13.25% faster than 3.6GHz Haswell on average (17 benchmarks). About what you would expect from an Intel Tock.
The gaming scores being lower than a 4790k are a bit worrying though. Wonder what's up with that given the 9% overall increase at stock clocks in non-gaming scenarios.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,395
0
76
EDIT: Only 9% faster than the 4790K with both at stock, though. Guess that's due to the higher boost frequency of the 4790K.
So 9% performance increase for two CPU generations (Haswell->Broadwell->Skylake). Quite crappy I'd say... :(
 

Sweepr

Diamond Member
May 12, 2006
5,151
1,127
131
So 9% performance increase for two CPU generations (Haswell->Broadwell->Skylake). Quite crappy I'd say... :(
Devil's Canyon was a generational bump itself, there was no 4C+GT2 Broadwell chip. Compared to Core i7 4770K we're talking 20-22% faster at stock based on this particular review.

Core i7 2600K was 20.1% faster than Core i7 880 and Core i7 4770K was 19.4% faster than Core i7 2600K according to (AMD-biased) Hardware.fr, did you find them crappy too?
 

ioni

Senior member
Aug 3, 2009
617
8
81
i7-6700K review: http://iyd.kr/758

Haven't looked through it all, but they've actually tested 4790K, 5775C and the 6700K at the 3.6GHz to compare IPC. Looks pretty decent, actually... Around 13% higher IPC than Haswell.

EDIT: Only 9% faster than the 4790K with both at stock, though. Guess that's due to the higher boost frequency of the 4790K.

Something funky going on with the gaming results, though. They're actually down...
The start of that review makes it look like they are comparing a 4.4GHz 4790k to a 4.2GHz 6700k. Am I interpreting that right?
 

ASK THE COMMUNITY