Intel Skylake / Kaby Lake

Page 560 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

gx_saurav

Senior member
Dec 5, 2012
247
61
101
about.me
I had a feeling that Coffee Lake won't be compatible with current Chipsets which would mean I would have needed to buy a new motherboard + new CPU if I wait for Coffee Lake. Now that things are clear and I can actually see Deus Ex Mankind Divided only utilizing 70% to 80% of my CPU, I guess I can stick to it easily for few more years.

Even if my Core i7 7700 becomes new Core i3 8300, I will still stick to it because other than brand name change, it will still have plenty of power for 1080p gaming with my RX 470 for next few years.

But then look at the bright side, developers will be forced to optimize their games for 4C-8T and 6C-6T going forward in next 3 years as that is most mainstream CPU now.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
But then look at the bright side, developers will be forced to optimize their games for 4C-8T and 6C-6T going forward in next 3 years as that is most mainstream CPU now.

I wish they don't. They have limited resources and time, like everything in life. Only in semiconductors you have your cake and eat it too(Moore's Law). And that's going to slow to a crawl.

I like developers like Valve and Blizzard. They don't focus on performance or cutting edge graphics, but gameplay. I look at Steam now and see most games are pretty much crap. That's supposed to be state of the art in offering games. It's the reason the games focus ever more on extremes(like increasing violence/gore/horror for example), because its the easy way to go.

Awesome graphics are fine, but you get used to it. And by then it becomes "meh". Good gameplay lasts. Doesn't mean they shouldn't focus on it, but it should be secondary.
 
  • Like
Reactions: Pilum

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I wish they don't. They have limited resources and time, like everything in life. Only in semiconductors you have your cake and eat it too(Moore's Law). And that's going to slow to a crawl.

I like developers like Valve and Blizzard. They don't focus on performance or cutting edge graphics, but gameplay. I look at Steam now and see most games are pretty much crap. That's supposed to be state of the art in offering games. It's the reason the games focus ever more on extremes(like increasing violence/gore/horror for example), because its the easy way to go.

Awesome graphics are fine, but you get used to it. And by then it becomes "meh". Good gameplay lasts. Doesn't mean they shouldn't focus on it, but it should be secondary.


I have felt for years that there are games that mainly get purchased as benchmarks by a lot of people to show how powerful their systems are. Ashes of the Singularity seems to show up in every benchmark run, but seems questionable as a fun game. Then there is the ever popular: "Does it run Crysis?".
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I have felt for years that there are games that mainly get purchased as benchmarks by a lot of people to show how powerful their systems are. Ashes of the Singularity seems to show up in every benchmark run, but seems questionable as a fun game. Then there is the ever popular: "Does it run Crysis?".

I think the most ridiculous(IMO) is regarding Anti-Aliasing. I read replies on the forum about changed AA algorithms on World of Warcraft. They linked a Youtube video because movement shows shimmering effects and such.

Few days ago I was playing Portal 2. Ok, there I noticed lack of AA was pretty bad. But you know when I started not caring? When I started moving around and playing. I'd think if you have a bit of a competitive nature and want to play it better, your attention would shift elsewhere. Meaning AA wouldn't be really noticed.

I play on a 1280x1024 monitor by the way. I thought it was a bit jarring since its a 5:4 monitor and pretty old one, used one at that. But hey, it was $40 CDN. No complaints. 3 days I don't notice the difference anymore.

What about World of Warcraft? I watched the Youtube video. I had to watch closely to see any difference beween No AA and 3-4 other AA methods. They were saying its due to compression, etc, but again, if you are raiding, or doing PvP I doubt it should bother you.

AA is fine, if it wasn't for the massive performance loss.

Sorry for off topic.
 

Timmah!

Golden Member
Jul 24, 2010
1,395
602
136
I am waiting for 7920x related news like for a second coming, but nothing ever happens.... aaaah. Just Coffee Lake news galore.
 

Malogeek

Golden Member
Mar 5, 2017
1,390
778
136
yaktribe.org
What about World of Warcraft? I watched the Youtube video. I had to watch closely to see any difference beween No AA and 3-4 other AA methods. They were saying its due to compression, etc, but again, if you are raiding, or doing PvP I doubt it should bother you.
Having recently returned to WoW, I've found it to have some of the best AA around. There's enough options that there is no shimmer, no shader aliasing, excellent edge detection for alpha foliage etc. It's a very clean image. Youtube is always a terrible way to show differences though, as mentioned, due to compression.

In contrast, I tried Black Desert Online and the shimmer and aliasing was so bad, with no way to mitigate it, that I couldn't play it.

After decades of PC gaming, rendering artifacts stand out to me so much they completely detract from the gameplay. My wife doesn't notice unless I specifically point it out to her, then she still doesn't care. It's very subjective.
 

dullard

Elite Member
May 21, 2001
24,998
3,326
126
I am waiting for 7920x related news like for a second coming, but nothing ever happens.... aaaah. Just Coffee Lake news galore.
Why the 7920x? From the data that has been rumored, it seems like the ugly stepchild of that family.

If the rumors are correct, it has 14% lower base clocks for only 20% more cores than the 7900x. Taking into account the loss of efficiency in most software for more cores, it will end up being not much faster than the 7900x in multiple threads and slower in single threads. On the other side, the 7940x is rumored to have more cores and higher clocks than the 7920x for not much more money.

That is, unless you have newer rumors of better clocks than the original 7920x rumors. My whole post could be wrong if Intel comes out with better specs than was originally rumored.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I think the most ridiculous(IMO) is regarding Anti-Aliasing. I read replies on the forum about changed AA algorithms on World of Warcraft. They linked a Youtube video because movement shows shimmering effects and such.

Few days ago I was playing Portal 2. Ok, there I noticed lack of AA was pretty bad. But you know when I started not caring? When I started moving around and playing. I'd think if you have a bit of a competitive nature and want to play it better, your attention would shift elsewhere. Meaning AA wouldn't be really noticed.

I play on a 1280x1024 monitor by the way. I thought it was a bit jarring since its a 5:4 monitor and pretty old one, used one at that. But hey, it was $40 CDN. No complaints. 3 days I don't notice the difference anymore.

What about World of Warcraft? I watched the Youtube video. I had to watch closely to see any difference beween No AA and 3-4 other AA methods. They were saying its due to compression, etc, but again, if you are raiding, or doing PvP I doubt it should bother you.

AA is fine, if it wasn't for the massive performance loss.

Sorry for off topic.

I am playing games (from GOG) on 9 year old computer with 8800GT GPU, so I am well off modern standard, though AA is something I really aim to have. Jaggies bother me. Though even the lowest 2X setting usually is enough to get me by.

The thing that gets me is the obsession with "Ultra" Settings. I think a GTX 1060 would be all the GPU I need (to play something modern like Witcher 3) because Medium setting look perfectly fine to me. Heck on side by side videos I can't even tell the difference most of the time.
 

TheF34RChannel

Senior member
May 18, 2017
786
309
136
The leaker (royalk) seems to think CNL PCH also supports future Cannon Lake-S CPUs, but we know this SKU was canned a long time ago. I wonder if he got confused with Ice Lake-S. I speculated about ICL-S compatibility for socket LGA 1151 with CNL PCH back in June.

Ice Lake, and then by default also Tiger Lake, compatibility would be interesting! However I seriously doubt that. @Sweepr So what is Cannonlake-PCH, or what will be replacing it...?
 
Last edited:

gx_saurav

Senior member
Dec 5, 2012
247
61
101
about.me
I wish they don't. They have limited resources and time, like everything in life.

While I do agree that gameplay is more important which is why I am still playing Watch Dogs 2 even after completing the story mission, graphics matter too to me.

Most people play one or two games repeatedly such as Battlefield or Call of Duty in multiplayer scenario and playing with a bunch of friends at home is always a better experience because then no one care about graphics.

Extracting more performance from your system is always a good thing. For a decade I have felt that hardware is way ahead of software when it comes to games and we need to give time to optimize games.
 
  • Like
Reactions: Kuosimodo

Timmah!

Golden Member
Jul 24, 2010
1,395
602
136
Why the 7920x? From the data that has been rumored, it seems like the ugly stepchild of that family.

If the rumors are correct, it has 14% lower base clocks for only 20% more cores than the 7900x. Taking into account the loss of efficiency in most software for more cores, it will end up being not much faster than the 7900x in multiple threads and slower in single threads. On the other side, the 7940x is rumored to have more cores and higher clocks than the 7920x for not much more money.

That is, unless you have newer rumors of better clocks than the original 7920x rumors. My whole post could be wrong if Intel comes out with better specs than was originally rumored.

I would not call it ugly stepchild :-D It interests me as much as 7900x, which however has its well known downsides, and i am curious if by accident 7920x wont turn out better. For all we know, it could be soldered, or even if not, it could have somewhat better thermal properties given the fact its gonna be based on another, bigger die than 7900x....

Additionally given the fact i own just 240mm Corsair AIO, i would not be able to OC 7900x higher than 4,3 - 4,4 GHz anyway. And since 7920x can turboboost 2.0 to 4,3GHz already out of the box (granted, on 2 or just one core only), i would hazard a guess it will be capable of all core overclock like that and i strongly believe those additional 2 cores wont make much difference.

7940x is just way too much money, i dont want to wait until October and the single core speed issue compared to 7900x is going to be even more pronounced. 7920x is already expensive enough, but if i find a way to not pay VAT, then it will be the same amount of money like 7900x with VAT :p

I wish Intel would drop their prices to match Threadripper, 7900x to 1920x price level, 7920x to 1950x price.
 

John Carmack

Member
Sep 10, 2016
153
246
116
I would bet 99%+ of motherboards only ever have one CPU in them. It's one of those complaints to have something to complain about, and will be utterly forgotten a couple of months after launch of the new socket standard.

Only because in recent years a certain company with control of most of the consumer market has deigned us only worthy of having a socket or chipset for a year or two. I have extra slot 1 and socket A processors laying around and it wasn't because they stopped working.

This is one of those very temporary blips that cause gnashing of teeth on forums, but have no actual effect in the real world since the VAST majority of people never upgrade their CPUs in the first place.

The vast majority of people don't custom build PC's or overclock, can't wait to see those disappear for good and hear the gnashing of teeth on the forums which has no effect on the real world.
 
  • Like
Reactions: Kuosimodo

formulav8

Diamond Member
Sep 18, 2000
7,004
522
126
If I'm reading that right, why would they do that? Only compatible with KL-X? I was thinking KL-X only made the most sense as a stepping stone purchase to hedt? :/
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Updated roadmaps out guys! Some interesting tidbits I got (first look):

Intel 2017-2018 DT Roadmaps

CcirB0c.jpg


vROivZt.jpg


EkiS3T4.jpg


oHjXDHV.jpg


http://bbs.pceva.com.cn/thread-140322-1-1.html


- There will be 6 consumer Coffee Lake-S SKUs at launch (including both 4C+GT2 and 6C+GT2), we already know the specs for four of them - considering the 6C+GT2 die covers both Core i5 and Core i7 lineup, two Core i3 models make sense, both should be 4C+GT2 (Core i3-8300 above and another one?)
- Production window for CFL-S 4+2 and 6+2 is the same, ww34-41 2017 - which means August 21 to October 9
- Up to 24 PCIe 3.0 lanes
- 95W (enthusiasts), 65W (corporate / mainstream) and 35W (low power) TDP SKUs, as expected
- KBL-R PCH = Z370 = high-end chipset, which launches first and apparently won't get replaced by a Z390 in early 2018 as Dr.MOLA indicated
- CNL PCH = Rest of the 300 series, including programmable quad-core audio DSP, Gen 2 of USB 3.1, Wifi-AC, SDXC 3.0 and Thunderbolt with DisplayPort 1.4 - even H110 will be replaced

Typo on first slide, 4+2 WW01'17.
 

coffeeblues

Member
Jun 23, 2017
49
18
36
Updated roadmaps out guys! Some interesting tidbits I got (first look):

..

- KBL-R PCH = Z370 = high-end chipset, which launches first and apparently won't get replaced by a Z390 in early 2018 as Dr.MOLA indicated
- CNL PCH = Rest of the 300 series, including programmable quad-core audio DSP, Gen 2 of USB 3.1, Wifi-AC, SDXC 3.0 and Thunderbolt with DisplayPort 1.4 - even H110 will be replaced

I wonder what is going on:

Z370 looks to be neither forward nor backward compatible,
and high-end overclockable CNL PCH could be delayed to extend the shelf-life of the above?
 

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
So, is Cascade Lake being released Q3 2018?
Rather than higher clocks, I'm hoping for something like a 20% decrease in power @ ISO clocks. And, hopefully, a further price cut.
 
  • Like
Reactions: Pick2

Ajay

Lifer
Jan 8, 2001
15,332
7,792
136
More like a very close sibling since the socket pin-out slightly differs, whereas a pure re-batch I would regard as identical with a new name slapped on.

What does the chipset have to do with the socket on modern processor SOCs? The connection is mainly (solely?) through the 4 DMI 3.0 links. Maybe I'm just daft today, but I don't see the relationship.
 
  • Like
Reactions: Pick2

TheF34RChannel

Senior member
May 18, 2017
786
309
136
What does the chipset have to do with the socket on modern processor SOCs? The connection is mainly (solely?) through the 4 DMI 3.0 links. Maybe I'm just daft today, but I don't see the relationship.

Lol I'm good for nothing today ha ha ha; everything's going wrong so why not this :D (been up since 3:30 and it's 20:20 now). Indeed, you're absolutely right of course. You should disregard that wonky post and in fact, I'll remove it.
 
  • Like
Reactions: lobz