Intel Skylake / Kaby Lake

Page 25 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Aug 11, 2008
10,451
642
126
Here is a test of the 4770R with more normal settings than the strange hodgepodge that Anand uses. Anand must have used some heavy AA in his tests, because Tomb Raider at 1080p gives much higher results than he got. Also gives results in a few other games and a comparison to kaveri and HD4600. iris pro test.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
The Brix Pro numbers are for GT3e, not GT4e.

How are they relevant?

Why did you make up a number for GT4e?
Quote:
Skylake GT4e = Tomb Raider 1080p15 Avg FPS = "Acceptable"

It's really bad when a person starts commenting without reading.

A didn't give any expectation of what GT4e is. I gave a metric of what GT4e has to be to accomplish something. You know, just how every company posts how X product is Y percentage better than Z product.

People here are expecting Skylake to have a 50% improvement over Broadwell on iGPU. Broadwell has about a 30% increase. Yet, Intel needs a 200% improvement over 65w Haswell GT3e to hit 15 Avg FPS on those settings I showed for Tomb Raider.



I showed all the examples that Anandtech showed. Dirt Showdown for instance would need like 240% improvement to hit 60 Avg FPS on those settings.

Here is a test of the 4770R with more normal settings than the strange hodgepodge that Anand uses. Anand must have used some heavy AA in his tests, because Tomb Raider at 1080p gives much higher results than he got.

What is "normal" settings? Oh and what are the systems specs...is there a link there with them? And of course it would give higher results if the settings are lower.
 

applepipe419

Junior Member
May 11, 2015
1
0
0
Im betting that apple uses this architecture for the next macbook 15" and they reduce the size of the bezel and the footprint by about a half a pound and a few cm in the thickest point. Just saying. Hopefully this will be announced at WWDC 15
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
It's really bad when a person starts commenting without reading.

A didn't give any expectation of what GT4e is. I gave a metric of what GT4e has to be to accomplish something. You know, just how every company posts how X product is Y percentage better than Z product.

People here are expecting Skylake to have a 50% improvement over Broadwell on iGPU. Broadwell has about a 30% increase. Yet, Intel needs a 200% improvement over 65w Haswell GT3e to hit 15 Avg FPS on those settings I showed for Tomb Raider.




I showed all the examples that Anandtech showed. Dirt Showdown for instance would need like 240% improvement to hit 60 Avg FPS on those settings.



What is "normal" settings? Oh and what are the systems specs...is there a link there with them? And of course it would give higher results if the settings are lower.

1080p60 and 1080p60 with extreme settings are two entirely different things. I'm not sure why you keep pointing to 1080p benchmarks with all the eyecandy turned on and then expect 60fps as the minimum acceptable framerate. Anandtech didn't hit 60fps with max settings on a 230W TDP GTX770 discrete card and a hex core extreme CPU in Tomb Raider or Sleeping Dogs.
67866.png

67865.png
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Honestly I care much more about 60fps 10bit HEVC decoding than gaming. I hope that the final full decoder isn't stuck to the high-end.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
The demo was interesting, but what does it tell us, really?

This demo was a bit different because the Skylake CPU was mounted inside of a laptop chassis, not just in an early development motherboard. One component of the demo was a 3DMark graphics benchmark, which may be a hint at the focus of Skylake's performance improvements. Skaugen also claimed the system can decode and display 4K video in real time.

http://techreport.com/news/27028/intel-demos-skylake-silicon-production-expected-in-2h-2015
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
1080p60 and 1080p60 with extreme settings are two entirely different things. I'm not sure why you keep pointing to 1080p benchmarks with all the eyecandy turned on and then expect 60fps as the minimum acceptable framerate. Anandtech didn't hit 60fps with max settings on a 230W TDP GTX770 discrete card and a hex core extreme CPU in Tomb Raider or Sleeping Dogs.
67866.png

67865.png

Anandtech has been using very old GPU's for testing $1000 CPU's for quite a while now.

Testing with high end cards would yield more accurate results, who the hell would buy a 5960X and couple it with a budget GTX 770 for gaming?...
 
Last edited:

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Anandtech has been using very old GPU's for testing $1000 CPU's for quite a while now.

Testing with high end cards would yield more accurate results, who the hell would buy a 5960X and couple it with a budget GTX 770 for gaming?...

I'm not sure what that has to do with expecting Sylake's igpu to hit high fps levels...

Do you mean we should expect GTX980 performance? :D
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
1080p60 and 1080p60 with extreme settings are two entirely different things. I'm not sure why you keep pointing to 1080p benchmarks with all the eyecandy turned on and then expect 60fps as the minimum acceptable framerate.

Because that is the tone of gaming in these forums for everything else.
 
Aug 11, 2008
10,451
642
126

So what is the footnote to LGA (Skylake-S) 4plus4e? You must have cut that off. Seems to me that there was a qualifier in that footnote that introduced some uncertainty that it would come to desktop. In any case, that seems like an outdated roadmap, since it was showing broadwell H for 2014, and we are well into Q2 15 and havent seen it yet.

Intel really needs to get this roadmap rolling. We have all these projections, but no quad broadwell laptops, no unlocked Broadwell for desktop, no firm release date for skylake at all.
Basically no quad core 14nm after all this time.
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
Because that is the tone of gaming in these forums for everything else.
I would say most people in these forums have much more reasonable view of acceptable performance from an iGPU, but you are free to have your own opinion. No iGPU will hit 1080p60fps in a demanding modern game at high settings though, regardless of vendor.

You mention in laptops. Like, the slide. I am just pointing out how far away Intel is, at its best case scenario(65wTDP) from getting to 1080p60FPS. Meaning, you would have to increase my results by quite a bit, for a laptop form factor. Intel iGPU is so far behind and they aren't doing much to improve upon it.
Probably the most efficient low power slot powered dGPU is the Maxwell-based 750 Ti, which pulls 66W under Furmark according to TechPowerUp. Even it doesn't come close to 60FPS in Tomb Raider (with TressFX disabled) if you crank things up.
tombraider_1920_1080.gif


Intel's iGPU might be substandard compared to those made by AMD, nothing you fit into a 65W envelope will let you play a game like Tomb Raider with maxed settings at 60FPS. That's not what they're for. If you want to max settings at 1080 and still get 60FPS, just do what everyone else does and pony up the money and power budget to get a GTX980 or R9 290X.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Yeh these are total fake - Intel wouldn't release the 6700 being slower than the 4790k - two generations old.

I agree that the benches are very suspect.

I would think a 3.4 ghz 6700 should be about tied with a 4.0ghz 4790K, with a 4.0ghz 6700K being noticeably faster.

The ST CB15 jump from the 3.4ghz 6700 at 158 to the 4.0ghz 6700K at 165, seems pretty small for 600 more mhz. It only gets you 7 more points.

The 400mhz jump from the 4790 to the 4790K is 145 to 159 or 14 points.

600mhz on Skylake got 7 more points, whereas 400mhz on Haswell got 14 more points.

Doesn't seem to fit.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
5960X scores 177 at that site. It scores 140 on Anandtech. A Xeon peaks at 193.

The first 4790K scores 159. While on Anandtech it gets 181.

67034.png
 

Serandur

Member
Apr 8, 2015
38
0
6
I would say most people in these forums have much more reasonable view of acceptable performance from an iGPU, but you are free to have your own opinion. No iGPU will hit 1080p60fps in a demanding modern game at high settings though, regardless of vendor.


Probably the most efficient low power slot powered dGPU is the Maxwell-based 750 Ti, which pulls 66W under Furmark according to TechPowerUp. Even it doesn't come close to 60FPS in Tomb Raider (with TressFX disabled) if you crank things up.
tombraider_1920_1080.gif


Intel's iGPU might be substandard compared to those made by AMD, nothing you fit into a 65W envelope will let you play a game like Tomb Raider with maxed settings at 60FPS. That's not what they're for. If you want to max settings at 1080 and still get 60FPS, just do what everyone else does and pony up the money and power budget to get a GTX980 or R9 290X.
I'm not taking any sides in any debate or anything, but the only AA options Tomb Raider has is FXAA, 2xSSAA, and 4xSSAA. That test is putting the 750Ti through 4xSSAA (and forgetting the "SS" in the label), which is basically 4K at 1920x1080, so...

Just saying, that is not an accurate measure of the 750Ti's capabilities unless you're expecting good 4K performance.
 
Last edited:

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
I'm not taking any sides in any debate or anything, but the only AA options Tomb Raider has is FXAA, 2xSSAA, and 4xSSAA. That test is putting the 750Ti through 4xSSAA (and forgetting the "SS" in the label), which is basically 4K at 1920x1080, so...

Just saying, that is not an accurate measure of the 750Ti's capabilities unless you're expecting good 4K performance.

I wouldn't expect the 750Ti to do well under those settings, just like I wouldn't expect any iGPU to do well. I wouldn't try and play some more demanding games at 1080P with all the graphics effects on even on a 960-class card. That was kind of the point.
 

mikk

Diamond Member
May 15, 2012
4,311
2,395
136
Yeh these are total fake - Intel wouldn't release the 6700 being slower than the 4790k - two generations old.



They allow this because the much lower clocked non-K isn't competitor of 4790k. 6700k will do the job for this model as replacement. On this page 6700k scores 10.58 in CB11.5, this is in line with this test: http://www.pcfrm.com/intel-i7-6700k-vs-i7-4790k/

Of course it can't prove all the scores on this page.
 
Mar 10, 2006
11,715
2,012
126
So what is the footnote to LGA (Skylake-S) 4plus4e? You must have cut that off. Seems to me that there was a qualifier in that footnote that introduced some uncertainty that it would come to desktop. In any case, that seems like an outdated roadmap, since it was showing broadwell H for 2014, and we are well into Q2 15 and havent seen it yet.

Intel really needs to get this roadmap rolling. We have all these projections, but no quad broadwell laptops, no unlocked Broadwell for desktop, no firm release date for skylake at all.
Basically no quad core 14nm after all this time.

It's a "conditional SKU" per this slide from mikk:

txsog9it.png


http://forums.anandtech.com/showthread.php?t=2406498

If I had to guess, I'd say it depends on how well BDW-C does.