• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

Intel Skylake / Kaby Lake

Page 142 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

escrow4

Diamond Member
Feb 4, 2013
3,333
113
106
Why wouldn't he? i7 really don't warrant the extra 100$ not unless you're running stock. And well they are K models. Who is running stock :/

Granted this is from a gamer view.
GTA V comes to mind for one. Most AAA games over the past year show gains with an i7 - see gamegpu. Those threads will come in use over the next few years. Doesn't make sense you spend that much and didn't spend an extra 100 on hyperthreading.
 

crashtech

Diamond Member
Jan 4, 2013
9,638
1,520
126
GTA V comes to mind for one. Most AAA games over the past year show gains with an i7 - see gamegpu. Those threads will come in use over the next few years. Doesn't make sense you spend that much and didn't spend an extra 100 on hyperthreading.
For a given budget, a 6600K + 980ti makes more sense the similarly priced 6700K + 980 in nearly all scenarios. This shouldn't even be arguable. It's presumptuous to try and say he should have spent more, the money available was well allocated imo.
 

tential

Diamond Member
May 13, 2008
7,363
641
121
GTA V comes to mind for one. Most AAA games over the past year show gains with an i7 - see gamegpu. Those threads will come in use over the next few years. Doesn't make sense you spend that much and didn't spend an extra 100 on hyperthreading.
Or the 5820k. The incremental cost to benefit ratio is too great to ignore going up from the i5 to the i7 range when you're in the high end GPU bracket.

Or we could just be real and say that i5s are for peasants =p jk jk.
 

pandemonium

Golden Member
Mar 17, 2011
1,775
75
91
Not getting an i7 in that rig...
That is a very poor decision. Your CPU choice is going to be top of the line CPU performance for the next 5-6 years to come. The GTX 980Ti will come and go in GPU land. Actually, in 2 years, it'll be a midrange GPU. Your i7 will be the top of the line CPU in 2 years and yes don't forget, it'll be still a top of the line CPU in 5 years. It's not going anywhere. The GTX 980 Ti will be no one in 5 years. And this is not to say I don't love the GTX 980 Ti, you'll see me hail that product all day, it's the high end card that actually feels like it is a good value.

Either way though, people don't always make the best decisions, have a great time with your rig, but it's really just the most suboptimal thing possible to not get an i7 in the high end market. I could begin to write paragraphs about it, but meh, I hope you're at least happy with the GTX 980 Ti. As you can tell, I think Nvidia did a bang out job with that card, especially now that I see the competition and have a comparison point.



I budgeted, realized that an i7 would be a better choice in the long run. I saw how relevant sandybridge was even today. I realized that a good mobo/CPU was going to last me 5 years. I knew that 3 years later I'd be wanting a GPU upgrade as a nodeshrink was bound to happen. But hey, I'm looking forward to ignoring 28nm possible. If I get the MSI GTX 980Ti, it's being sold the second the new AMD/Nvidia GPUs come out with a significant performance improvement. But I'd do that knowing I'd be losing $100-200. But most likely, I'll wait as everyone has a massive backlog of games to play, most aren't graphically intense, I can pick the ones that aren't and wait for the GTX 980Ti/Fury X successor to blow me away and get me to that lovely 4K point I want on the games I really want to play. If Direct X 12 mgpu works on the games I actually want to play, well.... I think it'll be hard for me to even get a card considering how much high end gamers will invest in mgpu setups if it works well like direct x 12 promises.
This.

Saving $100 today on an underpowered CPU paired to one of the most powerful GPUs on the market is silly to me. If the budget absolutely couldn't accommodate the additional cost, I would've waited for sales or until the budget allowed the additional $100, or sold blood or something.

Think more long term. Even as fast-paced and changing as technology is, considering the financial longevity of your purchases is still the wisest move.
 

witeken

Diamond Member
Dec 25, 2013
3,876
154
106
Just read the original 2012 Haswell deep dive and a few battery and platform tibdits from the Haswell MacBook Air. Anand really is an indispensable loss. Can't be overstated, he was great.

But Apple pays more, so who cares right?

Edit: So what I'm getting at, is that the Skylake deep dive won't be the same. (Not that Ian or Smith don't do good work (I like the Broadwell-Y coverage), but it won't be the same.)
 

escrow4

Diamond Member
Feb 4, 2013
3,333
113
106
For a given budget, a 6600K + 980ti makes more sense the similarly priced 6700K + 980 in nearly all scenarios. This shouldn't even be arguable. It's presumptuous to try and say he should have spent more, the money available was well allocated imo.
The point was/is that an 6700K + 980Ti would only be an extra 100/110. In terms of overall cost its trivial.
 

Tovarisc

Member
Jun 12, 2015
50
0
0
Kinda regretting that I posted anything about my upcoming rig, didn't realize how much I would get criticized for getting i5 over i7. That said last time I build was early 2010 and took i5 ;)

Edit: Also I know how ridiculous this may sound, but going even that lousy 100€ over budget wasn't option.

Not getting an i7 in that rig...
That is a very poor decision. Your CPU choice is going to be top of the line CPU performance for the next 5-6 years to come. The GTX 980Ti will come and go in GPU land. Actually, in 2 years, it'll be a midrange GPU. Your i7 will be the top of the line CPU in 2 years and yes don't forget, it'll be still a top of the line CPU in 5 years. It's not going anywhere. The GTX 980 Ti will be no one in 5 years. And this is not to say I don't love the GTX 980 Ti, you'll see me hail that product all day, it's the high end card that actually feels like it is a good value.

Either way though, people don't always make the best decisions, have a great time with your rig, but it's really just the most suboptimal thing possible to not get an i7 in the high end market. I could begin to write paragraphs about it, but meh, I hope you're at least happy with the GTX 980 Ti. As you can tell, I think Nvidia did a bang out job with that card, especially now that I see the competition and have a comparison point.
Oh wow, condescending much?
 
Last edited:

Nothingness

Platinum Member
Jul 3, 2013
2,158
406
126
Just read the original 2012 Haswell deep dive and a few battery and platform tibdits from the Haswell MacBook Air. Anand really is an indispensable loss. Can't be overstated, he was great.

But Apple pays more, so who cares right?

Edit: So what I'm getting at, is that the Skylake deep dive won't be the same. (Not that Ian or Smith don't do good work (I like the Broadwell-Y coverage), but it won't be the same.)
You can partly put the blame on Intel: they haven't provided us with the same level of micro-arch details since Haswell :(
 

witeken

Diamond Member
Dec 25, 2013
3,876
154
106
You can partly put the blame on Intel: they haven't provided us with the same level of micro-arch details since Haswell :(
We've got quite some things about Broadwell, Gen8, 14nm. Now is the first time we don't get architecture disclosure (long) before release..

Still better than Qualcomm, Apple, though.
 

ShintaiDK

Lifer
Apr 22, 2012
20,395
128
106
Kinda regretting that I posted anything about my upcoming rig, didn't realize how much I would get criticized for getting i5 over i7. That said last time I build was early 2010 and took i5 ;)

Edit: Also I know how ridiculous this may sound, but going even that lousy 100€ over budget wasn't option.
Nothing wrong with an i5 at all. If it wasnt because of the stock speed of the i7 I would pick one myself.
 

witeken

Diamond Member
Dec 25, 2013
3,876
154
106
Can you point me to Intel litterature about Broadwell micro-architecture?


Definitely :biggrin:

http://www.anandtech.com/show/8355/intel-broadwell-architecture-preview

Remember Broadwell is a Tick, so changes can be summed up in one slide / 1 page from article, apparently.



Edit: Isn't it interesting that Skylake is already the second generation of Intel's second generation FinFET.

Only Samsung has 1 product that is on its first generation FinFET.
 
Last edited:

CakeMonster

Senior member
Nov 22, 2012
998
85
91
Kinda regretting that I posted anything about my upcoming rig, didn't realize how much I would get criticized for getting i5 over i7.
I know the feeling.. in a similar example, I got hammered on another forums for saying that I wanted 6700K over 5820K. I just think that the guaranteed stock clockspeed with better IPC is more useful to me than the extra cores. It's not that I don't want the extra cores, its just that the alternatives are what they are... A hotter, much lower clocked HW-E with lower nominal OC potential compared to a newer Skylake architecture that delivers at stock.
 

CakeMonster

Senior member
Nov 22, 2012
998
85
91
Does anyone have any idea what it will take to fix that discrete graphics "bug" on Skylake platforms as described in the AT review? Its hardly a big issue since it takes really close examination of several benchmarks to even identify it, but I'm wondering if this can be fixed with driver/software updates, or BIOS updates, or if it will take a new motherboard/CPU to sort it out.

Any theories?

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/10

There’s no easy way to write this.

Discrete graphics card performance decreases on Skylake over Haswell.

This doesn’t particularly make much sense at first glance. Here we have a processor with a higher IPC than Haswell but it performs worse in both DDR3 and DDR4 modes. The amount by which it performs worse is actually relatively minor, usually -3% with the odd benchmark (GRID on R7 240) going as low as -5%. Why does this happen at all?

So we passed our results on to Intel, as well as a few respected colleagues in the industry, all of whom were quite surprised. During a benchmark, the CPU performs tasks and directs memory transfers through the PCIe bus and vice versa. Technically, the CPU tasks should complete quicker due to the IPC and the improved threading topology, so that only leaves the PCIe to DRAM via CPU transfers.

Our best guess, until we get to IDF to analyze what has been changed or a direct explanation from Intel, is that part of the FIFO buffer arrangement between the CPU and PCIe might have changed with a hint of additional latency. That being said, a minor increase in PCIe overhead (or a decrease in latency/bandwidth) should be masked by the workload, so there might be something more fundamental at play, such as bus requests being accidentally duplicated or resent due to signal breakdown. There might also be a tertiary answer of an internal bus not running at full speed. To be sure, we rested some benchmarks on a different i7-6700K and a different motherboard, but saw the same effect. We’ll see how this plays out on the full-speed tests.
 

flopper

Senior member
Dec 16, 2005
739
19
76
Kinda regretting that I posted anything about my upcoming rig, didn't realize how much I would get criticized for getting i5 over i7. That said last time I build was early 2010 and took i5 ;)

Edit: Also I know how ridiculous this may sound, but going even that lousy 100€ over budget wasn't option.



Oh wow, condescending much?
you catually listen to what people write on the net?
:D
 

coercitiv

Diamond Member
Jan 24, 2014
4,166
5,066
136
didn't realize how much I would get criticized for getting i5 over i7. That said last time I build was early 2010 and took i5
Out of pure curiosity, which games would perform significantly better on i7 2600k + GTX 980 versus i5 2500k + GTX 980Ti?
 

LTC8K6

Lifer
Mar 10, 2004
28,523
1,570
126
GTA V comes to mind for one. Most AAA games over the past year show gains with an i7 - see gamegpu. Those threads will come in use over the next few years. Doesn't make sense you spend that much and didn't spend an extra 100 on hyperthreading.
I'm not seeing any significant gains with an i7 over an i5 in GTA V.

I have seen benches where the HT makes the frame rate slightly worse.
 

LTC8K6

Lifer
Mar 10, 2004
28,523
1,570
126
Kinda regretting that I posted anything about my upcoming rig, didn't realize how much I would get criticized for getting i5 over i7. That said last time I build was early 2010 and took i5 ;)
Don't give it another thought.

You made the right CPU decision.
 

dahorns

Senior member
Sep 13, 2013
550
83
91
Does anyone have any idea what it will take to fix that discrete graphics "bug" on Skylake platforms as described in the AT review? Its hardly a big issue since it takes really close examination of several benchmarks to even identify it, but I'm wondering if this can be fixed with driver/software updates, or BIOS updates, or if it will take a new motherboard/CPU to sort it out.

Any theories?

http://www.anandtech.com/show/9483/intel-skylake-review-6700k-6600k-ddr4-ddr3-ipc-6th-generation/10
I thought we had figured it out - Skylake is hampered by the poor latency of the lower clocked DDR4 parts. Other reviews have shown ~10% gains when running faster DDR4 memory.
 
Aug 11, 2008
10,457
641
126
Nothing wrong with an i5 at all. If it wasnt because of the stock speed of the i7 I would pick one myself.
Well, there are two issues here. First, no can reasonably argue right now that 6600k plus 980Ti will not perform better in all or nearly all games than an i7 with a hundred dollar cheaper gpu.

The 6600k is a great processor. However, although they stated themselves somewhat...over-enthusiastically I agree with the posters who were arguing for i7 in this price bracket. With games being more multithreaded, DX12 coming, and a new generation 14/16 nm dgpus probably with HBM coming, I would allocate more resources toward cpu now, even at the cost of a less optimal current system for one that is likely to be better long term. This is especially true since intel has delayed 10 nm, and there is not really much new on the horizon (except Zen??!!!) in the cpu front.
 
Last edited:

ASK THE COMMUNITY