[Techno-Kitchen] i5-6400 3.1 Ghz vs. i5-6400 4.5Ghz w/ GTX1060 2Ghz => CPU bottlenecking explored

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
i5-6400 vs. i5-6400@4.5 w/ GTX 1060 6Gb @ 2Ghz
> The CPU bottleneck in BF1 is tremendous, with a stock i5 Skylake severely bottlenecking the GTX1060.

0:21 min = 48-51 fps vs. 78-81 fps
0:32 min = 49 fps vs. 77 fps
0:51-0:53 min = 46-48 fps vs. 68-69 fps
https://www.youtube.com/watch?v=IdV7zhDfA4Y

i5-6400 (3.1) vs i5-6400@4,5 in all new games 2016 (GTX 1060 @ 2Ghz)
https://www.youtube.com/watch?v=lOl21O_8_RI

For anyone who doesn't have the funds for a K series i5, you should look into purchasing an ASRock Z170 series board and the cheapest i5-6400 and overclocking it via BLCK to 4.4-4.6Ghz.

The era when an i5 was sufficient for gaming is long past. In today's modern games, the i5 must be overclocked to extract most of the performance out of modern GPUs. Considering Skylake is a rather modern CPU series/architecture, and it can be reasonably assumed that most i5-6400/6500/6600/6600K users will keep their CPU for 3-4 years, as GPUs get even more advanced and next gen games get even more demanding, the CPU bottleneck may become even more severe.
 
Last edited:

PontiacGTX

Senior member
Oct 16, 2013
383
25
91
Since multithreaing in games has become a trend (and DX11 has become CPU bound with modern high end gpus) nowadays an i7/xeon with HT is required for the best/proper performance in some recent games
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I run my 5930K at 3.7GHz and while a new $1K Kaby Lake core parts would be fun, still seems rather pointless. Besides, why didn't they test @ 3.5GHz for comparison as that is the standard speed of less gimped Intel CPUs. The i5 6400 is a reject CPU anyway.
 

kondziowy

Senior member
Feb 19, 2016
212
188
116
Yeah, stock i5-6400 level of performance (~i5-4460) was bottlenecking ultra details for a while now, like a good year. But still High is ok for 60fps at almost all games.

Soon it will drop to medium and that will mean visual quality has to be sacrificed = trash cpu :p
But Zen is ready, and 6-core mainstream is incoming.
 
Last edited:

pcslookout

Lifer
Mar 18, 2007
11,926
146
106
Yeah, stock i5-6400 level of performance (~i5-4460) was bottlenecking ultra details for a while now, like a good year. But still High is ok for 60fps at almost all games.

Soon it will drop to medium and that will mean visual quality has to be sacrificed = trash cpu :p
But Zen is ready, and 6-core mainstream is incoming.

If 6-core mainstream is incoming then why is everyone telling me to get a Intel i7 7700k when it comes out ? Also the Zen is 8 cores right ?
 

kondziowy

Senior member
Feb 19, 2016
212
188
116
6-core in 2018, i7-7700k will give you ultra details until that happens :) Still it looks like clock boosted Skylake so I don't like this cpu.

In 2018 when i5-6400 barely runs medium details, we peasants will get 6-cores :p Also i7-5820k will be a cheapo used grandpa chip ready for grabs
 

pcslookout

Lifer
Mar 18, 2007
11,926
146
106
6-core in 2018, i7-7700k will give you ultra details until that happens :) Still it looks like clock boosted Skylake so I don't like this cpu.

In 2018 when i5-6400 barely runs medium details, we peasants will get 6-cores :p Also i7-5820k will be a cheapo used grandpa chip ready for grabs

What about AMD Zen won't it give us more than 6 cores but 8 cores early for cheap? It could be a game changer.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
6-core in 2018, i7-7700k will give you ultra details until that happens :) Still it looks like clock boosted Skylake so I don't like this cpu.

In 2018 when i5-6400 barely runs medium details, we peasants will get 6-cores :p Also i7-5820k will be a cheapo used grandpa chip ready for grabs
i5-6400 stock base clock is a pretty slow 2.7ghz. If you were building a gaming computer, you probably wouldn't pick the 6400.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
I just ordered an i5-6400 last night for $180 based on RS's suggestion to get that one and BCLK OC it. (I do have an ASRock Z170 board, as well as two of their B150 "Hyper" boards.)

So, I hope it overclocks really well.

I tried boosting my G4400 higher than 4.45Ghz @ 1.300V, last night too, but wasn't entirely successful.

I could get it to boot Windows 10 and run OCCT:CPU test for 10 minutes, at 4.62Ghz @ 1.400V, but then I would get "WATCHDOG" timeout errors. Not sure what that's from.

I don't think I have control over the uncore / ring multiplier, so I think uncore is running at same clocks, and that might be the problem.
 

SteveGrabowski

Diamond Member
Oct 20, 2014
6,805
5,759
136
How do the quadcore hyperthreaded i7/Xeon hold up in this game? I have a Xeon E3-1231v3 that runs at 3.6 GHz on all cores and I was always at 99% gpu usage with my GTX 970 at 1080p on the open beta when I had vsync off. Not that I care much about this game since I don't play multiplayer fps (I was bored to death by the open beta), but I wonder if hyperthreading is starting to outperform clockspeed on quadcores now like it has on dual cores (eg Haswell i3 vs OC G3258).
 

nurturedhate

Golden Member
Aug 27, 2011
1,738
652
136
On a side note a 1070 at 2.05/9.4 is slightly bottlenecked on a 4670k at 4.7 in BF1. The death of the quad cores is approaching.
 
Aug 11, 2008
10,451
642
126
Thanks OP. and for those curious how the older i5 2500k@4.5 compares to the i5 6400. I'm sure many folks are still running a sandybridge chip, I know I still have one in a secondary system.

https://www.youtube.com/watch?v=mLW_3aNfZ4M

Granted it is Sandy Bridge, but those Techno Kitchen results are much different for the results you posted. The video you linked shows very little difference in most games between the 6400 and 4.5 ghz 2500k. But the TK results show some cases of 50% or more faster for the overclocked 6400, which would have to mean skylake is 50% or more faster than SB at the same clock. Quite a bit more difference than I would have expected. It would be nice to see a comparison from a well known test site instead of just youtube videos.

Edit: I would expect a stock 6400 to bottleneck a more powerful card, but not so much a midrange card like the 1060.
 

Trumpstyle

Member
Jul 18, 2015
76
27
91
I got a haswell i5 3,4 ghz non-overclock, work just fine for stable 60 fps. But I play on high setting not ultra. Might buy kabylake i7 3,6 ghz when it comes out as I don't want the 4,2 ghz because of bad energy effiency. Or maybe maybe I w8 for zen but the rumors I read points towards haswell ipc with 3 ghz max clock.
 

vissarix

Senior member
Jun 12, 2015
297
96
101
I was going to build an i5 6400 build oc it at around 4.5ghz, but found an used i7 4790k build which cost me less and its faster...i will stick to this i7 haswell till coffeelake 6cores comes out...even the i7 6700k is not enough for bf1 if you want very high fps...for example next year 240hz monitors comes out, and the i7 6700k cant give you more then 180fps even if you game on low settings..
 

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,686
136
The era when an i5 was sufficient for gaming is long past.
On a side note a 1070 at 2.05/9.4 is slightly bottlenecked on a 4670k at 4.7 in BF1. The death of the quad cores is approaching.
So, after ~3 years of service, the 4670K still manages to feed a modern graphics card in a seemingly CPU demanding AAA title, yet we tend to recommend against buying 4C/4T even with clear signs that 2017 and most importantly 2018 will bring about a uniform increase in thread count on both Intel and AMD CPUs, making the i7 of today the i5 of tomorrow: that total cost of ownership will be all over the place.

On top of that, the i5 6400 stock clocks are so gimped that even an i3 offers more consistent performance on a GTX 1080 FTW, yet we somehow hear the quad core swansong in budget builds based on GTX 1060. Can't wait to see that i3 7350K overclocked to 4.7Ghz+ in gaming benchmarks, I can already see the hoops.

Xd1Y8IP.png


Yes Joe, your i3 is better than i5, but only for the purpose of proving 4C/4T is no longer enough for gaming, otherwise i3 sucks even more. Don't be confused, be enthusiast.
 

Majcric

Golden Member
May 3, 2011
1,369
37
91
Granted it is Sandy Bridge, but those Techno Kitchen results are much different for the results you posted. The video you linked shows very little difference in most games between the 6400 and 4.5 ghz 2500k. But the TK results show some cases of 50% or more faster for the overclocked 6400, which would have to mean skylake is 50% or more faster than SB at the same clock. Quite a bit more difference than I would have expected. It would be nice to see a comparison from a well known test site instead of just youtube videos.

Edit: I would expect a stock 6400 to bottleneck a more powerful card, but not so much a midrange card like the 1060.


I would hate to even try to guess the percentage gain going from SandyBridge to Skylake in gaming but I will say the difference can be quite significant under the right circumstances. Take the game Watch Dogs for example My i5 2500k @4.5 would dip all the way into the 30s while my i7 6700k@4.5 never goes below 60. (doesn't matter if hyperthreading is disabled or not)
 

MajinCry

Platinum Member
Jul 28, 2015
2,495
571
136
01%20-%20Gains%20over%20Sandy.png


Outside of emulators (did we ever discover what gave Haswell that jump?), the difference isn't anything to write home about.

Me game plan is to somehow scrounge up enough dosh, nab a second hand i5 2500k for £70, a £100 motherboard from Kikatek, and everything's dandy. Biggest diff will be not having to deal with AMD's draw call deficit.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
on the BF games some settings reduce CPU load, I think terrain detail or something and are hard to notice the difference, makes sense if you are saving money,

also, did the test explore DX12?
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
01%20-%20Gains%20over%20Sandy.png


Outside of emulators (did we ever discover what gave Haswell that jump?), the difference isn't anything to write home about.

Me game plan is to somehow scrounge up enough dosh, nab a second hand i5 2500k for £70, a £100 motherboard from Kikatek, and everything's dandy. Biggest diff will be not having to deal with AMD's draw call deficit.

Err, Skylake can take advantage of DDR4 - Sandy Bridge cannot, making your graph totally pointless. High speed DDR4 nets a good performance advantage in addition to Skylake's increased IPC.

Real world performance is what counts.
 
Last edited:

Dave2150

Senior member
Jan 20, 2015
639
178
116
So, after ~3 years of service, the 4670K still manages to feed a modern graphics card in a seemingly CPU demanding AAA title, yet we tend to recommend against buying 4C/4T even with clear signs that 2017 and most importantly 2018 will bring about a uniform increase in thread count on both Intel and AMD CPUs, making the i7 of today the i5 of tomorrow: that total cost of ownership will be all over the place.

On top of that, the i5 6400 stock clocks are so gimped that even an i3 offers more consistent performance on a GTX 1080 FTW, yet we somehow hear the quad core swansong in budget builds based on GTX 1060. Can't wait to see that i3 7350K overclocked to 4.7Ghz+ in gaming benchmarks, I can already see the hoops.

Xd1Y8IP.png


Yes Joe, your i3 is better than i5, but only for the purpose of proving 4C/4T is no longer enough for gaming, otherwise i3 sucks even more. Don't be confused, be enthusiast.

BF1 is one of the onyl games where a Hex core can keep up with the higher frequency/IPC Skylake. 99% of other games all perform better on a 6700K (which easily clock to 4.6/4.7Ghz on air).

Kabylake will further increase this gap. I expect i7 quads will still be the best gaming CPU's until 2020 or so. By then a large enough % should have coffee lake hex's that game developers are confident enough to spend money multi-threading beyond 4 cores.
 

IllogicalGlory

Senior member
Mar 8, 2013
934
346
136
Err, Skylake can take advantage of DDR4 - Sandy Bridge cannot, making your graph totally pointless. High speed DDR4 nets a good performance advantage in addition to Skylake's increased IPC.

Real world performance is what counts.
Which tests have demonstrated this?