Nice article on 6600k vs 6700k vs 6900k for gaming.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I don't want to influence an interesting discussion too much, but I've already published an analysis of HT versus cache here:

http://techbuyersguru.com/best-gaming-cpus-pentium-vs-core-i3-vs-core-i5-vs-core-i7

In short, it depends on the game, and the Unreal 3 engine was particularly bad with HT, but it has been replaced with U4. Most modern game engines can use HT to positive effect if they are CPU limited.

Also, moonbogg, I could have tested with my Titan XP, but it's useless for CPU testing, as it constantly throttles due to temperature constraints that cannot be effectively controlled for without liquid cooling. It adds too much noise to the data. SLI does the same thing for different reasons - it isn't consistent or repeatable enough to be used as a basis for CPU testing.
 

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
I don't want to influence an interesting discussion too much, but I've already published an analysis of HT versus cache here:

http://techbuyersguru.com/best-gaming-cpus-pentium-vs-core-i3-vs-core-i5-vs-core-i7

In short, it depends on the game, and the Unreal 3 engine was particularly bad with HT, but it has been replaced with U4. Most modern game engines can use HT to positive effect if they are CPU limited.

Also, moonbogg, I could have tested with my Titan XP, but it's useless for CPU testing, as it constantly throttles due to temperature constraints that cannot be effectively controlled for without liquid cooling. It adds too much noise to the data. SLI does the same thing for different reasons - it isn't consistent or repeatable enough to be used as a basis for CPU testing.

I see what you are saying, but the card was clearly a major bottleneck. I understand that for the vast majority of gamers, any modern CPU will be fine with a single GPU and this testing reaffirms that fact. They all perform similarly when the GPU is getting wrecked at 1440p. It would be interesting to see what would have happened if the test was done at 1080p for 120hz+ people.
I think more tech sites should test what enthusiasts care about and not just stuff for mainstream gamers. Enthusiast stuff like how to get the most out of your SLI and 144hz monitor by choosing the right CPU and overclocking it. Showing the difference an OC'd CPU can make for 1080SLI @ 1440p/144hz would be a useful test for people who may not have their CPU overclocked.
I'm just saying what I personally care about and what other enthusiasts probably also care about. The high performance stuff. How to get the kind of performance that isn't available off the shelf. You have to overclock to get there.
What's best for a high end SLI setup running 144hz or more? Do we know the answer? I honestly don't because no one has tested it from what I can tell. Is it an OC'd skylake quad or a broadwell 6 or 8 core? Maybe a 10 core? What's best for SLI at high FPS?
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
I see what you are saying, but the card was clearly a major bottleneck. I understand that for the vast majority of gamers, any modern CPU will be fine with a single GPU and this testing reaffirms that fact. They all perform similarly when the GPU is getting wrecked at 1440p. It would be interesting to see what would have happened if the test was done at 1080p for 120hz+ people.
I think more tech sites should test what enthusiasts care about and not just stuff for mainstream gamers. Enthusiast stuff like how to get the most out of your SLI and 144hz monitor by choosing the right CPU and overclocking it. Showing the difference an OC'd CPU can make for 1080SLI @ 1440p/144hz would be a useful test for people who may not have their CPU overclocked.
I'm just saying what I personally care about and what other enthusiasts probably also care about. The high performance stuff. How to get the kind of performance that isn't available off the shelf. You have to overclock to get there.
What's best for a high end SLI setup running 144hz or more? Do we know the answer? I honestly don't because no one has tested it from what I can tell. Is it an OC'd skylake quad or a broadwell 6 or 8 core? Maybe a 10 core? What's best for SLI at high FPS?

Well, I've already tested something like that too, 1070SLI at 4K on Z170 and X99:

http://techbuyersguru.com/taking-4k-challenge-gtx-1070-sli-z170-and-x99

And take a closer look at the graphs in the 6600/6700/6900 article. Many of the individual games were way over 120fps. A GPU limitation really isn't the issue. You want 800x600 charts? HardOCP had them. ;)

But a 1080SLI vs TitanXP shootout is in the works. Which settings do you honestly think would be best? 1080p, 1440p, or 4K? Keep in mind that this testing actually takes a lot of time, so "all of the above" isn't realistic. I assume you want lower resolution because you're actually interested in the platform performance, not the GPUs, but these setups are overkill for 1440p, let alone 1080p.

Playing BF4 last night, my TXP was at 180fps at 1440p/ultra. I actually needed to vsync it as it was well beyond the gsync range. And of course it was running at 83C to hit 180fps, which means it was bouncing off the limiter.
 
Last edited:

moonbogg

Lifer
Jan 8, 2011
10,731
3,440
136
For a 1080SLI VS Titan X shootout, the best thing really is to test the setups at a variety of resolutions. IMO there should be 3 resolutions tested for those setups, 1440p, 3440x1440 and 4K. Its a lot of work, but those are the 3 resolutions that people will be using those setups for, and 3440x1440 is increasing in popularity pretty quickly in the enthusiast space. If I had to choose just one to see what setup is better, then it would have to be 4K and the reader could estimate performance at the reduced resolutions.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
For a 1080SLI VS Titan X shootout, the best thing really is to test the setups at a variety of resolutions. IMO there should be 3 resolutions tested for those setups, 1440p, 3440x1440 and 4K. Its a lot of work, but those are the 3 resolutions that people will be using those setups for, and 3440x1440 is increasing in popularity pretty quickly in the enthusiast space. If I had to choose just one to see what setup is better, then it would have to be 4K and the reader could estimate performance at the reduced resolutions.

All right, thanks for the input!

I'll check out of the conversation for now!
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
What card can you buy now that is faster?? It is "high end" right now, simply because there is nothing else as fast.
Uhhh Titan XP? You know the one that is 50% bigger and better in every way?

What do you call something which has something 50% bigger, better, faster above it and something 50% slower, smaller below it? That would be the "middle" right?
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
If this is too threadjacking, give me a heads up..

Here we have ...

https://www.techpowerup.com/forums/...rks-core-i7-6700k-hyperthreading-test.219417/

... evidence that i7 > i5 performance is due to the larger cache and NOT hyperthreading?

- 15 second benchmarks may not be long enough to fully capture or dismiss the benefits of HT
- Not including per core/thread CPU usage while achieving the performance may obfuscate a scenario where the Core i5 OC is at 90%+ CPU utilization, leaving little room for next gen AAA games and/or future GPU upgrades where we inevitably will get cards 2X faster than GTX1080 at $300 range. This comparison of a stock i7 6700 vs. i5 6600K @ 4.6Ghz highlights this point very well:

https://www.youtube.com/watch?v=f9cVxka2fns

I would probably pick an i7 6700 and overclock it via BCLK than to purchase the i5 6600K.
http://overclocking.guide/intel-skylake-non-k-overclocking-bios-list/

BCLK i7 6700 is good up to 4.9Ghz on a solid non-K chip.
https://www.youtube.com/watch?v=B0sDNbVgn3g

The extra $60-70 spent on the i7 6700 is a peace of mind that this CPU platform will enable the next 2-3 GPU generations without major CPU bottlenecks all the way to 2020+. This is especially true since the next true generation CPU architecture from Intel isn't until 2018 (at the earliest) with Icelake. Considering that many PC gamers skip at least 1 new Intel CPU architecture (i.e., SB i7 2600K -> skip Haswell -> i7 6700K, or Haswell i7 4770K/i7 4790K -> skip Skylake -> 2018 Icelake), realistically it means the Skylake i7 6700/K series user can skip 2018 Icelake and wait all the way until 2020-2021.

After seeing i7 2600K OC last 5 years, I would rather recommend for the PC gamer to cut costs the motherboard than buy an i5 if the intention is to keep the gaming PC for 4-5 years. There are many gamers who think you need a $200-250 Z170 board, but frankly if budget is tight, it's better to buy an i7 6700K + Asus Z170-E than to purchase an i5 6600K and Gigabyte Z170X-Gaming 7. Another thing I see if gamers who pair an i5 6600K with a $70-100 CPU cooling system, the likes of super high-end air coolers such as Noctua NH-D14/15 or AIOs Corsair H80/100/105/110, etc. An i7 6700K with a $20-30 cooler will smash the i5 6600K system.

---

As for BW-E generation, it's a dud of a die shrink. Not only does it cost more than Haswell-E (i.e, i7 5820K only $320 at MC vs. $400 for i7 6800K and i7 6950X is a $700 premium over i7 5960X - i.e., Intel never ), but it also overclocks worse on average.

MicroCenter pricing:

i7 6700K = $310
Premium board after $30 off combo and rebates = Asrock Z170 Extreme6 = $100
Total: $410

For gamers who don't care for AVX instructions, it's possible to save another $30 and go with the i7 6700 making things even worse for the ageing X99 platform.

i7 6800K = $400
'Equivalent' X99 board after $30 off combo and rebates = Asrock X99 Extreme4 USB 3.1 = $150
Total: $550

But you'd need to add another $20-30 for the much beefier CPU cooler for the 6800K to reach 4.4Ghz compared to 4.6-4.8Ghz on the 6700K, an overclock which can be achieved with a $20-25 cooler.

Overall, the i7 6700K path will be close to $150 savings that can be used towards a 960 EVO/Pro PCIe SSD or for "CPU future-proofing" by simply setting that $ aside towards 2018 Icelake or 2019 Icelake-X 6-8 core.

I'd love to read a follow-up 2017 article with Kaby Lake i7 7700K max overclocked against 6-10 core BW-E chips but with GTX1080Ti SLI.

A $650 card is midrange? :smiley::pizza:

Yes, it absolutely is. :) By definition, the middle chip in the stack is mid-range and always has been for NV up until they decided to double prices starting with Kepler in 2012.

Gx07 = budget
Gx06 = low-end
Gx04 = mid-range & upper-mid-range: historically the ~$199-249 market segment.
Gx02 = high-end
Gx00/110 = ultra high-end

All right, thanks for the input!

Next time I think you should overclock Skylake to the max. I know what you were trying to do with comparing them clock for clock but in the real world, if someone can hit 4.4Ghz on the 6800K, surely that same type of PC user/gamer will go for broke and try to hit 4.7-4.8Ghz on the 6700K. Considering i7 6700K can achieve 4.8Ghz at 1.40V and Intel charges $30 to replace it over 3 years, it's a valid scenario to test max overclocks on both and not limit the 6700K artificially.

http://click.intel.com/tuningplan/purchase-a-plan
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Purchased this 5930K back in end of 2014 and I doubt I'll be upgrading the system at all for a long while yet. Being lazy, I've just set MCE to 3.7GHz across all cores, and at 1200p its more than enough for this 1070. I could build a smaller, quieter 7700K system, but in terms of performance no real point.
 
Aug 11, 2008
10,451
642
126
You can buy a Titan X. It was released a while ago. You didn't know? The 1080ti will be very similar and the mid ranginess of the 1080 will become very apparent.
Uhhh Titan XP? You know the one that is 50% bigger and better in every way?

What do you call something which has something 50% bigger, better, faster above it and something 50% slower, smaller below it? That would be the "middle" right?

A niche product. In any case according to webster high end is defined as "higher in price and of better quality than most others". So by that definition, right now the 1080 would qualify.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
My ten year plan for my i5 3570K @ 4.0Ghz is still alive.

I have a similar plan for my 3770k @ 4.2. I typically keep my board/CPU for 5-6 years before I go with a new build. I'm at 4.5 with this one so far. I'll probably upgrade the GPU one more time and that'll be it for her.
 

cytg111

Lifer
Mar 17, 2008
25,987
15,441
136
- 15 second benchmarks may not be long enough to fully capture or dismiss the benefits of HT
- Not including per core/thread CPU usage while achieving the performance may obfuscate a scenario where the Core i5 OC is at 90%+ CPU utilization, leaving little room for next gen AAA games and/or future GPU upgrades where we inevitably will get cards 2X faster than GTX1080 at $300 range. This comparison of a stock i7 6700 vs. i5 6600K @ 4.6Ghz highlights this point very well:

https://www.youtube.com/watch?v=f9cVxka2fns

.....

I hear what you are saying, i have one objection though, lets break down the numbers.

Percentages (taken at random frame so conjecture might be effed right here from the start..)

cpu1234: 73,71,74,76
cpu12345678 : 55,47,62,49,57,39,53

totals

cpu1234 : 294%,
cpu12345678 : 362%

Thats 2/3'rds of a core more work being done for the same result or about ~25% - for the same job.
I dont trust readings on these hyperthreads, they may be stalling, competing for the same resources on a core, giving a wrong picture about utilization.
Notable is of course that there appear to actually be at least 8 workable threads. Notable is also, allbeit 8 workable threads the 6900 does not pull ahead. It takes me back to school, linear algebra, where we were tasked to find a perpendicular vector of another vector in a n-dimensional space. Just cause it is threaded doesnt mean it is actually concurrent. Just thoughts.
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
I have a similar plan for my 3770k @ 4.2. I typically keep my board/CPU for 5-6 years before I go with a new build. I'm at 4.5 with this one so far. I'll probably upgrade the GPU one more time and that'll be it for her.

Yep, I will do at least one GPU upgrade before the 10 years is up.