i5-6600 vs i7-6700: +40% in price but what about performance?

swapjim

Member
Nov 16, 2015
113
2
81
I'm comparing the specs of i7-6700 with i5-6600 and I see that the i7 has the following bonuses:


  • Hyper-Thredding
  • +2MB of cache
  • +0.2GHz in clock speed
  • +0.1Ghz in clock speed while in Turbo mode
  • +2 more threads, which I assume is due to Hyper-Threading

And these bonuses will cost me 42% more in price, based on the prices listed in Intel's website. On local stores, it's +46%.

I have two questions.

1. Is there is anything else about the two CPUs that I'm not seeing in those links?

2. What increase in performance can I expect, and in what situations?

I've read this benchmark test, and the i7 is 23-25% faster in multi-core tests. I'm assuming that's Hyper-Threading in action. In what real-world situations would this increase in speed become apparent?

This link says that the i7 performs better at memory-intensive tasks. Which I assume is because of the extra 2MB of cache. Examples for such tasks is 7-Zip and WinRAR. I imagine that video encoding (like x264) will also benefit - am I right? What other real-world examples are there for memory-intensive tasks?

And finally, this video shows, on average, a 30% increase for games.

Is that it? +30% in games, +25% in multi-threaded applications, and +39% in memory intensive applications?
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
i7 has 8 threads vs 4 on the i5.

In some cases, the extra threads are not very helpful, and in others, the difference is very large. Take this case:

65063.png


Or this one:

65067.png



A simplified explanation of hyperthreading (as I understand it) is that it allows unutilized parts of the core to do something else, rather than sit idle, so the benefits are largest when you're trying to do two different types of things at once. In these cases, the benefit can be a doubling in performance. Whether or not an 8 thread CPU will be worth the extra money for your use cases will depend entirely on your use cases.

Also, consider that when you look at the cost of the entire computer, you're getting that extra performance for only 5-15% extra total cost.
 
Last edited:

mysticjbyrd

Golden Member
Oct 6, 2015
1,363
3
0
So, what are you going to do with this PC?

If only playing games, then it helps to know what type of games do you play? What video card do you have, and do you plan to upgrade it later?

For instance, the games in that video were all open world first person shooter games. Those types of games are very demanding of the CPU, which CAN be a limiting factor IF you have an insane video card like the 980TI.
 
Aug 11, 2008
10,451
642
126
Yes, I think the OP is looking at the question in the wrong way. Instead of trying to envision every possible use case and evaluate hyperthreading, he should decide what he is going to use the cpu for and evaluate in the context of those apps.

Edit: I also tend to evaluate the price of a component in the context of the whole system. For instance, one cpu may cost 40% more than another cpu, but in the context of the cost of the whole system, it is probably only 5 to 10%.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Yes, I think the OP is looking at the question in the wrong way. Instead of trying to envision every possible use case and evaluate hyperthreading, he should decide what he is going to use the cpu for and evaluate in the context of those apps.

Edit: I also tend to evaluate the price of a component in the context of the whole system. For instance, one cpu may cost 40% more than another cpu, but in the context of the cost of the whole system, it is probably only 5 to 10%.

Exactly. :thumbsup:
 

swapjim

Member
Nov 16, 2015
113
2
81
For instance, the games in that video were all open world first person shooter games. Those types of games are very demanding of the CPU, which CAN be a limiting factor IF you have an insane video card like the 980TI.

Nope. Nothing like the 980TI. I'll be settling for a 200-230$ VGA.

What I was about to say. $100 in the context of a $2500 system is just a 4% increase in price for a 20-40% performance gain.

By a rough estimaton, the system without the display will cost me around 1200$. And that includes a case, a PSU, and an HDD.

Yes, I think the OP is looking at the question in the wrong way. Instead of trying to envision every possible use case and evaluate hyperthreading, he should decide what he is going to use the cpu for and evaluate in the context of those apps.

Lightroom will be the application I'll be working A LOT. In most likelihood I'll be exploring other RAW workflow programs to replace it, because Lightroom is so heavy. Information that I have read on the web is not definite on what CPU will make Lightroom the most happy. I'm still not sure it'll benefit from Hyper-Threading. "Not definite" means that I have seen people in forums say that Lightroom will benefit from HT but I'm not sure I should trust these posts.

What I can trust, though, is that LR will not get much faster beyond 4 cores when working with photos (core count helps in exporting, for example), and it benefits more from the speed of cores, rather than the number. Overclocked CPUs run LR faster.

Lightroom is a nasty app. It has an amazingly organized and intuitive UI (which is it's great appeal) but a heavy engine that under utilizes even high end CPUs. Adobe has given photographers one hell of a dilemma.

Photoshop is one more program I'll be using. I haven't researched how the latest Photoshop version behaves, but throughout the years, Photoshop has been a lot more reasonable to work with. Just get a recent system and you're okay.

Sony Vegas is what I'll be using for video editing and I'll be mostly working with still images (this is less intensive than moving picture) to compose moving picture.

I'll be running a few VMs in VirtualBox (but might explore options from VMWare) from modern.ie to test websites in various versions of Internet Explorer. I'll have at least one IE VM open, and odds are that I'll have a second one too. Plus the VMs I'll set up for my own amusement.

I think that just about covers it for software.

Now about games.... There are a few games that I'd like to play. The new Unreal Tournament (which is being made as a crowdsourced project and is in beta right now). The new Doom, coming this spring (I know that it's better to wait till the game is out but my current computer is very limiting and I need something NOW). Counter-Strike Global Offensive is one of the games I'll play. The latest Grid racing game. I'll probably explore other racing games of the last 5 years or so. Starcraft 2 (this one is lightweight). The latest Serious Sam games. I still haven't played the Crysis series of games. Would love to explore them. The Call Of Duty series don't appeal to me a lot. Haven't decided if I want to play the latest Grand Theft Auto games.

My list of games is incomplete. Apparently I have a thing for FPS games. I think I'm opportunist in the way that I play games. I haven't checked what kind of GPUs are out there. I was planning to set a limit of 200-230$ for one and see what best I can get. And then select those games that'll work okay on that.

I saw my CPU decision as something separate from games because I read the the i5 is very very close to i7 in terms of gaming. And the extra 100$ is better invested in the GPU. So I focused on the programs that I intented to use. Lightroom, other RAW workflow software, Photoshop, Sony Vegas, and Virtual Box.

That's the method of thinking that I'm using to decide between i5 and i7 and it hasn't provided a definite answer yet. Comments are very welcome about everything I wrote in this post, and my original one.
 
Last edited:
Aug 11, 2008
10,451
642
126
There are a couple of really nice articles from Puget Systems about the cpu and gpu requirements. lightroom hardware .

With a 1200.00 budget, I would go for something like a 4790k/6700k, and a GTX 960/970. For heavy productivity use like this, I think an i7 is definiely worth the cost.

You might even consider the 5820K, but you will have to get a 4ghz+ overclock (usually doable) to get the needed single thread performance. If it is for professional work, you might want to go quad i7 and not risk overclocking.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
I was planning to set a limit of 200-230$ for one and see what best I can get. And then select those games that'll work okay on that.

I saw my CPU decision as something separate from games because I read the the i5 is very very close to i7 in terms of gaming. And the extra 100$ is better invested in the GPU.

This is true. Some of the games you listed are very GPU demanding. Try to price out a Core i7 4790K rig as it's possible to put one together for the price of an i5-6600K. That gives you much better performance than the 6600K in multi-threaded apps. If budget is very tight, there is no reason not to look at heavily discounted DDR3, i7 4790K over the Skylake i5s.

As far as the GPU selection, try to squeeze $10-20 to get into the R9 290/390/970 territory. Right now all of the GPUs in the $180-230 range present bad value compared to these 3 cards I listed. If you set a strict $200-230 budget on a GPU and end up with a 960 4GB/R9 380X, you are standing to lose a huge chunk of performance by not spending just a small fraction above that. Be on the look-out for hot deals and various discounts on Newegg such as using $25 off $200 with AMEX, etc.

My vote is for the 6700K.

I would say get a 6700k with 390/970. And of course 16-32GB RAM.

For productivity and overall gaming setup over 4-5 years, i7 5820K @ 4.4-4.5Ghz is way better than the i7 6700K. Why do you guys keep recommending him the i7 6700K? If multi-threaded performance is key, 5820K OC will crush the 6700K. Also, X99 platform has support for 8/10 core Broadwell-E and 18-core Xeons. Z170 is dead end as far as big performance gains go.

While 6700K OC is barely faster than the i7 4790K despite a big price increase, the i7 5820K OC actually smashes the 6700K in multi-threaded apps while delivering most of the gaming performance. With current pricing, the i7 6700K sits in no-man's land imo.
 
Last edited:

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
For productivity and overall gaming setup over 4-5 years, i7 5820K @ 4.4-4.5Ghz is way better than the i7 6700K. Why do you guys keep recommending him the i7 6700K? If multi-threaded performance is key, 5820K OC will crush the 6700K. Also, X99 platform has support for 8/10 core Broadwell-E and 18-core Xeons. Z170 is dead end as far as big performance gains go.

While 6700K OC is barely faster than the i7 4790K despite a big price increase, the i7 5820K OC actually smashes the 6700K in multi-threaded apps while delivering most of the gaming performance. With current pricing, the i7 6700K sits in no-man's land imo.

This:

What I can trust, though, is that LR will not get much faster beyond 4 cores when working with photos (core count helps in exporting, for example), and it benefits more from the speed of cores, rather than the number. Overclocked CPUs run LR faster.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
138
106
If you are using VM, only the i7 or the FX 8K series are your options... Even more, the FX goes better than the i7 in that aspect...

However in the rest of tests, the i7 crushes the FX.
 
Aug 11, 2008
10,451
642
126

But is highly multithreaded performance what the op is looking for? From the articles I looked at, performance in Lightroom seems to plateau at about 4 cores. You are also assuming the OP is willing/able to get a good overclock on the 5820k.

Edit: Yuriman, sorry, thought because you quoted Russian you were agreeing with him in recommending HW-E. I think I misread your post.
 
Last edited:

tential

Diamond Member
May 13, 2008
7,355
642
121
I can never recommend an i5 with how long Intel processors last. I don't think any person that does is a nice person. Especially with the deals you can get on i7s at Microcenter or frys. Deals are so good I'll upgrade to the hedt platform just because it's that good. I dunno when though. My 4770k works fine for everything I do and I really can't imagine a time within the next 3 years where it will need an upgrade and if it's lasting that long then of course I should get an i7.

In fact, my build will see 3 different gpus before any other major part is changed....
 

coercitiv

Diamond Member
Jan 24, 2014
6,205
11,916
136
I can never recommend an i5 with how long Intel processors last. I don't think any person that does is a nice person.

  • Number of threads Photoshop uses when saving a 1GB+ file: 1.
  • Most important component when loading one large file or many files at once: SSD speed.
  • Biggest Photoshop limitation when working with large/many files : RAM.
  • Critical component when doing photo editing: Display.
  • Necessary component when working on multimedia projects: more displays.
Achievement unlocked - Successful professional: i7 CPU and latest iPhone. (both latest model cuz great value).

Yes, I'm not a nice person.
 

dazelord

Member
Apr 21, 2012
46
2
71
Lightroom will be the application I'll be working A LOT. In most likelihood I'll be exploring other RAW workflow programs to replace it, because Lightroom is so heavy. Information that I have read on the web is not definite on what CPU will make Lightroom the most happy. I'm still not sure it'll benefit from Hyper-Threading. "Not definite" means that I have seen people in forums say that Lightroom will benefit from HT but I'm not sure I should trust these posts.

What I can trust, though, is that LR will not get much faster beyond 4 cores when working with photos (core count helps in exporting, for example), and it benefits more from the speed of cores, rather than the number. Overclocked CPUs run LR faster.

Lightroom is a nasty app. It has an amazingly organized and intuitive UI (which is it's great appeal) but a heavy engine that under utilizes even high end CPUs. Adobe has given photographers one hell of a dilemma.

I use LR a lot (on a 4770K @ 4.4) and cannot believe how slow it is. It's the only software that makes my computer feel slow. On average, the pixel count in DSLRs has only risen 2-3x during the last ten years. Still Adobe is incapable of creating a software that runs well on modern i7 cpus.

According to this test LR is insensitive to hyperthreading and seems to scale reasonably well up to six physical cores. I've seen evidence that HT sometimes hurts the performance.

http://www.sweclockers.com/test/20862-intel-core-i7-6700k-och-i5-6600k-skylake/9#content
 
Last edited:

swapjim

Member
Nov 16, 2015
113
2
81
There are a couple of really nice articles from Puget Systems about the cpu and gpu requirements. lightroom hardware .

That link confirms what I have seen in other articles about Lightroom performance.

With a 1200.00 budget, I would go for something like a 4790k/6700k, and a GTX 960/970. For heavy productivity use like this, I think an i7 is definiely worth the cost.

If it's an i7, and it'll probably be, why go for the 4790? The 6700 is a newer processor (benchmarks say it's more powerful) and uses a newer socket (1151). It's more future "proof" -- meaning it'll be easier to find a cheap, beefed up processor in a couple of years if I decide I want an upgrade. Unless that 4790 has something that the 6700 doesn't.

As far as the GPU selection, try to squeeze $10-20 to get into the R9 290/390/970 territory. Right now all of the GPUs in the $180-230 range present bad value compared to these 3 cards I listed. If you set a strict $200-230 budget on a GPU and end up with a 960 4GB/R9 380X, you are standing to lose a huge chunk of performance by not spending just a small fraction above that. Be on the look-out for hot deals and various discounts on Newegg such as using $25 off $200 with AMEX, etc.

I understand what you're saying and I can price up to get something that'll have such a huge difference. Both the i5, and the i7 have GPUs in them so I can grab one and start working, and postpone selecting a VGA for a while (a few weeks or a couple of months).

try to squeeze $10-20 to get into the R9 290/390/970 territory.

I have really not looked at VGAs at all so I'm not sure I get what "models" you mention there. Can you please confirm that these are the ones?:

ATI Radeon R9 290 4GB RAM
ATI Radeon R9 390 8GB RAM
Nvidia GeForce GTX970 4GB

All of these cards go into the 350-380 Euro price range (I live inside the Euro zone) at local stores.

I mentioned USD in my earlier posts because a CPU that costs 275 USD in the US costs about 275 EUR in the Euro zone. And this trend follows most of the PC components.

With VGAs: an Nvidia GTX960 2GB RAM starts at 200 EUR. 230 EUR if you want 4GB. An Nvidia GTX970 4GB starts at 350 EUR. But then I need to buy from a store that has good RMA, and they bring Asus, MSI, and Gigabyte GTX970s, which start from 420 EUR. Damn!

An EVGA Nvidia GTX970 4GB costs around 300 USD in NewEgg but NewEgg doesn't ship in the place where I live. Amazon does though. The cheapest GTX970 that I can see right now is around 360 USD. Plus shipping costs. And if it breaks I have to ship it abroad. Damn. Maybe it's best if I postpone buying a VGA for a while.

For productivity and overall gaming setup over 4-5 years, i7 5820K @ 4.4-4.5Ghz is way better than the i7 6700K. Why do you guys keep recommending him the i7 6700K? If multi-threaded performance is key, 5820K OC will crush the 6700K. Also, X99 platform has support for 8/10 core Broadwell-E and 18-core Xeons. Z170 is dead end as far as big performance gains go.

While 6700K OC is barely faster than the i7 4790K despite a big price increase, the i7 5820K OC actually smashes the 6700K in multi-threaded apps while delivering most of the gaming performance. With current pricing, the i7 6700K sits in no-man's land imo.

I never, ever, ever, ever overclock my machines. I don't like the idea of stressing the machine and upping the heat.

The extra heat is also the reason that I'm hesitant to go with a 'K' processor. The i7-6700K consumes 95w, the i7-6700 consumes 65w.

I still remember the day that I unpluged my AMD Athlon XP from my motherboard and saw that the CPU socket had a slight rotation because the temperature made the plastic soft! That day I swore I would never again have a hot CPU in my system. It makes me feel (it's not a rational decision) unsafe and unease. The stuff I do when I sit on my computer is important and anything that might even slightly undermine what I'm doing there is something I would really like to avoid. OC has gone a long way since the late 90s. It's gone so mainstream that today --hell even 10 years ago!-- is pretty trivial and safe to OC your CPU. I'll definately need to check on OC at some point in the future.

I use LR a lot (on a 4770K @ 4.4) and cannot believe how slow it is. It's the only software that makes my computer feel slow. On average, the pixel count in DSLRs has only risen 2-3x during the last ten years. Still Adobe is incapable of creating a software that runs well on modern i7 cpus.

According to this test LR is insensitive to hyperthreading and seems to scale reasonably well up to six physical cores. I've seen evidence that HT sometimes hurts the performance.

http://www.sweclockers.com/test/20862-intel-core-i7-6700k-och-i5-6600k-skylake/9#content

Is it also slow in Develop mode? When switching from one photo to the next? When opening a new subfolder? When selecting all photos having a keyword? Do you use the CC version? (I'm on an old 3.x version).

I think Adobe has gotten lazy with LR. We're going to need some competion if we're to see a faster LR.

Would be interested to see what other RAW workflow progs are out there, that at least have some of the facilities of Lightroom that I like: nested keywords, doun't touch RAW files and write everything to XMP files, adjustment snapshots, rename based on date when importing (there are probably more).

This topic deserves its own thread. If you're inclined open one and send me PM to point me to it. Or if you have tried any other RW prog, send me a PM mentioning it. I'll definately return with this.
 
Aug 11, 2008
10,451
642
126
4790k vs 6700k is simply a matter of price. There are some good deals floating around on 4790k right now, while 6700k is tending to be overpriced. For instance at the microcenter near me, 4790k is 250 dollars and 6700k is 360.00. Yes, the 6700k if faster, but some very unusual gaming data from FO4 aside, only about 10% or less. To me that is not worth it. At those prices, I would go 4790k. 6700k is a newer platform and maybe has an upgrade path, but I am not expecting anything on that socket to be a big jump in cpu performance.
 

Bearmann

Member
Sep 14, 2008
167
2
81
I use LR a lot (on a 4770K @ 4.4) and cannot believe how slow it is. It's the only software that makes my computer feel slow. On average, the pixel count in DSLRs has only risen 2-3x during the last ten years. Still Adobe is incapable of creating a software that runs well on modern i7 cpus.

According to this test LR is insensitive to hyperthreading and seems to scale reasonably well up to six physical cores. I've seen evidence that HT sometimes hurts the performance.

http://www.sweclockers.com/test/20862-intel-core-i7-6700k-och-i5-6600k-skylake/9#content

I looked at the test, but I'm not sure why you conclude that hyperthreading is no benefit in Lightroom. Could you please elaborate. Thanks.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
You have the video cards right. Given those prices, a GTX960 seems a reasonable choice. Also, given your preference for cooler, lower power components, I would recommend avoiding AMD's video cards. They offer a lot of performance per $, but draw significantly more power.

If a 6700K is no more than 10-15% more expensive, I'd go for it. If it's significantly more than a 4790K, I'd chalk it up to retailers overpricing it so they can get rid of old stock, wait for it to come down a little, and then still buy the 6700K.

I think you'll find that, even though it's a higher TDP part, the 6700K is still a relatively efficient chip. Anandtech's review of the 6700K shows that despite a higher TDP, the 6700K actually draws less power than the 4790K, and probably stands to draw significantly less if you undervolt. Their sample motherboard seemed to give far more voltage than was necessary. You also have the option of reducing clocks 1-200mhz and lowering voltage even more. I'd love to own a Skylake system, so I could plot some voltage vs frequency vs power charts.
 

swapjim

Member
Nov 16, 2015
113
2
81
4790k vs 6700k is simply a matter of price. [...] For instance at the microcenter near me, 4790k is 250 dollars and 6700k is 360.00.

The market is very different here. The 4790 costs 341 EUR. The 6700 costs 350 EUR. I have no reason to go with the 4790 (the 'K' versions of the CPUs have similarly low difference in price).

I think you'll find that, even though it's a higher TDP part, the 6700K is still a relatively efficient chip. Anandtech's review of the 6700K shows that despite a higher TDP, the 6700K actually draws less power than the 4790K, and probably stands to draw significantly less if you undervolt. Their sample motherboard seemed to give far more voltage than was necessary. You also have the option of reducing clocks 1-200mhz and lowering voltage even more. I'd love to own a Skylake system, so I could plot some voltage vs frequency vs power charts.

My dilemma is between 6700 and 6700K. That's what I'm researching right now. But you said something that sounds like a game changer. Can I get the 6700K, undervold/underclock it and get better performance than the 6700 and the same temperatures?
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
This is correct. I run my Ivy Bridge CPU slightly underclocked and heavily undervolted for these same reasons. I don't much need that last 10% performance at the expense of 50% extra power used.
 

swapjim

Member
Nov 16, 2015
113
2
81
Wait, I need to confirm this: an underclocked 6700K will perform faster and at lower temperatures than the 6700?