How Many CPU Cores Do You Need?

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
How Many CPU Cores Do You Need?

Which begs the question: how many CPU cores are right for me? Is a triple-core processor good enough for gaming, or should you splurge on a quad-core chip? Is a dual-core CPU good enough for the average user, or do more cores really make a difference? Which applications are optimized for multiple cores and which ones react only to specifications like frequency or cache size?

http://www.tomshardware.com/re...lti-core-cpu,2280.html

I thought the methodology trick for downconverting your rig into a lesser cored computer was pretty slick.

Definitely makes for an easy way to probe the performance differences with fewer cores while keeping all other hardware the same. It does suffer from a slight convolution of the data from the fact that the cache/core ratio changes when you disable cores. But pretty slick nonetheless.

Processor(s) Intel Core 2 Duo Q6600 (Kentsfield), 2.7 GHz, FSB-1200, 8 MB L2 Cache

http://www.tomshardware.com/re...i-core-cpu,2280-3.html

Can anyone tell me what the heck this is supposed to mean? 2.7GHz Q6600 with 400MHz FSB? What?

Would have been cool if they did a comparison with modern 2009 cpus, Phenom II and an i7 plus a Q9650 for good measure.

edit:

toms expanded on their original article to correct a few testing methodology gaffes as well as add some new "real world" user scenarios.

Part 2: How Many CPU Cores Do You Need?

Primarily, there was a concern that part one might have been flawed technically, as the Core 2 Quad Q6600 we used in our testing does not share all 8 MB of its L2 cache between its four CPU cores. Intel's Q6600 instead has two separate 4 MB cache repositories, each shared between one pair of CPU cores.

This means the quad- and triple-core results would have demonstrated the CPUs utilizing 8 MB of total cache, while the dual- and single-core results show that they were likely benefiting from 4 MB. Indeed, the benchmarks may have been reflecting the difference in L2 cache availability more than performance attributable to enabled processing cores.

A few readers were also interested in simulating a scenario where multiple applications are running at the same time, in order to gauge the benefit of additional CPU cores while multitasking. We therefore ran a new test to analyze this type of scenario, too.

http://www.tomshardware.com/re...-performance,2373.html
 

classy

Lifer
Oct 12, 1999
15,219
1
81
I have to admit, I work in IT and have been building computers, even servers for over 10 years. And I have used msconfig hundreds, literally hundreds of times and I never saw that, lol. ;)
 

classy

Lifer
Oct 12, 1999
15,219
1
81
Ok now I have finished reading the article, :). Good article, really cool. But there is one caveat. He should not have used a core 2 based quad. Its not a true quad, its two dual core processors with a bridge. The problem with that is the 8mb cache is split 4 meg to each dual core, when put together gives a total of 8mb. Now I may be wrong, but I believe thats how it is. So by disabling cores, you start to disable the cache as well, if I am correct. So his results may be slightly flawed because cache size is important, especially to core 2 processors in certain programs. Now he should have used an AMD processor which is a true quad core and the L3 cache is used by all the cores independantly. This would give a better or more true result. But even there as well the cache could come into play because each core has 512 tied to it. So by disabling 3 cores for example on an amd quad processor you would also take away 1.5 megs of L2 cache. But at least the L3 cache is not split up among the cores. An I7 as well would have been a better choice as well for the same reasons. Good article though, nice job.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I could use a couple thousand cores. With 3d rendering you can give it 10K cores and the software acts like, "Is that it ? just 10K cores ? bah"
 

apoppin

Lifer
Mar 9, 2000
34,890
1
0
alienbabeltech.com
His game benchmarks totally suck. The rest of it is based on a premise that cache size is not impacted by disabling cores

All low details - what gamer gives a crap?
We ran these games at a low 1024x768 resolution with low details to minimize the impact of the graphics card so we could really see how CPU-core limited these game titles might be.

So what? We already know for a fact that some games simply *require* Quad core. Most don't.
rose.gif

 

jandlecack

Senior member
Apr 25, 2009
244
0
0
Originally posted by: apoppin
All low details - what gamer gives a crap?
We ran these games at a low 1024x768 resolution with low details to minimize the impact of the graphics card so we could really see how CPU-core limited these game titles might be.

So what? We already know for a fact that some games simply *require* Quad core. Most don't.
rose.gif
Wow, someone knows what he's talking about.
 

ardeegee

Junior Member
Mar 29, 2009
7
0
0
Originally posted by: Modelworks
I could use a couple thousand cores. With 3d rendering you can give it 10K cores and the software acts like, "Is that it ? just 10K cores ? bah"

Yeah, that's what I was thinking when I saw the subject line. I like to tinker with CG-- I won't have enough cores until I can run arbitrarily complex frames in arbitrarily high resolutions in realtime or better. So, a large render farm on my desk might start to head in that direction.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: dguy6789
IDC, I am pretty sure he just ran the Q6600 at 300FSB instead of 266.

Yeah that makes sense. 9x300 then. I couldn't get my brain around the text "Q6600, 2.7GHz"...uh what? Had they said "we overclocked it" then I would have realized, of course in retrospect I see it was rather self-evident. Thanks for helping me out there :thumbsup:

Originally posted by: classy
Ok now I have finished reading the article, :). Good article, really cool. But there is one caveat. He should not have used a core 2 based quad. Its not a true quad, its two dual core processors with a bridge. The problem with that is the 8mb cache is split 4 meg to each dual core, when put together gives a total of 8mb. Now I may be wrong, but I believe thats how it is. So by disabling cores, you start to disable the cache as well, if I am correct. So his results may be slightly flawed because cache size is important, especially to core 2 processors in certain programs. Now he should have used an AMD processor which is a true quad core and the L3 cache is used by all the cores independantly. This would give a better or more true result. But even there as well the cache could come into play because each core has 512 tied to it. So by disabling 3 cores for example on an amd quad processor you would also take away 1.5 megs of L2 cache. But at least the L3 cache is not split up among the cores. An I7 as well would have been a better choice as well for the same reasons. Good article though, nice job.

You are right, the fact they used a 2x2core chip no doubt is what caused some of the scaling havoc between the two and three core benches.

Yeah if they used a monolithic chip architecture like that of Phenom II or i7 then they wouldn't have had any of the cache-loss weirdness going on while they ratcheted thru the disabling of cores.

I'm not so much impressed with the review itself, the actual CPU and benches themselves are all flawed to make the results inapplicable for many users. But I did like the spirit of the review and the technique for disabling cores.

I'm more hoping someone (AT, techreport, etc) picks up on the notion and decides to do a more rigorous and robust assessment of the impact of cores on performance for gaming and other real-world apps.
 
Dec 30, 2004
12,553
2
76
Originally posted by: Idontcare
How Many CPU Cores Do You Need?

Which begs the question: how many CPU cores are right for me? Is a triple-core processor good enough for gaming, or should you splurge on a quad-core chip? Is a dual-core CPU good enough for the average user, or do more cores really make a difference? Which applications are optimized for multiple cores and which ones react only to specifications like frequency or cache size?

http://www.tomshardware.com/re...lti-core-cpu,2280.html

I thought the methodology trick for downconverting your rig into a lesser cored computer was pretty slick.

Definitely makes for an easy way to probe the performance differences with fewer cores while keeping all other hardware the same. It does suffer from a slight convolution of the data from the fact that the cache/core ratio changes when you disable cores. But pretty slick nonetheless.

Processor(s) Intel Core 2 Duo Q6600 (Kentsfield), 2.7 GHz, FSB-1200, 8 MB L2 Cache

http://www.tomshardware.com/re...i-core-cpu,2280-3.html

Can anyone tell me what the heck this is supposed to mean? 2.7GHz Q6600 with 400MHz FSB? What?

Would have been cool if they did a comparison with modern 2009 cpus, Phenom II and an i7 plus a Q9650 for good measure.

300Mhz FSB...multiplier = 9; --> 2.7Ghz. I'm assuming they're assuming we can all reach 2.7ghz on stock cooling (a legitimate assumption).

edit: oops, late to the game.

I'm more hoping someone (AT, techreport, etc) picks up on the notion and decides to do a more rigorous and robust assessment of the impact of cores on performance for gaming and other real-world apps.
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
Originally posted by: Idontcare
Originally posted by: classy
Ok now I have finished reading the article, :). Good article, really cool. But there is one caveat. He should not have used a core 2 based quad. Its not a true quad, its two dual core processors with a bridge. The problem with that is the 8mb cache is split 4 meg to each dual core, when put together gives a total of 8mb. Now I may be wrong, but I believe thats how it is. So by disabling cores, you start to disable the cache as well, if I am correct. So his results may be slightly flawed because cache size is important, especially to core 2 processors in certain programs. Now he should have used an AMD processor which is a true quad core and the L3 cache is used by all the cores independantly. This would give a better or more true result. But even there as well the cache could come into play because each core has 512 tied to it. So by disabling 3 cores for example on an amd quad processor you would also take away 1.5 megs of L2 cache. But at least the L3 cache is not split up among the cores. An I7 as well would have been a better choice as well for the same reasons. Good article though, nice job.

You are right, the fact they used a 2x2core chip no doubt is what caused some of the scaling havoc between the two and three core benches.

Yeah if they used a monolithic chip architecture like that of Phenom II or i7 then they wouldn't have had any of the cache-loss weirdness going on while they ratcheted thru the disabling of cores.

I'm not so much impressed with the review itself, the actual CPU and benches themselves are all flawed to make the results inapplicable for many users. But I did like the spirit of the review and the technique for disabling cores.

I'm more hoping someone (AT, techreport, etc) picks up on the notion and decides to do a more rigorous and robust assessment of the impact of cores on performance for gaming and other real-world apps.

It really bugs me when reviewers get weird results and don't bother questioning them, let alone explaining them. I'm also not a huge fan of the graph on the conclusion page. For some things, presenting the numbers that way makes sense, but for a lot of tasks you actually want the inverse (i.e. 2x as fast = a bar half as long). If stitching a panorama photo goes from 8 minutes to 4 minutes to 2.6 minutes to 2 minutes, the 4th core saved you less than a minute. If you look at the first graph on page 7, a 4th core saves you under 1 minute when compared to 3 cores. That chart would be really interesting with estimated prices added.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: CTho9305
It really bugs me when reviewers get weird results and don't bother questioning them, let alone explaining them. I'm also not a huge fan of the graph on the conclusion page. For some things, presenting the numbers that way makes sense, but for a lot of tasks you actually want the inverse (i.e. 2x as fast = a bar half as long). If stitching a panorama photo goes from 8 minutes to 4 minutes to 2.6 minutes to 2 minutes, the 4th core saved you less than a minute. If you look at the first graph on page 7, a 4th core saves you under 1 minute when compared to 3 cores. That chart would be really interesting with estimated prices added.

It is backward, and in more than one way. For one they have the abscissa and the ordinate reversed.

The controlled variable is the number of cores, ergo that should be the x-axis (abscissa).

Amdahl's law

Ordinarily when doing a scaling analysis you would plot cores (or processors) versus speedup.

The tom's guys kinda tried to do that with that last graph, but they got their x and y axes inverted and they plotted speedup as a percentage instead of just the standard speedup numbers.

The premise of the article though is no different than the ageless debate in single-core days over "how much clockspeed do you need"...plotting performance scaling as a fixed function of CPU GHz always created the pointless impression of diminishing returns as the clockspeed scaled.

There is value to the data if it is normalized properly and presented in a way that the reader knows how to leverage when they attempt to answer that question for themselves in their situation.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Part 2: How Many CPU Cores Do You Need?

Primarily, there was a concern that part one might have been flawed technically, as the Core 2 Quad Q6600 we used in our testing does not share all 8 MB of its L2 cache between its four CPU cores. Intel's Q6600 instead has two separate 4 MB cache repositories, each shared between one pair of CPU cores.

This means the quad- and triple-core results would have demonstrated the CPUs utilizing 8 MB of total cache, while the dual- and single-core results show that they were likely benefiting from 4 MB. Indeed, the benchmarks may have been reflecting the difference in L2 cache availability more than performance attributable to enabled processing cores.

A few readers were also interested in simulating a scenario where multiple applications are running at the same time, in order to gauge the benefit of additional CPU cores while multitasking. We therefore ran a new test to analyze this type of scenario, too.

http://www.tomshardware.com/re...-performance,2373.html

toms expanded on their original article to correct a few testing methodology gaffes as well as add some new "real world" user scenarios.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Very well written article.

This here supports something I've been saying for quite some time about how using a quad has advantages even for people who don't heavily multitask.
 

drizek

Golden Member
Jul 7, 2005
1,410
0
71
There is also a max memory setting in msconfig that I had used to get around some bug with Google Earth(it lags if you have more than 4gb of ram or something like that, seems to be fixed now). I was surprised that I didn't see it used in a triple/dual channel memory benchmark. I think the reviewer (anand?) had 4GB of RAM in the dual channel system and 6GB in the triple channel one.