CPU Core Count Mania. What do you need more cores for?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What is your most important use case[s] for more CPU cores.

  • Gaming

    Votes: 32 25.0%
  • Video Encoding

    Votes: 38 29.7%
  • 3D rendering

    Votes: 10 7.8%
  • Virtualization (VMware and similar)

    Votes: 31 24.2%
  • HPC and Scientific computing

    Votes: 18 14.1%
  • Other (detail below)

    Votes: 18 14.1%
  • Software Compilation

    Votes: 16 12.5%
  • e-peen

    Votes: 13 10.2%
  • I don't need more cores

    Votes: 17 13.3%

  • Total voters
    128

eek2121

Platinum Member
Aug 2, 2005
2,930
4,025
136
You would need to use a different ISA instead of x86 and the software that can use that many cores. Something like that will be far more suitable for data centers, supercomputers, and the like.
Google Xeon Phi.
 

DaveSimmons

Elite Member
Aug 12, 2001
40,730
670
126
6 fast intel cores are plenty for me for gaming and Visual Studio. I have no interest in upgrading my i7-8700 to the octos when they come out
 

Eug

Lifer
Mar 11, 2000
23,586
1,000
126
I need six cores just to multitask with Netflix (1080p not 4K), office apps, and web browsing. And yes I’m being serious.

Why? Cuz the three desktop CPU cores I had before was not enough. It was OK most of the time, but lagged occasionally when multitasking.

The caveat though is the CPUs I’m talking about are 8 years old. I upgraded from an 2.9 GHz Athlon II X3 435 to a 2.8 GHz Phenom II 1055T. Made a noticeable difference. Saved me from having to buy a new machine just to run a few business apps. A 4.5 GHz triple-core AM3 chip would have worked too but they don’t exist so hex-core it is.

Oh and I have multi-core in my phone too!
 

NTMBK

Lifer
Nov 14, 2011
10,232
5,012
136
I need six cores just to multitask with Netflix (1080p not 4K), office apps, and web browsing. And yes I’m being serious.

Why? Cuz the three desktop CPU cores I had before was not enough. It was OK most of the time, but lagged occasionally when multitasking.

The caveat though is the CPUs I’m talking about are 8 years old. I upgraded from an 2.9 GHz Athlon II X3 435 to a 2.8 GHz Phenom II 1055T. Made a noticeable difference. Saved me from having to buy a new machine just to run a few business apps. A 4.5 GHz triple-core AM3 chip would have worked too but they don’t exist so hex-core it is.

Oh and I have multi-core in my phone too!

Are you sure it's the cores that helped, and not the L3 cache?
 

Eug

Lifer
Mar 11, 2000
23,586
1,000
126
Are you sure it's the cores that helped, and not the L3 cache?
Probably both. It’s really just a matter of performance, and the 3-core 2.9 GHz Athlon is just slow by current standards, noticeably slower than my Core m3 Kaby Lake Y, and without turbo.
 

Asterox

Golden Member
May 15, 2012
1,026
1,775
136
I need six cores just to multitask with Netflix (1080p not 4K), office apps, and web browsing. And yes I’m being serious.

Why? Cuz the three desktop CPU cores I had before was not enough. It was OK most of the time, but lagged occasionally when multitasking.

The caveat though is the CPUs I’m talking about are 8 years old. I upgraded from an 2.9 GHz Athlon II X3 435 to a 2.8 GHz Phenom II 1055T. Made a noticeable difference. Saved me from having to buy a new machine just to run a few business apps. A 4.5 GHz triple-core AM3 chip would have worked too but they don’t exist so hex-core it is.

Oh and I have multi-core in my phone too!

I have the same CPU, Phenom II 1055T running on 3.3ghz.But in reality 4/4 R3 2200G is faster CPU then Phenom II X6.

Old Phenom II X6 1055T, as new it was priced 200$ vs today CPU performance for 200$ Ryzen R7 1700.

https://youtu.be/feABiH0ggb8?t=1h18m1s

https://www.newegg.com/Product/Product.aspx?Item=N82E16819113428
 

eek2121

Platinum Member
Aug 2, 2005
2,930
4,025
136
I should add that going from my quad core 2600k to Threadripper 1950X was rather amazing. :p
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
I am noticing, that the least popular category (3D rendering) is the one that houses the most popular benchmark to quote (Cinebench).
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,541
14,495
136
I am noticing, that the least popular category (3D rendering) is the one that houses the most popular benchmark to quote (Cinebench).
If there were other well-known free benchmarks, maybe there would be more of those used.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
There are boat-loads of very-well-known free benchmarks but they are all paid off and manipulated by intel,just like cinebench was before AMD build their arch around it (not only but also)
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
I make use of more cores in audio post production (i.e. TV, film etc).

1. It allows for more tracks playing simultaneously in realtime with more plugins/processing. For average sports for example we're probably looking at 40 to 50 tracks with audio on them, and another 20 that are groups/auxes, for a total of around 80 to 100 channels (more for surround sound). Most have plugins on them, some quite heavy. All of that while playing video in sync in at least HD (1920x1080) and receiving control data from a control surface.

2. It allows for faster encoding of video that is sent out for approval. Just 5 years ago it was a pain in the neck for shorter jobs like commercials where the ad agency clients suddenly spring on you that they want to send out the 60 second version, the 30 second version and the 15 second version to clients with the recorded announcer and the mix of it all.... plus the two alternate versions for each length.... etc. Today with faster computers (i.e. more cores for encoding) it's much faster to do that on short notice (the clients typically sit right behind you).

3. More cores arguably also allows for more multi-tasking, or for more stable multi-tasking. With older systems I never dared kick off a render of something and then check my email at the same time or download something or whatever, but with newer more powerful systems I feel more comfortable doing that at times.

More cores definitely have a use.
 

StrangerGuy

Diamond Member
May 9, 2004
8,443
124
106
Usage: Gaming. Upgraded to a 8700K last year only because I was really bored with the PC landscape and there was a $80 discount on the mobo bundle.

Nothing has been interesting at all this year besides SSD prices collapsing, and even then there are huge diminishing returns beyond a big enough SATA3 SSD to use as an OS boot drive.
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Actually I forgot one thing: For a feature film you can pretty much add a zero after the numbers I gave, and then multiply by two.

StrangerGuy; obviously what interests us is personal, but I have to say that the recent rumors about Threadripper gets me interested in where we're heading. I got the best deal (bang/buck) I could at the end of last year, a Ryzen 1700 based system, and it's been good for me. It overclocks without much effort at all to 3.7GHz on all cores on included air cooling. Could probably go higher but I didn't bother. So for context and for what is interesting: A new Threadripper that has 4 times the cores and still overclocks to 4GHz on all of them on an included air cooler? That's nuts. I'm sure it won't be cheap, but that to me is insanely "interesting".
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Actually I forgot one thing: For a feature film you can pretty much add a zero after the numbers I gave, and then multiply by two.

StrangerGuy; obviously what interests us is personal, but I have to say that the recent rumors about Threadripper gets me interested in where we're heading. I got the best deal (bang/buck) I could at the end of last year, a Ryzen 1700 based system, and it's been good for me. It overclocks without much effort at all to 3.7GHz on all cores on included air cooling. Could probably go higher but I didn't bother. So for context and for what is interesting: A new Threadripper that has 4 times the cores and still overclocks to 4GHz on all of them on an included air cooler? That's nuts. I'm sure it won't be cheap, but that to me is insanely "interesting".

Would you say there is a point of diminishing returns with your work WR to how many you cores you can effectively use?

By that I mean, if having more cores makes your work 'snappier' but doesn't necessarily make you more 'productive', if that makes sense?
 

mattiasnyc

Senior member
Mar 30, 2017
356
337
136
Would you say there is a point of diminishing returns with your work WR to how many you cores you can effectively use?

By that I mean, if having more cores makes your work 'snappier' but doesn't necessarily make you more 'productive', if that makes sense?

It's a somewhat difficult question to answer. All else being equal the answer is 'yes', there is a point of diminishing returns. You could look at things in stages: We too little CPU capacity I simply can't perform several important tasks at all; with sufficient but little capacity I can do all I need, but some tasks take long enough that it's bad for business; with a decent CPU I can do most things fast enough, and for some tasks I have to take a break - not ideal, but acceptable perhaps; with a very good CPU I've reached that point of diminishing returns in the sense that I can do all I need within the time I need it done; with an even more capable CPU I can cut processing time for certain tasks but it doesn't make a huge difference any longer.

I hope that makes sort of sense.

The issue though is that not all else is equal. To give a very specific but clear example: Back about a decade ago or so there was a software called "RX" that came out by a company called iZotope. It was a suite of plugins for audio including for example a de-clicker, which would help me cut out click sounds when people speak (often due to a dry mouth). It then got a de-noiser. It does what it says, cuts out noise. The de-noiser was heavier on the CPU and I used it offline in non-realtime. Ideally I'd use it in realtime because it saves me time. So with newer computers that was possible. Then they came out with a de-reverb plugin. Same deal. But now because of the heavier plugins that I want to run live I need a heftier CPU. And so it goes.

In addition to that I predict that we'll see far more complicated processing for 3d audio, as well as mixing audio/video. Blackmagic Design's Davinci Resolve is a software that now does media management, video editing, audio post production, coloring, and finally rendering final media. So that's all that I do for audio post plus all that video people do - all in one app.

So, 'yes', to a degree there's a point of diminishing returns for every individual case, but we appear to always end up upgrading software at some point and then the race is back on again. It applies to individuals like me too that sometimes act as sub-contractors to audio production studios and do work at home. So in that sense.... 'no'....

How's that for a too long answer!?
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
It's a somewhat difficult question to answer. All else being equal the answer is 'yes', there is a point of diminishing returns. You could look at things in stages: We too little CPU capacity I simply can't perform several important tasks at all; with sufficient but little capacity I can do all I need, but some tasks take long enough that it's bad for business; with a decent CPU I can do most things fast enough, and for some tasks I have to take a break - not ideal, but acceptable perhaps; with a very good CPU I've reached that point of diminishing returns in the sense that I can do all I need within the time I need it done; with an even more capable CPU I can cut processing time for certain tasks but it doesn't make a huge difference any longer.

I hope that makes sort of sense.

The issue though is that not all else is equal. To give a very specific but clear example: Back about a decade ago or so there was a software called "RX" that came out by a company called iZotope. It was a suite of plugins for audio including for example a de-clicker, which would help me cut out click sounds when people speak (often due to a dry mouth). It then got a de-noiser. It does what it says, cuts out noise. The de-noiser was heavier on the CPU and I used it offline in non-realtime. Ideally I'd use it in realtime because it saves me time. So with newer computers that was possible. Then they came out with a de-reverb plugin. Same deal. But now because of the heavier plugins that I want to run live I need a heftier CPU. And so it goes.

In addition to that I predict that we'll see far more complicated processing for 3d audio, as well as mixing audio/video. Blackmagic Design's Davinci Resolve is a software that now does media management, video editing, audio post production, coloring, and finally rendering final media. So that's all that I do for audio post plus all that video people do - all in one app.

So, 'yes', to a degree there's a point of diminishing returns for every individual case, but we appear to always end up upgrading software at some point and then the race is back on again. It applies to individuals like me too that sometimes act as sub-contractors to audio production studios and do work at home. So in that sense.... 'no'....

How's that for a too long answer!?
To build on this response, on my side of the audio fence (music production) there have been immense breakthroughs in virtual instrument clarity and emulation that use computationally heavy algorithms that weren't previously possible (or economically possible). Things like the guitar FX chain simulators (Axe FX, Bias, Amplitube, etc.). So totally agreed. People come up with cool new things you can do with the processing power all the time
 
  • Like
Reactions: ub4ty

ub4ty

Senior member
Jun 21, 2017
749
898
96
To build on this response, on my side of the audio fence (music production) there have been immense breakthroughs in virtual instrument clarity and emulation that use computationally heavy algorithms that weren't previously possible (or economically possible). Things like the guitar FX chain simulators (Axe FX, Bias, Amplitube, etc.). So totally agreed. People come up with cool new things you can do with the processing power all the time
Indeed. The higher core count rigs I'm currently on 8/16 (64 pcie lanes) have definitely inspired a new approach to my software development. A number of software packages have not been updated to utilize this kind of core count or scale w/ the increasing trend. A number of software packages aren't NUMA aware. I'm not too interested in the performance of such packages Intel/AMD because I'm not reliant on them. I actually develop such software. In this, I find my home among Ryzen. The prices dropped so low recently that I grabbed up an 8 core Ryzen 1700 and built another rig to slap on to the network (distributed computing). I put the whole thing together for about $600. This breeds all new re-thinks. When I did my research to see what performed best in the HPC/Scientific/Engineering software space, it was actually Ryzen. When software is properly written, more cores definitely trumps higher clocks. 2 more cores + 4 threads is a hard thing to beat. I peaked over to various media forums and the same holds true. Stuttering on a 6 core.... Smooth performance on an 8-core.

Everyone ultimately needs to research their intended use case, performance therein, and make a smart purchase. There's a case for dual, quad, six, 8, 16, 32. I have dual/quad/8/16 rigs and use them all for various different purposes.
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,541
14,495
136
Web browsers, gamers, and 95% of the population outside of a enthusiast tech forum , 4 to 6 cores is plenty. 8 cores is what you call future proof.

The other 5% need more cores.

That's about the jist of it.
I assume you are excluding servers. There are probably more servers in the world, than PC's for personal use. And those servers can use all the cores they can get. And they are not all Xeon's, even in the server world. (however the non Xeon type processors are in a minority)
 
Aug 11, 2008
10,451
642
126
This. I don't think people realize, the real difference that multiple cores makes. Sure, it may not affect the CPU usage % graph (which is generally out of all collective cores anyways) much, but it makes a real difference in responsiveness.

Try running your PC with a single-core CPU, and then try to do a task, while scanning for viruses / malware, or doing a heavy download, etc. while web browsing.

Then do the same thing on a dual-core PC. Then try a quad-core. See the difference?

I'm not really sure why some in the peanut gallery seem to think that this benefit stops at the quad-core mark, and doesn't extend to 6, 8, 10, MOAR CORES.

Yes, diminishing returns, Amdahl's Law, etc. we've heard the argument against multi-core.

Let me tell you this: Open Task Manager, click on CPU, look at the number of Threads. Until we have as many cores as there are threads to run, we'll see improvements in responsiveness. (Note Threads on my system are up in the 2000+ range.)
It depends on the task, and how much you do at once. Personally, I am from the old days, and my philosophy is to minimize the number of tasks being done concurrently. I am sure for some users, they can use all the cores available, but you dont seem to be differentiating between those users and other users who dont need more that 4 or maybe even two fast cores. I certainly dont think you need 10 cores to surf the web or use MS office. As for your last statement, I wonder what the clockspeed of that 2000 core cpu would be and how well Windows would manage at assigning one thread per core.
 

Cableman

Member
Dec 6, 2017
78
73
91
Scientific computing, mainly Bayesian modeling/MCMC simulations, separate models are on separate threads so more cores allow me to get the results significantly faster. I use both CPUs and GPUs for different tasks.
 
  • Like
Reactions: Drazick

happy medium

Lifer
Jun 8, 2003
14,387
480
126
I assume you are excluding servers. There are probably more servers in the world, than PC's for personal use. And those servers can use all the cores they can get. And they are not all Xeon's, even in the server world. (however the non Xeon type processors are in a minority)
It should come as no surprise that Intelcontinues to dominate (>99%) the server market but is under enormous pressure on all fronts. Xeon and its evolution continue to be their compute vanguard. Xeon-Phi (and now the addition of Nervana) make up their engines for high-performance computing /

https://www.google.com/amp/s/www.fo...17/01/10/server-cpu-predictions-for-2017/amp/

Well I was not talking about the professional server market that Intel dominates obviously.

I think that would need a new CPU section all together.

Let's keep it on topic.