CPU Core Count Mania. What do you need more cores for?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What is your most important use case[s] for more CPU cores.

  • Gaming

    Votes: 32 25.0%
  • Video Encoding

    Votes: 38 29.7%
  • 3D rendering

    Votes: 10 7.8%
  • Virtualization (VMware and similar)

    Votes: 31 24.2%
  • HPC and Scientific computing

    Votes: 18 14.1%
  • Other (detail below)

    Votes: 18 14.1%
  • Software Compilation

    Votes: 16 12.5%
  • e-peen

    Votes: 13 10.2%
  • I don't need more cores

    Votes: 17 13.3%

  • Total voters
    128

TheGiant

Senior member
Jun 12, 2017
410
68
86
#76
my problem with high core count is the energy output

with current internet connection speed I think it will relative soon end with cloud computing, but we need more acceptable prices
current prices for high performance 16C with 256GB RAM on the cloud are astronomic....

the only thing you cannot buy and will not anytime soon able to buy from internet is latency based computing- gaming, artist stuff (pen work& interactive), CAD drawing etc where you must feel the touch imo
on the other hand- rendering, distributed computing, scientific (like me doing CFD), big excel calcalations, encoding etc can be done on the cloud imo

the other possibility is for me rebuilding the tech room in the house to use like~ 1KW output on the computer for thermal energy while mining/other stuff that makes money and you need that hot water anyway :)
 

IEC

Super Moderator
Super Moderator
Jun 10, 2004
13,565
390
136
#77
It should come as no surprise that Intelcontinues to dominate (>99%) the server market but is under enormous pressure on all fronts. Xeon and its evolution continue to be their compute vanguard. Xeon-Phi (and now the addition of Nervana) make up their engines for high-performance computing /

https://www.google.com/amp/s/www.fo...17/01/10/server-cpu-predictions-for-2017/amp/

Well I was not talking about the professional server market that Intel dominates obviously.

I think that would need a new CPU section all together.

Let's keep it on topic.
Server architectures are perfectly on topic, as we are talking about CPU core counts and what we use them for. Many of us in the DC and scientific/HPC communities utilize servers with Xeon and Epyc processors. I've used 300+ Skylake-EP cores at a time, for instance.

That aside, the article you cite is from January 2017, prior to the launch of the Zen architecture.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
227
96
#78
Server architectures are perfectly on topic, as we are talking about CPU core counts and what we use them for. Many of us in the DC and scientific/HPC communities utilize servers with Xeon and Epyc processors. I've used 300+ Skylake-EP cores at a time, for instance.

That aside, the article you cite is from January 2017, prior to the launch of the Zen architecture.
I know what more cores are used for in work applications/settings.

My intent was finding out what people use them for at home. I am betting that you don't have 300 Skylake cores at home.
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
17,774
1,378
136
#80
I know what more cores are used for in work applications/settings.

My intent was finding out what people use them for at home. I am betting that you don't have 300 Skylake cores at home.
At home, I have 136 Haswell Xeon cores, and if you count threadripper as a "server" CPU I have 128 of those. I have another 100 or so cores on other Intel/Ryzen platforms.

And yes, they are doing DC, as in cancer research, BOINC for Rosetta@home, WCG which is mostly cancer research, and F@H which is all cancer research.
So not only am I totally on-topic with "what do you use them for at home" since I use server CPU's, thats also on-topic.

Oh, and in case you didn't know, I am fighting cancer right now, if you wonder why I am so dedicated to that cause.
 

ub4ty

Senior member
Jun 21, 2017
749
321
96
#81
I know what more cores are used for in work applications/settings.

My intent was finding out what people use them for at home. I am betting that you don't have 300 Skylake cores at home.
The over-arching take-away then is that people are using compute and lots of it at home in the same manner people use them in work applications and enterprise : Video editing/Virtualization/HPC computing/AI/Simulations/Compilation.

That's essentially what corporations are doing just on different/similar/ or even lessor scales. Tbqh, the hardware isn't all that different niether are the architectures by and large. It all centers any a many core CPU with a bunch of PCIE lanes. Desktops now have NVME/SSD just like the enterprise. Even networking gear is the same for some people and even more capable in some cases. Computing hardware is now heavily commoditized and affordable. Entrepreneurship is soaring. I have never used cloud computing for my work and likely never will if I have a say in it. I have more capable hardware, I control and update the configuration and I don't have to deal w/ bozo security issues. There are a number of cases where building out your own computer stack is far cheaper especially at the higher end. There's tons of open source packages for management/Virtualization/etc. Enterprise is built on the same software stacks available to everyone via the open source community.

Computing goes through phases and focus. What people call cloud computing was the mainframe/thin client age from yesteryear. Hilariously, the bus architectures and many of the tidbits are still the same and borrow from what these pioneers did back in the 70s/80s. Cray is still around and does alot of HPC work. What happened after was that computing power was packed into an affordable Desktop thus came the PC age. The cloud computing era is long in the tooth and dead from where I sit. As always, the old guard and mainstream are always late to the game. One tries to milk an era for as long as they can (cloud computing meme) .. the (enterprise consumers) don't like change and stick to things until they're dead.

My thread-ripper rig can outperform a number of enterprise servers at various tasks. The game has changed. There's not much distinguishing enterprise from high end desktop beyond redundant power supplies, PCIE switches (they've been milking this forever) to allow for increased scaling, hot swap capability, and some other features which hardly are necessary unless you need 99.999% uptime. Shell out some cash and you can run high speed fiber networking and a good chunk of all of the other 'enterprise' gimmicks. Were in the next phase where the mainframe got packaged into the desktop. This is where you have a software and application explosion. Some got the memo. Some haven't.

I see the innovation in people taking the hardware and applying it in new ways. I don't see it in doing the same ol' crusty things they've been doing since the start of the cloud computing era... Cloud computing btw was a big scheme to achieve the holy grail of re-occurring revenue btw (aka leasing the same stuff to you over and over beyond the price it would have cost to purchase it). I hear people say : But I can run my software on 500 cores in the blink of an eye. Yeah, you also could write better software so you only need 10 cores. The innovation is going to occur with less not more IMO. We have some really crappy software out there due to how much compute power we have. We have some really crappy algorithms dominating computing due to insane compute power (meme learning).



Time for a change. The cloud era is over
 
Last edited:

ub4ty

Senior member
Jun 21, 2017
749
321
96
#82
At home, I have 136 Haswell Xeon cores, and if you count threadripper as a "server" CPU I have 128 of those. I have another 100 or so cores on other Intel/Ryzen platforms.

And yes, they are doing DC, as in cancer research, BOINC for Rosetta@home, WCG which is mostly cancer research, and F@H which is all cancer research.
So not only am I totally on-topic with "what do you use them for at home" since I use server CPU's, thats also on-topic.

Oh, and in case you didn't know, I am fighting cancer right now, if you wonder why I am so dedicated to that cause.
136 Haswell Xeon cores
WEW LAD
This is news to me... I didn't know you had this in addition to your Thread-ripper rigs.
 

maddie

Platinum Member
Jul 18, 2010
2,586
489
136
#83
I know what more cores are used for in work applications/settings.

My intent was finding out what people use them for at home. I am betting that you don't have 300 Skylake cores at home.
Maybe you should again edit the OP to say
CPU Core Count Mania. What do you need more cores for non work related activities?
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
17,774
1,378
136
#84
136 Haswell Xeon cores
WEW LAD
This is news to me... I didn't know you had this in addition to your Thread-ripper rigs.
See sig
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
17,774
1,378
136
#85
Maybe you should again edit the OP to say
CPU Core Count Mania. What do you need more cores for non work related activities?
And all of the recent posts are about non-work related activities at home.
 

Sable

Golden Member
Jan 7, 2006
1,093
1
91
#87
honestly? It would be wasted on me. but I'm tempted by the fun of it. the very fact it's 16/32 is just :O when I've been happy (and am still happy) with my 2500k.

I've been toying with building a new system for so many years. the one I have works great. so the only reason to build a new one would be for the fun of it. And ludicrous speed (cores) is just FUN.
 

whm1974

Diamond Member
Jul 24, 2016
7,472
494
96
#88
honestly? It would be wasted on me. but I'm tempted by the fun of it. the very fact it's 16/32 is just :O when I've been happy (and am still happy) with my 2500k.

I've been toying with building a new system for so many years. the one I have works great. so the only reason to build a new one would be for the fun of it. And ludicrous speed (cores) is just FUN.
Me? I probably will go to 8c/16t by the time I build a new rig in three to five years.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
227
96
#89
At home, I have 136 Haswell Xeon cores, and if you count threadripper as a "server" CPU I have 128 of those. I have another 100 or so cores on other Intel/Ryzen platforms.

And yes, they are doing DC, as in cancer research, BOINC for Rosetta@home, WCG which is mostly cancer research, and F@H which is all cancer research.
So not only am I totally on-topic with "what do you use them for at home" since I use server CPU's, thats also on-topic.

Oh, and in case you didn't know, I am fighting cancer right now, if you wonder why I am so dedicated to that cause.
I have been here a while and know you run a large amount of cores dedicated to research, and your situation. My response wasn't directed at you.

I was just pointing out that that the poll was intended to track home use cases. My use cases are different home and work, I only put my home use cases in the poll.
 

maddie

Platinum Member
Jul 18, 2010
2,586
489
136
#90
And all of the recent posts are about non-work related activities at home.
Correct, and I think the survey isn't turning out as expected or desired.
 

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
17,774
1,378
136
#91
I have been here a while and know you run a large amount of cores dedicated to research, and your situation. My response wasn't directed at you.

I was just pointing out that that the poll was intended to track home use cases. My use cases are different home and work, I only put my home use cases in the poll.
Yes, and I and all of our DC people are only putting their home use cases in as well. I think you don't realize how wide and varied the uses for home are, and how many people here use a lot of cores.

Now to be fair, if you are not an enthusiast, I am sure the poll would have worked out WAY different. Your "Average Joe" does not even know how many cores he is running.
 

moonbogg

Diamond Member
Jan 8, 2011
9,756
28
126
#92
I want a new 8/16 chip, but probably not a 14nm one. Why do I need it? Because I like to know that I got 2 cores not being used that are just waiting to assist me while I play the next Battlefield game. These cores are in reserve, they are overkill, they are enthusiast and are nearly as excessive as my passion for the hobby. I need to see both cinebench numbers scoring super high, both single and multi thread. When I fire up a game, I remember those numbers and I think to myself;

"On the back of the box of this game, they recommend a quad core at 3.5ghz. I got 16 threads of 4.8ghz whompass for them instead. Let's see what they think about that".

When my monitor maxes out at 100hz and I know my GPU has at least 30 more FPS in reserve for me, it makes me feel STRONG to realize that my CPU has enough reserve to run TWO INSTANCES of this game at 200FPS, each. That's enthusiast. That's what I want. That's what I'll get.
 

NTMBK

Diamond Member
Nov 14, 2011
8,300
280
126
#93
I want a new 8/16 chip, but probably not a 14nm one. Why do I need it? Because I like to know that I got 2 cores not being used that are just waiting to assist me while I play the next Battlefield game. These cores are in reserve, they are overkill, they are enthusiast and are nearly as excessive as my passion for the hobby. I need to see both cinebench numbers scoring super high, both single and multi thread. When I fire up a game, I remember those numbers and I think to myself;

"On the back of the box of this game, they recommend a quad core at 3.5ghz. I got 16 threads of 4.8ghz whompass for them instead. Let's see what they think about that".

When my monitor maxes out at 100hz and I know my GPU has at least 30 more FPS in reserve for me, it makes me feel STRONG to realize that my CPU has enough reserve to run TWO INSTANCES of this game at 200FPS, each. That's enthusiast. That's what I want. That's what I'll get.
Lol
 
Jun 15, 2001
33,854
185
126
#94
Rendering with POV-Ray. It easily murders my 16 cores and 128GB of memory. I'd love to get 256 if the new boards support it (and the prices return to sanity).
 

Midwayman

Diamond Member
Jan 28, 2000
5,158
30
106
#95
Gaming, but mostly in that I have enough to prevent system tasks from causing glitches in the frame rate. Even if games aren't utilizing all cores its nice to have a core available to handle background tasks.
 

Insert_Nickname

Diamond Member
May 6, 2012
3,559
113
126
#96
the other possibility is for me rebuilding the tech room in the house to use like~ 1KW output on the computer for thermal energy while mining/other stuff that makes money and you need that hot water anyway :)
I've frequently threatened to do a custom-loop-as-heat-source for a coffee machine. Unfortunately, with the current heat output of my main system, it'd take a while to make a brew... :D
 

TheGiant

Senior member
Jun 12, 2017
410
68
86
#97
Time for a change. The cloud era is over
IMO the cloud era is on the move. I personally use office365 (personal and company) and people totally don't know what are the benefits. They just open eyes when I am telling that data are not on their computers, they don't need mail server, file servers, backup.....
If they add a computing service like encoding or photo editing running on the cloud it will be perfect.

IMO the personal powerhouse at home era is over. It is just a matter if time and now only with of the enthusiasts, who unfortunately don't make a majority of profits
 

Headfoot

Diamond Member
Feb 28, 2008
4,401
49
126
#98
Computing goes through phases and focus. What people call cloud computing was the mainframe/thin client age from yesteryear. Hilariously, the bus architectures and many of the tidbits are still the same and borrow from what these pioneers did back in the 70s/80s. Cray is still around and does alot of HPC work. What happened after was that computing power was packed into an affordable Desktop thus came the PC age. The cloud computing era is long in the tooth and dead from where I sit. As always, the old guard and mainstream are always late to the game. One tries to milk an era for as long as they can (cloud computing meme) .. the (enterprise consumers) don't like change and stick to things until they're dead.
I understand where you're coming from but nothing that came before is like Amazon AWS/Azure/Google Cloud. It enables fundamentally different architectures that can scale to degrees never before possible. If you haven't lately, go play around with AWS. The ease of use and scope of features is mind-boggling, especially if you're coming from a distributed systems background. Things like Lambda and Serverless enable a different mindset in software development altogether. No HPC system from years past could match the Google cloud's ability to deliver web search at that scale and speed.
 

ub4ty

Senior member
Jun 21, 2017
749
321
96
#99
IMO the cloud era is on the move. I personally use office365 (personal and company) and people totally don't know what are the benefits. They just open eyes when I am telling that data are not on their computers, they don't need mail server, file servers, backup.....
If they add a computing service like encoding or photo editing running on the cloud it will be perfect.

IMO the personal powerhouse at home era is over. It is just a matter if time and now only with of the enthusiasts, who unfortunately don't make a majority of profits
Thank you for your feedback. There's also Google docs/etc that a surprising number of youths use to accomplish basic work in grade school and higher ed. You don't really need much from such apps now-a-days and its free thus why they use it. Mail = gmail. file server/backup are easily done at home with 8TB drives being widely available and affordable and 16TB drives coming next year. When I mention the 'cloud' i mean things related to compute more so than these basic tasks. In such a way, I have indeed and continue to hear of people migrating to the cloud. However, I also hear of a number of people coming back out of the cloud to establish their own private compute centers again.
 

ub4ty

Senior member
Jun 21, 2017
749
321
96
I understand where you're coming from but nothing that came before is like Amazon AWS/Azure/Google Cloud. It enables fundamentally different architectures that can scale to degrees never before possible. If you haven't lately, go play around with AWS. The ease of use and scope of features is mind-boggling, especially if you're coming from a distributed systems background. Things like Lambda and Serverless enable a different mindset in software development altogether. No HPC system from years past could match the Google cloud's ability to deliver web search at that scale and speed.
While I hear this, the trend is towards a Hybrid Solution whereby you only put what you absolutely need to in somebody else's data center : https://www.infoworld.com/article/3...ng-why-some-are-exiting-the-public-cloud.html

This is far more sensible and will eventually hit a tipping point back to private instances. My sentiments like those who left the cloud is security and a whole host of other problems with leasing someone else's equipment. There are a number of cases where doing so is far more expensive over the lifetime of equipment than actually purchasing yourself. I also note a huge chasm in understanding forming due to people choosing this EZ-bake oven solution to performance. A lot of runtime performance of compute solutions has actually gone to the gutter because no one know how to actually deal with the underlying hardware because they're just slapping things into containers or provisioning software.

Hardware is becoming more complex not less. If I asked if a server instance in the cloud was using NVME drives and if so what version, what do you think the answer would be? Do people know how to configure a sever to optimally handle their workload? GPU or CPU? Memory bound or CPU Bound? Yes, a lot of people love the idea of not having to know about any of these details and they pay for this luxury in many ways. A lot wouldn't know the first thing about distributed computing or load balancing beyond clicking a configuration in software. This leads to a lot of crappy software and security issues. You can't have your cake and eat it too. I imagine you could cute compute requirements in half if people really understood the principals behind building such software themselves and speccing the hardware to their specific requirements.

Instead, we exist in an age where people would rather run 800 cores worth of machines at full tilt brute forcing solutions using meme learning. There's a significant cost associated with this practice which is why its cyclical. Hardware is becoming complicated again and increasing substantially in capability. Mainframe/Cloud eras go through a flux during such periods... Namely because innovators begin writing new approaches to software now that they have an amazon AWS rack worth of computing in their room. New and widely varied hardware starts raining on the scale parade. Enterprise level features begin trickling their way down to the consumer.
 


ASK THE COMMUNITY

TRENDING THREADS