What will happen when Intel brings core counts greater than four to mainstream?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Really, what mainstream chicken are you talking about?

Fairly sure he's referring to FX CPUs. 4c8t i7's haven't been prohibitively expensive either, when you consider the cost of the entire platform, and we've had GPUs capable of a wide variety of compute tasks for years but very few are bothering to make use of it.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Really, what mainstream chicken are you talking about?

i7 920 for example. 8 threads.

The problem is simply serial code in software that cant be multithreaded. Thats also why experiments like Mitosis was created. Later abandoned when performance/watt started to matter.

Its simply a complete delusion to think software will get magically fixed if we get more cores. Even servers are pretty terrible this way and depend on amount of concurrent users.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
Fairly sure he's referring to FX CPUs. i7's aren't prohibitively expensive either, really, and we've had GPUs capable of a wide variety of compute tasks for years but very few are bothering to make use of it.

FX CPUs are 4 Module 8 threads, not 8 core. And they do not have contemporary ST performance. I'm talking about pure 8 core CPUs with contemporary ST performance.

And 8 core Intel HEDT CPUs are not mainstream, especially considering the associated motherboard cost. The market penetration is far too small.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Speculative threads will be the only usage outside of today on your desired 8 core in the far future. You got a dream without reality.

mitosis1.jpg

IMG_2728.JPG
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
There is no solution so the serial code issues besides speculative threading or concurrent users.

The only "rambling" is to claim that more cores will magically solve software coding issues. Its an issue software developers have worked with for many many years already. And there is no solution in sight besides resource heavy alternatives like speculative threading. But that doesnt stand a chance in the performance/watt world.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
i7 920 for example. 8 threads.

The problem is simply serial code in software that cant be multithreaded. Thats also why experiments like Mitosis was created. Later abandoned when performance/watt started to matter.

Its simply a complete delusion to think software will get magically fixed if we get more cores. Even servers are pretty terrible this way and depend on amount of concurrent users.

There is no solution so the serial code issues besides speculative threading or concurrent users.

The only "rambling" is to claim that more cores will magically solve software coding issues. Its an issue software developers have worked with for many many years already. And there is no solution in sight besides resource heavy alternatives like speculative threading. But that doesnt stand a chance in the performance/watt world.

So you're you're back to questioning whether it is possible to write parallelized SW?

As a SW developer I can tell you it is, and the incentive for doing so increases drastically with the average number of cores available among mainstream users.
 

Dave2150

Senior member
Jan 20, 2015
639
178
116
It's quite clear that consumers want lower power and less/no fans. It makes sense considering that the average Core processor is so much of an overkill for what people use their computer for. Plus it enables smaller form factors for desktops and lighter laptops.

I just don't agree with this.

Who buys big desktop PC's these days? Not your average Joe. Average Joe uses his tablet, smartphone, tv, for all the tasks average Joe used to do on his PC, 10 years ago.

I'd say 90% of users buying desktop PC's with an I5/I7 (I mean mid tower PC's, not tiny small factor ones) are buying them for either gaming, streaming while gaming, video encoding/rendering etc.

Upgrading mainstream I7's to 6, 8 cores would be a dramatic benefit.
 

jpiniero

Lifer
Oct 1, 2010
16,939
7,355
136
I just don't agree with this.

Who buys big desktop PC's these days? Not your average Joe. Average Joe uses his tablet, smartphone, tv, for all the tasks average Joe used to do on his PC, 10 years ago.

Corporate users. They buy desktops because the form factor makes sense in some cases to them. No, it doesn't make sense that they would buy an i5 or i7 but they do. Starting with Skylake, you are going to see the form factors start to shrink more agressively on desktops. I'm sure the corporate users will appreciate it.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
It's quite clear that consumers want lower power and less/no fans. It makes sense considering that the average Core processor is so much of an overkill for what people use their computer for. Plus it enables smaller form factors for desktops and lighter laptops.

Yes, consumers want lower power and less/no fans for mobile and streaming devices.

But a desktop that serves those devices doesn't have to be fan less or very lower power.

And these days with the Xeon-D octocore die weighing in at a mere 160 mm2 (including two 10 GbE LAN), there isn't much reason for Intel to hold back the cores.

160mm2 is a mainstream die size.
 

Lil'John

Senior member
Dec 28, 2013
301
33
91
ShintaiDK is correct: the hardware has been available a while for software developers to show they can make highly parallelizable software. Sandy-E for example.

If you ignore CPU cores, LOTS of small cores are available with OpenCL and CUDA. Not many programs take advantage of these either.

If the software developers can't show more cores are better, the normal computer buyers will not worry about cores when buying.

So either the mainstream software can't be parallelized much or software developers can't pay for the hardware.

Until the software developers show consumers the NEED for more cores, I don't expect Intel to change their current core count because the consumers aren't demanding it.
 

jpiniero

Lifer
Oct 1, 2010
16,939
7,355
136
And these days with the Xeon-D octocore die weighing in at a mere 160 mm2 (including two 10 GbE LAN), there isn't much reason for Intel to hold back the cores.

I would think Intel would like to keep the clock speeds higher than what Xeon D can deliver at this point, and eventually lower the TDP some more.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
ShintaiDK is correct: the hardware has been available a while for software developers to show they can make highly parallelizable software.

His primary argument assumes a single user though.

But with Android L already allowing multiple users (and with Google not likely to be worried about charging server licensing fees like MS), I think the landscape is poised for change in the future.
 

stockwiz

Senior member
Sep 8, 2013
403
15
81
Intel would be stupid to lower prices. People on these forums (an others) act as though Intel cares about us little PC enthusiasts. We make up a silly small amount of their business. They start pricing unlocked hex or octa cores so the masses can afford them and they lose a lot of revenue from server chips which has been mentioned here already.

Be happy you can get 6 and 8 core chips at all. I'll admit I'd like to see the 5820K to drop to $300 though. I'd buy one if I could find a sale for $300 shipped and I do look every once in a while.

I don't want to sidegrade to another quad core. My 2600K is good for now until my mother says she's ready to buy my system (at a substantial discount) .. if it wasn't for her I'd keep it another 3 years.
 
Last edited:

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
833
136
More SW will be optimized for higher core counts. So it'll bring a huge performance boost compared to the ~5% yearly performance increase we've been seeing the last few years.
Not enough software is amenable to parallelism.

But I think AMD Zen is what will bring about this revolution, not Intel.
LOL. There is no revolution on the horizon.

Laptops & laptop sales guarantees this, in addition to the difficulties on the software front.
 
Apr 30, 2015
131
10
81
I wonder if Mediatek will develop a 12-core SoC for laptops, and smaller form-factor desktops. They could feature 2 X 4 ARM A53, 4 ARM A72 cores, plus 4 accelerators, to a total of 16 heterogeneous cores. This might be good for general-purpose computing, and some gaming, plus 4K graphics. The best cores for the job would be used, at any time, to save power. Price might be $50, at a guess. Marketed as a 12/16 core SoC, it might trouble Intel. How would they respond, I wonder? The PC market may look like a soft-target to some big chip-design companies.
 
Last edited:

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
So you're you're back to questioning whether it is possible to write parallelized SW?
I don't think he's questioning whether it's possible, simply pointing out most code that isn't "embarrassingly parallel" (like video encoding or synthetic benchmarks) usually bumps heads with Amdahl's Law, whilst hyper-parallelable specialist programs are typically GPU based (OpenCL / CUDA, etc).

"MOAR CORES" for general usage (web, office, media playback, casual gaming, etc) has ended up just as much "pushing on a string" as "chicken vs egg". For media playback, we're talking 3w for H265 UHD via fixed function decoding hardware. CPU "software" decoding hasn't become less relevant than today. Even for video encoding we've reached the point where instead of power hungry 8x core laptop's, or 32x core desktops, people are just using Quicksync or Shadowplay using 1/4 of the power, and still ending up with video that's below the comparable threshold of what Youtube re-encodes will degrade it to anyway regardless of how pristine the source is. Same with "throwaway" video (web / video conferencing, Skype, etc), it's all about mobile / power efficiency these days. Given the sheer economies of scale for mobile devices, I can see more effort being put into improving quality of sub 10w fixed-function encoders than demanding 16x core CPU's for software X264/5 encodes.

Ripping a CD (audio encoding) is so fast that the bottleneck is the optical drive not the CPU even on a Celeron. Laptop's and most "off the shelf" pre-build desktop bottlenecks are the HDD not the CPU. Office & web browsers run on Celeron's (bottleneck for opening / reading lots of 4k files in web cache is again the mechanical HDD). Low-end "mainstream" photo editing is usually trivial enough (crop, resize, red-eye removal, etc) that it's fast on even the slowest CPU (and done even on tablets). High end professional photo editing can be GPU accelerated faster than the fastest CPU. Even AAA gaming still shows i3's beating FX-9590's in 2015 after the 8th annual "this will be the year 8-core FX chips render i5's obsolete" failed prediction... Casual / low end Indie gaming runs on a potato. What's left for the average person (that doesn't include CAD, professional / research applications, etc)? Not much. You can buy an 18-core / 36-thread Intel CPU right now. But it's not "mainstream" for a reason.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I wonder if Avago-Mediatek will develop a 12-core SoC for laptops, and smaller form-factor desktops. They could feature 2 X 4 ARM A53, 4 ARM A72 cores, plus 4 accelerators, to a total of 16 heterogeneous cores. This might be good for general-purpose computing, and some gaming, plus 4K graphics. The best cores for the job would be used, at any time, to save power. Price might be $50, at a guess. Marketed as a 12/16 core SoC, it might trouble Intel. How would they respond, I wonder? The PC market may look like a soft-target to some big chip-design companies.

The "famous" ARM is going to take on x86 anytime "soon" in traditional x86 segments?

ARM is already losing tablets. Its a oneway battle.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Not enough software is amenable to parallelism.

Exactly. And it also shows the lack of understanding of the problem at hand. The blindsighted mantra that its all Intels fault and they should just give us cheap 8 cores that would solve everything software wise is simply ignorant.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
ShintaiDK is correct: the hardware has been available a while for software developers to show they can make highly parallelizable software. Sandy-E for example.
That's not a mainstream CPU. The market penetration is very small.
If you ignore CPU cores, LOTS of small cores are available with OpenCL and CUDA. Not many programs take advantage of these either.
Those are not general CPU cores that can be used all SW tasks, only specific ones.
If the software developers can't show more cores are better, the normal computer buyers will not worry about cores when buying.

So either the mainstream software can't be parallelized much or software developers can't pay for the hardware.

Until the software developers show consumers the NEED for more cores, I don't expect Intel to change their current core count because the consumers aren't demanding it.

We're back to the chicken and egg argument. SW developers will not spend the extra effort needed to produce highly parallelized SW until 8+ cores are mainstream. Backtrack to this post and follow the discussion from there. We're looping. ;)
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
ShintaiDK is correct: the hardware has been available a while for software developers to show they can make highly parallelizable software. Sandy-E for example.
That's not a mainstream CPU.
If you ignore CPU cores, LOTS of small cores are available with OpenCL and CUDA. Not many programs take advantage of these either.
Those are not general CPU cores that can be used all SW tasks, only specific ones.
If the software developers can't show more cores are better, the normal computer buyers will not worry about cores when buying.

So either the mainstream software can't be parallelized much or software developers can't pay for the hardware.

Until the software developers show consumers the NEED for more cores, I don't expect Intel to change their current core count because the consumers aren't demanding it.

We're back to the chicken and egg argument. SW developers will not spend the extra effort needed to produce highly parallelized SW until 8+ cores are mainstream (even 4 cores are not minimum standard yet). Backtrack to this post and follow the discussion from there. We're looping. ;)
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
OpenCL, HSA, CUDA whatever would also solve all the worlds problems. They didnt.

You assume all code can be parallelized and its simply wrong. Even servers with "endless money and cores" are plagued by massive amount of serial code. And the only way to use more cores is more users. This is also why ARM chips are too slow for transaction servers and a lot of web frontends.
 
Last edited:

Fjodor2001

Diamond Member
Feb 6, 2010
4,277
614
126
You assume all code can be parallelized and its simply wrong.

When have I said that?

I'm not saying all SW can be parallelized. But I'm saying much more SW has potential to be made parallelized than what is actually being done today. But it will take extra effort. So for it to be done, enough of the CPUs on the market must have sufficient core count for there to be any point in doing so. And we're not there yet. We're waiting for the chicken.
 
Last edited: