Google Xeon Phi.You would need to use a different ISA instead of x86 and the software that can use that many cores. Something like that will be far more suitable for data centers, supercomputers, and the like.
Google Xeon Phi.You would need to use a different ISA instead of x86 and the software that can use that many cores. Something like that will be far more suitable for data centers, supercomputers, and the like.
Isn't that just a bunch of little x86 cores with a large iGPU?Google Xeon Phi.
Isn't that just a bunch of little x86 cores with a large iGPU?
My phone has more cores (8) than my PC (6)Oh and I have multi-core in my phone too!
I need six cores just to multitask with Netflix (1080p not 4K), office apps, and web browsing. And yes I’m being serious.
Why? Cuz the three desktop CPU cores I had before was not enough. It was OK most of the time, but lagged occasionally when multitasking.
The caveat though is the CPUs I’m talking about are 8 years old. I upgraded from an 2.9 GHz Athlon II X3 435 to a 2.8 GHz Phenom II 1055T. Made a noticeable difference. Saved me from having to buy a new machine just to run a few business apps. A 4.5 GHz triple-core AM3 chip would have worked too but they don’t exist so hex-core it is.
Oh and I have multi-core in my phone too!
Probably both. It’s really just a matter of performance, and the 3-core 2.9 GHz Athlon is just slow by current standards, noticeably slower than my Core m3 Kaby Lake Y, and without turbo.Are you sure it's the cores that helped, and not the L3 cache?
I need six cores just to multitask with Netflix (1080p not 4K), office apps, and web browsing. And yes I’m being serious.
Why? Cuz the three desktop CPU cores I had before was not enough. It was OK most of the time, but lagged occasionally when multitasking.
The caveat though is the CPUs I’m talking about are 8 years old. I upgraded from an 2.9 GHz Athlon II X3 435 to a 2.8 GHz Phenom II 1055T. Made a noticeable difference. Saved me from having to buy a new machine just to run a few business apps. A 4.5 GHz triple-core AM3 chip would have worked too but they don’t exist so hex-core it is.
Oh and I have multi-core in my phone too!
If there were other well-known free benchmarks, maybe there would be more of those used.I am noticing, that the least popular category (3D rendering) is the one that houses the most popular benchmark to quote (Cinebench).
Actually I forgot one thing: For a feature film you can pretty much add a zero after the numbers I gave, and then multiply by two.
StrangerGuy; obviously what interests us is personal, but I have to say that the recent rumors about Threadripper gets me interested in where we're heading. I got the best deal (bang/buck) I could at the end of last year, a Ryzen 1700 based system, and it's been good for me. It overclocks without much effort at all to 3.7GHz on all cores on included air cooling. Could probably go higher but I didn't bother. So for context and for what is interesting: A new Threadripper that has 4 times the cores and still overclocks to 4GHz on all of them on an included air cooler? That's nuts. I'm sure it won't be cheap, but that to me is insanely "interesting".
Would you say there is a point of diminishing returns with your work WR to how many you cores you can effectively use?
By that I mean, if having more cores makes your work 'snappier' but doesn't necessarily make you more 'productive', if that makes sense?
To build on this response, on my side of the audio fence (music production) there have been immense breakthroughs in virtual instrument clarity and emulation that use computationally heavy algorithms that weren't previously possible (or economically possible). Things like the guitar FX chain simulators (Axe FX, Bias, Amplitube, etc.). So totally agreed. People come up with cool new things you can do with the processing power all the timeIt's a somewhat difficult question to answer. All else being equal the answer is 'yes', there is a point of diminishing returns. You could look at things in stages: We too little CPU capacity I simply can't perform several important tasks at all; with sufficient but little capacity I can do all I need, but some tasks take long enough that it's bad for business; with a decent CPU I can do most things fast enough, and for some tasks I have to take a break - not ideal, but acceptable perhaps; with a very good CPU I've reached that point of diminishing returns in the sense that I can do all I need within the time I need it done; with an even more capable CPU I can cut processing time for certain tasks but it doesn't make a huge difference any longer.
I hope that makes sort of sense.
The issue though is that not all else is equal. To give a very specific but clear example: Back about a decade ago or so there was a software called "RX" that came out by a company called iZotope. It was a suite of plugins for audio including for example a de-clicker, which would help me cut out click sounds when people speak (often due to a dry mouth). It then got a de-noiser. It does what it says, cuts out noise. The de-noiser was heavier on the CPU and I used it offline in non-realtime. Ideally I'd use it in realtime because it saves me time. So with newer computers that was possible. Then they came out with a de-reverb plugin. Same deal. But now because of the heavier plugins that I want to run live I need a heftier CPU. And so it goes.
In addition to that I predict that we'll see far more complicated processing for 3d audio, as well as mixing audio/video. Blackmagic Design's Davinci Resolve is a software that now does media management, video editing, audio post production, coloring, and finally rendering final media. So that's all that I do for audio post plus all that video people do - all in one app.
So, 'yes', to a degree there's a point of diminishing returns for every individual case, but we appear to always end up upgrading software at some point and then the race is back on again. It applies to individuals like me too that sometimes act as sub-contractors to audio production studios and do work at home. So in that sense.... 'no'....
How's that for a too long answer!?
Indeed. The higher core count rigs I'm currently on 8/16 (64 pcie lanes) have definitely inspired a new approach to my software development. A number of software packages have not been updated to utilize this kind of core count or scale w/ the increasing trend. A number of software packages aren't NUMA aware. I'm not too interested in the performance of such packages Intel/AMD because I'm not reliant on them. I actually develop such software. In this, I find my home among Ryzen. The prices dropped so low recently that I grabbed up an 8 core Ryzen 1700 and built another rig to slap on to the network (distributed computing). I put the whole thing together for about $600. This breeds all new re-thinks. When I did my research to see what performed best in the HPC/Scientific/Engineering software space, it was actually Ryzen. When software is properly written, more cores definitely trumps higher clocks. 2 more cores + 4 threads is a hard thing to beat. I peaked over to various media forums and the same holds true. Stuttering on a 6 core.... Smooth performance on an 8-core.To build on this response, on my side of the audio fence (music production) there have been immense breakthroughs in virtual instrument clarity and emulation that use computationally heavy algorithms that weren't previously possible (or economically possible). Things like the guitar FX chain simulators (Axe FX, Bias, Amplitube, etc.). So totally agreed. People come up with cool new things you can do with the processing power all the time
I assume you are excluding servers. There are probably more servers in the world, than PC's for personal use. And those servers can use all the cores they can get. And they are not all Xeon's, even in the server world. (however the non Xeon type processors are in a minority)Web browsers, gamers, and 95% of the population outside of a enthusiast tech forum , 4 to 6 cores is plenty. 8 cores is what you call future proof.
The other 5% need more cores.
That's about the jist of it.
It depends on the task, and how much you do at once. Personally, I am from the old days, and my philosophy is to minimize the number of tasks being done concurrently. I am sure for some users, they can use all the cores available, but you dont seem to be differentiating between those users and other users who dont need more that 4 or maybe even two fast cores. I certainly dont think you need 10 cores to surf the web or use MS office. As for your last statement, I wonder what the clockspeed of that 2000 core cpu would be and how well Windows would manage at assigning one thread per core.This. I don't think people realize, the real difference that multiple cores makes. Sure, it may not affect the CPU usage % graph (which is generally out of all collective cores anyways) much, but it makes a real difference in responsiveness.
Try running your PC with a single-core CPU, and then try to do a task, while scanning for viruses / malware, or doing a heavy download, etc. while web browsing.
Then do the same thing on a dual-core PC. Then try a quad-core. See the difference?
I'm not really sure why some in the peanut gallery seem to think that this benefit stops at the quad-core mark, and doesn't extend to 6, 8, 10, MOAR CORES.
Yes, diminishing returns, Amdahl's Law, etc. we've heard the argument against multi-core.
Let me tell you this: Open Task Manager, click on CPU, look at the number of Threads. Until we have as many cores as there are threads to run, we'll see improvements in responsiveness. (Note Threads on my system are up in the 2000+ range.)
It should come as no surprise that Intelcontinues to dominate (>99%) the server market but is under enormous pressure on all fronts. Xeon and its evolution continue to be their compute vanguard. Xeon-Phi (and now the addition of Nervana) make up their engines for high-performance computing /I assume you are excluding servers. There are probably more servers in the world, than PC's for personal use. And those servers can use all the cores they can get. And they are not all Xeon's, even in the server world. (however the non Xeon type processors are in a minority)