When will the core wars stop?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Ares1214

Senior member
Sep 12, 2010
268
0
0
I think 4 things will make change the race, or the "name" of the race. Those 3 things are Graphene, Quantam Mechanics, ARM, and IBM.

Graphene because it has been pushed from anywhere between 50 GHz to 150 GHz, with a max theoretical range of 1 THz. Not only that but it also has many advantages in power spreadings, and manufacturing.

Quantam mechanics because the main reason we are adding more cores is almost purely a shrinking node. 45nm-32nm, add 2 cores for intel, 45-32 for AMD, add 4+ cores. When the return on a shrinking node diminish, or it becomes entirely impossible to shrink it anymore all together, then the increase in core count will slow.

ARM because it adds in an entirely new instruction set into the mix. x86 has dominated for years now. ARM has proven dominant in efficiency in the mobile market, and to some extent scalability and flexibility in the server market. Combine that with the fact that Windows will support ARM CPU's starting with Windows 8, and that Nvidia seems determined to make ARM CPU's mainstream, and you have something to really shake the market up. If im not mistaken their Kal El CPU proved equal to or faster than a Core 2 Duo, granted that Kal El had 4 cores, and the Core 2 Duo was 2 GHz, but still.

And lastly IBM. By the time Intel and AMD stop adding cores, IBM would have been researching alternatives for years. IBM isnt going to let things stagnate in the tech world, if they arent adding more cores, they will be using something that IBM is probably researching right now.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
If im not mistaken their Kal El CPU proved equal to or faster than a Core 2 Duo, granted that Kal El had 4 cores, and the Core 2 Duo was 2 GHz, but still.

That was just marketing BS, they compared apples to oranges.
 

mosco

Senior member
Sep 24, 2002
940
1
76
More cores are great for systems that do a lot of unrelated processing simultaneously, but I have a feeling that the consumer market will show less of a need for these. Consumer-oriented programs still don't take advantage of the threading/cores available on the current machines, in many cases.

That's why apple developed http://developer.apple.com/technologies/mac/snowleopard/gcd.html and why they open sourced it http://libdispatch.macosforge.org/ . There is room for multi-core optimization in the consumer space.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
As much as people bring up IBM they always seemed to me to be the loser that couldn't compete. The cell processor was pretty much pointless for them, sure they landed core in the 360/ps3/and wii but I was under the impression that was probably more due to there cheaper cost compared to anything AMD/Intel would have provided.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
As much as people bring up IBM they always seemed to me to be the loser that couldn't compete. The cell processor was pretty much pointless for them,
Pointless for most people, yes, but they had the good sense to kill off future versions, and take what they learned from it. Some applications fly on the Cell, believe it or not. The large local memories, DMA to move data between those memories, no multitasking ability to speak of, and only moderate DRAM performance, all helped combine to make a chip that most people have little desire for. Bits and pieces of it went into Power CPUs, though, and both features and lessons will show up in future console CPUs, too.
sure they landed core in the 360/ps3/and wii but I was under the impression that was probably more due to there cheaper cost compared to anything AMD/Intel would have provided.
Well yeah. It's not a very good business strategy to plan to provide something very expensive, and try to convince people making a device that has to sell for a low price that their more expensive option should the one you choose. IBM's real strength for the Wii and Xbox360 (Sony screwed up w/ the PS3) is that they could make, for a fair price, a custom chip to meet their needs. In this area, Intel and AMD are a decade or more behind IBM.

Also, IBM rakes in dough like mad from R&D, and make cross-license deals to keep from having to spend money on royalties all the time, too. They have completely forsaken the cut-throat world of consumer electronics, but they are still quite a force, behind the scenes.