Intel: Why a 1,000-core chip is feasible

RavenSEAL

Diamond Member
Jan 4, 2010
8,661
3
0
Everyone knows technology is about a decade ahead (if not more), market demands and production prices just don't justify the mainstream sale of products of such a large scale.
 

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,574
126
what is a 'core' exactly? are each of the shader processors in a graphics chip a 'core'? what components can be shared between 'cores' and which have to be with a 'core' for a 'core' to be a 'core'?
 

frostedflakes

Diamond Member
Mar 1, 2005
7,925
1
81
Of course it's feasible, we already have processors from AMD with 1600 cores. :p

One might say it's comparing apples to oranges, but I'd imagine a lot of the higher-level design challenges (for example, how do you group and most effectively manage such a large number of processing units?) are fundamentally very similar.

I also liked this:

Would the process of fabricating 1,000 cores present problems in itself?
I came up with that 1,000 number by playing a Moore's Law doubling game. If the integration capacity doubles with each generation and a generation is nominally two years, then in four or five doublings from today's 48 cores, we're at 1,000. So this is really a question of how long do we think our fabs can keep up with Moore's Law. If I've learned anything in my 17-plus years at Intel, it's never bet against our fabs.
:)
 

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Of course it's feasible, we already have processors from AMD with 1600 cores. :p

One might say it's comparing apples to oranges, but
But GPUs don't have to worry about cache coherency protocols and co which makes scaling them almost trivial compared to modern CPUs.

And from reading that article it seems their current approach seems to be to give up cache coherency and use message passing. Now consider how well current multi threaded applications on SMPs with cache coherency work and then think about how that'll turn out if the complexity factor doubles.
Certainly doable (there are enough HPC programs out there that prove that - message parsing is nothing new), but I think that's a long way to suitable for mainstream programmers.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,087
3,593
126
Everyone knows technology is about a decade ahead (if not more), market demands and production prices just don't justify the mainstream sale of products of such a large scale.

a decade?

Id say more if we look at military tech.

This was from a military computer.
Liquid element used was gallium... used for desert operations.

GALLIUM!!! :biggrin:

And it dates quite a while back.
allgallium4yz.jpg


LOL.... we will never see this in consumer end...
But dude.... i wouldn't doubt our military.
 

JFAMD

Senior member
May 16, 2009
565
0
0
Of course it's feasible, we already have processors from AMD with 1600 cores. :p

One might say it's comparing apples to oranges, but I'd imagine a lot of the higher-level design challenges (for example, how do you group and most effectively manage such a large number of processing units?) are fundamentally very similar.

I also liked this:


:)

That was a weak quote. The challenge in getting to 1000 cores isn't a fab issue as much as it is a design and architecture or even a programming issue.

If you can design it with in the constructs of existing tools and processes, it should not be an issue. Getting a crossbar to talk to 1000 cores or trying to deal with cache coherency is the real technichal problem. Fabs just build whatever has been designed.

The world will be there eventually. And someone will still complain about single threaded performance.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,087
3,593
126
The world will be there eventually. And someone will still complain about single threaded performance.

ROFL... this is almost sig worthy... :biggrin:
 
Nov 26, 2005
15,194
403
126
That was a weak quote. The challenge in getting to 1000 cores isn't a fab issue as much as it is a design and architecture or even a programming issue.

If you can design it with in the constructs of existing tools and processes, it should not be an issue. Getting a crossbar to talk to 1000 cores or trying to deal with cache coherency is the real technichal problem. Fabs just build whatever has been designed.

The world will be there eventually. And someone will still complain about single threaded performance.


LOL so true!!! Yes Aigo, very sig worthy!
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Intel and AMD are about to get trumped in the cpu department. Windows 7 will be coming to the ARM platform next year and that platform doesn't have the problems that x86 does with threading. I wonder how intel feels about selling it off years ago ?
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
what is a 'core' exactly? are each of the shader processors in a graphics chip a 'core'? what components can be shared between 'cores' and which have to be with a 'core' for a 'core' to be a 'core'?

a core is a unit of computational resources incapable of hardware integration, resource sharing, and load balancing with other similar units, such that software must be written to distribute computation between them in a rather inefficient manner as if they were on completely separate die.

Each CPU core contains multiple ALU (arithmetic logic units) and other computational resources which it can intelligently use, it is far easier to just create several such "cores" rather then expand the capacity of intelligently shared information.

GPU "cores" are a marketing gimmick which incorrectly references the CPU trend of multi core chips, it is used to refer to individual ALU's that are all a part of a single core, and modern CPUs are reintegrating their multiple cores such that they intelligently share resources to a greater degree with each iteration, up to a rather impressive amount of integration in sandy bridge such that their definition as individual cores begins to become muddy.
 
Last edited:

Nintendesert

Diamond Member
Mar 28, 2010
7,761
5
0
a decade?

Id say more if we look at military tech.

This was from a military computer.
Liquid element used was gallium... used for desert operations.

GALLIUM!!! :biggrin:

And it dates quite a while back.
allgallium4yz.jpg


LOL.... we will never see this in consumer end...
But dude.... i wouldn't doubt our military.



Yes, fear our Dells!
 

JFAMD

Senior member
May 16, 2009
565
0
0
Intel and AMD are about to get trumped in the cpu department. Windows 7 will be coming to the ARM platform next year and that platform doesn't have the problems that x86 does with threading. I wonder how intel feels about selling it off years ago ?

I'll take the under on ARM performance on windows. I will also take the under on ARM performance on windows 7 since all of the rumors are on support for ARM on windows 8. I will also take the under on ARM application and feature support under windows 8.

I can see msft providing support for ARM because tablets and smartphones are growing quickly and a scaled back version of win 8 could possibly play on an ARM chip. But the client world is not about to jump to ARM in replacing current form factors. I would bet your "trumped" prediction is a bit overstated.

When you can get a dual core 2.8GHz Athlon II for $42 at newegg today, I see few people jumping off the bandwagon and jumping on ARM based on what they will be giving up. Keep in mind that a version of windows that runs on ARM will probably not be binary compatible with x86 windows (much like the Alpha and Itanium versions). This might be acceptable for things like tablets, but it is going to leave the platform very limited.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Intel and AMD are about to get trumped in the cpu department. Windows 7 will be coming to the ARM platform next year and that platform doesn't have the problems that x86 does with threading. I wonder how intel feels about selling it off years ago ?

Yep, because the eleven gazzilion apps that run on x86 Windows will magically become ARM comparable.
o_O
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
a decade?

Id say more if we look at military tech.

This was from a military computer.
Liquid element used was gallium... used for desert operations.

GALLIUM!!! :biggrin:

And it dates quite a while back.
allgallium4yz.jpg


LOL.... we will never see this in consumer end...
But dude.... i wouldn't doubt our military.

this got me curious. funny, I couldn't find "computer coolant" as one of the uses of gallium at wikipedia: http://en.wikipedia.org/wiki/Gallium

however, I did find this gem:

In a classic prank by scientists, who fashion gallium spoons and serve tea to unsuspecting guests. The spoons melt in the hot tea.

:):)
 
Last edited:

Voo

Golden Member
Feb 27, 2009
1,684
0
76
Yep, because the eleven gazzilion apps that run on x86 Windows will magically become ARM comparable.
o_O
Every good written app that doesn't need its own drivers (and that should be the minority) would just have to be recompiled if MS ports all the libraries (will they? I'm not so sure that's the best approach there). Doesn't mean we'll see thousands of ARM apps because I assume there's more to it than just the technical aspects, but it really shouldn't be too hard.

Modelworks said:
and that platform doesn't have the problems that x86 does with threading
Huh what? ARM has more or less the same "problems" as x86 when it comes to threading (and the problems scaling to many cores apply to it just as well) since those are pretty architecture independant (sure locks, atomic regions and co can be implemented different, but for the people using those libraries it's all the same), care to elaborate?

JFAMD said:
The world will be there eventually. And someone will still complain about single threaded performance.
If I'd to decide between 200 cores with twice the IPC or 1000 cores for my own desktop, I'd always take the first option. And even for servers throughput alone isn't all (hi google paper).