Intel's answer to Fusion

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Inquirer Article

It's called Larrabee:

"the target is 16 cores in the early 2009 time frame, but that is not a fixed number. Due to the architecture, that can go down in an ATI x900/x600/x300 fashion, maybe 16/8/4 cores respectively, but technically speaking it can also go up by quite a bit.

What are those cores? They are not GPUs, they are x86 'mini-cores', basically small dumb in order cores with a staggeringly short pipeline. They also have four threads per core, so a total of 64 threads per "CGPU". To make this work as a GPU, you need instructions, vector instructions, so there is a hugely wide vector unit strapped on to it. The instruction set, an x86 extension for those paying attention, will have a lot of the functionality of a GPU.

What you end up with is a ton of threads running a super-wide vector unit with the controls in x86. You use the same tools to program the GPU as you do the CPU, using the same mnemonics, and the same everything. It also makes things a snap to use the GPU as an extension to the main CPU"




"In any case, the whole idea of a GPU as a separate chip is a thing of the past. The first step is a GPU on a CPU like AMD's Fusion, but this is transitional. Both sides will pull the functionality into the core itself, and GPUs will cease to be. Now do you see why Nvidia is dead?"
 

crydee

Member
Jun 2, 2006
194
0
0
I don't think the gpu as a seperate chip is ever going to be a thing of the past for gamers.
 

judasmachine

Diamond Member
Sep 15, 2002
8,515
3
81
crydee, you could be right, however think of it this way; they make a entry level, mainstream level, performance level, and and nutcase level CPU that has the graphics embedded in some of it's many cores. That way you still get to pick your level of performance without having to buy a separate card. Kind of like as mentioned, you can buy 4/8/12/16? core chips.

BTW I actually have a poor understanding how the technical end of this stuff works.