Larrabee comments from Intel

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
According to some preliminary data, the first Larrabee version will have from 16 to 24 cores, each with a 32KB L1 cache. The shared L2 cache will be about 4-6MB big. Individual IA cores will be connected via ring bus like the one used in Cell processors. The first Larrabee modifications will be manufactured with 45nm technological process, and the working frequencies of these processors are expected to be in the 1.7-2.5GHz interval. The expected TDP should be around 150W, however, please keep in mind that these are all very preliminary specifications.
http://www.xbitlabs.com/news/v...e_Graphics_Market.html

I can't tell for sure but it sounds like they intend to put larrabee in Havendale (nehalem with integrated graphics)...did anyone else get that from the article?
 

CTho9305

Elite Member
Jul 26, 2000
9,214
1
81
The expected TDP should be around 150W, however, please keep in mind that these are all very preliminary specifications.
Nobody in their right mind would buy a chip that has 150W of GPU plus another few dozen watts of CPU. It'd be impractical to cool.

by 2009 integrated graphics will work 6 times faster, while by 2010 it will be 10 times faster than in 2006, when the performance standard was set by i965G chipset.
That's only relevant if integrated graphics (and system requirements) don't grow at the same rate.

This processor should allow should allow the game developers to use image rendering techniques involving ray-tracing method.
I have a hard time believing game development houses would come up with two completely different render paths for the same game, so either Intel is playing up something just because they know it sounds good but is irrelevant, or everybody will be ray tracing just as well in that timeframe. Ray tracing was a "holy grail" for a long long time, but it doesn't solve many problems as well as the "cheats" that triangle pipelines used (the really good looking "ray traced" images all use radiosity and other techniques on top of a raytracer).
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Originally posted by: CTho9305
The expected TDP should be around 150W, however, please keep in mind that these are all very preliminary specifications.
Nobody in their right mind would buy a chip that has 150W of GPU plus another few dozen watts of CPU. It'd be impractical to cool.

I read that part to mean when they release a discreet GPU based on Larrabee it will be operating within a thermal envelope of 150W.

I don't think anyone at Intel would be unawares of the impracticality of cooling 150W+ beasts, netburst has been seared into the brains of all the relevant decision makers.

Originally posted by: CTho9305
by 2009 integrated graphics will work 6 times faster, while by 2010 it will be 10 times faster than in 2006, when the performance standard was set by i965G chipset.
That's only relevant if integrated graphics (and system requirements) don't grow at the same rate.

Completely agree. AMD's designs on Fusion and even what they are doing now with the 780G (warning INQ link) are completely far and away more aggressive than the spoken to plans by Intel. Which either means Intel is just sandbagging themselves for whatever that gains them in the short-term, or they really have no serious expectations of taking on Fusion with Larrabee.

Originally posted by: CTho9305
This processor should allow should allow the game developers to use image rendering techniques involving ray-tracing method.
I have a hard time believing game development houses would come up with two completely different render paths for the same game, so either Intel is playing up something just because they know it sounds good but is irrelevant, or everybody will be ray tracing just as well in that timeframe. Ray tracing was a "holy grail" for a long long time, but it doesn't solve many problems as well as the "cheats" that triangle pipelines used (the really good looking "ray traced" images all use radiosity and other techniques on top of a raytracer).

This part has honestly baffled me since day 1 of this "real-time raytracing FTW" talk. I like the idea, it sounds cool, but it seems like it will require so much extra effort by game makers to support existing technology pathways (DX10, etc) that going cold-turkey to a race-tracer game engine will be way to much of a resource requirement for too many gamehouses to jump onboard.

I mean jsut look where in-game physics went? Nowhere for all the same reasons migrating to real-time raytracing is going to have an uphill battle. It requires hardware and software with no seamless transistion pathway between the two.
 

BigMoosey74

Member
Dec 18, 2007
92
0
0
I have a hard time believing game development houses would come up with two completely different render paths for the same game, so either Intel is playing up something just because they know it sounds good but is irrelevant, or everybody will be ray tracing just as well in that timeframe. Ray tracing was a "holy grail" for a long long time, but it doesn't solve many problems as well as the "cheats" that triangle pipelines used (the really good looking "ray traced" images all use radiosity and other techniques on top of a raytracer).



Yeah there is a huge bridge that Intel needs to cross with the ray-tracing caravan in order to bring it home. I definitely think that Intel's path to perfecting ray tracing is worth while but I agree with you there...there is a lot of hype on their behalf. To me, I feel like a lot of over simplification bs is being thrown out there by Intel in order to gain momentum.


Who is the guy that they recruited that did his PhD research on ray tracing? Didn't he pretty much have to overhaul Doom for his ray tracing demos?
 

Kuzi

Senior member
Sep 16, 2007
572
0
0
Originally posted by: CTho9305
by 2009 integrated graphics will work 6 times faster, while by 2010 it will be 10 times faster than in 2006, when the performance standard was set by i965G chipset.
That's only relevant if integrated graphics (and system requirements) don't grow at the same rate.

These are actually very weak numbers to consider for 2009/2010. The new AMD 780G chipset already runs about 4-6 times faster than Intel's fastest chipset the G35, draws less power, and does full HD video. You can check this nice review here.

Next year I'm sure AMD will integrate something more powerfull into Fusion, add to the fact that AMD has Hybrid CrossFire (check at the review). Meaning once you add a dedicated graphics card, you will get extra performance from the integrated GPU.

They are going to introduce systems using an additional Larrabee processor that will enhance the potential of the integrated graphics cores.

So this means this will work in a similar say as Hybrid Crossfire, and Larrabee is supposed to be really fast for graphics. Lets wait and see I guess.