Originally posted by: CTho9305
The expected TDP should be around 150W, however, please keep in mind that these are all very preliminary specifications.
Nobody in their right mind would buy a chip that has 150W of GPU plus another few dozen watts of CPU. It'd be impractical to cool.
I read that part to mean when they release a discreet GPU based on Larrabee it will be operating within a thermal envelope of 150W.
I don't think anyone at Intel would be unawares of the impracticality of cooling 150W+ beasts, netburst has been seared into the brains of all the relevant decision makers.
Originally posted by: CTho9305
by 2009 integrated graphics will work 6 times faster, while by 2010 it will be 10 times faster than in 2006, when the performance standard was set by i965G chipset.
That's only relevant if integrated graphics (and system requirements) don't grow at the same rate.
Completely agree. AMD's designs on Fusion and even what they are doing now with the
780G (warning INQ link) are completely far and away more aggressive than the spoken to plans by Intel. Which either means Intel is just sandbagging themselves for whatever that gains them in the short-term, or they really have no serious expectations of taking on Fusion with Larrabee.
Originally posted by: CTho9305
This processor should allow should allow the game developers to use image rendering techniques involving ray-tracing method.
I have a hard time believing game development houses would come up with two completely different render paths for the same game, so either Intel is playing up something just because they know it sounds good but is irrelevant, or everybody will be ray tracing just as well in that timeframe. Ray tracing was a "holy grail" for a long long time, but it doesn't solve many problems as well as the "cheats" that triangle pipelines used (the really good looking "ray traced" images all use radiosity and other techniques on top of a raytracer).
This part has honestly baffled me since day 1 of this "real-time raytracing FTW" talk. I like the idea, it sounds cool, but it seems like it will require so much extra effort by game makers to support existing technology pathways (DX10, etc) that going cold-turkey to a race-tracer game engine will be way to much of a resource requirement for too many gamehouses to jump onboard.
I mean jsut look where in-game physics went? Nowhere for all the same reasons migrating to real-time raytracing is going to have an uphill battle. It requires hardware and software with no seamless transistion pathway between the two.