You seem to have been colonized by a marketing infection. Larrabee won't usher in a raytracing revolution. To succeed, it will have to be capable of running contemporary software, and Intel knows this.
To quote Intel-
"First, graphics that we have all come to know and love today, I have news for you. It's coming to an end. Our multi-decade old 3D graphics rendering architecture that's based on a rasterization approach is no longer scalable and suitable for the demands of the future."
Of course they are backpedaling now, they realized how profoundly moronic they were.
If Cell was a better GPU than what Intel would want to put in the PS4, it would be better than RSX, and Sony would have put two Cells in the PS3.
The question is if Cell in '11 or '12 would be better then Larrabee, by Intel's most optimistic claims it will be and easily so. That still won't be close to good enough to get the job done, but we'll get back to that.
Where's your source for that?
At 90nm with 234Million transistors Cell was 221mm squared with a redundant unit to increase yields. They are now running 45nm(second die shrink since launch) at 114.5mm squared with a redundant unit still built in to improve yields.
How is Sony still losing money on the PS3 with a 7-core 3.2Ghz Cell?
Couldn't tell you. Can't figure out why it would be more expensive then the 360.
You sound like Ken Kutaragi. How's life in 4-D?
I'm assuming English isn't your first language then. To quote myself-
Sony went with nV because they realized that there goals weren't going to work
KK failed to come close to what he wanted to do with Cell by a long shot. The difference is he at least was far more honest with his approach then what Intel is being. He made no sacrifice, his design was tailor made in every aspect to be the most powerful processor of its type per transistor/watt or whatever metric you want to use. He wasn't going to hang on to front end costs of converting an archaic ISA to uOps, he wasn't going to saddle his hardware with resources for dealing with OoO code, he wasn't going to limit his capabilties by utilizing die space, and quite a bit of it, for anything other then functional units or cache to keep them fed, that was it. Intel is trying to do the exact same thing KK was, only they are making a ton of sacrifices to do it and somehow they are supposed to come to a different conclusion then KK did? As egomaniacal as KK was, he had to admit he was horribly wrong and had to lose face going to an American company at the 11th hour to save his design. Intel is starting to backpedal now, realizing they have no chance at doing what they were claiming at first. They should have learned from the last pompous, unemployed windbag
The entire point of Larrabee in the PS4 is that Intel would give it away for free.
You see Intel giving away hundreds of millions of dollars in hardware? Let's say I think it more likely that they would name KK CEO of Intel before that happened.
Sounds very plausible to me... nVidia's greed is upsetting Sony,
nVidia only collects a licensing fee from Sony, nothing else. I am perplexed as what possible 'greed' issues there could be? They agreed upon an IP cost per unit, Sony fabs them themselves, what issue could there be with nV at this point? This is where I see Inq's logic going into its' normal level of idiocy, there seems to be no grounding at all for that initial assertion, and they then base a ton of speculation off of that misplaced assertion.
-design it themselves (too expensive)
They would do another GS before going to Larrabee, the costs would be considerably less expensive and it is hard to imagine Sony making anything slower then what Larrabee is shaping up to be.
And i will be willing to guess that the first larrabee may have some fixed function hardware as well. They did license powervr after all. Now powervr was always very promising but needed a dsp like frontend to do the T & L in the past.
If the tiling rumor is true, it would certainly be the final nail in the coffin of the Larrabee in a console rumor. Geometric complexity is simply too high now, the amount of RAM that would be required to handle the tiling end wouldn't work in the console world. Look at what most developers on PS3 or 360 list as their biggest issue- it isn't computational power.
I don't know the clocks for larrabee but that seems possible. I am guessing here : Maybe an improved 45 nm process and running at 2,5 Ghz ? Intel does have a lot of experience now with the 45nm high k..
In the timeframe we are talking about you need to think in the order of probably 10TFLOPS being the norm for GPUs. This isn't about Intel's first futile offerings, they need something much, much stronger then that.