Larrabee graphics chip delayed, launching only as 'kit'

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
If intel is selling this thing as a development kit. They wont be competing in any meaningful way in the hpc market.

You may be correct . I not sure but this development is surprizing when you look at the total picture. So I believe the news but I am of the mind there is other important reason for this . It may be that Intel failed at this attempt. But beings how when it comes to intel /ATI my glass is always half full. It is possiable intel has made a breakthrough that put present larrabee on the for software development only . Which is a good idea to have anyways. So who knows what intel is upto . But this may be a breakthrew and not a failure that led to this decision. 2010 is going to be wonderful year . I have waited long . May the chips fall were they will.

It is possiable and its really not reaching far at all . I am reasonably sure that Intel knows everthing there is to know about Fermi and its problems . Intel likely decided to pass on 45nm and going to do 32nm which we all understand is possiable or intel.

Who knows how late Fermi will be. Maybe Intel has a good idea of that problem and knows the timeframe . So the said lets dump prime and do 32nm. At the same time using 45nm, for software development . Now if we see Fermi in Feb March . Who knows ? But if Fermi pushes into may june . As I believe . Than intel made a great decision to pull 45nm and go to next gen. Its all in how ya want to read the news . ALL the news.
 
Last edited:

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Actually dude, the "original" release timeframe was supposed to be in 2008. Says so right in the article.

"Larrabee, a chronically delayed chip, was originally expected to appear in 2008. It was slated to compete with discrete graphics chips from Nvidia and Advanced Micro Devices' ATI graphics unit."

So what is it we need to know?



Yes, disappointing. Would have been nice to see a third player. But it's no big change from the last decade or so. NV and ATI.


Actually I was looking for a offical company statenment differant than the 2009/2010 period . Show me the link.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
55
91

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
from same article:
"Justin Rattner (Intel Senior Fellow) demonstrated Larrabee hitting one teraflop, which is great but you could walk across the street and buy an ATI graphics board for a few hundred dollars that would do five teraflops." A teraflop is 1 trillion floating point operations per second, a key indicator of graphics chip performance.

What AMD SKU does 5 TFlops for $300?
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
If we believe the numbers from Intel's Siggraph 2008 paper, we need approximately 16 Larrabee cores running at 1GHz frequency to achieve 60 fps on FEAR at 1600x1200x4AA.

If they used nVidia's IP which isn't going to be an option anymore, given how well they were running Quake Wars it also seems like they are off by an order of magnitude on their estimates even using IP that they are losing the rights to.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
I have always maintained that Larrabee was not gonna be anything special. For us Gamers anyway. They just don't seem to do well for some things. They waste more $ on R&D than AMD does in total R&D. If they want to be a Player in Graphics, they'll need to buy Nvidia.

doubtful if jhh offers them the same deal as last time...
 

DrMrLordX

Lifer
Apr 27, 2000
22,967
13,062
136
The thing about Larrabee that confuses me is that, despite any estimated performance deficit it might have had at launch versus offerings from AMD and Nvidia, it was a massively-parallel x86 co-processor device (more or less). Who cares what it would have been like as a graphics card? That would just have been the "trojan horse" to get Larrabee products, presumably at rock-bottom prices, into as many OEM machines/netbooks/laptops as possible to make consumer-level GPGPU app support desirable to developers. Larrabee's appeal to the HPC crowd is obvious, but what about just dumping existing threads intended for the CPU onto Larrabee cores and/or aggressively multithreading existing apps for said purpose? Vector instruction set be damned, we're talking about a "video card" that seemed an awful lot like 16 1-2 ghz Atoms (minus SMT) crammed onto one PCI-e device.

Getting the OS to recognize the Larrabee card for what it was and getting the scheduler to assign threads to its cores would be all that it would take . . . surely that would have been useful for SOMETHING. If those pudwhackers behind Killer NIC could get that silly thing to market, when all it did was take some of the TCP/IP overhead off the CPU (more or less), why couldn't Intel have just sold Larrabee as the multi-threaded helper's card?

It wasn't supposed to be about beating AMD or Nvidia, it was supposed to be about getting the hardware out there so that devs could tinker with it and find useful little things to do with it. Or, at least, that was the way I was looking at it. Larrabee-accelerated Flash alone would have been awesome for budget machines that sometimes struggle with Flash ad-laden sites.

Since all the cores were essentially in-order x86 cpus, devs SHOULD have had a much easier time developing software to utilize Larrabee cards than Nvidia and/or ATI video/HPC cards. Was this all down to a driver/OS issue? Just how much trouble was Intel having making Larrabee's cores available to thread schedulers? Were they even trying to do that? PCI-e bandwidth/latency issues might also have been a problem, so maybe Larrabee wouldn't have worked so well as a co-processor board for common x86 apps in that implementation, but give it a QPI link and solder it onto the board somewhere . . .
 
Last edited:

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
doubtful if jhh offers them the same deal as last time...
Even if Jen-Hsun Huang is a little more down-to-earth in dealing with Intel (since Intel is not just an "AMD"), it is still doubtful because the US government would probably not like it so much. Intel leads the GPU market thanks to IGP, and nVidia is the market-leader in discrete GPUs. Joining them together probably paints a negative picture to the government.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
The thing about Larrabee that confuses me is that, despite any estimated performance deficit it might have had at launch versus offerings from AMD and Nvidia, it was a massively-parallel x86 co-processor device (more or less). Who cares what it would have been like as a graphics card? That would just have been the "trojan horse" to get Larrabee products, presumably at rock-bottom prices, into as many OEM machines/netbooks/laptops as possible to make consumer-level GPGPU app support desirable to developers. Larrabee's appeal to the HPC crowd is obvious, but what about just dumping existing threads intended for the CPU onto Larrabee cores and/or aggressively multithreading existing apps for said purpose? Vector instruction set be damned, we're talking about a "video card" that seemed an awful lot like 16 1-2 ghz Atoms (minus SMT) crammed onto one PCI-e device.

*snip*

Remember with Intel their decision makers operate with an additional gross-margins minimum requirement that is above-and-beyond basically everyone else in the industry.

Its not good enough for Intel to release Larrabee if the gross-margins are the same as Nvidia's and AMD's. They just don't do that, the shareholders they care about (institutional holders) will get brutalized by the day-traders and speculators as they dump their smaller percentage of the float onto the market on any rumors of gross margins erosion. Especially if gross margins are seen as trending below 50%.

Killer NIC guys could be selling those things at cost, just making enough to pay their own salaries and the electricity bill and that is good enough to be a viable business/product strategy. Intel's decision makers aren't operating with that luxury.

Whenever Larrabee does come to the consumer space it won't be until such time that its production cost structure (largely node-dependent) is sufficiently less than the competition (might not be till 22nm) or the performance is finally large enough that the ASP can be sufficiently large so as to drive a net gross margin that approaches or exceeds 50%.

If Larrabee was an AMD or Nvidia project then the decision makers could have gotten away with pulling the trigger and releasing product when the gross margins were as low as 15% or thereabouts. They might have held off until margins were as high as 25% but only if they were concerned about cannibalizing their other slightly-higher gross margin products at the time.
 

jrphoenix

Golden Member
Feb 29, 2004
1,295
2
81
Intel will, Intel must, be back in the graphics space and HPC space in some form. I think their future depends on it. So they can't afford to sit and watch AMD and NV move ahead. Too much at stake here. A lot of R&D already invested. I'm thinking they'll be back in 2011.

BTW, why are some of you "consumers" in here, who always say you wanna see good competitions between the vid card makers, seem to be a little giddily happy over this news like a virgin getting her first?! Personally I think this is a devastating blow to consumers.

I agree they must. I think they are thinking acquisition will be less expensive now that creating a chip from the ground up. If Fermi slips more.... possible time for Intel to use their cash??? Is that possible?
 

jvroig

Platinum Member
Nov 4, 2009
2,394
1
81
Remember with Intel their decision makers operate with an additional gross-margins minimum requirement that is above-and-beyond basically everyone else in the industry.
Not arguing with you, but where can I find this information? And for that matter, AMD's and nVidia's as well?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
jvroig you have to follow the industry from a financial point of view for a while (5+ yrs I guesstimate) to accumulate that perspective, I can't really just link to it and transfer the experience that is the basis of insight. Not trying to be lazy or guarded of anything, I wish there was a resource that clearly laid out the situation but I don't know of any...guys like Benskywalker and V8envy would be your best bet of someone having a set of links handy to post.

Intel walked down this path twice before in recent times, both cases were very near and dear to my employer at the time (Texas Instruments) and involved the efforts by Intel to get into the mobile phone and HDTV businesses. The parallels between those two failed ventures and Larrabee is pretty remarkable, the lessons of history holds nothing on some folks I guess.

Both mobile phone and HDTV markets were sub-50% gross margin markets at the time Intel made public their intentions to jump into them. A year or two later and if anything gross margins in both had declined. TI wasn't held to the standard of hitting 50% GM to support their P/E ratio so it was OK for our decision makers to continue to stay in that business segment (that has since changed though). Intel did not have that luxury. So when push came to shove and the choices were (1) lower prices to gain entry into the marketspace at the expense of having lower margins, or (2) abandon the effort as long-term gross-margin horizons were not going to stabilize above 50% with any degree of confidence...Intel chose to abandon both markets. They sold the mobile phone business to Marvel for $600m, and simply closed shop on the HDTV efforts.

Here's a post from long ago that contains some relevant links (if the links even still work): http://forums.anandtech.com/showpost.php?p=25630931&postcount=61

Originally posted by: Viditor
Originally posted by: Idontcare


And remember that AMD is beholden to its shareholders to merely stop losing money, Intel is beholden to not let gross margins erode below 55%. Desktop Nehalems won't be cheaper to produce than desktop wolfdales/yorkfields (they will be more expensive to manufacture). Further eroding my confidence in their desktop timeline being anything like the server timeline at all.

Just as a nitpick, Intel's GM is currently 53.8%...

I don't think that's a nitpick...that's actually my point. Intel stock takes a hit when the shareholders don't get their expected 55% or higher GM. You can bet they'd be priced about $2/share higher right now were their GM's in the 55-60 zone instead of the 50-55 zone.

From 2007: Intel: Gross Margins Aren't Heading Any Higher This Year

From 2008: Intel gross margin becomes NAND?s latest victim

Back when Intel announced they were going to compete with TI over HDTV technology (CES 2004) we didn't even break a sweat of worry over it because we knew what the gross margins were like in that industry (DLP) and we knew that as soon as the right people at Intel got a dose of the GM reality then they'd be jumping out of that market segment like it was netburst part duo.

Intel shows off giant screens
Intel delays first TV chip
Intel kills TV chip plans

What impressed me with Intel's foray into HDTV was that it seemed like someone didn't do the easy/obvious homework before they sunk a bunch of money into productizing the thing...in the end them coming to realize the "ROI" wasn't there is kinda silly to have be your hindsight and not foresight.

Back to gross margins...Intel's executive team must defend all business actions which take Intel towards a product mix that reduces GM's below 55%. They divested themselves of their mobile phone ambitions for identical reasons. To gain marketshare would have required them to downgrade ASP's to an unacceptable (to Intel) gross margin outcome.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Their TV and Smartphone attempts aren't exactly dead though. They want Smartphone markets using Moorestown, and TV chips using Sodaville. Well, actually Sodaville is more for set-top boxes but still.

Their problem in graphics is they don't stick to the basic architecture enough for drivers to mature. Maybe Larrabee should have been used in replacement of X3000/X4500 IGP as fewer core versions just for development purposes.
 

DrMrLordX

Lifer
Apr 27, 2000
22,967
13,062
136
Remember with Intel their decision makers operate with an additional gross-margins minimum requirement that is above-and-beyond basically everyone else in the industry.

Its not good enough for Intel to release Larrabee if the gross-margins are the same as Nvidia's and AMD's. They just don't do that, the shareholders they care about (institutional holders) will get brutalized by the day-traders and speculators as they dump their smaller percentage of the float onto the market on any rumors of gross margins erosion. Especially if gross margins are seen as trending below 50%.

An interesting point, though I should refer you to Anand's reaction to the Larrabee cancellation (http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3686). Of primary interest:

It's not a huge financial loss to Intel. Intel still made tons of money all the while Larrabee's development was underway. Its 45nm fabs are old news and paid off. Intel wasn't going to make a lot of money off of Larrabee had it sold them on the market, definitely not enough to recoup the R&D investment, and as I just mentioned using Larrabee sales to pay off the fabs isn't necessary either. Financially it's not a problem, yet. If Larrabee never makes it to market, or fails to eventually be competitive, then it's a bigger problem. If heterogenous multicore is the future of desktop and mobile CPUs, Larrabee needs to succeed otherwise Intel's future will be in jeopardy. It's far too early to tell if that's worth worrying about.

Maybe Anand's analysis isn't spot-on here, but the jist of what he's saying seems to be that Intel had no real need to include R&D costs and fab costs in Larrabee's cost-of-production. Sure, they would have had to lowball to get the 32-core Larrabee Prime (or 16 core or whatever) onto shelves as a discreet GPU, but their profit margin would have been based on material costs alone. Intel is paying off fab costs and R&D costs anyway, so why hold that against Larrabee in profit margin calculations?

Seen in that light, all Intel would have to do set the retail price to twice the cost of raw production and . . . there's your 50%. How many Larrabees could they get off a wafer in a 45nm fab of theirs?

At 600mm2 it's huge, but still . . .
 

DrMrLordX

Lifer
Apr 27, 2000
22,967
13,062
136
700mm2? Let's hope not. Not that it matters at this point, I suppose, but still . . . makes the prospect of an economic Larrabee at 32nm less likely.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
An interesting point, though I should refer you to Anand's reaction to the Larrabee cancellation (http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=3686). Of primary interest:



Maybe Anand's analysis isn't spot-on here, but the jist of what he's saying seems to be that Intel had no real need to include R&D costs and fab costs in Larrabee's cost-of-production. Sure, they would have had to lowball to get the 32-core Larrabee Prime (or 16 core or whatever) onto shelves as a discreet GPU, but their profit margin would have been based on material costs alone. Intel is paying off fab costs and R&D costs anyway, so why hold that against Larrabee in profit margin calculations?

Seen in that light, all Intel would have to do set the retail price to twice the cost of raw production and . . . there's your 50%. How many Larrabees could they get off a wafer in a 45nm fab of theirs?

At 600mm2 it's huge, but still . . .

Gross margins do not include R&D expenses, neither does PFO.

Look at AMD's and Nvidia's gross margins on graphics products, if it is as simple as you portend then their GM situation would beg further inquiry.

For Intel (and everyone else) the gross margins will include software/driver development/maintenance costs (pro-rated against volume of larrabee chips of course) plus shipping/distribution. Plus the GPU alone isn't what makes a graphics card. Someone has to assemble the card, PCB/power components/memory, package and ship plus warranty and support.

Do you think Intel's gross margins for their CPU's entail nothing more than merely doubling the silicon production cost to arrive at the retail price?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Why? Where did he base that figure from?

It was estimated to be something like 680mm^2 or thereabouts based on some eyeballing of a 300mm wafer containing Larrabee "extreme version" that was held up and shown by Gelsinger (or was it Ratner?) at an IDF a year ago.

Not to be confused with the hundreds of threads spawned nearly simultaneously from the same presentation all claiming to have a photo of a different wafer which was mistaken as being the Larrabee wafer (that wafer/die turned out to be Jasper Fores).

So just beware that if you go digging you will come across threads claiming to have photos of a larrabee wafer, it is not, discussing a die size that actually is about right based on eye-witness accounts by folks who were in the audience and saw both wafers.

For whatever reason though no one has ever publicized (to my knowledge) the photo of the Larrabee wafer itself though.

Where the wheels fell of the speculation train though is that no one could come to an agreement as to what the "extreme" comments meant in terms of core count. Was it 680mm^2 for a 32core Larrabee? 48core? 64core?

With too many unknowns and too few knowns it was pretty much worthless to know that some version of Larrabee was nearly 700mm^2.