Larrabee graphics chip delayed, launching only as 'kit'

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Gross margins do not include R&D expenses, neither does PFO.

They don't? That's . . . interesting. Learn something new every day.

Look at AMD's and Nvidia's gross margins on graphics products, if it is as simple as you portend then their GM situation would beg further inquiry.

For Intel (and everyone else) the gross margins will include software/driver development/maintenance costs (pro-rated against volume of larrabee chips of course) plus shipping/distribution. Plus the GPU alone isn't what makes a graphics card. Someone has to assemble the card, PCB/power components/memory, package and ship plus warranty and support.

Do you think Intel's gross margins for their CPU's entail nothing more than merely doubling the silicon production cost to arrive at the retail price?

No, I was sort of wrapping up all the production costs, from the wafer to the packaging and shipping and so forth, into one number. It is interesting that you bring up driver development costs, which are easy enough to overlook. I would have assumed that initial driver development would have been covered under R&D, but that's assumptions for you.

That still doesn't bring us any closer to an actual cost-of-production/shipping/warranty support/etc. for Larrabee Prime. Something tells me that such data will not be readily available.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Larrabee is overhyped and I predict that it will never be a good product. There is simply no way a general purpose device can beat a special purpose device in performance all other things being equal. The advantage of the GP device is that it can run a lot of different stuff rather than just one thing. However, if people don't feel like running Microsoft Word on their video cards, Larrabee's general purpose abilities will be useless. GPUs are becoming more and more general purpose because of shaders but they are still do enough fixed function computations to give GP devices problems in keeping up without the addition of specialized processing units (which larrabee has).

Also, in modern large x86 CPUs, instruction translation overhead isn't so bad because the chips have such huge transistor budgets but in a high core count cpu like Larrabee, x86 translation overhead is expensive. If intel want's to produce a semi-viable Larrabee, they will need to ditch x86.

Larrabee is more useful for servers than consumer hardware in my opinion. There are many server loads that can benefit from such a high core count.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
The thing you're missing is the end user doesn't need to opt into running applications on their "video card" to receive the benefits of larry b (* disclaimer: in the most perfect, theoretical world.)

With proper OS and library support already compiled and available applications would become accelerated. Excel or Oracle calling sort() on a few hundred megs of data? No problemo. Offload sorting 1/30th of the data to each of your 30 larryb cores, then merge sort on CPU. Your percieved runtime just went from looking like O(N log N) to looking a lot more like O(N). A lot less waiting when waiting matters, in other words.

Whatever piggish things Flash does could be done by multiple instances on multiple 'GPU' cores. All of a sudden a web site with multiple Flash ads becomes viewable without requiring adblock software.

That's the promise of GPGPU. In practice, OS vendors would have to buy into this and provide GPU accelerated runtimes. Application vendors may choose to provide their own support as well. End users won't care, other than *all* their apps run better when equipped with a GPGPU.
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
Getting good library and programmer buy-in to something like Larrabee is extremely difficult. It's just not gonna happen. Most programs now still don't have basic multithreading. Adobe can't make a decent version of Flash even though flash is merely 2d. Heck, even Acrobat performance is poor even though it's workload is intrinsically light. Do you think Adobe is gonna write a super parallel processing version of Flash to take advantage of high core count gpus? Not gonna happen.
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Getting good library and programmer buy-in to something like Larrabee is extremely difficult. It's just not gonna happen. Most programs now still don't have basic multithreading. Adobe can't make a decent version of Flash even though flash is merely 2d. Heck, even Acrobat performance is poor even though it's workload is intrinsically light. Do you think Adobe is gonna write a super parallel processing version of Flash to take advantage of high core count gpus? Not gonna happen.

Key word: "most" programs are not multithreaded. OSes and runtime libraries are not written by the "most" kind of programmer, Allah be praised. These are crucial building blocks upon which the entire OS platform is evaluated and rightfully you choose the brighter of your bulbs to architect and implement them rather than pressing interns previously slinging VB code. TL;DR: it'll be possible to find the upper 1% of developer talent to do this if it makes sense from a $ standpoint.

Which brings us to the second point: poster children for awful code being belched forth by Adobe. Writing good software is hard (as you pointed out), but making mediocre software run well on a silly amount of hardware is a much easier task. Yes, even though the guys at Adobe may not consider it worthwhile to complicate their product by re-implementing it on top of a decent threading model they may find the task of profiling, finding hot spots and executing identical code in parallel on a lot more hardware a much, much easier task. The second approach is precisely why J2EE exists and is as popular as it is.

Same thing goes for consumer software and enterprise software. If the marketing team can put a bullet point of "Now! 1337% faster!" on their slides and the product box at a relatively low cost a whole bunch of vendors will bandwagon.

TL;DR: Larry B exists to simplify the whole paralellizing existing codebases without the need of brilliant developers.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Larrabee is overhyped and I predict that it will never be a good product. There is simply no way a general purpose device can beat a special purpose device in performance all other things being equal.

*snip*

I don't think that's the real worthiness of Larrabee. The entire point is programmability.

In the short term, programmability isn't important to us consumers, and Larrabee doesn't look exciting to us either. But developers care, because they are the ones who actually have to write and optimize code for it.

People don't buy video cards for new features like DX11, if the new cards aren't faster than the previous gen(case in point, like DX10 on Geforce 5800 GPUs). That prevents the developers from making games that support DX11 code, since the user base don't care about them. Only reason that new standards like DX11 progress is because the faster video cards feature them, and they trickle down to mainstream. Progress is slowed because developers are reluctant to adopt new DX-standards, and consumers aren't willing to pay for it.

But if you have GPUs like Larrabee, older GPUs will support the new features, which mean standards like DX11 will trickle down the market faster, because developers will be more willing to support them in their games which will result in better graphics, and whatever optimizations in code DX11 has over DX10.1 will also benefit the lowly-end GPUs.

Imagine if CPUs worked like GPUs. To support all the new features in the OS, you need to buy new CPU. Who in their right mind would do that?

The real question is then of course whether all these graphics "progress" is really making the games any better. I'm not sure how to answer that.

I think if they can get the Larrabee or its derivatives to have similar performance as the then-current competitor GPUs, it'll be a huge success. The problem is, how close can they get a software renderer to a fixed function renderer, but this is Intel we are talking about. What will prevent them having a GPU that'll be that much bigger to bring FULL compatibility with x86 yet have similar performance as fixed function GPUs?
 

lopri

Elite Member
Jul 27, 2002
13,209
594
126
I find myself keep going back to this man's wisdom every time a "next-gen" is the topic du jour.

http://www.pcper.com/article.php?aid=532&type=expert&pid=1

I'm disappointed at the Larrabee news personally. I didn't expect it'd be competitive for 3D gaming (drivers would be half of it), but imagined if Intel could push out this new "general purpose" GPU then someone could find a way to use it. We all remember how Intel took a huge lifting towards multi-core CPUs.

I'd also like to note how mature it is the way Intel acknowledged the difficulty without much smoke or excuses. At the point my interest is its next gen CPUs that will have on-die GPUs (supposed to be Larrabee variants, but now it is not clear at all)
 

zsdersw

Lifer
Oct 29, 2003
10,560
2
0
LMAO. Nvidia should have some good cartoons next week if this is indeed true.

Nvidia is the last company who should be doing the whole schadenfreude thing with this.

Beaten to having DX11 parts on store shelves by AMD/ATI? Damn, that's gotta hurt.
 

DrMrLordX

Lifer
Apr 27, 2000
21,637
10,855
136
Whatever piggish things Flash does could be done by multiple instances on multiple 'GPU' cores. All of a sudden a web site with multiple Flash ads becomes viewable without requiring adblock software.

That's what I was thinking . . . and the prospect of Intel being forced to sell Larrabee products at prices of $100 or less just to break into the market would have given us consumers the ability to pick up a boatload of general-purpose computing power that we could just pop into a standard PCI-e slot. Hopefully.

That never came to fruition so, oh well . . .
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Well they didn't cancel the project, so presumably yet the plan is for some version of Larrabee to eventually come to the market.

Are any of these Larrabee "prime" or "1, 2, 3" names actual internal project codenames or are we looking at another case of all the silly naming speculation that went Cypress before AMD let us know it was called Cypress and not R800, etc?