Final Word on Larrabee

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zstream

Diamond Member
Oct 24, 2005
3,396
277
136
Cloud based games will never become a reality as long as we are using copper.
 

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
ATI-Crossfire
Nvidia-SLI
Intel-Servers

One of them is doing it wrong. :D

Heh, one of them is doing it right if they convince gamers to pay for four servers packed with knights ferry cards ;)

Selling ice-boxes to eskimos is the saying.

(what are Nvidia's and AMD's trailing 4Q net profit for graphics divisions?)
 

sandorski

No Lifer
Oct 10, 1999
70,100
5,640
126
Heh, one of them is doing it right if they convince gamers to pay for four servers packed with knights ferry cards ;)

Selling ice-boxes to eskimos is the saying.

(what are Nvidia's and AMD's trailing 4Q net profit for graphics divisions?)

lol, that's a big "if", but you're right, if they managed it, it would be the biggest Marketing Win ever.
 

Viditor

Diamond Member
Oct 25, 1999
3,290
0
0
Heh, one of them is doing it right if they convince gamers to pay for four servers packed with knights ferry cards ;)

Selling ice-boxes to eskimos is the saying.

(what are Nvidia's and AMD's trailing 4Q net profit for graphics divisions?)

AMD's revenue in graphics for Q4 09 was ~35% of their total revenue for the year in graphics ($427 Million in Q4, $1.206 Billion FY). Their Operating Gain in Q4 for graphics was $68 million.
I don't think Nvidia breaks out those numbers (or I just couldn't find them...).
 

zephyrprime

Diamond Member
Feb 18, 2001
7,512
2
81
They seem to gravitate towards the Odd when it comes to Video. Like the i740(IIRC that's what it was called) and low RAM idea. I suspect these are just really smart people(I'm not always convinced of this though) seeing a problem and addressing it, but not really understanding the fundamentals of the thing they're trying to fix.

I highly suspect that both ATI and NVidia were kinda biting their tongue during the whole attempt. Both knowing possibly numerous reasons why it was going to fail, but not wanting to help them.
I think it was pretty fricken obvious that the attempt would fail and I said as much in one of my old posts. Specialized silicon has a HUGE performance advantage over software rendering. Just try using a software renderer and see how super slow it is. I'm talking 2-3 orders of magnitude. Now making a more computational dense part like larrabee can close the gap by about 1 order of magnitude but that's still not enough to completely close the gap. The idea was fool hardy to begin with and Intel was foolish to try it.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
The idea was fool hardy to begin with and Intel was foolish to try it.

Using terms like foolish is a delicate matter. Just because a project did not get seen through to a product doesn't make it a complete failure. I'm sure much knowledge was gained during this project. Yes it did cost Intel's CTO his job, but was that his own fault for over promising?

While I'm sure Intel would rather have a sell-able product, a negative result is still valuable. After all, knowing what doesn't work to solve a problem is just as important as knowing what does work.
 

epidemis

Senior member
Jun 6, 2007
796
0
0
Oh no Wreckage was right again.......

I personally think they will just buy Nvidia at some point and totally dominate the cpu/gpu on chip market.

If Intel bought Nvidia, Huang would've demanded that he be made the new CEO ^^
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
actually, I think that's exactly what he did and that's what kept the deal from happening. jhh's company might be down right now, but he's a bad mofo. It'll be interesting to see if sometime down the road intel doesn't regret that decision.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
actually, I think that's exactly what he did and that's what kept the deal from happening. jhh's company might be down right now, but he's a bad mofo. It'll be interesting to see if sometime down the road intel doesn't regret that decision.

Intel could always initiate a hostile takeover. I'd bet a lot of shareholders would tender their shares for a 40% premium. I'm not aware of Nvidia have adopted any poison pill measures.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
If Intel extends their fixed function units a bit, over a few generations (less direct mapping of DX, but still fairly simple computational units--something halfway between SB's IGP and current Radeons), they'll have flexibility there to tackle some corner cases better, and the many-core CPU can work well as leverage for branchy code. For all of its ills, Hyperthreading aught to be excellent for handling the small wait times between CPU & GPU, when dealing with fine-grained work.

Combine that with having the expensive many-weak-core devices for the HPC market, and nVidia will end up even more screwed. They're in no short-term danger, IMO, but they're going to have to push down some of that pride and ego, and learn from their competitors.

As part of that they will need to stop with the black boxes (same goes to you, Imagination!). Hide the secret sauce behind well-documented interfaces; because the ability for Linux kernel and Xorg hackers to fix bugs will come to matter more than all of the performance you can bring. Also, I doubt we're all that far away, outside of Windows, at least, for FOSS compilers to do well with GPGPU (when/if Gallium3D becomes stable, and supports a few FOSS-friendly GPUs, the capability will be there to exploit).