Intel is well ahead of nv/ati's 40 nm, so should they release 22nm Larrabee?

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
I think they should. There drivers may need some work, but I think a discrete GPGPU from intel would have a lot to offer.

Whereas nvidia and ati won't be on 28 nm until late next year, intel is in very close reach (<5months) of 22nm.
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
i'm not sure how closely related the EUs on Sandy Bridge and the LRB cores are (they are probably not, as LRB was like a packed group of 486s or something) , but if Intel has a 12EU 22nm GPU attached to its CPU which performs around Radeon HD5600 levels, the most reasonable thing they can do is release a standalone 24EU-36EU card to accelerate graphics. but i don't see it happening soon, plus i'm not sure these EUs can function on their own.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
EUs couldn't function on their own, but Larrabee also has texturing units. The EUs would emulate depth/stencil units and blending/aa and would be used for vertices, and programmable shading.

What they could do, when they get to 16nm, is have like 16 IB cores @ 4GHz and an appropriate number of texture units at 1 GHz along with a QPI-like interface that's for GDDR5 or DDR4 rather than DDR3. They could even make is an all-in-one (CPU/GPU) chip with say, 2 cores for traditional CPU tasks and the remaining 14 cores and for rasterization emulation plus the texture units for texturing.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
There is more to performance than what process node you are at. The fact is, Larabee must have sucked/been too expensive to produce or intel would have released it.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Intel would only produce an external graphics card if they could take the highly profitable upper end of the market, they're not going to deal with the low margin parts.

Besides, the discrete graphics card market is shrinking, it's probably too late to try to get into it and still have an opportunity to recoup investment.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
Besides, the discrete graphics card market is shrinking, it's probably too late to try to get into it and still have an opportunity to recoup investment.

This is sort of the sad thing to me. With intel being the larget CPU manufacturer out there, soon we will all be using crappy integrated intel GPUs.
 

Borealis7

Platinum Member
Oct 19, 2006
2,914
205
106
PC graphics market may be shrinking, but consoles OTOH...

i bet Intel would kill to get their hands on an XBOX graphics contract instead of AMD.
 
Mar 11, 2004
23,077
5,558
146
The GPU companies will be on 28nm when Intel is on 22nm, so the gap isn't that big. Larrabee was not a competitive product, not even close, and that was to the GPUs back then, let alone newer stuff like Fermi.
 

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
There is more to performance than what process node you are at. The fact is, Larabee must have sucked/been too expensive to produce or intel would have released it.
True, but if intel is at 22nm while AMD/nvidia are at 32 or 40 nm, then intel can pack a lot more transistors while keeping wattage down. It's true that Larrabee would be a lot slower than an AMD/nvidia part at the same process, but nvidia/ati's advantage is much smaller if intel's process node is nearly 2x better. At 22nm, they could release something with close to the performance at the same TDP or close TDP with the same performance of a 40nm AMD/nvidia part, at only a slightly higher price. Then, they could pay OEMs/etailers not to sell nvidia and amd parts (pretending that we didn't have the tyrannical FTC), plus intel has brand name recognition that AMD and nvidia don't have.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
True, but if intel is at 22nm while AMD/nvidia are at 32 or 40 nm, then intel can pack a lot more transistors while keeping wattage down. It's true that Larrabee would be a lot slower than an AMD/nvidia part at the same process, but nvidia/ati's advantage is much smaller if intel's process node is nearly 2x better. At 22nm, they could release something with close to the performance at the same TDP or close TDP with the same performance of a 40nm AMD/nvidia part, at only a slightly higher price. Then, they could pay OEMs/etailers not to sell nvidia and amd parts (pretending that we didn't have the tyrannical FTC), plus intel has brand name recognition that AMD and nvidia don't have.

You are making lots of assumptions. We have no idea how larabee performs. The fact that they canceled it means "not good". This is probably not something that you can just "throw some more tranistors at it" and all the sudden be up to snuff. Intel is nowhere near AMD and nVidia's performance now.

Intel is already the number 1 gpu seller in the world, they probably don't care to grab the rest of the market. Especially not with the engineering costs that would be involved.
 
Mar 11, 2004
23,077
5,558
146
We don't know that. They could be on 32 nm.

I might be mistaken, but I've seen it posted numerous times that they're skipping 32nm on GPUs. AMD has stated they've taped out more than one 28nm product as well, and since their CPUs are on 32nm I don't know what it would be other than GPU.

Plus, for whatever Intel's process advantage, they are at a far bigger disadvantage on creating a product to compete with actual GPUs. Larrabee performed pretty poorly against the GPUs as that time (in fact, I believe the rumors said it performed even worse than the previous gen GPUs of that time), and so it'd get slaughtered against something like revised Fermi, let alone what GPUs both companies will have out at the time that Intel has 22nm going. Plus, they'd have time of designing and mapping out 22nm Larrabee which would require a considerably longer development, and by then we might be looking at what's after the next gen GPUs.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
True, but if intel is at 22nm while AMD/nvidia are at 32 or 40 nm, then intel can pack a lot more transistors while keeping wattage down. It's true that Larrabee would be a lot slower than an AMD/nvidia part at the same process, but nvidia/ati's advantage is much smaller if intel's process node is nearly 2x better. At 22nm, they could release something with close to the performance at the same TDP or close TDP with the same performance of a 40nm AMD/nvidia part, at only a slightly higher price. Then, they could pay OEMs/etailers not to sell nvidia and amd parts (pretending that we didn't have the tyrannical FTC), plus intel has brand name recognition that AMD and nvidia don't have.

Larabee was supposed to be massive. As large as Intel's server chips which command several thousand dollars a piece. Even if Larabee was the best single card on the market, that gives it a cap of about $600. Intel would rather produce another Xeon than a Larabee.

Larabee is almost a purely reactionary play to CUDA, and until CUDA is successful at being a threat to Intel's markets, Intel has no reason to make GPGPU a credible alternative to their cpus.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
INTC does not have unlimited 22nm capacity, and it may be more profitable to use it on higher-margin parts. I don't know what INTC's gross margins are on, say, flash memory or CPUs vs. GPUs, but I suspect at least one of them is higher than the hypothetical margin they'd make on GPUs.... INTC's CPU margins in particular are astounding.

This doesn't mean INTC can't take a short term lowering of profit to create more buzz for something... but if so, it'd be for Atom chips in the mobile space, not GPUs. INTC needs all the buzz it can get in the mobile space in order to fight ARM.
 

Red Hawk

Diamond Member
Jan 1, 2011
3,266
169
106
i'm not sure how closely related the EUs on Sandy Bridge and the LRB cores are (they are probably not, as LRB was like a packed group of 486s or something) , but if Intel has a 12EU 22nm GPU attached to its CPU which performs around Radeon HD5600 levels, the most reasonable thing they can do is release a standalone 24EU-36EU card to accelerate graphics. but i don't see it happening soon, plus i'm not sure these EUs can function on their own.

Sandy Brigde's IGP does not perform anywhere NEAR Radeon HD 5600 level. Didn't we have a big thread that went over this?
 

Dark Shroud

Golden Member
Mar 26, 2010
1,576
1
0
Intel's White Knight seems to be more of a competitor to Cuda than Larrabee. And it can run OpenCL and I think Direct Compute. But that isn&#8217;t a big market right now and it has competition from everyone including IBM Cell clusters.
 

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
20,846
3,190
126
because instead of dumping nearly the same required resources on a gpu, they rather spam cpu's + SSD's for a higher margin.

Raw material cost on a gpu is way more then a cpu + SSD, and those 2 can cost more then any gpu if u go up tiers.

And having there own personal FAB's, they dont need to be at the mercy of TSMC.
If there bored, they can roll up a new silicon without waiting in line or RnD from third party.

Lastly, intel makes a very small percentage from consumers like u and i.
They make there bulk though enterprise.

Whatever u buy for 1 dollar on your end in consumer, enterprise buys 1000 dollars worth.

So the consumer market is mostly for us, because intel could support themselves on pure business machines like IBM if they wanted to, and we'd probably still end up buying Xeons @ 200&#37; markup.
 
Last edited:

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Intel canned Larrabee, didn't they? As in, there's no final product to die shrink! It's kind of pointless for them to put out an inferior, dated product that at 22nm would quite likely not even be competitive with 40nm GPU's from ATI and Nvidia. Not to mention the waste of R&D and valuable 22nm node capacity - capacity that would be much better spent on low power Atom processors or Ivy Bridge processors for notebooks/netbooks/tablets.
 

Cogman

Lifer
Sep 19, 2000
10,277
125
106
Intel canned Larrabee, didn't they? As in, there's no final product to die shrink! It's kind of pointless for them to put out an inferior, dated product that at 22nm would quite likely not even be competitive with 40nm GPU's from ATI and Nvidia. Not to mention the waste of R&D and valuable 22nm node capacity - capacity that would be much better spent on low power Atom processors or Ivy Bridge processors for notebooks/netbooks/tablets.

My point exactly. A product that is vaporware isn't going to magically come out and beat todays top of the line.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
At some point in time intel will produce the fastest gpu's in the world. After all they did express interest in fabbing out 22nm. It's just a matter of who strikes the deal with them.
 

Damascus

Golden Member
Jul 15, 2001
1,434
0
0
Intel would only produce an external graphics card if they could take the highly profitable upper end of the market, they're not going to deal with the low margin parts.

Besides, the discrete graphics card market is shrinking, it's probably too late to try to get into it and still have an opportunity to recoup investment.

Yes I don't get it either. Why dedicate capacity to a market with lower margins and more competition when you could use it for a higher margin market where you're not chasing somebody else?
 

richardffw

Junior Member
May 18, 2011
3
0
0
Yes I don't get it either. Why dedicate capacity to a market with lower margins and more competition when you could use it for a higher margin market where you're not chasing somebody else?

Intel cares about entering gpu market because gpu's are eating away at intel's high end markets. Servers/HPC are never going to be 100% gpu but nowadays they incorporate gpu whenever possible. So instead of buying 100 cpu's, company can buy 70 cpu's and 20 gpu's and get better performance for less money than 100 cpu's.

As gpu's get more general, it allows them to eat away at work that was previously done by cpu.



On a different note, people are talking about intel drivers. With Larrabee, if intel has their way then there will be no drivers. Games will return to being run in software mode, game engines will be coded to metal instead of some bytecode api which is interpreted by drivers. Many game developers want this (tim sweeney of unreal fame and gabe newell of valve have said as much) because writing to drivers places a restriction on what game developers can do because there will always be a overall pipeline and overall logic that has to be followed, no matter how general each step of the pipeline can get.

So the point of any Larrabee drivers are only for sake of backwards compatibility so that people can play library of games they have and not just the 1 or 2 games that would launch with Larrabee.

Taking all that into consideration, intel's best chance for Larrabee success is getting a console design win from msft/sony/nintendo because developers are forced to start from scratch anyway. The problem is intel doesn't want to license out their architecture which may be a non-starter for the console companies since they can't control design iterations.

On the other hand the console companies might make the sacrifice of letting intel control Larrabee because having a true gp-gpu would greatly extend the lifetime of a console because of the variety of game engines that would be possible.

Or intel might have changed their minds about licensing out their architecture since xbox 1 (intel had control of cpu and wouldn't allow msft to make it on their own), because since then intel has allowed other fabs to make atom cpu's.



And the reason gaming or a console design win benefits intel's high end markets is because that would greatly accelerate the entire Larrabee ecosystem. There would be more people who know how to use Larrabee, more library's and tools, etc. Designing server apps is totally different from games, but having general knowledge about Larrabee out in the wild is far better than having nothing at all. (For grossly simplified example, a game developer might come up with some clever trick to boost performance and although HPC developer won't use the exact same trick, it will give him an idea of how to do something similar in his app)
 
Last edited:

bryanW1995

Lifer
May 22, 2007
11,144
32
91
The GPU companies will be on 28nm when Intel is on 22nm, so the gap isn't that big. Larrabee was not a competitive product, not even close, and that was to the GPUs back then, let alone newer stuff like Fermi.

I think that what got intel with larrabee was that amd/nv have a stranglehold on the gpu IP these days. Intel tried to reinvent the wheel, but by the time it was competitive they would have been better off buying nvidia or buying the gpu division away from amd. Most likely scenario for a true high end intel gpu is that they play a waiting game with nvidia until jhh relents/retires/is forced out.