Larrabee die shot shown at Visual Computing Institute presentation

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
I thought it would be interesting to put it here as we have discussions about how many cores Larrabee would have and such.

Guess not.

I'm out.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: IntelUser2000
I thought it would be interesting to put it here as we have discussions about how many cores Larrabee would have and such.

Guess not.

I'm out.

We know Intel has chips with 64 cores, there's been numerous reports on the scaling performance to 64cores. Of course Otellini called them the "extreme" version, but they exist nonetheless. So a 32 core product is unsurprising, as would be a 48 core or 16 core product.

Of more of interest to folks, I imagine, is the power-consumption, performance, and price of any Larrabee configuration with any given number of cores at any given clockspeed and memory subsystem specs.

Until we have all three pieces of information (watts, price, performance) it really isn't helpful to know the other two and debating them with speculation has kinda tired out some of us. (speaking personally, not for others)

Knowing things like core-count and clockspeed help to speak to the performance side of things, but technical specs are a poor substitute for actual performance as the drivers side of the equation is missing.

I think Azn is onto something, the hype is wearing us out. Somebody get JC an ES so he can start dribbling out his authorized unofficial leaks...
 

ilkhan

Golden Member
Jul 21, 2006
1,117
1
0
Still wondering GT300 vs Larrabee. It'll be weird to have an all Intel gaming machine again.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
Be interesting to see if they can deliver. Personally I dont think this will run games much past the mid grade GPU's if that. But I am welcome to be proved wrong when it debuts.

 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Maybe. At 45nm, the die will be too big for 64 cores. 48 is possible, but at 64 it would mean if there was nothing but the cores the size of each would be 9.4mm2. That's awfully small according to some reports.

64 cores: 2TFlop DP/4 TFlop SP at 2GHz

Mmmm... though honestly I am more interested on the integrated graphics. :p
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
Originally posted by: Hacp
Key is how much power does it consume?

well if Nvidia needs you to install a second card @ 100 watts plus to run PhysX in a game so anything under 400 watts should be equal, maybe

-note :just went gtx 285's sli and people and Corsair think a TX850 won't cut it and a 1000 watter is spec on the corsair web site as min for a i7 and 2x gtx 285's. so Intel has a lot of room as far as power is concerned for the high end cards.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: rgallant
Originally posted by: Hacp
Key is how much power does it consume?

well if Nvidia needs you to install a second card @ 100 watts plus to run PhysX in a game so anything under 400 watts should be equal, maybe

-note :just went gtx 285's sli and people and Corsair think a TX850 won't cut it and a 1000 watter is spec on the corsair web site as min for a i7 and 2x gtx 285's. so Intel has a lot of room as far as power is concerned for the high end cards.

Dont listen to people who recommend monster power supplies. Im running 2 280s off of a 750TX, and I dont ever pull more than the mid/high 500s (from the wall) according to my kill-a-watt.
 

yusux

Banned
Aug 17, 2008
331
0
0
Originally posted by: OCguy
Originally posted by: rgallant
Originally posted by: Hacp
Key is how much power does it consume?

well if Nvidia needs you to install a second card @ 100 watts plus to run PhysX in a game so anything under 400 watts should be equal, maybe

-note :just went gtx 285's sli and people and Corsair think a TX850 won't cut it and a 1000 watter is spec on the corsair web site as min for a i7 and 2x gtx 285's. so Intel has a lot of room as far as power is concerned for the high end cards.

Dont listen to people who recommend monster power supplies. Im running 2 280s off of a 750TX, and I dont ever pull more than the mid/high 500s (from the wall) according to my kill-a-watt.

Ur not using an i7 and 285s, obviously 850 is for higher end software
 

stipalgl

Member
Jul 17, 2008
118
0
0
Actually, OCguy is correct for the most part. Most wattage claims are indeed exaggerated a tad bit, probably for the sole purpose of getting gamers/enthusiasts to spend a little more on higher priced models since they constantly tend to upgrade hardware.

A 750W PSU for a couple of GTX's in SLI should be sufficient enough.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Micron enters graphics memory business

"Our upcoming 50-nanometer technology is very competitive when it comes to power consumption and performance," Robert Feurle, Micron's VP of DRAM marketing, said in a phone interview Thursday.

"I think it's a good point in time to begin discussions with big enablers Nvidia and AMD and get started with some design-ins," Feurle said.

Micron is making its debut with Double Date Rate 3 (DDR3) memory. This is the same type of memory used for the main memory of currently shipping PCs, which have gravitated from DDR2. In the future, Micron will look at making more proprietary graphics memory, referred to as GDDR3 and GDDR5. "No decision has been made yet but we're looking into that very seriously," Feurle said.

http://news.cnet.com/8301-1392...news&tag=2547-1_3-0-20

Given the cozy relationship Intel and Micron have, ala IM Flash, I wonder if this decision by Micron stems from them being designed in already on Larrabee discreet boards?

Could this be an indication that Larrabee will be initially paired with Micron DDR3? The timing of Microns announcement seems to coincidental...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Some speculation about that Larrabee die shot

Happily, with no more information than that, we can tentatively pretend to start handicapping this chip's possible graphics power. We know Larrabee cores have 16-wide vector processing units, so 32 of them would yield a total of 512 operations per clock. The RV770/790 has 160 five-wide execution units for 800 ops per clock, and the GT200/b has 240 scalar units, for 240 ops/clock. Of course, that's not the whole story. The GT200/b is designed to run at higher clock frequencies than the RV770/790, and its scalar execution units should be more fully utilized, to name two of several considerations. Also, Larrabee cores are dual-issue capable, with a separate scalar execution unit.

If I'm right about the identity of the texture and memory blocks, and if they are done in the usual way for today's GPUs (quite an assumption, I admit), then this chip should have eight texture units capable of filtering four texels per clock, for a total of 32 tpc, along with four 64-bit memory interfaces. I'd assume we're looking at GDDR5 memory, which would mean four transfers per clock over that 256-bit (aggregate) memory interface.

All of which brings us closer to some additional guessing about likely clock speeds. Today's GPUs range from around 700 to 1500MHz, if you count GT200/b shader clocks. G92 shader clocks range up to nearly 1.9GHz. But Larrabee is expected to be produced on Intel's 45nm fab process, which offers higher switching speeds than the usual 55/65nm TSMC process used by Nvidia and AMD. Penryn and Nehalem chips have made it to ~3.4GHz on Intel's 45nm tech. At the other end of the spectrum, the low-power Atom tends to run comfortably at 1.6GHz. I'd expect Larrabee to fall somewhere in between.

Where, exactly? Tough to say. I've got to think we're looking at somewhere between 1.5 and 2.5GHz. Assuming we were somehow magically right about everything, and counting on a MADD instruction to enable a peak of two FLOPS per clock, that would mean the Larrabee chip in this die shot could line up something like this:

http://techreport.com/discussions.x/16920
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: yusux
Originally posted by: OCguy
Originally posted by: rgallant
Originally posted by: Hacp
Key is how much power does it consume?

well if Nvidia needs you to install a second card @ 100 watts plus to run PhysX in a game so anything under 400 watts should be equal, maybe

-note :just went gtx 285's sli and people and Corsair think a TX850 won't cut it and a 1000 watter is spec on the corsair web site as min for a i7 and 2x gtx 285's. so Intel has a lot of room as far as power is concerned for the high end cards.

Dont listen to people who recommend monster power supplies. Im running 2 280s off of a 750TX, and I dont ever pull more than the mid/high 500s (from the wall) according to my kill-a-watt.

Ur not using an i7 and 285s, obviously 850 is for higher end software


Im not sure how this applies. I have a highly overclocked CPU, and my overclocked 280s use 1x8 pin and 1x 6pin connectors. Dont the 285s only use 2x 6pin each? It is a shrunken GT200 core (GT200b).
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Idontcare
Originally posted by: IntelUser2000
I thought it would be interesting to put it here as we have discussions about how many cores Larrabee would have and such.

Guess not.

I'm out.

We know Intel has chips with 64 cores, there's been numerous reports on the scaling performance to 64cores. Of course Otellini called them the "extreme" version, but they exist nonetheless. So a 32 core product is unsurprising, as would be a 48 core or 16 core product.

Of more of interest to folks, I imagine, is the power-consumption, performance, and price of any Larrabee configuration with any given number of cores at any given clockspeed and memory subsystem specs.

Until we have all three pieces of information (watts, price, performance) it really isn't helpful to know the other two and debating them with speculation has kinda tired out some of us. (speaking personally, not for others)

Knowing things like core-count and clockspeed help to speak to the performance side of things, but technical specs are a poor substitute for actual performance as the drivers side of the equation is missing.

I think Azn is onto something, the hype is wearing us out. Somebody get JC an ES so he can start dribbling out his authorized unofficial leaks...

Ya know when I counted cores I couldn't Help but chuckel. The way intel is releasing info is hilerious. This is better than the ATI 4000 series marketing win.

One of first things Intel tells us that alot of the slipped info was from this extreme chip they have. The next thing is this picture of 32core model. Come on . They gotta have ya guessing. Be honest. Sure they show pirty pics of graphs with 64 cores. But when they show the core pic its of the midrange model? They have me puzzled. LOL! Say I am a little puzzled on this driver thing. Does software render requirer really really good drivers. I am not talking about present shit games . I am talking about games made for visuals effects threw software. I think not. For present games ya. But Ya know I like the idea of Larrabee. But I did read that Intel maynot support DX9 games. I tried to find source but couldn't . I like this approach. Get drivers working good on present DX1o and future hardware render games.

This is good. Intel will get spanked in games like Dx10. But from there on out . None can say . Will Better easier graphics win the day or more FPS. My choice is better graphics. Will games be easier to build with Intels engine (project Offset) . None can say . But if The game there working on brings what intel invisions look out. All will move in that direction in mass. Funny but Intels Project offset game could be = to pong in importance. If Intel fails I applaud the effort.

 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
There was an article at the Inquirer the other day trashing NV and their new G300 GPU. In it, Charlie said that DX11 is actually going to favor the Larrabee when it comes to shader code. He also said that AMD has essentially supported DX11 since the 2900XT and that NV is completely screwed with CUDA, PhysX, DX11, and the G300.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Originally posted by: OCguy
Originally posted by: rgallant
Originally posted by: Hacp
Key is how much power does it consume?

well if Nvidia needs you to install a second card @ 100 watts plus to run PhysX in a game so anything under 400 watts should be equal, maybe

-note :just went gtx 285's sli and people and Corsair think a TX850 won't cut it and a 1000 watter is spec on the corsair web site as min for a i7 and 2x gtx 285's. so Intel has a lot of room as far as power is concerned for the high end cards.

Dont listen to people who recommend monster power supplies. Im running 2 280s off of a 750TX, and I dont ever pull more than the mid/high 500s (from the wall) according to my kill-a-watt.

Yep..........550-590 watts from the wall would be like 465-500 watts @ the PSU assuming a power efficiency of 85%.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: SickBeast
There was an article at the Inquirer the other day trashing NV and their new G300 GPU. In it, Charlie said that DX11 is actually going to favor the Larrabee when it comes to shader code. He also said that AMD has essentially supported DX11 since the 2900XT and that NV is completely screwed with CUDA, PhysX, DX11, and the G300.

Ya I read it , I play it safe no comment.

 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
Originally posted by: SickBeast
There was an article at the Inquirer the other day trashing NV and their new G300 GPU. In it, Charlie said that DX11 is actually going to favor the Larrabee when it comes to shader code. He also said that AMD has essentially supported DX11 since the 2900XT and that NV is completely screwed with CUDA, PhysX, DX11, and the G300.

And I bet you actually believed it. :laugh:
 

SickBeast

Lifer
Jul 21, 2000
14,377
19
81
Originally posted by: OCguy
Originally posted by: SickBeast
There was an article at the Inquirer the other day trashing NV and their new G300 GPU. In it, Charlie said that DX11 is actually going to favor the Larrabee when it comes to shader code. He also said that AMD has essentially supported DX11 since the 2900XT and that NV is completely screwed with CUDA, PhysX, DX11, and the G300.

And I bet you actually believed it. :laugh:

Some of it made sense, yes. I personally don't see CUDA or PhysX as a complete waste of time, however, and I'm not going to write off the GT300 before it even comes out. I can see where they are coming from in that they may do very poorly in terms of performance per transistor, however I would not write off NV given the fact that they have had the fastest overall GPU for the past several generations.

The problem is, NV has not executed in terms of midrange derivatives of the GT200, and AMD will surely make more money on their next gen part right off the bat due to their superior strategy.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: SickBeast
Originally posted by: OCguy
Originally posted by: SickBeast
There was an article at the Inquirer the other day trashing NV and their new G300 GPU. In it, Charlie said that DX11 is actually going to favor the Larrabee when it comes to shader code. He also said that AMD has essentially supported DX11 since the 2900XT and that NV is completely screwed with CUDA, PhysX, DX11, and the G300.

And I bet you actually believed it. :laugh:

Some of it made sense, yes. I personally don't see CUDA or PhysX as a complete waste of time, however, and I'm not going to write off the GT300 before it even comes out. I can see where they are coming from in that they may do very poorly in terms of performance per transistor, however I would not write off NV given the fact that they have had the fastest overall GPU for the past several generations.

The problem is, NV has not executed in terms of midrange derivatives of the GT200, and AMD will surely make more money on their next gen part right off the bat due to their superior strategy.

Not if it's the same architecture they won't. IMHO, people will want the new core on GT300. Simply because it's a complete change. New tech. MIMD. If AMD changes their architecture, they they have a good shot, but I don't think they're doing much more than doubling the shaders and adding ROP's. Sure, It'll perform great in games, but how will it perform in OpenCL, DirectX Compute of Windows 7, or Snow Leopard? Ah, but I'm getting ahead of things here. I know. Wait and see. ;)
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Ati doesn't need to change arch . The industry is coming to ATI . Al those hand held devicies most are VLIW and there coded for C/C++ And open CL has opened up new horizons for ATI. You'll understand soon enough .