Intel Larrabee architecture revealed

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Wreckage

Banned
Jul 1, 2005
5,529
0
0
If Larrabee is a success it will be the death of AMD/ATI. I really don't think they have the cash to fight Intel on 2 fronts.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Wreckage
If Larrabee is a success it will be the death of AMD/ATI. I really don't think they have the cash to fight Intel on 2 fronts.



You forget AMD has a card left to play . THE lawsuite. IF AMD waits to go to trail they may die. IF they settle out of court and play their cards right . There is 1 or 2
Billion to be taken and maybe some compiler help . ATI /Intel are still buddies. You really should find out what all those tech sharing agreements between Intel / ATI were all about.

NV is the clear loser here.
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Nemesis 1

You forget AMD has a card left to play . THE lawsuite.

Lawsuits are the last desperate attempt of a dying company. See Rambus. I doubt they will get anything from it really. Intel is not about to admit any guilt or knuckle under pressure.

AMD has bled far too much cash these last 2 years and their stock price has taken a monumental hit because of it.

I would really hate to see AMD fail as I have been using their processors since the K6 days, but it's not looking good for them at all.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: SunnyD


Really? I thought that Intel was demoing 1280x720 with decent frame rates and better detail than traditional 3D renderers.

My bad, 1024x1024 on 8 cores. That doesn't really sound like a very poor attempt to me, especially when you're talking about things that most computer enthusiasts care about - games.

It looked like crap as far as quality is concerned.

Right now to do a scene in high quality raytrace for 720p can take 16 quad core running at 3.0Ghz about 1 minute a frame.

I don't think we will see that kind of performance in real time , at least 30 fps, in the next 2 years.

If they can do games in at least 720p with caustics and radiosity I'll be impressed.
 

Janooo

Golden Member
Aug 22, 2005
1,067
13
81
Originally posted by: Wreckage
Originally posted by: Nemesis 1

You forget AMD has a card left to play . THE lawsuite.

Lawsuits are the last desperate attempt of a dying company. See Rambus. I doubt they will get anything from it really. Intel is not about to admit any guilt or knuckle under pressure.

AMD has bled far too much cash these last 2 years and their stock price has taken a monumental hit because of it.

I would really hate to see AMD fail as I have been using their processors since the K6 days, but it's not looking good for them at all.
Well, AMD's GPUs are doing fine and Deneb overclocks to 4GHz. They have some life in them left, don't worry.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Wreckage
Originally posted by: Nemesis 1

You forget AMD has a card left to play . THE lawsuite.

Lawsuits are the last desperate attempt of a dying company. See Rambus. I doubt they will get anything from it really. Intel is not about to admit any guilt or knuckle under pressure.

AMD has bled far too much cash these last 2 years and their stock price has taken a monumental hit because of it.

I would really hate to see AMD fail as I have been using their processors since the K6 days, but it's not looking good for them at all.


Pure BS . Yes Lawsuites can be bad or good its not a sign of weakness but harm if can be proven . Intel and Amd have sued each other befor. Is Intel weak?

But Amd problem is this.It is the second time AMd has sued for the same thing. I would have to look up the dates. But that brings up an intersting point. AMD can only use evidance back to that settlement date. I believe it was 2004/2003. So any wrong doing intel did . Had to have been done between those 2 dates . The old lawsuite settlement or verdict. and when AMD filed the new lawsuite . Surey if intel did wrong they would stop that activity after the lawsuite was filed. If AMD is hanging their case on rebates . Its hopeless for them.

NV was right . The future of pC is the GPUCPU . What NV didn't know and ATI did . Was there was room for other tech in this new world . Using epic Backend. That was AMDs door of oppertunity. ATi was going down the road intel was heading. ATI's problem same as NV's present problem they needed a cpu to marry and have good little epic kids lots and lots of them . @ 22nm 10 's of thousands of them bastards. Now with that number of cores Epic running x86 on a compiler with lots of cache is going to be really fast no matter how ya cut it. It will be real time. I would like to know who called who . Did Dave call Hector or did Hector call Dave . It makes a differance . Ya see Dave knew . Hector didn't .

This is probably how it went. When Dave found out what Intel was Planning . He probably asked Intel were this left them . Beings ATI Intel had good relationship. Intels reply was probably without an X86cpu up a creek without a paddle. Dave probably asked what he should do . Intel if honest should have told him . Tell hector everthing you know and see what happens.

This is no doubt why intel excutives laughed when the heard ATI/AMD deal was made. Not because it was funny . The guys that run intel their really smart guys truely they are . But they make mistakes. NO they laughed not because it was funny . But they found a place for the epic yoke . To be honest . AMD needed that one . Good for Intel . Pay backs are a bitch. But it still left the door open for AMD . Believe it or not ATI's gpus are really good lots lots of possiabilities with shrinkage. Real good for raytracing . As I said x86 apps can run on it with the right compiler with enouch cores and good cache the translation problem will be nonfactor. Tesstalation is another avenue . But intel is covering that also .





 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Nemesis QUIT rambling in this thread. Goddamnit I can't follow the good discussion going on here because you interject with your page long posts every 3-4 posts. Please keep your thoughts concise and on topic.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: OneOfTheseDays
Nemesis QUIT rambling in this thread. Goddamnit I can't follow the good discussion going on here because you interject with your page long posts every 3-4 posts. Please keep your thoughts concise and on topic.

Excuse me . But I brought a pretty good link to this thread. That offered more info. It is a speculation thread. If you don't like it don't read it . That about as rude as I will get.

Which good replies were you were referring to . The ones that say Larrabbee will fail . The orginal link is good. Seems to me the author of that link seems to believe that if Larrabee is good . It will change everthing . Many others in the know are saying likewise.
I think I laid it out pretty good how>iF Larrabbee is that good how it effects everthing. I really would like to know what Derek Wilson take on the idea . Of Intel releasing their game at the same time as Larrabee. If said game takes advantage of larrabees many differant advantages including physics and hybred RT.

I would like to also point out one hugh fact. I and a few others don't believe AMD is dead in the water . Fact is Larrabee and ATI relationship with intel may very well save AMD/ATI asses. Because of the GPUCPU less importance is placed upon the cpu . Now truely you can see were this will help AMD. I am assuming larrabee will run just fine on an AMD system . If it doesn't Intel is nothing but stinky fish . If intel Does a game from intels gaming company using Havok and Hybred raytracing . It would be in intels best interest for intel to make sure said game runs on ATI GPU's . NV just doesn't have anything that will do RT well enough at this time . Where as ATI can . Not as fast as larrabee but still they can do it. Thats a Plus for ATI's future. So Larrabee brings 2 pluses to AMD/ATI And one very long range iffy + . What larrabee represents to NV future is exactly as NV said . Larrabee is a slide show dark cloud hanging over NV . Future . Here some news for NV . Larrabee cards are going out to partners very soon . That slide show is real . The cloud is the curtain closing.
THE KEEPER
Ya Epic is a very sore spot for Zinn2b. Along with Itanic. As the boys in the band said . TIME Is On MY SIDE. I will keep the stones a rolling . But EPIC and the word epic is going to very fun to play with in the future when it comes to AMD /ATI . LOL

 

tcsenter

Lifer
Sep 7, 2001
18,934
567
126
Originally posted by: Wreckem
Originally posted by: clandren
this kind of reminds me of the cell processor
Except the cell isnt x86 and is a bitch to work on.
And those are about the only major differences. Larrabee is hardly more than a x86 adaptation of practically every significant conceptual and architectural element used in Cell BE.

I keep reading articles discussing the numerous differences and almost no similarities between Larrabee and GPUs from ATI/NVIDIA, and it amazes me how the numerous striking similarities to Cell are mentioned once in passing or just plain ignored.
 

magreen

Golden Member
Dec 27, 2006
1,309
1
81
Originally posted by: tcsenter
Originally posted by: Wreckem
Originally posted by: clandren
this kind of reminds me of the cell processor
Except the cell isnt x86 and is a bitch to work on.
And those are about the only major differences. Larrabee is hardly more than a x86 adaptation of practically every significant conceptual and architectural element used in Cell BE.

I keep reading articles discussing the numerous differences and almost no similarities between Larrabee and GPUs from ATI/NVIDIA, and it amazes me how the numerous striking similarities to Cell are mentioned once in passing or just plain ignored.

I'd like to hear more about it, tcs.

(And please, no more epic yokes in this thread!)
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: OneOfTheseDays
Nemesis QUIT rambling in this thread. Goddamnit I can't follow the good discussion going on here because you interject with your page long posts every 3-4 posts. Please keep your thoughts concise and on topic.

Lol, could be worse. Imagine both Nemesis and apoppin in this thread! (KIDDING!)

/offtopic
 

SunnyD

Belgian Waffler
Jan 2, 2001
32,675
146
106
www.neftastic.com
Originally posted by: Modelworks
Originally posted by: SunnyD


Really? I thought that Intel was demoing 1280x720 with decent frame rates and better detail than traditional 3D renderers.

My bad, 1024x1024 on 8 cores. That doesn't really sound like a very poor attempt to me, especially when you're talking about things that most computer enthusiasts care about - games.

It looked like crap as far as quality is concerned.

Right now to do a scene in high quality raytrace for 720p can take 16 quad core running at 3.0Ghz about 1 minute a frame.

I don't think we will see that kind of performance in real time , at least 30 fps, in the next 2 years.

If they can do games in at least 720p with caustics and radiosity I'll be impressed.

Hardly looks like crap. It's about as good quality-wise as a traditional 3D renderer. Initially, that's pretty damn good for general purpose CPU's using a software rasterizer. Now, take the same thing on a specialized piece of hardware such as Larrabee that's designed and optimized to handle a ray tracing workload, with approximately 4x the amount of physical resources to throw at it, and for a first generation product you're seeing Intel do for ray tracing what NVIDIA did for triangle-based 3D all those years ago.

I have absolutely no doubts that we'll see completely acceptable IQ from an Intel ray tracing engines on Larrabee next year.

(And yes I know that NVIDIA didn't invent the triangle setup accelerator, but they popularized it to where it is today)
 

quadomatic

Senior member
May 13, 2007
993
0
76
Hmmm...considering how many cores are on this thing, how much is Intel planning to charge?

If they don't have a sub-$200 version, they're going to lose out on quite a bit.
 

Wreckem

Diamond Member
Sep 23, 2006
9,547
1,127
126
Originally posted by: Nemesis 1
You just don't get it . If Larrabee can perform as spect. Its a new industry . Totally new. Why can't you see that . Its not Intel breaking into an old market . Intel is trying to pull off the bigone here. Leaving an old tech to die into something new and exciting.

This changes gaming. It doesn't break into any old tech thats there because it has to be.Maybe won't even perform as well as AMD NV. It doesn't have to . All it has do do is bring something new and exciting. Than the games will be made for Larrabee. Because its new and exciting. All intels has to do is bring out a game that supports everthing larrabbee can do . If the game is a hit . Thats it the future is here in 2009/2010. Everone will fall inline . That how big this is . If its good everthing changes . If it sucks nothing changes.

I remember pong . crysis is to pong what larrabbee is to gaming.

nintendo Wii is a good example of something new slapping better performance around isn't it.


Im going to straight up and call you a moron.

Game Developers are EXTREMELY RISK AVERSE. They dont go with whats new and exciting EVER. Game developers go with PROVEN technology. Only a hand full of studios ever go for the new and shiny. This is even more so now with where the game industry is headed(more and more towards casual gaming).

As for something new. Intel isnt really bring much new to the table, just a different way, and potentially not as fast way, to do things.

Nvidia and AMD are moving to fully programmable GPUs, real time ray tracing, etc.

Larabee is also going to be expensive as hell. Its like you are acting Larabee is going to be mainstream(affordable) when its released. They might have an affordable version, but it'll get trounced by AMD and Nvidia.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Good for you wreckem . I wouldn't take this lying down either. How dare intel. Than there is me blowing their horn the shame of it all.

I am a little concerned about this cost thing. So you think this chip is going to be expensive. Would you say more expensive than say A cpu . Q core?

Its kinds like every since C2D was released befor the prices are known. Everyone says way to expensive . But than the pricies come out and there lower than expected . Kinda like nehalem is going to be 2 expensive . Yet we know that the 2.6 is only $300. Now that amd spider platform is done that kinda puts the M/B at even steven .



By the way I think your a great guy and almost godlike. I can't understand why you waste your time here.


Whats is the cost of a NV 280.? What goes a QC intel 45nm cpu cost with M/B and Memory . Not all that long ago I remember guys buying $800 NV GPU. How much to make these things . How much was a R600 NV 7800 . Ya intel is in a terriable habbit of raising pricies after a die shrink. As history can show plainly.

Intels cost to make Atom is like less than $12. The money grabbing bastards.

I would imagine larrabbee will cost intel in the $75 dollar area to produce.

I say boycott intel befor they bring out extremely overpriced GPUCPU. That teach the bastards. :(


NV and AMD are moving towards fully programmable parts. Really . And what kind of apps do they run . Is it easy to write the programms . Or is it hard like for Itanic . can they run X86 instructions. Well they run a PC without a CPU?

So many questions so few ans. NV has been hyping pyhsics now for how long . Cuda . Tesstalation is another bright spot for AMD. DX11 allows all this stuff to work right. When is DX11. I seen the phyisic stuff thats pretty low on my list.
 

RESmonkey

Diamond Member
May 6, 2007
4,818
2
0
I wish I could come up with a comment, but I don't understand Larabee as much :p

I'm excited to see Intel in the GPU biz tho.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: RESmonkey
I wish I could come up with a comment, but I don't understand Larabee as much :p

I'm excited to see Intel in the GPU biz tho.


None of us do . So don;t let that stop you.

Granted I am being overly optamistic maybe. Maybe , Heres what is comes down to ok .

Larrabbee is a 24-32 core inorder Cpu from an old p . core that has been optimized for GPU usage. Each core is connected to the other cores threw cache. It uses a 512 ring bus for memory 512 one direction 512 the other direction and 1024 the same direction . Ya really need to read this stuff its on my pdf link. Because I am not wording it correctly . IF larrabee = its specs . it has people spooked

It well use software to run DX10 DX11 DX12 what ever. fully programmable with only a few fixed funtion logic circuits. Larrabbee can even run an os kernal.

It has a vertex unit that takes up 1/3 the die space . If it works its going to be good . If it keeps up with either NV or ATI on the highend in todays games it well BE GREAT with everthing else it brings to the table it will be intant winner.

It also means the possiable death of more than 1 company . That seems to be weighing heavely on many minds . Which to me is non issue . swimm or drown or tread water as long as you can . I actually believe that alot of these guys would rather hold back progress rather than see there favorite hardware company get pawned and shown the exit.

 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: RESmonkey
I wish I could come up with a comment, but I don't understand Larabee as much :p

I'm excited to see Intel in the GPU biz tho.

Intel had a video card before and it was not something to be proud of.

If this new card is a success I have a feeling it may just be the final killing blow to ATI. They only have 18% of the market as is and they have zero cash on hand to fight both Intel and NVIDIA GPUs.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Wreckage I think your being a little unfair on ATI's market share the numbers your using are from last qt. Lets wait till next quarter results. I will say this also . When nehalem is released look for ati cards to be all over these M/B THe majority by large % will sport 4570x2's. It would seem ATI is becoming the darling . I feel the same way as many . Even tho I have always been ATI . If I was NV . I would switch to help AMD as many many are doing.

I even bought that doggy k9 cpu . and I won't say more . I did my part thats enough.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: RESmonkey
I wish I could come up with a comment, but I don't understand Larabee as much :p

I'm excited to see Intel in the GPU biz tho.


Sorry about the double but there is something my brother inlaw pointed out to me . An Ex ATI engineer.

He said larrabbees future proofing for the 32nm larrabee is really good . Than he tells me to think on it and well talk latter . Man this guy can be an ass at times .

So I kept looking at larrabee and I couldn't find one dam thing that made it future proof. Other than it runs on software. So I figure he feeding me shit like he has in the past. So I went to bed . slept couple of hours . I woke up and I had the ans. If you guys go back and read some other speculation post done sometime ago about larrabbee. I said than that larrabbee would work differantly with nehelem than say C2D . Well I was wrong .

AVX gave me the ans.

So he calls me back says did ya figure it out. I said . I think so . So you guys know me well enough now to know I can be a bit of a smart ass. He said howed you figure it out . I said I was at the beach . He laughed outload and said ok you got it. So I had to ask is this thing going to scale he said u bet your ass . He new I was referring to dual sockets.

I asked what the simulated increase was on sandy . with 4 cores its 40% with 8 its 80 %

12 cores 120 16 cores 160% performance increase for the GPUCPU over nehalem using sandy. You heard that right here first

So this is how it should work for you guys . You should here more about this from sources you can trust soon enough.

OK in late O9 you buy your 32nm larrabbee. You bought it because it worked out great and everybody loves it OK .

You install on your Nehalem system after removing your 4570.X2 . You love it . But its 2 years till the next shrink . Thats a long time.

But wait 2010 comes out Sandy . Now you can increase your gaming performance by 40% buy installing a brand new 4 core sandy . If thats not enough you could go 8 cores for a 80% gaming performance increase. Or you could install the latest version of larrabbee for we'll say 90% increase plus add sandy for a total of 130%. This part looks good to me and I know he didn't lie except maybe he understated the performance increase with sandy . He has never overstated anything to myself . He has feed me some crap tho. But I think this is real .
 

OneOfTheseDays

Diamond Member
Jan 15, 2000
7,052
0
0
Is there a way to ignore every post by Nemesis? We SERIOUSLY need an ignore function on this forum. I've seen it implemented on others.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I will try to find info on sandy post it here if it fits in with improved larrabbee performance statement I repeated.

Intel AVX: The next step in the Intel instruction set -- Gelsinger also discussed Intel AVX (Advanced Vector Extensions) which, when used by software programmers, will increase performance in floating point, media, and processor intensive software. AVX can also increase energy efficiency, and is backwards compatible to existing Intel processors. Key features include wider vectors, increasing from 128 bit to 256 bit wide, resulting in up to 2x peak FLOPs output. Enhanced data rearrangement, resulting allowing data to be pulled more efficiently, and three operand, non-destructive syntax for a range of benefits. Intel will make the detailed specification public in early April at the Intel Developer Forum in Shanghai. The instructions will be implemented in the microarchitecture codenamed ?Sandy Bridge? in the 2010 timeframe.


Intel Corp. this week unveiled details about its microprocessors code-named Sandy Bridge that are due in late 2010, more than two years from now. Obviously, the first details about the new central processing units (CPUs) do not give any indication regarding performance, but give a signal to software developers to get prepared for the innovations.

Earlier this week Patrick Gelsinger, senior vice president and general manager of digital enterprise group discussed Intel AVX (Advanced Vector Extensions) which, when used by software programmers, will increase performance in floating point, media, and processor intensive software, according to the Intel Corp.?s high-ranking executive.

Key features of Intel AVX include wider vectors, increasing from 128 bit to 256 bit wide, resulting in up to 2x peak FLOPs output. Enhanced data rearrangement, resulting in allowing data to be pulled more efficiently, and three operand, non-destructive syntax for a range of benefits. Some features that Intel describes as part of AVX resemble those available in SSE5 set of instructions developed by Advanced Micro Devices.

Intel AVX can also increase energy efficiency, and is backwards compatible to existing Intel processors (obviously, older chips will be able to execute specific operations, but will not do it as quickly as those featuring hardware AVX ? X-bit labs), the world?s largest manufacturer of x86 microprocessors said.

Intel will make the detailed specification public in early April ?08 at the Intel Developer Forum in Shanghai. The instructions will be implemented in the micro-architecture codenamed ?Sandy Bridge? in the 2010 timeframe.

Ok I take the bolded part to mean that if Sandy is working with larrabbee which has AVX It will work faster.

Gelsinger would not comment on the relationship between the Sandy Bridge AVX instructions and Larrabee's vector instructions, but clearly it would make no sense to maintain two unrelated vector architectures.

There is one other small thing . Maybe its nothing, The orginal code name Gesher in hebrew means bridge. A bridge to what? The GPU ? Or is it just another silly name .

Than you look at intels new name sandy bridge. Now does that mean something . Likely not. When I think of sandy I think grains. What is about cpus that has something to do with grains . God for the life of me I can't think what that could symbolize. CAN YOU!

Than there is sandy memory system .looks to be the same setup as larrabbee.