• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Intel Larrabee architecture revealed

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Originally posted by: Martimus
Originally posted by: Aberforth
I hope it crushes ATI and NV.

After reading that article, I sincerely doubt it will be able to compete in the high end. Maybe after a few iterations, Intel will be able to compete though.

Intel is a pioneer in multi-threading, I am sure they are going to leverage the software side using optimizations that are similar to their CPU architecture...like SSE 1-5, EM64, VT etc

Edit: It might be designed for industrial use at the beginning..but Intel knows a ton on things that NV doesn't and they've got plenty of money.
 
so this isnt going to be a workstation only card like it was rumored before?

well the graph showed that performance in 2 games scaled as the number of processors went up in multiples of 8, but 5 times 1FPS is still unplayable.
so scaling is nice but you need the level of "1" to be reasonable. but we cant know whats it going to be like so everything at this point is pure speculation.

and we still dont know anything about power consumption and heat output.
today its not all about pure performance at any cost.
 
Originally posted by: keysplayr2003
Originally posted by: Cookie Monster
Ill bet my cookie jar on it. 😀

I'm kinda with you on this one. I think, at best, the Intel solution will have performance equivalent to what Volari V8 was to the competitive offerings of NV/ATI at the time. And just as many driver glitches as Volari had.

I hope they prove me wrong however. :thumbsup:

I doubt that. They have the 3Dlabs driver team. If there's one team that can put out far better drivers than NVIDIA (which isn't saying much at this point) and ATI combined, it's the 3Dlabs team. They've been doing professional quality drivers far longer than NVIDIA has even existed.
 
Originally posted by: Borealis7

and we still dont know anything about power consumption and heat output.
today its not all about pure performance at any cost.


Exactly my thoughts... I dont remember reading anywhere on that article about power, heat, or anything like that.... So is it going to be viable or a monster heater, jus a lil short of a steam engine?
 
I'll be honest and say that most of the article is beyond me in understanding the technical nuances of the Larrabee approach vs Nvidia vs ATI and the implications for the market. A good 3rd player would be handy though, for competitive reasons. It would certainly keep NV and ATI on their toes and not completely focused on each other. 🙂
 
Originally posted by: racolvin
I'll be honest and say that most of the article is beyond me in understanding the technical nuances of the Larrabee approach vs Nvidia vs ATI and the implications for the market. A good 3rd player would be handy though, for competitive reasons. It would certainly keep NV and ATI on their toes and not completely focused on each other. 🙂

Intel is building a general parallel processor using software emulation that it intends to compete against speciallized hardware. I can't imagine that being able to compete regardless of how good their hardware and software are. you have to remember that nVidia and AMD are also very good at designing parallel processing, and have much more experience with it, so any advantage Intel has in that area would be minor at best. (It would likely be a weakness compared to the competition actually.) This really sounds like they are building Larrabee to be a GPGPU, with a side affect of being able to also play games at acceptable framerates.
 
They are trying to conquer the world. If they start brewing beer I am filing a anti-trust! No one company should get that much of my paycheck.
 
Larrabee drivers will get hacked, then we'll have a system with:

Nehalem/Deneb CPU + Larrabee + NV/ATI GPU

That sounds more interesting to me 🙂

For games, Larrabee handles Physics and AI, and the NV/ATI GPU does the graphics work, the rest for the CPU.
 
Originally posted by: Martimus
Originally posted by: racolvin
I'll be honest and say that most of the article is beyond me in understanding the technical nuances of the Larrabee approach vs Nvidia vs ATI and the implications for the market. A good 3rd player would be handy though, for competitive reasons. It would certainly keep NV and ATI on their toes and not completely focused on each other. 🙂

Intel is building a general parallel processor using software emulation that it intends to compete against speciallized hardware. I can't imagine that being able to compete regardless of how good their hardware and software are. you have to remember that nVidia and AMD are also very good at designing parallel processing, and have much more experience with it, so any advantage Intel has in that area would be minor at best. (It would likely be a weakness compared to the competition actually.) This really sounds like they are building Larrabee to be a GPGPU, with a side affect of being able to also play games at acceptable framerates.


The distinction is increasingly meaningless as GPUs become more programmable. Larrabee isn't going to be a very effective general-purpose processor design, it's extremely heavily slanted towards vector processing. I wouldn't really call it much less specialized than a GPU.
 
Originally posted by: woolfe9999
Is the initial Larabee product supposed to be an add-on card or some sort of IGP?

- woolfe

excatly....


When we get some early silicon back, and a hint of performance, then this will be relevant. Until then they will be given the hairy eyeball. General Purpose Software (no matter how parallel) vs Specialize hardware (with some programmability): hardware tends to win these battles.

But I really hope it succeeds just so we can see Id and the rest of the ?visionaries? not be confined to the available hardware specs/versions for incorporating 3d features. The idea that what we are giving, by way of DX version, what developers are confined to well? sucks. Id?s Rage is still DX 9 based, I believe.

I like the idea of the software guys thinking of cool ideas and pushing the hardware as much as it can. It definitely reminds me of the early 3d days. I?m also excited about not being tied down to a DX version feature set (I?m still not on Vista for DX 10), and letting drivers add new functionality/compatibility. The side effect will be a complete software stack for gaming, and BIG boon for Linux gaming no more DX to OGL conversions steps needed..


Time will tell..
 
Originally posted by: clandren
this kind of reminds me of the cell processor

Except the cell isnt x86 and is a bitch to work on.

Im skeptical about larabee, but it has the potential and Intel has the resources.

I can see this thing powering the next Xbox for sure...
 
Originally posted by: Kuzi
For games, Larrabee handles Physics and AI, and the NV/ATI GPU does the graphics work, the rest for the CPU.

Actually this is what I was thinking too.... This larrabee would be so much more better at handling physics and complicated AI.

But the thing is, it opens up programming to a lot of people... and who knows what ppl would come up with, there would be so much more than we can think of now which can use this scale of parallel-processing.....

I wanna know, is there some software/program, that can use your Video Card to encode videos, like convert to Divx format or other formats, entirely using your video card??
 
I used to own a 3DLabs graphics card. Drivers for any non-workstation task was nothing to brag about. They also had terrible IQ.
 
Originally posted by: bharatwaja
Originally posted by: Kuzi
For games, Larrabee handles Physics and AI, and the NV/ATI GPU does the graphics work, the rest for the CPU.

Actually this is what I was thinking too.... This larrabee would be so much more better at handling physics and complicated AI.

But the thing is, it opens up programming to a lot of people... and who knows what ppl would come up with, there would be so much more than we can think of now which can use this scale of parallel-processing.....

I wanna know, is there some software/program, that can use your Video Card to encode videos, like convert to Divx format or other formats, entirely using your video card??

Anandtech has an preview of some software destined for the Nvidia cards that does what you are talking about.
 
Originally posted by: AmdInside
I used to own a 3DLabs graphics card. Drivers for any non-workstation task was nothing to brag about. They also had terrible IQ.

Yeah, however Intel is said to have had heavy involvement in DX11.

 
oh man.... I was interested in CUDA and got this Idea, I thought of doing it as my end semester project..... Nevertheless I am still gonna do it and I would just release it as open source i guess....
 
I was really interested until this part"Larrabee's default OpenGL/DirectX renderer is tile based", there goes my hope of vray on Larrabee.
Oh well, I'll just keep throwing more Mhz at it.

 
Originally posted by: bharatwaja
Originally posted by: Kuzi
For games, Larrabee handles Physics and AI, and the NV/ATI GPU does the graphics work, the rest for the CPU.

Actually this is what I was thinking too.... This larrabee would be so much more better at handling physics and complicated AI.

But the thing is, it opens up programming to a lot of people... and who knows what ppl would come up with, there would be so much more than we can think of now which can use this scale of parallel-processing.....

I wanna know, is there some software/program, that can use your Video Card to encode videos, like convert to Divx format or other formats, entirely using your video card??

You can pull apart the source code of Nvidia's Gelato.
It uses the gpu for fpu task.
 
Originally posted by: SunnyD
Originally posted by: keysplayr2003
Originally posted by: Cookie Monster
Ill bet my cookie jar on it. 😀

I'm kinda with you on this one. I think, at best, the Intel solution will have performance equivalent to what Volari V8 was to the competitive offerings of NV/ATI at the time. And just as many driver glitches as Volari had.

I hope they prove me wrong however. :thumbsup:

I doubt that. They have the 3Dlabs driver team. If there's one team that can put out far better drivers than NVIDIA (which isn't saying much at this point) and ATI combined, it's the 3Dlabs team. They've been doing professional quality drivers far longer than NVIDIA has even existed.

My thoughts exactly...I think the most likely worst case scenario is that intel ends up with an absolute beast of a general purpose video card that is only mediocre for video...however if it can compete on the high end in that area, then AMD and nVidia could be in a bit of trouble
 
Originally posted by: Wreckem
Originally posted by: clandren
this kind of reminds me of the cell processor

Except the cell isnt x86 and is a bitch to work on.

Im skeptical about larabee, but it has the potential and Intel has the resources.

I can see this thing powering the next Xbox for sure...

I agree . Except on the xbox part. I have a thread on this already but until we know more we leave it at that.
But the just of it is this . I believe Apple/Intel are going to build a gaming consol . Intel sells every part required except for Blue Ray. But I see this consol as much more. It will infact be PC capable WITHOUT a cpu if they so choose or a cpu could be onboard endless possiabilities.. Now thats interesting. Its also interesting to me that when larrabee launches aleast 6 game titles made for larrabee will release at the same time . If intel isn't trying to blow smoke up our asses. We should see some interesting effects as well as games . If just 2 of these games are good shooters . That I like. That would infact justify My buying the card. I don't know but if intel pulls this off it WILL change everthing as we know it today. This Gentlemen is the start of a new era in computing. Don't forget intel bought a gaming company already + havok and a new announcement forthcoming.
You guys all know how I go on about the Elbrus compiler . Well looks to me like Intel has given Boris a job that when completed. Well prove if he is the genius I believe him to be. A notch above all others. A man ahead of his time. Well see soon enough.

 
Back
Top