BenSkywalker
Diamond Member
- Oct 9, 1999
- 9,140
- 67
- 91
DreamWorks animation said that Larabee allows them to increase what they can do by 20X.
Versus the old Athlon machines they are upgrading, I believe that easily.
If Larabee really sucked, Sony would not want to put it into a PlayStation 4 console that they will be losing money on to begin with.
The only site that even tries to claim it will be is the Inquirer. If they reported the sun was coming up tomorrow I'd be worried
Intel has said that all of their driver programmers are focused on their current IGP solutions. They have usurped the 3d Labs people to work on Larabee.
3DLabs never wrote a high performance driver. They wrote extremely robuts ones, but their hardware never came close to its peak theoretical performance.
- Larabee will be a monster when it comes to 3D rendering (hence the DreamWorks comments)
You are assuming that the DX11 GPUs won't be, why?
Just think of all the time saved in eliminating DX and OpenGL.
Been there, done that. Time saved? You mean the exponential increase in development time? Try this, create a sphere and apply a cube map in x86 code. I can do it using D3D in less time then it takes to write this post, you will be hours hand coding it under x86 if you are very good. I was present for the days before the 3D APIs, I never want to go back thanks
Games will be able to run just as fast (or faster) in software mode compared to running in OpenGL/DX.
No, they won't even be close. Intel has already backed away from both that claim, and the claim that Larry will run with current GPGPUs despite a build process edge and the fact that they are backing away from it comparing with what will be outdated last gen GPUs. This is based on what Intel is claiming as of now, not my assesment.
You seem really concerned about this chip, but when credible sources like DreamWorks and Anand himself are excited
You think a P4 1.4GHZ is faster then the i7 Intel chips? That is a serious question, and it relates to how much faith you should put into some of your sources
You know, Intel isn't the first company that has tried this. Sony thought they were going to do the same thing with Cell and then realized later on in development how stupid the idea was of creating a general purpose type architecture that was going to compete with what in essence is a DSP. There just isn't any comparison.
You have my apologies for my initial comments.
Heh, anytime you have any questions about me just shoot me a PM, as far as the person you were concerned about, you put enough monkeys banging away on enough keyboards.....
