Intel GPU in the PS4?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

taltamir

Lifer
Mar 21, 2004
13,576
6
76
what would hydra help if it is a software based solution? (well, technically all solutions use software, which they call drivers... the issue is weather it is fixed function or general hardware. and i assume you mean general hardware based).
The advantage of general hardware is great scaling and lack of waste. the downfall is that it simply cannot do the one or two specific tasks as well... which intel now comments they WILL use a combination of fixed function and general hardware.
So as far as I undertand it, its going to be a regular GPU only with 50% of the shaders wasted on x86, but with many cores and great multi core scaling...

The thing is, its all a bunch of rumors. intel has demonstrated their capability with the core2, nehalem, and the X25 HDD... so that is where the excitement comes, that despite it sounding stupid on paper (the rumors mind you), intel is on a roll and have garned some faith in their capability to not suck.
Time will tell if this faith is well justified.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Hydra doesn't offer any advantage or have any direct benefit from software rendering per se, just saying it offers Intel a scaling alternative for a higher-end solution where their single GPU might fall short (due to size, cores, performance etc). For example, they might not be able to get 120 cores onto a single die or even a single package, but with Hydra, they may be able to put 1, 2, 3 even 4 dies with 40-60 cores on a single PCB and scale performance with Hydra.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
which is only going to be useful if EVERY rumor so far has been false, and they are not going to program DX on an x86 array.
You yourself said you expect it to be "software" (aka, generic x86 hardware rather then specialized hardware).
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: taltamir
which is only going to be useful if EVERY rumor so far has been false, and they are not going to program DX on an x86 array.
You yourself said you expect it to be "software" (aka, generic x86 hardware rather then specialized hardware).
Not at all, and quite contrary, it'd benefit the most if all these rumors come true and a single Larrabee GPU is not competitive based on current leaked specs and expectations. I think its generally well accepted that Larrabee's software-based renderer will be slower than fixed-function raster hardware found on existing GPUs. The point is that Hydra would address deficiencies beyond those limitations, where a single discrete Larrabee GPU may not be able to compete, Hydra would allow for near 100% scaling with each successive core.

This scaling isn't dependent at all on software rendering, as Lucid/Hydra has made similar scaling claims with NV/ATI GPUs. I only made the point because of the general viewpoint the software-based renderer would cripple Larrabee to begin.

Larrabee's Rasterizer, explained




 

Spicedaddy

Platinum Member
Apr 18, 2002
2,305
77
91
Sounds very plausible to me... nVidia's greed is upsetting Sony, just like they did with MS and the first Xbox.

Sony needs a graphics chip, so they either:

-design it themselves (too expensive)
-buy the design from ATI (who already has deals with MS and Nintendo, aren't willing to give the chips away because finances are rough these days)
-buy the design from Intel (looking to enter the computer discrete graphics market, want support for their architecture, company is stable financially and willing to sell for cheap...)

Which would you pick? :D

nVidia's bully attitude is losing them contracts with the world's biggest tech companies, that can't be good for them in the long run.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
mmm... the wii does show that you don't have to have the BEST graphics to sell.. and if intel makes a good deal for a company...
 
May 11, 2008
22,599
1,473
126
Originally posted by: Idontcare
Originally posted by: William Gaatjes
I found this information from a post at realworld technologies.


Cell's raw massive FLOPage was what made that possible. You can do a lot with 155.5 GFLOPS on a single chip (single precision Linpack benchmark) ? oops, that's the first release. The one in RoadRunner is the new and improved PowerXCell, which peaks at 204.8 GFLOPS single precision, 102.4 GFLOPS double, and you double that for a QS22 blade, each of which has two PowerXCell chips. (Peak quoted because I couldn't find per-chip or -blade Linpack numbers. Sorry.) (Decoder ring: GFLOPS = GigaFLOPS is a mere thousand million per second.) Will IBM continue development of Cell without Playstation volumes? HPC volumes are a nice frosting on the console-volume cake for IBM, but hardly a full meal for a fab. Further development will be required to stay competitive, since a widely-leaked Intel Larrabee presentation, given in universities, indicates (with a little arithmetic) that an initial 16-processor Larrabee is supposed to reach 640 GFLOPS single precision peak, half that for double. That's supposed to see the light of day in late '09 or early 2010, and maybe 24-way, with 32-way and 48-way versions following, providing proportionally larger peak numbers (that's 1,920 GFLOPS for 48-way); and possibly even more with a process shrink. Those are really big numbers. And Cray is planning on using it, so there will be big RoadRunner-ish non-Cell future systems, maybe with a PS4-derived development system. On the other hand, the newer PowerXCell isn't used in the PS3, and development always knew it wouldn't be a high-volume product. Will IBM continue to fund it, perhaps using some of their pile of profits from Services and Software? (In IBM, Hardware has lagged those two for many years in the money department.)


http://perilsofparallel.blogsp...ts-future-of-ibms.html

Those are some very eye-opening numbers.

It begs the question though - why hasn't anyone else produced chips capable of scaling to these numbers that Intel is projecting? What is the secret sauce for Intel that makes Intel suddenly capable of doing this while the existing folks (IBM/Sony, Nvidia, ATi) have slowly struggled to increase their computational processing power (relative to where Intel is expected to start at)?

Is it the process node (45nm) enabling them to just stack enough parallel cores on the chip? Or is it specifically to do with the chip's architecture, being x86, that has kept the non-x86 licensees from doing what Intel is doing?

I'm not saying I believe the numbers that Intel is putting out, nor am I saying I disagree with them.

What I am wondering is what makes Intel's approach here so special and revolutionary that it hasn't been done before and won't readily be duplicated in the immediate future as well?



I personally think it is the complete package : Development tools are available. Hardware is good enough to do 60Hz at 1080p resolutions and Intel is willing to sell cheap to make and become friends with Sony. Intel want's to come in the console market i think because sooner or later consoles and pc will merge for most customers.

Intermezzo :
My own thoughts on this are :
Apple makes a lot of money on the hardware/software bundle. I think microsoft went into the console business to do the same in the end. A microsoft computer with microsoft software and microsoft hardware. Because all have similair hardware specs and controlled by microsoft, there will be less driver issues, microsoft claiming is the number one issue for windows bsod's. Hacking is more difficult as is pirating on a online console. And how about cloud computing ? Consoles seem to be perfect for that although we not might actually agree with the limited freedom. At the moment, the consoles are powerfull enough to do what the pc does for most people. Internet, email, games, some office applications. Reality is that's it is sufficiënt for most people. We fanatics want big pc's and total freedom and want to know how everything works, but for most people out there adjusting the bios is already scary enough to not touch it. I see a lot of people claiming in this discussion that larrabee may not be faster then the nvida/ati solutions. Most people do not want to run a game at 200fps. If it is a constant 60 fps it is good enough. The enthusiast people buy the best there is but most people buy what is sufficiënt. And with consoles there is 1 thing not present pc's do suffer from : Hardware does not age and becomes obsolete that fast. And because of the consoles having all the same spec's it is possible to do more low level programming, pushing performance levels up wards because of less software layers in between. Because the tv screen's in more and more homes become like huge computer monitors it all fit's together.
End intermezzo.

And i will be willing to guess that the first larrabee may have some fixed function hardware as well. They did license powervr after all. Now powervr was always very promising but needed a dsp like frontend to do the T & L in the past. Now we have the many cores larrabee to help. and reading back all the posts a 24 or 32 core larrabee seems possible.
If we take atom's power numbers for comparision : 2,5 watts at 1.6 ghz. 32 * 2,5 watts = 80 watts if all cores are at work. Compare that to the powerusage of Ati and Nvidia.
I don't know the clocks for larrabee but that seems possible. I am guessing here : Maybe an improved 45 nm process and running at 2,5 Ghz ? Intel does have a lot of experience now with the 45nm high k..



 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
You seem to have been colonized by a marketing infection. Larrabee won't usher in a raytracing revolution. To succeed, it will have to be capable of running contemporary software, and Intel knows this.

To quote Intel-

"First, graphics that we have all come to know and love today, I have news for you. It's coming to an end. Our multi-decade old 3D graphics rendering architecture that's based on a rasterization approach is no longer scalable and suitable for the demands of the future."

Of course they are backpedaling now, they realized how profoundly moronic they were.

If Cell was a better GPU than what Intel would want to put in the PS4, it would be better than RSX, and Sony would have put two Cells in the PS3.

The question is if Cell in '11 or '12 would be better then Larrabee, by Intel's most optimistic claims it will be and easily so. That still won't be close to good enough to get the job done, but we'll get back to that.

Where's your source for that?

At 90nm with 234Million transistors Cell was 221mm squared with a redundant unit to increase yields. They are now running 45nm(second die shrink since launch) at 114.5mm squared with a redundant unit still built in to improve yields.

How is Sony still losing money on the PS3 with a 7-core 3.2Ghz Cell?

Couldn't tell you. Can't figure out why it would be more expensive then the 360.

You sound like Ken Kutaragi. How's life in 4-D?

I'm assuming English isn't your first language then. To quote myself-

Sony went with nV because they realized that there goals weren't going to work

KK failed to come close to what he wanted to do with Cell by a long shot. The difference is he at least was far more honest with his approach then what Intel is being. He made no sacrifice, his design was tailor made in every aspect to be the most powerful processor of its type per transistor/watt or whatever metric you want to use. He wasn't going to hang on to front end costs of converting an archaic ISA to uOps, he wasn't going to saddle his hardware with resources for dealing with OoO code, he wasn't going to limit his capabilties by utilizing die space, and quite a bit of it, for anything other then functional units or cache to keep them fed, that was it. Intel is trying to do the exact same thing KK was, only they are making a ton of sacrifices to do it and somehow they are supposed to come to a different conclusion then KK did? As egomaniacal as KK was, he had to admit he was horribly wrong and had to lose face going to an American company at the 11th hour to save his design. Intel is starting to backpedal now, realizing they have no chance at doing what they were claiming at first. They should have learned from the last pompous, unemployed windbag ;)

The entire point of Larrabee in the PS4 is that Intel would give it away for free.

You see Intel giving away hundreds of millions of dollars in hardware? Let's say I think it more likely that they would name KK CEO of Intel before that happened.

Sounds very plausible to me... nVidia's greed is upsetting Sony,

nVidia only collects a licensing fee from Sony, nothing else. I am perplexed as what possible 'greed' issues there could be? They agreed upon an IP cost per unit, Sony fabs them themselves, what issue could there be with nV at this point? This is where I see Inq's logic going into its' normal level of idiocy, there seems to be no grounding at all for that initial assertion, and they then base a ton of speculation off of that misplaced assertion.

-design it themselves (too expensive)

They would do another GS before going to Larrabee, the costs would be considerably less expensive and it is hard to imagine Sony making anything slower then what Larrabee is shaping up to be.

And i will be willing to guess that the first larrabee may have some fixed function hardware as well. They did license powervr after all. Now powervr was always very promising but needed a dsp like frontend to do the T & L in the past.

If the tiling rumor is true, it would certainly be the final nail in the coffin of the Larrabee in a console rumor. Geometric complexity is simply too high now, the amount of RAM that would be required to handle the tiling end wouldn't work in the console world. Look at what most developers on PS3 or 360 list as their biggest issue- it isn't computational power.

I don't know the clocks for larrabee but that seems possible. I am guessing here : Maybe an improved 45 nm process and running at 2,5 Ghz ? Intel does have a lot of experience now with the 45nm high k..

In the timeframe we are talking about you need to think in the order of probably 10TFLOPS being the norm for GPUs. This isn't about Intel's first futile offerings, they need something much, much stronger then that.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Lets not turn this into some fanbois BS. Larrabee offers a new direction. If intel successful. Its great for everyone. The Sony /Intel thing is a done deal. Its just not offical yet. This surprized me. I thought Intel would bet Xbox. But I am happy about it. I want to see Intel & Microsoft go head to head. After all It was MS. That stuck it to Itanic. Ya I want to see a PC game platiform that doesn't need DX11 or 10 /9. But still can play those games.

I would post a link I found on one of the hydra chips on M/B article I read . But many here preferr to form there own opions. So well let it be as they want it. Know sense in using useful information in a debate. Were Attacking a company is more the game than . Assembling known facts and debating them .

Like this known fact . When big wigg at Dream works was talking about Larrabee. In referrance to performance. He said Larrabee isn't 2x or 3x faster than current but 20x faster.

Now this is a high profile gentleman . I would think he would gauge his remarks.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
This is from project offset News.

February 4th, 2009 - Hello from Bot and the Offset team


Hello from Project Offset?.1 year at Intel and going strong!




I am Mark Subotnick the Executive Producer of Project Offset; known as Bot to many?Botnick got shortened to Bot over the years. I am not a robot or AI, and very far from it?.




I am sorry we have been out of touch and have not posted updates and content lately.




About 1 year ago Intel acquired Project Offset. This March is officially 1 year since the team moved up north from Southern CA to the Bay Area, and we began to work towards the move from prototype to a 1st playable and so-on. We have been showing you all a lot of about our tech over the years, and it is still moving forward and advancing very nicely. We will be updating very soon with videos and some more posts about our game.




The game has gone through changes and we will be updating you on what to expect, as soon as possible, and when to expect it. I know you all want to see gameplay and we will show it when it is ready. First impressions are very important.




It is important to note that all you have seen previously was from prototypes and tech demos made internally that we wanted to share or that leaked. We know you have been reading about us for quite some time, and we want you to know that we entered full production last year and have been making great progress. Most of our time has been spent building tools, an engine and editor that will allow us to make the game we have all been excited to play.




Leaks and sneak peeks at what the game might look like have gone out, but we hope to update you and show you where we are now at soon. Until then we plan on updating you with all we can. So please check back often and stay tuned.




We have gained some great people over the past year and lost some great people. We thank all who have been a part of Project Offset over the years as you helped us get to where we are today and for that we are forever grateful.




We will be updating the team page, and web site as a whole to make more current ?.stay tuned?




The move up to the Bay Area and the adding of new people and elements, combined with being a part of Intel, has made the past year a ?forming? year for the team and project. We eventually have become a solid core and team with a common objective and goal. The tech. engine and editor are making production life easier and easier and we continue to crank out killer content we can?t wait to share. We continue to iterate on game mechanics to deliver you not only a beautiful and immersive experience, but a fun one as well.




The team wishes we could show all to you and we will share all we can?when we can. We appreciate all of your support, passion and feedback.




Thanks again and stay tuned for more updates.




Your EP?Bot
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Nemesis 1
Lets not turn this into some fanbois BS. Larrabee offers a new direction. If intel successful. Its great for everyone. The Sony /Intel thing is a done deal. Its just not offical yet. This surprized me. I thought Intel would bet Xbox. But I am happy about it. I want to see Intel & Microsoft go head to head. After all It was MS. That stuck it to Itanic. Ya I want to see a PC game platiform that doesn't need DX11 or 10 /9. But still can play those games.

I would post a link I found on one of the hydra chips on M/B article I read . But many here preferr to form there own opions. So well let it be as they want it. Know sense in using useful information in a debate. Were Attacking a company is more the game than . Assembling known facts and debating them .

Like this known fact . When big wigg at Dream works was talking about Larrabee. In referrance to performance. He said Larrabee isn't 2x or 3x faster than current but 20x faster.

Now this is a high profile gentleman . I would think he would gauge his remarks.

It nearly brings me to tears to admit this, but I must say this is (1) probably the first nemesis post I feel like I read and fully understood the full spirit of the message, and (2) have no way to argue against the validity of the spirit of the message so I must accept defeat and admit this seems the most probable explanation/description of our forthcoming reality.

Damn you Nemesis, what did you put in my coffee this morning? And why does it hurt so much when I sit down? But I digress...
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Nemesis 1
Heres a little more on Hydra. Does Intel need Hydra???

http://gpucafe.com/2009/02/luc...-100-closer-than-ever/


http://www.elsa-jp.co.jp/newsrelease/2009/0130_eng.html

Shouldn't Hydra really be called KillerGPU? Sure it works, but it remains to be seen whether anyone really needs it.

It is definitely and efficient ends to a means, bust up your large tough-to-yield die into smaller more yieldable chips (yorkfield anyone) and then use a near perfectly efficient glue-layer to pseudo MCM the GPU's together on-board or on-system.

It doesn't have to be about increasing performance, it can also be about lowering costs (increasing yields).

The KillerNIC slight was a joke nemesis, don't pop a vein.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
I understand hummor when I see it. LOL . Hydra in the future and now.Its very good for hardware makers. Many people Get pissee when new better hardware appears because it means to stay current = Spend money. But for our economics to work . Its a needed update.

Wait until Sandy Bridge. There is going to be so much Argueing and bedating this chip. Well all get sick of it. The debate is already in the roadmaps. The guys who will take issue with what Intel is doing is correct in a sense. Intel really doesn't need to wait for sandybridge on 22nm. To do what they are doing . But for economics . I agree with Intels delay and latter update. Its all about $$$$. Intelis spending over 8 billion to bring us these chips. So If they want to do FMA in 2 steps its Ok with me. But its going to bring alot of debate.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Nemesis 1
Intel really doesn't need to wait for sandybridge on 22nm.

Are you talking about Ivy Bridge (the 22nm sandy bridge shrink) or did you really mean sandy bridge and mistyped 22nm when you meant 32nm?
 

Wreckage

Banned
Jul 1, 2005
5,529
0
0
Originally posted by: Nemesis 1
Lets not turn this into some fanbois BS.

Larrabee offers a new direction. If intel successful. Its great for everyone. The Sony /Intel thing is a done deal.

Your first quote and the next 2 contradict each other.

So far LarryB is nothing but smoke and mirrors. The Sony\Intel deal was an Inq fantasy.
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Originally posted by: Wreckage
Originally posted by: Nemesis 1
Lets not turn this into some fanbois BS.

Larrabee offers a new direction. If intel successful. Its great for everyone. The Sony /Intel thing is a done deal.

Your first quote and the next 2 contradict each other.

So far LarryB is nothing but smoke and mirrors. The Sony\Intel deal was an Inq fantasy.


Begone. We don't need your anti talk. This isn't About Intel its about products. Who cares who makes them as long as they deliver/


You said this.,
So far LarryB is nothing but smoke and mirrors.

I take it ya didn't watch superbowl halftime. Also Top management Dreamworks is saying that larrabee is the real deal .

If you would put your ear to the internet. Listen to the whispers. There all positive. The only negitive comes from fanbois like yourself. Who KNOW nothing.


 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: Idontcare
Originally posted by: Nemesis 1
Lets not turn this into some fanbois BS. Larrabee offers a new direction. If intel successful. Its great for everyone. The Sony /Intel thing is a done deal. Its just not offical yet. This surprized me. I thought Intel would bet Xbox. But I am happy about it. I want to see Intel & Microsoft go head to head. After all It was MS. That stuck it to Itanic. Ya I want to see a PC game platiform that doesn't need DX11 or 10 /9. But still can play those games.

I would post a link I found on one of the hydra chips on M/B article I read . But many here preferr to form there own opions. So well let it be as they want it. Know sense in using useful information in a debate. Were Attacking a company is more the game than . Assembling known facts and debating them .

Like this known fact . When big wigg at Dream works was talking about Larrabee. In referrance to performance. He said Larrabee isn't 2x or 3x faster than current but 20x faster.

Now this is a high profile gentleman . I would think he would gauge his remarks.

It nearly brings me to tears to admit this, but I must say this is (1) probably the first nemesis post I feel like I read and fully understood the full spirit of the message, and (2) have no way to argue against the validity of the spirit of the message so I must accept defeat and admit this seems the most probable explanation/description of our forthcoming reality.

Damn you Nemesis, what did you put in my coffee this morning? And why does it hurt so much when I sit down? But I digress...

+1
 

Nemesis 1

Lifer
Dec 30, 2006
11,366
2
0
Now i will go Offtopic alittle. Its not fanbois either. Unless your not American.

Intel is spending over 8 billion on 32nm. I would like to see intel send out an even stronger message to the world. I would like to see intel announce a new circuits factor here in the USa using state of the art process and Start building AMERICAN made M/B and Video cards. Keep it here Baby. I think its time USA moves the pc intrastructure into American Factories. Thats what the stimulus package should have been about . American Made LCDs American Made Everthing. WE do not need THEM. THey need us. Lets not let them change that fact.
 
May 11, 2008
22,599
1,473
126
Originally posted by: BenSkywalker

KK failed to come close to what he wanted to do with Cell by a long shot. The difference is he at least was far more honest with his approach then what Intel is being. He made no sacrifice, his design was tailor made in every aspect to be the most powerful processor of its type per transistor/watt or whatever metric you want to use. He wasn't going to hang on to front end costs of converting an archaic ISA to uOps, he wasn't going to saddle his hardware with resources for dealing with OoO code, he wasn't going to limit his capabilties by utilizing die space, and quite a bit of it, for anything other then functional units or cache to keep them fed, that was it.

You forget that good development tools are sometimes more appreciated than raw horsepower assuming larrabee will not be all that special. When the ps3 came out , developers where complaining the ps3 is hard to program just as the ps2 was. I am sure Intel is betting that with larrabee the developers will be a lot more happy to program for since the learning curve will be smaller with the development tools Intel will provide. I think that is what the people from neoptica are working on. Devolpers want to tap in a lot of horsepower but if it costs to much they won't really be touching it that fast. That is what microsoft realized when they wanted a 3 core general purpose cpu for the xbox360. Cell is much more powerfull but harder to program. In the current timeframe these complaints will be a lot less i asssume...
 
May 11, 2008
22,599
1,473
126
Originally posted by: chizow
Originally posted by: taltamir
which is only going to be useful if EVERY rumor so far has been false, and they are not going to program DX on an x86 array.
You yourself said you expect it to be "software" (aka, generic x86 hardware rather then specialized hardware).
Not at all, and quite contrary, it'd benefit the most if all these rumors come true and a single Larrabee GPU is not competitive based on current leaked specs and expectations. I think its generally well accepted that Larrabee's software-based renderer will be slower than fixed-function raster hardware found on existing GPUs. The point is that Hydra would address deficiencies beyond those limitations, where a single discrete Larrabee GPU may not be able to compete, Hydra would allow for near 100% scaling with each successive core.

This scaling isn't dependent at all on software rendering, as Lucid/Hydra has made similar scaling claims with NV/ATI GPUs. I only made the point because of the general viewpoint the software-based renderer would cripple Larrabee to begin.

Larrabee's Rasterizer, explained

According to the article Intel have put there best software engineers and former 3dlabs software engineers on the task of making drivers for larrabee.

I asked Intel who was working on the Larrabee drivers, thankfully the current driver team is hard at work on the current IGP platforms and not on Larrabee. Intel has a number of its own software engineers working on Larrabee's drivers, as well as a large team that came over from 3DLabs. It's too early to say whether or not this is a good thing, nor do we have any idea of what Intel's capabilities are from a regression testing standpoint, but architecture or not, drivers can easily decide the winner in the GPU race.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Like this known fact . When big wigg at Dream works was talking about Larrabee. In referrance to performance. He said Larrabee isn't 2x or 3x faster than current but 20x faster.

Now this is a high profile gentleman . I would think he would gauge his remarks.

There is a rather lengthy explenation of why this would be and Larrabee would still be utterly horrible for real time, but to keep it as simplistic as possible- what form of occlusion culling were they using?

If you are going to ray trace then Larrabee would be much, much faster then any current GPU, actually something is horribly wrong if it is only twenty times the speed atm, but what we are talking about is ray tracing on Larrabee versus rasterization on the alternatives.

You forget that good development tools are sometimes more appreciated than raw horsepower assuming larrabee will not be all that special.

For PC devs that does tend to be the case. A good portion of the PS3 devs, and all of the top developers on the platform, won't even use the tools that are available from Sony, they built their own from the ground up. That is the norm for the top tier console developers.

According to the article Intel have put there best software engineers and former 3dlabs software engineers on the task of making drivers for larrabee.

3DLabs was great at making accurate and robust drivers. They sucked at making fast ones. That isn't a slam, their market demanded accuracy and stability above all else, but they certainly aren't known for their high performance drivers.

Perhaps when I have more time I'll get a bit more into why a ray tracer will kill a rasterizer for the type of rendering that Dreamworks needs and why that doesn't translate over to real time and won't in the forseeable future, would be a fairly lengthy post.
 
May 11, 2008
22,599
1,473
126
Originally posted by: BenSkywalker
Like this known fact . When big wigg at Dream works was talking about Larrabee. In referrance to performance. He said Larrabee isn't 2x or 3x faster than current but 20x faster.

Now this is a high profile gentleman . I would think he would gauge his remarks.

There is a rather lengthy explenation of why this would be and Larrabee would still be utterly horrible for real time, but to keep it as simplistic as possible- what form of occlusion culling were they using?

If you are going to ray trace then Larrabee would be much, much faster then any current GPU, actually something is horribly wrong if it is only twenty times the speed atm, but what we are talking about is ray tracing on Larrabee versus rasterization on the alternatives.

You forget that good development tools are sometimes more appreciated than raw horsepower assuming larrabee will not be all that special.

For PC devs that does tend to be the case. A good portion of the PS3 devs, and all of the top developers on the platform, won't even use the tools that are available from Sony, they built their own from the ground up. That is the norm for the top tier console developers.

According to the article Intel have put there best software engineers and former 3dlabs software engineers on the task of making drivers for larrabee.

3DLabs was great at making accurate and robust drivers. They sucked at making fast ones. That isn't a slam, their market demanded accuracy and stability above all else, but they certainly aren't known for their high performance drivers.

Perhaps when I have more time I'll get a bit more into why a ray tracer will kill a rasterizer for the type of rendering that Dreamworks needs and why that doesn't translate over to real time and won't in the forseeable future, would be a fairly lengthy post.

Intel also has aquired pixomatic. A professional in software rasterzing.

pixomatic website