Why Intel will fail with Larrabee

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

dflynchimp

Senior member
Apr 11, 2007
468
0
71
Originally posted by: BFG10K
Depreciated or not, the drivers for GMA have historically been poor, especially for gaming. I saw some OpenGL tests being run on one a while ago, and it was basically failing almost all of them (?unsupported?).

If Intel wants to compete in the discrete market they are going to have to compete with ATi?s and nVidia?s driver programmers, teams that have had over a decade of experience with tuning drivers for games.

true, but Intel also has never tried to compete seriously in the discrete graphics market. It would be unfair to not give them the benefit of the doubt and automatically assume that they will fail. Afterall we all love competition between these companies.
 

Xentropy

Senior member
Sep 6, 2002
294
0
0
That looks to me (maybe it's just Wikipedia introducing some bias like they're not supposed to; I wasn't familiar with the card at the time) like Intel pushing a shill product just to accelerate adaptation of AGP and thus sell more chipsets. In this case, Larrabee is actually supposed to compete with other serious gaming options, not just act as a proof of concept for an associated technology, nor (like GMA) just exist as filler for the low-end.
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: MODEL3
Originally posted by: nitromullet
I don't think Intel will fail. They might not have a top performer with Larrabee, but maybe their next iteration, or the one after that.

The thing that people have to start to understand is that Intel, AMD, and NV aren't fighting for who can build the fastest cpu, gpu, or chipset; but who will control the platform. The components are just pieces of that overall fight, and while building the fastest/best helps build mindshare the real key is who can package it all the best in a solid, reliable, easy to sell to the end user. Whoever can do this, controls the platform, and can make the rules. It is important for Intel that x86 remains the standard, so they need to put a stop to all this 'GPGPU nonsense' before we all realize that x86 could actually be replaced by something better if we allowed it to happen.

I like how you put it.

What helps Intel's effort to control (they don't want to put a stop) the whole GPGPU prospect is that Larrabee is going to be at 45nm at launch.

Otherwise if they launch at 32nm while being not competitive with ATI & NV they would not have a chance to succeed in controlling the whole GPGPU prospect.

What perf. advantages can Intel bring with X86 type architecture GPU within the same 2 years manufacturing cycle? 1,5X? (I doubt 2X, but lets say 2X MAX)

So since ATI & NV can do now more than 1,5X per year (think what they can do if they think that the existence of their companies is at stake) they can always (in the 32nm scenario) offer something more than Intel in the competition game.

So in 32nm launch scenario Nv & ATI will have the upper hand becauce Intel will be forced to exit the gaming GPU marhet in a year or two.

Intel only needs to stay in the game for a few years (4-5), in order to take the upper hand.

I'm pretty sure I don't understand how Intel's 45nm Larrabee release would help them maintain control, but 32nm would not give them a chance. What logic are you using here? What did I miss?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: Xentropy
That looks to me (maybe it's just Wikipedia introducing some bias like they're not supposed to; I wasn't familiar with the card at the time) like Intel pushing a shill product just to accelerate adaptation of AGP and thus sell more chipsets. In this case, Larrabee is actually supposed to compete with other serious gaming options, not just act as a proof of concept for an associated technology, nor (like GMA) just exist as filler for the low-end.

Whatever you wish to call it, that was Intels last attempt at discrete graphics. Whatever reasoning there was for the introduction of the product, it still existed and cannot be discounted for any reason. Now that was an extremely long time ago. There is apparently a new Intel these days and if they wish to remain consistent with the success of their Core technology and not release an absolute DUD in Larrabee, they definitely needed to put the nose to the grindstone and get serious if they intend to compete with Nvidia and AMD in the discrete graphics/GPGPU department.

I wouldn't use Intels i740 as a reference regarding what to expect with Larrabee.

 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Xentropy

That looks to me (maybe it's just Wikipedia introducing some bias like they're not supposed to; I wasn't familiar with the card at the time) like Intel pushing a shill product just to accelerate adaptation of AGP and thus sell more chipsets.
Kind of like Larrabee being pushed as a shill product just to sell more x86 chips?

In this case, Larrabee is actually supposed to compete with other serious gaming options, not just act as a proof of concept for an associated technology, nor (like GMA) just exist as filler for the low-end.
So was the i740, and look how that turned out. Additionally, the commentary from that article bears striking similarity to where we are now:

In the lead-up to the i740's introduction, the press widely commented that it would drive all of the smaller vendors from the market. As the introduction approached, rumors of poor performance started circulating. In spite of this, pundits continued to agree that its release would have enormous effects on the market. Peter Glaskowsky noted that "Very few of the manufacturers have the access to the [manufacturing plants] that Intel does, S3 could be the big loser here--it doesn't sell to the performance market. Intel has the resources to beat S3 on those terms and they have the performance".
This sounds suspiciously like the Larrabee pre-release hype we have now. That because Intel has manufacturing muscle, it automatically implies they will become a major player in the graphics arena.

Now we move on to the actual release:

The AGP Texture concept soon proved to be a tremendous error in design, because the card had to constantly access the textures over a channel that was upwards of eight times slower than RAM placed on the graphics card itself.
Wow, more similarities, namely Larrabee pushing an unorthodox multi-core software approach to hardware rasterization and trying to downplay it by pushing a ray tracing agenda instead, all to sell more x86 chips.

Tell me, what hard proof does anyone have that Larrabee will succeed? Has anyone seen any benchmarks in actual games? What past success from Intel in the discrete arena leads anyone to believe Larrabee will succeed?

Not only has Intel competed in the discrete market before, they also failed woefully. So until I see something concrete to the contrary, Larrabee is essentially i740 revision two, and it?s absolutely valid to draw comparisons between the two products.
 

faxon

Platinum Member
May 23, 2008
2,109
1
81
Originally posted by: Keysplayr
Originally posted by: MODEL3
Originally posted by: nitromullet
I don't think Intel will fail. They might not have a top performer with Larrabee, but maybe their next iteration, or the one after that.

The thing that people have to start to understand is that Intel, AMD, and NV aren't fighting for who can build the fastest cpu, gpu, or chipset; but who will control the platform. The components are just pieces of that overall fight, and while building the fastest/best helps build mindshare the real key is who can package it all the best in a solid, reliable, easy to sell to the end user. Whoever can do this, controls the platform, and can make the rules. It is important for Intel that x86 remains the standard, so they need to put a stop to all this 'GPGPU nonsense' before we all realize that x86 could actually be replaced by something better if we allowed it to happen.

I like how you put it.

What helps Intel's effort to control (they don't want to put a stop) the whole GPGPU prospect is that Larrabee is going to be at 45nm at launch.

Otherwise if they launch at 32nm while being not competitive with ATI & NV they would not have a chance to succeed in controlling the whole GPGPU prospect.

What perf. advantages can Intel bring with X86 type architecture GPU within the same 2 years manufacturing cycle? 1,5X? (I doubt 2X, but lets say 2X MAX)

So since ATI & NV can do now more than 1,5X per year (think what they can do if they think that the existence of their companies is at stake) they can always (in the 32nm scenario) offer something more than Intel in the competition game.

So in 32nm launch scenario Nv & ATI will have the upper hand becauce Intel will be forced to exit the gaming GPU marhet in a year or two.

Intel only needs to stay in the game for a few years (4-5), in order to take the upper hand.

I'm pretty sure I don't understand how Intel's 45nm Larrabee release would help them maintain control, but 32nm would not give them a chance. What logic are you using here? What did I miss?

basically, if intel were to release larrabee @ 32nm, it would be on a fresh process tech that just started mass production, leaving no space for improvement other than stepping updates and SKUs that fill the cracks every $20 you increase your budget. however, if they were to release @ 45nm, 6-8 months down the road, once they have the kinks ironed out of their 32nm process and yields are near 80-90% with most chips meeting the highest binning standards, they could release another generation of parts to upgrade their 45nm lineup, offering increased performance at the same or lower cost, regaining mind share, and maintaining a process tech lead until GFs 32nm/28nm processes are ready for mass GPU production. from a business prospective, it makes a lot of sense, especially if they have any prior knowledge of actual ATI or Nvidia next gen performance. if they build their design to what they expect ATI and Nvidia to have on the market at the time (which they probably already are doing), they should be competitive in at least the midrange to low end gaming market, which is where most of the profits are, with more than enough funding and research staff to iron any kinks out of the arch for a revision @ 32nm to improve performance and take advantage of a die shrink at the same time. my question, how long would it take for ATI or Nvidia to catch up again, if this were to be the case?
 

Keysplayr

Elite Member
Jan 16, 2003
21,219
54
91
Originally posted by: faxon
Originally posted by: Keysplayr
Originally posted by: MODEL3
Originally posted by: nitromullet
I don't think Intel will fail. They might not have a top performer with Larrabee, but maybe their next iteration, or the one after that.

The thing that people have to start to understand is that Intel, AMD, and NV aren't fighting for who can build the fastest cpu, gpu, or chipset; but who will control the platform. The components are just pieces of that overall fight, and while building the fastest/best helps build mindshare the real key is who can package it all the best in a solid, reliable, easy to sell to the end user. Whoever can do this, controls the platform, and can make the rules. It is important for Intel that x86 remains the standard, so they need to put a stop to all this 'GPGPU nonsense' before we all realize that x86 could actually be replaced by something better if we allowed it to happen.

I like how you put it.

What helps Intel's effort to control (they don't want to put a stop) the whole GPGPU prospect is that Larrabee is going to be at 45nm at launch.

Otherwise if they launch at 32nm while being not competitive with ATI & NV they would not have a chance to succeed in controlling the whole GPGPU prospect.

What perf. advantages can Intel bring with X86 type architecture GPU within the same 2 years manufacturing cycle? 1,5X? (I doubt 2X, but lets say 2X MAX)

So since ATI & NV can do now more than 1,5X per year (think what they can do if they think that the existence of their companies is at stake) they can always (in the 32nm scenario) offer something more than Intel in the competition game.

So in 32nm launch scenario Nv & ATI will have the upper hand becauce Intel will be forced to exit the gaming GPU marhet in a year or two.

Intel only needs to stay in the game for a few years (4-5), in order to take the upper hand.

I'm pretty sure I don't understand how Intel's 45nm Larrabee release would help them maintain control, but 32nm would not give them a chance. What logic are you using here? What did I miss?

basically, if intel were to release larrabee @ 32nm, it would be on a fresh process tech that just started mass production, leaving no space for improvement other than stepping updates and SKUs that fill the cracks every $20 you increase your budget. however, if they were to release @ 45nm, 6-8 months down the road, once they have the kinks ironed out of their 32nm process and yields are near 80-90% with most chips meeting the highest binning standards, they could release another generation of parts to upgrade their 45nm lineup, offering increased performance at the same or lower cost, regaining mind share, and maintaining a process tech lead until GFs 32nm/28nm processes are ready for mass GPU production. from a business prospective, it makes a lot of sense, especially if they have any prior knowledge of actual ATI or Nvidia next gen performance. if they build their design to what they expect ATI and Nvidia to have on the market at the time (which they probably already are doing), they should be competitive in at least the midrange to low end gaming market, which is where most of the profits are, with more than enough funding and research staff to iron any kinks out of the arch for a revision @ 32nm to improve performance and take advantage of a die shrink at the same time. my question, how long would it take for ATI or Nvidia to catch up again, if this were to be the case?

Ok, but kinks in Intels manufacturing process? Aren't those few and far between?
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: Keysplayr
Originally posted by: Xentropy
That looks to me (maybe it's just Wikipedia introducing some bias like they're not supposed to; I wasn't familiar with the card at the time) like Intel pushing a shill product just to accelerate adaptation of AGP and thus sell more chipsets. In this case, Larrabee is actually supposed to compete with other serious gaming options, not just act as a proof of concept for an associated technology, nor (like GMA) just exist as filler for the low-end.

Whatever you wish to call it, that was Intels last attempt at discrete graphics. Whatever reasoning there was for the introduction of the product, it still existed and cannot be discounted for any reason. Now that was an extremely long time ago. There is apparently a new Intel these days and if they wish to remain consistent with the success of their Core technology and not release an absolute DUD in Larrabee, they definitely needed to put the nose to the grindstone and get serious if they intend to compete with Nvidia and AMD in the discrete graphics/GPGPU department.

I wouldn't use Intels i740 as a reference regarding what to expect with Larrabee.

I agree, given where Intel's decision makers have been targeting their technology at point of introduction recently (cpu's and ssd's) I just don't see these same decision makers suddenly going schizophrenic on their shareholders, forgetting how and why they did things precisely as they've done them these past 3 yrs, and releasing Larrabee at all if its performance is going to be lackluster or emberrasing.

For example look at what the current decision makers have opted to even do with itanium, taking it out of the game entirely at 45nm and skipping its tick/tock cycle to 32nm instead.

This isn't team Barret, its team Otellini.

Now if Otellini were to suddenly vacate his role as CEO tomorrow then I'd be far less confident that 2010 and 2011 Intel products were going to be a continuation of their 2006-2009 model. But until I see the people making the critical decisions actually screw up I'm not going to lower my expectations just because the company logo on the letterhead is the same logo on the letterhead used by some decision makers a decade ago that made some rather poor decisions in their time.

Logos and companies don't make decisions, people do, and the people making decisions at Intel today were not the people making decisions at Intel 10yrs ago.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Xentropy
That looks to me (maybe it's just Wikipedia introducing some bias like they're not supposed to; I wasn't familiar with the card at the time) like Intel pushing a shill product just to accelerate adaptation of AGP and thus sell more chipsets. In this case, Larrabee is actually supposed to compete with other serious gaming options, not just act as a proof of concept for an associated technology, nor (like GMA) just exist as filler for the low-end.

I really don't think that AGP market adaptation was Intel intention.
Riva 128 had an AGP model way before i740 comes to the market.
Do you know how many AGP cards Nvidia sold with this model and ZX (also ZX was launch before i740)
It was a huge success.

Why Intel do anything about AGP market acceleration when everybody loved AGP (I mean from the tech industry)
Every new mid range motherboard had an AGP slot and everybody within the industry & and the tech press said nice things about it.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Keysplayr
I'm pretty sure I don't understand how Intel's 45nm Larrabee release would help them maintain control, but 32nm would not give them a chance. What logic are you using here? What did I miss?

I meant that in order Intel to control the whole GPGPU prospect they should stay in discrete GPU market for 4-5 years, they can't do it from the first 1-2 years.

So I said that Larrabee being a 45nm product is a good thing becauce the 45nm tech gives Intel the possibility to move faster than 2 years cycle in the manufacturing process for their future GPUs.

And this is crucial becauce first TSMC from Q4 2005 (90nm) has 1,5 year equivalent cycles and second ATI & NV probably can improve the performance of their GPU per year more in relation with a X86 architecture.

Also in Anand preview of Larrabee technology, Intel said that even the move to 6 cores from 4 brings a not ideal scaling in performance, so imagine what will be with more than 6 cores.

So if Larrabee was to launch at 32nm Intel will have to improve for the next 2 years the performance of Larrabee within this framework.

So you can understand that if (like you said) Larrabee has not a top performance in DX11 (this is the most probable scenario)

For example in Q2 2009 there was 300$ 1GHz 4890 models & 100$ 4770 models,
let's say that the TOP Larabee model in Q2 2010 instead of having the DX11 performance of the future Q2 2010 "1GHz 4890 model" equivalent
will have something like the DX11 performance level of the future Q2 2010 "4770 model" equivalent (around half)

So the TOP Larrabee model will probably have the DX11 performance of a Q2 2010 100$ ATI model.

How many years can Intel stay in the discrete GPU gaming market if ATI & NV can increase the performance of their future GPU models much more than what Intel can increase?

With this scenario Intel will be forced after 1-1,5 year to sell products in a price range with a DX11 performance that is way lower than what ATI & NV will have.

Of cource this scenario has 2 presumptions

1.Intel Larrabee is to Launch at 32nm
2.Intel TOP Larrabee model will not have the performance of a 300$ DX11 Q2 2010 ATI model

That's why I said that that Larrabee launching at 45nm can mean good things for Intel's future plan.

 

Zap

Elite Member
Oct 13, 1999
22,377
7
81
Originally posted by: taltamir
I think the op is making the point that people will refuse to buy larabee because of intel's lack of support for older chipsets... not that intel will have sub par drivers for larabee on release

Just like people refusing to buy HP printers or Creative Labs sound cards because they tend to stop supporting old products with new operating systems. Oh wait, people still buy those. :roll:

Originally posted by: MODEL3
The first try was with i740

My guess is that they failed becauce they could not launch new GPU models every year like their competitors did

Actually they succeeded. They put the graphics core into the Northbridge and now sell more graphics chips than all other companies combined.

source
Q2'09
Intel 50.30%
Nvidia 28.74%
AMD 18.13%
everyone else negligible

Who knows? Maybe Larrabee will become the basis for integrated graphics in a few years?
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: Zap
Originally posted by: MODEL3
The first try was with i740

My guess is that they failed becauce they could not launch new GPU models every year like their competitors did

Actually they succeeded. They put the graphics core into the Northbridge and now sell more graphics chips than all other companies combined.

source
Q2'09
Intel 50.30%
Nvidia 28.74%
AMD 18.13%
everyone else negligible

Who knows? Maybe Larrabee will become the basis for integrated graphics in a few years?

You missed the previous line:

Originally posted by: MODEL3
I guess this is the second try for Intel to enter the discrete GPU business, right?

The first try was with i740 in the end of 1997 if i remember correctly, what i do remember was that after a few months you could buy the i740 a LOT cheaper than Voodoo 1, NV Riva 128, Power VR, Matrox G100.

So we were talking about discrete GPU business.



Originally posted by: Zap
They put the graphics core into the Northbridge and now sell more graphics chips than all other companies combined.

Yes, I agree of cource, I just want to add that there was an IGP solution in the market, if I remember correctly based on SIS chipsets way before that.

 

Majic 7

Senior member
Mar 27, 2008
668
0
0
I wonder what kind of predictions people would be making if Intel were just now talking of getting into the SSD business? Never underestimate the power of lots of brains and money against some brains and some money.
 

sandorski

No Lifer
Oct 10, 1999
70,783
6,341
126
Originally posted by: Majic 7
I wonder what kind of predictions people would be making if Intel were just now talking of getting into the SSD business? Never underestimate the power of lots of brains and money against some brains and some money.

Don't Overestimate it either. Intel has had many flops that many people assumed would be Successes simply because Intel developed or was pushing it. i740 and RDRam being 2 very big examples.
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: sandorski
Originally posted by: Majic 7
I wonder what kind of predictions people would be making if Intel were just now talking of getting into the SSD business? Never underestimate the power of lots of brains and money against some brains and some money.

Don't Overestimate it either. Intel has had many flops that many people assumed would be Successes simply because Intel developed or was pushing it. i740 and RDRam being 2 very big examples.

I will add in the failures the 1066MHz PIII fiasco & and the i820 fiasco.

About the RDRam fiasco, what on earth Intel was thinking while developing the RDRam platform?

The SDR/DDR prices was cheaper than candies, this move had such a bad timing.

Unless Intel had projected the timetable for the failure ($$$) of Hynix, this was a foretellable failure.

Of cource there was an analogy between SDR & RDR price but with Rambus in charge and with only one manufacturer (Samsung only, Elpida was to small and Hynix's RDR modules was not that good) the RDR price level was not right.

So anything can happen, nothing is guaranted.



 

Majic 7

Senior member
Mar 27, 2008
668
0
0
There is always the possibility of failure. I just don't think it will be on the tech side. Their record lately on the tech side is very good. Too good, maybe. They already produce CPUs that the "Fat girls need love too" company can't compete with and they aren't exactly flying off the shelves. Market is the problem. If they don't come out with a compelling game to show what their tech can do no one will have a reason to buy Larrabee. Cuz I don't think it will be cheap. And right now everyone expects cheap.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Zap
Originally posted by: taltamir
I think the op is making the point that people will refuse to buy larabee because of intel's lack of support for older chipsets... not that intel will have sub par drivers for larabee on release

Just like people refusing to buy HP printers or Creative Labs sound cards because they tend to stop supporting old products with new operating systems. Oh wait, people still buy those. :roll:
For gosh sakes, 945 chipset is a current product, and Windows 7 is just nearly a current OS. Being not properly supported really puts a negative spin on all of intel's product line, as far as drivers go.

 

sandorski

No Lifer
Oct 10, 1999
70,783
6,341
126
Originally posted by: MODEL3
Originally posted by: sandorski
Originally posted by: Majic 7
I wonder what kind of predictions people would be making if Intel were just now talking of getting into the SSD business? Never underestimate the power of lots of brains and money against some brains and some money.

Don't Overestimate it either. Intel has had many flops that many people assumed would be Successes simply because Intel developed or was pushing it. i740 and RDRam being 2 very big examples.

I will add in the failures the 1066MHz PIII fiasco & and the i820 fiasco.

About the RDRam fiasco, what on earth Intel was thinking while developing the RDRam platform?

The SDR/DDR prices was cheaper than candies, this move had such a bad timing.

Unless Intel had projected the timetable for the failure ($$$) of Hynix, this was a foretellable failure.

Of cource there was an analogy between SDR & RDR price but with Rambus in charge and with only one manufacturer (Samsung only, Elpida was to small and Hynix's RDR modules was not that good) the RDR price level was not right.

So anything can happen, nothing is guaranted.

Intel was trying to predict the future with RDRam. It had some distinct advantages over DDR and Intel just thought those Advantages would propel RDRam ahead of DDR. Turned out to not work that way. They certainly didn't help their Move by crippling the PIII with it.

Kinda similar to the P4, again they were trying to address some kind of future Computing idea, that turned out to not see Reality. They certainly didn't lose $$ on the P4, but what they did lose was Market Share to the Athlon/Athlon XP(until near the end of the Athlon XP line).
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: sandorski
Intel was trying to predict the future with RDRam. It had some distinct advantages over DDR and Intel just thought those Advantages would propel RDRam ahead of DDR. Turned out to not work that way. They certainly didn't help their Move by crippling the PIII with it.

Yes I know about the advantages of RDR tech in relation with DDR tech.
But the actual performance difference in a system was not enough to entice the customers.
With the price difference for the RDR platform in relation with the DDR platform you could buy a faster processor or a faster GPU or more memory.
All these options were clearly better.

Anyway I don't want to sound like an "After Christ prophet". (It's a greek term to show that someone has a prediction for something that already happend)

 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Drivers being the force that drags Larrabee down is extremely likely, but not for the reason the OP seems to be thinking.

Larrabee in terms of a GPU is its' drivers. While we may see a couple of percent here or there with nV/ATi, with Larrabee the difference could very easily be in the tens of thousands percent up or down. Larrabee is a software solution, and Intel right now has built themselves a team with no experience making high performance 3D drivers for real time gaming.

Larrabee is Intel playing defense against nV's assault on the GPGPU front, nothing more.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: MODEL3

About the RDRam fiasco, what on earth Intel was thinking while developing the RDRam platform?
They didn?t really have a choice as the P4 paid a high price for memory accesses and missed branches because of its lengthy pipeline. Since DDR was in its infancy, they had to go with RDRAM for bandwidth reasons.

That and Willy had a very small cache relative to future derivatives.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Originally posted by: Zap

Actually they succeeded. They put the graphics core into the Northbridge and now sell more graphics chips than all other companies combined.
They only succeeded because of bundling, not because it was the better product. As it stands now, if you buy an Intel motherboard, you almost always get a GMA with it.

Who knows? Maybe Larrabee will become the basis for integrated graphics in a few years?
All integrated solutions suffer from the fundamental fact that sharing system memory is woefully slower than fast onboard memory. If Larrabee starts sharing system memory then I?m not convinced it?ll be hugely faster than its predecessor for graphics rendering.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
64
91
Originally posted by: MODEL3
Anyway I don't want to sound like an "After Christ prophet". (It's a greek term to show that someone has a prediction for something that already happend)

:laugh: In the USA we have a similar phrase but it stems from reference to sports versus religion in your example - Monday morning quarterback
 

MODEL3

Senior member
Jul 22, 2009
528
0
0
Originally posted by: BFG10K
Originally posted by: MODEL3

About the RDRam fiasco, what on earth Intel was thinking while developing the RDRam platform?
They didn?t really have a choice as the P4 paid a high price for memory accesses and missed branches because of its lengthy pipeline. Since DDR was in its infancy, they had to go with RDRAM for bandwidth reasons.

That and Willy had a very small cache relative to future derivatives.

If I remember correctly, there were DDR chipsets for AMD's CPUs from Q1 2000 (but good chipset from Q1 2001 with the arrival of VIA KT266) so with good DDR chipsets for AMD since Q1 2001 I think that in Q3 2001 that P4 launched, the time for a P4 DDR chipset was "just right".
I think it was a design choice, not becauce DDR was at its infancy.
And if you remember latter when Intel made a DDR P4 chipset the performance difference with the RDR platform was not that great.

I don't remember clearly the 2001 situation so I may be wrong.