Originally posted by: apoppin
you have no case
Originally posted by: apoppin
of course ... i have shown links, i have quoted wikipedia and there are literally hundreds of links to show P4 was an 'engineering failure'
you otOH, have ... *nothing* ... zero ... no support and no links ... just yourself to quote ... over-and-over
:roll:
instead of giving evidence ... you *nitpick* on the opposing mountain of evidence in an vain attempt to cloud the fact that you still have ... *nothing* ... no case whatsoever
--and Intel Engineer's *cannot* speak for intel [period] ... they sign an 'unbreakable' NDA
Originally posted by: apoppin
i'd say you're irrelavant
Originally posted by: apoppin
and you didn't show *anything*
we have trolls in video who do the exact same thing ... they have nothing to offer or any contributions to add to a thread, so they nitpick other members posts and dispute "sources" though they have none of their own to offer ... we tend to ignore them
ok ..Conroe is a really fantastic chip. The funny thing is very few people in the industry have been willing to come out and say that the Pentium 4 architecture sucks. It sucked all along. Even at the height of it's sucking, when it was running at 3.6GHz and not performing as well as a 2GHz AMD64... People were reluctant to say it sucked... so IT SUCKS! But Conroe really makes up for that and I am really happy to see that, that Intel is back on this track of extremely high computing performance at reasonable clock rates.
Originally posted by: Scholzpdx
It was the first post on failblog.
Originally posted by: myocardia
Originally posted by: Scholzpdx
It was the first post on failblog.
You didn't notice that this thread was 2½ years old?:laugh:
Originally posted by: deimos3428
I don't see how anyone could claim the P4 was anything other than a major success. I've still got one chugging along at home, and you really can't call a chip that works for 10+ years a failure.
Success isn't relative. It doesn't matter if it did better than AMD's offerings, or if it was the best possible design ever. It was a vast improvement over the P3, it sold a ton of units and made money for Intel. Which happened to allow the company to employ a lot of people and keep making even better chips. You wouldn't have the i7 without the P4.
Originally posted by: deimos3428
I don't see how anyone could claim the P4 was anything other than a major success. I've still got one chugging along at home, and you really can't call a chip that works for 10+ years a failure.
Success isn't relative. It doesn't matter if it did better than AMD's offerings, or if it was the best possible design ever. It was a vast improvement over the P3, it sold a ton of units and made money for Intel. Which happened to allow the company to employ a lot of people and keep making even better chips. You wouldn't have the i7 without the P4.
Originally posted by: alyarb
their position in the industry allowed them to sell just as many netburst products as they would have sold on a better or worse design. it was a success in many respects, but it was an inefficient architecture throughout its life. i doubt intel foresaw the thermal challenges ahead of them with the high frequency 90nm parts. they went from 80 watts with 130 nm to what, 115 watts with prescott? for a little single core, that is a gargantuan increase in power and it's all attributable to the frequencies they were gunning for. all this during a time when heatsink design was still very 1990's and uninspired. usually the die shrink is intended to save power, but they had to scale to get away from K8. they really painted themselves into a corner. in the mean time it was pretty interesting to see 2 GHz K8's performing like 3 GHz pentiums. we'll never see it again, that's for sure.
Originally posted by: alyarb
Larrabee isn't gunning for high freqs, and it's incredibly wide. it's like the anti-netburst when you think about it. plus they have HKMG now. If you could spruce up a prescott with 45 nm HKMG, give it QPI to MCH (or iMCH) and 32MB of L2 (so its about the size of a 800 M transistor chip), i would look a lot prettier than it did in 2004. netburst was not the fault of the engineers (in fact, SSE2 performance was astounding initially); AMD would not have done a better job at the same approach (at least not during the same historical era of available techniques). the guys were given a job to do and they did their best with their crude 20th century implements. larrabee could have a lot of potential depending on how efficient the software renderer is, and it certainly will help scaling wide x86 apps.