Originally posted by: apoppin
Do you think any one got fired over Prescott?
- Perhaps "P4" overall wasn't a failure considering Intel is No1
BUT
But - and speculation is what we do best - what do you think would have happened if they decided to go with "M" and continue the research from PIII that lead to C2D - without taking the 8-year P4 detour with RD-RAM and finally with Prescott?
This is very reasonable post, and it's indeed interesting to consider the possibilities and intricacies involved.
There are several elements to this :
P4 and the Pentium M teams were separate. AFAIK, development on the P3 tech (which was really based on P2 + SSE + On-die cache/etc) never truly stopped. It's just that the Netburst architecture became their premier desktop product for a long time.
RDRAM/Rambus was a disaster in the marketplace, and yet an engineering success on the terms of delivering the desired performance. It did deliver exceptional performance, and DDR PC1600 and PC2100 could not equal the performance (not to mention it came out later). Once PC2700/DDR333 came along, it offered the performance of RDRAM 800 with a fraction of the cost, but it took a while to really get going (with the later 845 chipsets, and of course the wonderful Nforce2 for Socket A). The last gasp of RDRAM, the PC1066, was wickedly fast, but even more outlandishly priced, and DDR400/PC3200 was a much better option. I still don't really understand why RDRAM was so expensive. Perhaps something about the fabrication or yields was the reason, or perhaps it was because so few places (I remember Samsung made the best IMHO) made it, maybe a combination of factors is most likely. I label RDRAM as a disaster because it wasn't on the market for very long, and it never became affordable, and there were *very* few chipsets that used it, only i820 and i850 IIRC.
Back to the bounce-back between Netburst and the P3 descendants, one could say that perhaps things would have moved along more quickly if both teams had been combined to work on the projects that would become Conroe and so on. On the flip side, things like the advanced branch prediction, hyperthreading, and so forth developed during the P4 variants, have been put into modern Intel cpus yielding spectacular results. i7 in particular is interesting, as hyperthreading is back and makes a huge impact under apps that can take advantage.
Looking at the performance leaps from the P4 1.3ghz Willamette with 256k L2 and 400mhz FSB, non-HT, to the final 3.6Ghz Cedar Mills with 65w 65nm cores, 2MB L2, 64-bit extensions, Virtualization, XD, and Hyper-threading, you can see a huge distance in performance that was achieved. Given that 3.6Ghz was 65w, Intel presumably could have continued to develop the P4 line into the 4ghz range, particularly with 45nm tech on the horizon, but very high IPC / PPW Cpus were becoming more viable, and it was honestly time to end the P4 era. AMD64 really proved the advantages of doing more with fewer clock cycles, and Intel had been learning a lot from their mobile processor designs.
The P4 was a commercial and business success, and met all realistic engineering goals along the way, outside of the growing pains that struck when 90nm was first attempted with gate leakage and the high heat output. I'm sure Intel would have been happy to have kept the P4 going for even longer, but AMD really forced their hand with the outstanding Socket 939 and 754 chips. RDRAM and Prescott should go on the record as distinct failures during the mammoth P4 era.
We have the advantage of looking back, but not the advantage of insider info from Intel or AMD, I'm sure their perspectives are more interesting.