Question Intel had a 7 GHz CPU years ago

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

aigomorla

CPU, Cases&Cooling Mod PC Gaming Mod Elite Member
Super Moderator
Sep 28, 2005
21,065
3,570
126
Damn. I didn't profit much from my affair with Mr. Intel Itanium. It was a fun ISA, but it left me with similar feelings to most of my ex-boyfriends - vague wistfulness, combined with irritation.
Still better then Mrs. Apple, who leaves you utterly dependant, and impossible to fix without her consent, not to mention she costs way more then she is worth as she used to be Mrs. Intel wearing heavy makeup.
 
Jul 27, 2020
26,827
18,471
146
Damn. I didn't profit much from my affair with Mr. Intel Itanium.
You worked for HP? What do you do exactly? Enterprise software developer? Or Enterprise systems developer?

Anyway, I have to ask you because you clearly have a grasp on these technical things. Could AVX-512 ever be used to accelerate games? Does any version of Excel use AVX-512 (Online or offline)? Does any open source spreadsheet (like Libre Office) use AVX-512?

Have you dabbled in OpenCL? I know Libre Office can accelerate some tasks with OpenCL but I think it's limited to AMD APUs and GPUs. There really doesn't seem to be much research or a benchmark sheet comparing the performance of different vendors' GPUs.

What's the most fun you've had in your job? The hardware specs and software used in that instance?
 

SarahKerrigan

Senior member
Oct 12, 2014
735
2,036
136
You worked for HP? What do you do exactly? Enterprise software developer? Or Enterprise systems developer?

Anyway, I have to ask you because you clearly have a grasp on these technical things. Could AVX-512 ever be used to accelerate games? Does any version of Excel use AVX-512 (Online or offline)? Does any open source spreadsheet (like Libre Office) use AVX-512?

Have you dabbled in OpenCL? I know Libre Office can accelerate some tasks with OpenCL but I think it's limited to AMD APUs and GPUs. There really doesn't seem to be much research or a benchmark sheet comparing the performance of different vendors' GPUs.

What's the most fun you've had in your job? The hardware specs and software used in that instance?

I did not work for HP. I most definitely did work with their products, a lot, and at a very low level.

At the risk of veering off-topic - sure, there are things in games that have enough DLP to benefit from wide vectorization, though I'm skeptical of SIMD as something to throw silicon area at. From where I sit, "comfy SIMD" - ie, rich features like good mask and permute support, integer rotate, etc - are more important than wide SIMD, and the needs of technical compute differ enough from the needs of "normal" consumer and even enterprise compute that trying to target one SIMD ISA at both seems dubious. I'm not sure where the state is of support for it in Excel, etc.

As for OpenCL, I've touched it, mostly on Xilinx FPGAs, which had some weird limitations that made it seem not terribly useful as a "generic" target for existing OpenCL kernels. It's possible their support has improved since.
 
  • Like
Reactions: lightmanek
Jul 27, 2020
26,827
18,471
146
At the risk of veering off-topic
Don't worry about that. I'm the OP. I hereby permit you to veer as far as your heart desires!

Are you doing anything with GPU compute? Do you think Intel's Alchemist iGPU could be used to offload some hefty and useful calculations from the CPU? What are the possibilities and projected performance gains?

Would you be kind enough to accept me as your Padawan, Master?
 
Jul 27, 2020
26,827
18,471
146

The 65nm P4 Extreme Edition 965 3.73 GHz @ 4.6 GHz with DDR3!

Pentium 4's last hurrah!

1.jpg2.jpg3.jpg4.jpg5.jpg6.jpg7.jpg

If it did so well at 65nm with reasonable power draw (compared to today's 250W CPUs), I see no reason why it couldn't do well at smaller processes and clockspeeds approaching 10 GHz and at least 16 cores.

Doom 2016 results show that well optimized game engines would've worked very well with this architecture.

Maybe it wouldn't have been a runaway success but I don't see why they couldn't keep this line of CPUs going as a low cost alternative to their Core line.
 
Jul 27, 2020
26,827
18,471
146
Compared to now ? lol.
It was ahead of its time.

Remember, one of the reasons Tejas was cancelled was because internally Intel was debating if the public was ready for liquid cooling. They decided in the end that it wasn't (maybe the AIO cooler technology wasn't mature enough at the time or just too expensive for average joe to afford).

Had affordable 420mm coolers like the ones from Arctic or Deepcool had been widely available at the time, I'm pretty sure that Tejas would've seen the light of day. One could even say that the cancellation of Tejas affected AMD's future as the lack of affordable liquid cooling prevented a future Bulldozer CPU working at 6 GHz or beyond.
 

Pontius Dilate

Senior member
Mar 28, 2008
222
384
136
Liquid cooling may have been part of the calculus for determine Tejas fate, I hadn't heard that before, but I doubt it was significant. Netburst was a dead-end architecture and many people at Intel had to know it by the time Tejas was approaching tapeout. Intel was extending the pipeline and doing less work in each clock cycle just to get the clock to go higher, which continuously increased the complexity of the chip requiring more and more associated circuitry just to keep the thing fed and functional. And even with all that extra complexity and the high clocks pushing their process to it's limit, AMD was punching them in the face with Athlon 64 which had equivalent performance at lower clocks, heat and cost. Intel was pushing into power and complexity walls simultaneously for increasingly diminishing or negative returns.

At the same time, Apple was dealing with similar problems with their Power G5. That thing was a furnace at the clocks it needed to run at to be useful, and if you downclocked it enough to fit into mobile power budget it was no more performant than their aging G4. Apple had to go with water-cooling themselves on the G5 Power Mac dual-dual cores just to ship them at 2.5 and 2.7 GHz, never mind the 3 GHz that IBM was promising. So if Intel was thinking about water-cooling they had a product in the market to look at, and it didn't look good. Apple by this time had to have been working with Intel and Pentium 4 had the exact same problems they were trying to get away from in G5, so they were waiting for the promotion of Pentium M to desktop to start their transition. So I suspect however far Tejas got in development by 2004-5 Intel already knew it was cooked, literally.
 
  • Like
Reactions: Thibsie

NTMBK

Lifer
Nov 14, 2011
10,431
5,760
136
I'd love to see how this stacks up against the low power stuff from a few years later, the likes of Bay Trail and Kabini. I suspect the comparison would be pretty horrific.
 
  • Like
Reactions: lightmanek
Jul 27, 2020
26,827
18,471
146
Well, I hope Intel spends its last few billions resurrecting a modern die shrink of Tejas with AVX10.2 and APX and a minimum of 12 cores. It's better to die a fiery death than with a whimper.
 
  • Haha
Reactions: yottabit

yottabit

Golden Member
Jun 5, 2008
1,651
817
146
Well, I hope Intel spends its last few billions resurrecting a modern die shrink of Tejas with AVX10.2 and APX and a minimum of 12 cores. It's better to die a fiery death than with a whimper.
Igor I love you obsession with resurrecting P4 on a modern node. But I can’t tell if it’s trolling or serious. You know the performance per clock would be in the absolute gutter right? Do I need to break my P4 system out of the basement to run some PPC uplift benchmarks vs a modern system?

Somewhere a while back you mentioned having each Pentium 4 core be it’s own NUMA node with 4 GB RAM each to get around the 32 bit architecture limitation and I’m still chuckling about that. You’ve put a lot of thought into this
 
Jul 27, 2020
26,827
18,471
146
Igor I love you obsession with resurrecting P4 on a modern node. But I can’t tell if it’s trolling or serious.
I love the idea of the P4 and its life was cut short prematurely by the horrible decision influenced by the monsters at IDC. Core architecture should've remained a mobile-first design and maybe by now, they would've reached Apple silicon power efficiency. Instead, those idiots tried to push that efficient architecture out of its efficiency zone and we got crappy cores like Raptor and Lion Cove. Had they kept optimizing Core for power efficiency rather than frequency, they wouldn't have needed the pathetic Atom which I have had the misfortune of having to use in the form of Gracemont. Skymont is great but at what expense? It took them THIS long to get to a good Atom architecture because they stupidly veered off the right path.

65nm P4 showed that it responded well power-wise to process shrinks. It should've been kept alive so we could've been at 10+ GHz today. They should've developed more tricks like turning bad branchy code on the fly to optimized code more suitable for the P4's pipeline.

Do I need to break my P4 system out of the basement to run some PPC uplift benchmarks vs a modern system?
Which P4 model you got?

I think I have at least two P4s on hand but I haven't booted them to check which ones they are. One I would need to assemble and the other I'm not even sure if it will boot. Got them both because my workplace was throwing them away. But I'm pretty sure they are going to be the worst cheap models one would find in a cheap office PC with horrible RAM.

The video I posted above, P4 did very well with Doom considering its age.
 
Jul 27, 2020
26,827
18,471
146
I mean 12900K from 2021 will crush it at the same power
Yeah well that's unfair. The P4 of 2021 had it been worked on for years and years would be running at a speed so high that the 12900K would shrink smaller than a squirrel's balls out of sheer embarrassment.
 

yottabit

Golden Member
Jun 5, 2008
1,651
817
146
Certainly a very interesting thread 😆

I do have fond memories of the Pentium 4 but only during the Northwood era. That was really Intel’s time to shine before Athlon 64 stomped their rear. And even then with Hyperthreading Intel was ahead on certain productivity workloads. Plus they had abandoned expensive RDRAM in favor of DDR. I feel bad for anyone that bought a Willamette though, Intel still to this day continuing their proud tradition of occasionally orphaning people on a single generation socket.

I switched to team blue for a build there after having my Athlon Thunderbird rig die on me, and being a young teen I just didn’t have the money to keep dicking around with overclocking and destroying my own hardware. I got the most boring Intel motherboard with an Intel Northwood P4 and it was boring but rock stable. Then back to AMD once the Athlon 64 came out. Then back to Intel for the Core 2 Duo.

I learned CAD on an IBM Thinkstation with a Pentium 4 with Hyperthreading. I was an intern and I’m pretty sure it was supposed to be an office computer but it was snappier than the behemoth Xeon workstations the rest of the engineers had. Pretty sure it must’ve had a Geforce MX graphics card or similar (low profile)

Speaking of graphics the i845G integrated graphics from this era has kind of a bad reputation but I remember it as being the first semi usable integrated graphics. It could run Half Life and Counter Strike (barely) and I’m guessing a whole generation of gamers probably started with the i845. It seems like the rate of progress of integrated graphics accelerated rapidly from there.
 

yottabit

Golden Member
Jun 5, 2008
1,651
817
146
See? That's what the guy in the video also said. It felt surprisingly snappy.
There's nothing magical about it. They were running Netburst Xeons I'm sure. It was a case of the IT guy mis-speccing people's systems - they were dual CPU beasts and the CAD software was almost entirely single threaded. They would have been great for running engineering simulations but nobody there did that. The clock rate on my workstation was higher.

I also think they had a lot more background software and services running.

Any computer can feel snappy. I have a Pentium MMX 233 rig running Windows 95 that feels really snappy. If you're going for anachronistically "hot-rodding" a system too like that video you shared that doesn't really prove anything. I have a modded AMD K6-III+ pushing 640 MHz that will run things you wouldn't believe on a Socket 7 platform. Yet I'm not asking AMD to re-spin the K6-III on 2 nanometer :D

The most condemning fact for Netburst IMO was the fact that it barely outperformed Coppermine PIII's clock for clock. Wilamette actually underperformed against Tualatin - in some cases not just clock for clock but outright. And this is comparing SDRAM for the PIII vs RDRAM for Wilamette.

I don't know many people that actually had Tualatin PIII when they were contemporary but it's not a great look. The long pipeline of the Netburst architecture was proven to be a misstep compared to wider cores.

Edit: All that being said I don’t want to be the one getting in the way of Igor’s dreams. Igor, you should write a letter to Congress demanding that if Intel gets bailed out it should be conditional on them re-spinning the Pentium 4 on 18A. We need all the Gigahertz
 
Last edited:
Jul 27, 2020
26,827
18,471
146
Edit: All that being said I don’t want to be the one getting in the way of Igor’s dreams. Igor, you should write a letter to Congress demanding that if Intel gets bailed out it should be conditional on them re-spinning the Pentium 4 on 18A. We need all the Gigahertz
At the moment, the only two people who can get POTUS to do anything are war criminals. They don't even know what a P4 is.