Worst CPUs ever, now with poll!

Page 30 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What's the worst CPU ever? Please explain your choice.

  • Intel iAPX 432

  • Intel Itanium (Merced)

  • Intel 80286

  • IBM PowerPC 970

  • IBM/Motorola PowerPC 60x

  • AMD K5

  • AMD family 15h

  • AMD family 10h

  • Intel Raptor Lake


Results are only viewable after voting.

Thunder 57

Diamond Member
Aug 19, 2007
4,154
6,927
136
It's the same reason why x86 continues to live. Despite its IPC superiority, Apple M4 cannot run a Geforce 4090 or 5090 or 9070 XT. And suppose Apple came out with a micro-ATX board for it tomorrow (very unlikely), how interested would AMD/Nvidia be in rewriting their GPU drivers for it? Apple would have to PAY both for the driver development and then hope the cost pays off in the end. They could totally go down this path but they are rolling in so much cash that they go "meh" at even the thought of it.

As you said very unlikely. Should really be no chance in hell. Apple loves their walled garden and controlling everything. They would never allow anyone to possibly tarnish their brand. Also, it's not like they need the money.
 

LightningZ71

Platinum Member
Mar 10, 2017
2,619
3,306
136
From what I've read on the topic, it wasn't just timelines that influenced IBM's decision. Intel was working on a project to really push the 8086/88 in the market that was providing a ton of engineering support for a whole ecosystem around that processor family. They also had a clear development roadmap for multiple generations of follow ups that were supposed to be fully backwards compatible. The only thing even close to that was the 6800/68000 series from Motorola who just didn't have all the moving parts together at the time.

IBM's PC was a heavily collaborative effort between Intel and themselves, with a crazy wall of secrecy between them that they both guessed through, but publicly honored.

Choosing the 68000 would have delayed the project by at least a year, and made market adoption less effective as it just didn't have the same level of back end support at the time. There are times that you just have to do SOMETHING with whatever is available.

The biggest problem for the market is that there was never a successful collaboration of resources by any one alternative. Mac volume was never enough on its own for the same economy of scale. Commodore and Atari imploded, with the XT line hitting a wall and Amiga barely limping along. You get MIPS, VAX, Sparc and Alpha along the way, but none can outgrow their niche. If they had formed an industry consortium, rallied around the 68000 and bespoke accelerators or a compatible high/low product approach with a common programming and OS kernel platform, they would have been a force for change.
 
  • Like
Reactions: Nothingness
Jul 27, 2020
28,173
19,203
146
One could argue that the alternatives failed because AMD didn't try to get behind a non-x86 alternative. There's just something about the AMD culture where they breed the best technology possible with limited resources while Intel always had to try to clobber them by throwing ten times more money. So no surprise that they fell badly behind when they didn't have enough money to attack the AMD problem anymore.
 

Hulk

Diamond Member
Oct 9, 1999
5,205
3,838
136
As you said very unlikely. Should really be no chance in hell. Apple loves their walled garden and controlling everything. They would never allow anyone to possibly tarnish their brand. Also, it's not like they need the money
Apple struck gold, oil, and a mine full of diamonds with the iPhone. Everything else matters little.
 

LightningZ71

Platinum Member
Mar 10, 2017
2,619
3,306
136
The corporate world knew for a long time prior. Things like the Palm Pilot and the Compaq iPaq were around for years prior, and there were teams actively working on the concepts of converged devices. Apple gets credit for being the first to pull it off.
 

lakedude

Platinum Member
Mar 14, 2009
2,778
529
126
Socket A Thunderbirds where freakin awesome. I am never a big fan of these stacked comparisons. 6 months to a year before this Intel CPU's would do the same. Intel came up with their speed step for slowing down as tempatures got higher, first, and all of a sudden it was look how bad AMD is because their CPU's will burn up without a cooler. Instead of being a cool feature Intel CPU's it became proof that AMD chips ran hot and were bad.
It wasn't that AMD chips ran hot, it was that they lacked effective thermal protection.

Back in the day Intel chips had built in thermal protection well before speed step. They would just stop working if they got hot but they were not damaged and worked fine once they cooled down.

I specifically remember because I sent an AMD computer to a friend. Somehow it ended up with the heatsink off and the AMD CPU burned up dead.

Also one of the AMD owners (who gave me a hard time for buying Intel) burned up his AMD CPU. The AMD owners made it sound like this would never happen but it did happen, twice that I know of personally.

It was a dumb thing not to have. Thousands of transistors and you can't put one thermal diode in there?
 
  • Like
Reactions: igor_kavinski

gdansk

Diamond Member
Feb 8, 2011
4,680
7,906
136
This early K7 oversight is notable because:
1) it could kill itself more quickly, making quite the show
2) the competition had recently implemented such a safeguard (in P6, one generation earlier)

But it was very easy to avoid. Working K7 remain plentiful to this day. They're still considered e-waste. I don't consider it anything but a caveat to an otherwise decent CPU for its time.
 
  • Like
Reactions: Thibsie

Thibsie

Golden Member
Apr 25, 2017
1,148
1,352
136
It wasn't that AMD chips ran hot, it was that they lacked effective thermal protection.

Back in the day Intel chips had built in thermal protection well before speed step. They would just stop working if they got hot but they were not damaged and worked fine once they cooled down.

I specifically remember because I sent an AMD computer to a friend. Somehow it ended up with the heatsink off and the AMD CPU burned up dead.

Also one of the AMD owners (who gave me a hard time for buying Intel) burned up his AMD CPU. The AMD owners made it sound like this would never happen but it did happen, twice that I know of personally.

It was a dumb thing not to have. Thousands of transistors and you can't put one thermal diode in there?
Well it isn't as if Intel integrated the thermal diode 10 years before, huh.
 

lakedude

Platinum Member
Mar 14, 2009
2,778
529
126
Well it isn't as if Intel integrated the thermal diode 10 years before, huh.
Yeah IDK the video at Tom's is my earliest recollection of the topic. Tom's indicates that Intel's 2 year old solution at the time was better than AMD's most recent efforts at the time.

I did run across some evidence that Intel might have had some sort of thermal protection as early as the P1:

Pentium® OverDrive® processors with MMX™ technology
Runs Slower if the Fan is Disabled

End of Interactive Support Announcement
These products are no longer being manufactured by Intel. Additionally, Intel no longer provides interactive support for these products via telephone or e-mail, nor will Intel provide any future software updates to support new operating systems or improve compatibility with third party devices and software products.

THESE DOCUMENTS ARE PROVIDED FOR HISTORICAL REFERENCE PURPOSES ONLY AND ARE SUBJECT TO THE TERMS SET FORTH IN THE "LEGAL INFORMATION" LINK BELOW.For information on currently available Intel products, please see www.intel.com and/or developer.intel.com


Symptom
The system seems slower than normal with the Pentium® OverDrive® processor with MMX™ technology installed. Across all applications, the performance drop appears the same. Diagnostics report the processor working at the proper speed.

Description
The fan is not working. The thermal protection circuitry built into the microprocessor is reducing the number of instructions performed, thus slowing the system.

20250425_041953.jpg
 
  • Like
Reactions: igor_kavinski

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
32,182
32,801
146
The corporate world knew for a long time prior. Things like the Palm Pilot and the Compaq iPaq were around for years prior, and there were teams actively working on the concepts of converged devices. Apple gets credit for being the first to pull it off.
Crackberry was the homerun.

With Raptor Lake being the most disastrous generation, and the Arrow Lake launch being a Bulldozer moment in gaming performance on a DOA socket, I think this poll will look quite different if we redo it in a few years.
 

Thibsie

Golden Member
Apr 25, 2017
1,148
1,352
136
Yeah IDK the video at Tom's is my earliest recollection of the topic. Tom's indicates that Intel's 2 year old solution at the time was better than AMD's most recent efforts at the time.

I did run across some evidence that Intel might have had some sort of thermal protection as early as the P1:

Pentium® OverDrive® processors with MMX™ technology
Runs Slower if the Fan is Disabled

End of Interactive Support Announcement
These products are no longer being manufactured by Intel. Additionally, Intel no longer provides interactive support for these products via telephone or e-mail, nor will Intel provide any future software updates to support new operating systems or improve compatibility with third party devices and software products.

THESE DOCUMENTS ARE PROVIDED FOR HISTORICAL REFERENCE PURPOSES ONLY AND ARE SUBJECT TO THE TERMS SET FORTH IN THE "LEGAL INFORMATION" LINK BELOW.For information on currently available Intel products, please see www.intel.com and/or developer.intel.com


Symptom
The system seems slower than normal with the Pentium® OverDrive® processor with MMX™ technology installed. Across all applications, the performance drop appears the same. Diagnostics report the processor working at the proper speed.

Description
The fan is not working. The thermal protection circuitry built into the microprocessor is reducing the number of instructions performed, thus slowing the system.
Yep, from memory, it was (IMO) from previous generation with Intel meaning socketed PIII. Not really a lot earlier than AMD.
Then, I would not take Tom's as a source of anything but garbage. Another source would be better but it is around that era.

AFAIR next AMD generation had a diode which should mean Athlon XP Palomino and next ones.
 
  • Like
Reactions: Thunder 57

Thunder 57

Diamond Member
Aug 19, 2007
4,154
6,927
136
Yep, from memory, it was (IMO) from previous generation with Intel meaning socketed PIII. Not really a lot earlier than AMD.
Then, I would not take Tom's as a source of anything but garbage. Another source would be better but it is around that era.

AFAIR next AMD generation had a diode which should mean Athlon XP Palomino and next ones.

They did. I should know, I owned at least one Palomino.
 
  • Like
Reactions: Thibsie

kschendel

Senior member
Aug 1, 2018
295
235
116
Sorry for being a nag, but the thread title is "worst CPU's ever", not "worst x86-ish CPU's ever." I'll even allow the reading of "worst microprocessor CPU ever", because if we get into older designs such as Burroughs vs IBM 360 vs Univac 11xx ones-complement (a wonderful if weird machine) vs Harris 24-bit vs the PDP-8 vs some PDP-11 haters, the "worst" metric gets just too difficult to pin down. There have been some relatively dog x86 CPU's, but even the worst of them pale in comparison to the worst worst. I stand by my choice, stated at least once (maybe more?) already up-thread.

I note that the worst microprocessor CPU was so bad, it doesn't even appear in the thread poll. (Hint to poll author: "other" is often enlightening, even when one imagines that one knows all of the candidates. Which in this case clearly wasn't true.)

I understand that for many here, x86-ish CPU's are the entirety of their experience, and I'm not trying to diss any of that. I'm just saying that there is an abyss between "worst x86 CPU" and "worst [microprocessor] CPU" that perhaps isn't properly felt.
 
Last edited:

LightningZ71

Platinum Member
Mar 10, 2017
2,619
3,306
136
I took the thread to be focused on CPUs that were consumer desktop/console/laptop focused. I also took it to mean the full spec version of said processor, and not some cut down value version that was castrated by choice to hit a price point. If we were including those, I would focus on the atom Cedar Trail, particularly the N2000 series that were shoveled into Netbooks. Those things sucked in every way a processor could. They doubled down by having licensed PowerVR graphics integrated, for which they only ever licensed one driver for and never updated. It was so bad, Linux had problems with it due to the same driver issue, but never having a native one at all.

If we're going farther afield, there are a couple of microcontrollers whose designers I would very much like a word with in a dark alley...
 
  • Like
Reactions: Nothingness

kschendel

Senior member
Aug 1, 2018
295
235
116
If we're going farther afield, there are a couple of microcontrollers whose designers I would very much like a word with in a dark alley...
A good lesson to us to properly define our terms! re microcontrollers, I hear you; you've awakened memories that I've tried to forget, even if I wasn't directly involved. Ugh.
 

DZero

Golden Member
Jun 20, 2024
1,792
688
96
I know that this won't be a champion, but can we consider the Tensor G5 on how the dissaster was done to be considered a case of studio?
I mean, uses TSMC 3nm, which is considered a superior node, has a decent CPU configuration, but then choses a Power VR GPU which is unoptimized and causes that all the good things they did, gets royally screwed.

Gaming wise has a lot of glitches, processing wise, performs like a Snapdragon 8 Gen 2.

And yeah, it is being even compared to the Kirin 9020... a "7nm" processor that can go toe by toe. It left me think... what Google did wrong to end like that?
 

NTMBK

Lifer
Nov 14, 2011
10,486
5,905
136
I know that this won't be a champion, but can we consider the Tensor G5 on how the dissaster was done to be considered a case of studio?
I mean, uses TSMC 3nm, which is considered a superior node, has a decent CPU configuration, but then choses a Power VR GPU which is unoptimized and causes that all the good things they did, gets royally screwed.

Gaming wise has a lot of glitches, processing wise, performs like a Snapdragon 8 Gen 2.

And yeah, it is being even compared to the Kirin 9020... a "7nm" processor that can go toe by toe. It left me think... what Google did wrong to end like that?
Compared to some of the train wrecks on this list it doesn't even break the top 10 😂

I'd argue it's still better than the old Mongoose CPUs from Samsung- those used a custom CPU architecture with massively more die area than the stock ARM cores, and still performed worse and used more energy. Total disaster.
 

Jan Olšan

Senior member
Jan 12, 2017
589
1,157
136
The biggest problem for the market is that there was never a successful collaboration of resources by any one alternative. Mac volume was never enough on its own for the same economy of scale. Commodore and Atari imploded, with the XT line hitting a wall and Amiga barely limping along. You get MIPS, VAX, Sparc and Alpha along the way, but none can outgrow their niche. If they had formed an industry consortium, rallied around the 68000 and bespoke accelerators or a compatible high/low product approach with a common programming and OS kernel platform, they would have been a force for change.
The accidental greatness of the PC - and a reason why I think it was just that it won - was that it wasn't hardwired to have particular graphics and sound chips and that it had a base of a common platform (BIOS) and OS (the PC/MS DOS). IBM didn't intend it, but this computer ended up being not just a one-company play like all the competition (say, Amiga, Atari, Apple and various highend stuff), but a platform. The viability of clones and their coming was MASSIVE THING.

You see all the clones not only brought volume that is the lifeblood the other alternatives lacked. It brought competition, and that competition forced prices lower, it created pressure that made companies try to push out new generations of computers out faster and make them better. While the individual competing platforms (those Amigas/Ataris and whatever Apple stuff) had to fear PC, but even so they had their captive market within which they felt safe and were complacent. In any case, the lack of clones to pick from was detrimental for the customers and limited evolution of those platforms.
At the same time, PC's modular approach on the graphics/audio side (but in other peripherals too - disk controllers, network cards, etc) allowed similar competition in these areas, so graphics and audio cards for PCs went through fast evolution, and absolutely deservedly wiped the floor with things like Amiga that stayed stuck to their fixed chipsets.

The PC absolutely WAS the best platform out of the competitors that I mentioned and it is great it has won eventually. It is deserved it WON instead of those limited proprietary computer lines. Had it not ended up like that, it would be a worse timeline. As much as I feel that exotic nostalgia for stuff like the Amiga, it is clear it lost deservedly (sorry). If something might have been alternative, perhaps it was the S-100 bus standard and ecosystem of pre-PC expansion boards. Not any of the closed clone-less platforms. Motorola's bad luck was that all the platforms it was married to were the bad closed clone-less ones.

X86 processors were kinda similar matter, with their own clone market creating competition. In hindsight, the right things happened with them. But In their case, Motorola probably could have been an alternative too, if similar agreements were made/forced by IBM. The Cyrixes and AMDs (or even Intel) would have manufactured 68K processors, the competition would have lead to faster evolution perhaps, and the ISA would perhaps not die but be evolved into the OoO and 64bit modern era by one or the other company.
 
Last edited:

johnsonwax

Senior member
Jun 27, 2024
456
670
96
enough on its own for the same economy of scale. Commodore and Atari imploded, with the XT line hitting a wall and Amiga barely limping along. You get MIPS, VAX, Sparc and Alpha along the way, but none can outgrow their niche. If they had formed an industry consortium, rallied around the 68000 and bespoke accelerators or a compatible high/low product approach with a common programming and OS kernel platform, they would have been a force for change.
Conventional wisdom in some areas is that the PC market is a good example of bundling vs unbundling theory. In the early days of computing (the Commodore/Sun era) the markets were too small to unbundle - you had to do it all in house because there wasn't enough volume to tip up a component/OEM model where the component makers could stay in business.

The IBM PC was sufficiently successful in business that the market grew large enough to change that, heralding in the Wintel era which addressed the other problem that needs easily outstripped the capability of computers - so it was beneficial to have an unbundled market so you could tailor hardware to your needs - I remember building local database workstations that threw most of my budget into storage I/O because I wasn't ever CPU bound. But the guy in the next office over was doing the opposite. That's a space where bundling struggles to work because it puts the burden on an Apple to anticipate all of those needs. Run forward 30 years and the capability of computers has largely outstripped the need - not in servers but in desktop because we've shifted most of our compute to the cloud. I don't need compute locally to do my taxes, because my watch is now powerful enough to do that if it isn't already a cloud based service that I'm using. Quickbooks is SAAS now. So the number of people that need to build a bespoke PC to meet a specific need is pretty much the hyperscalers and gamers with a lot of cash to burn. And that's it. A MBA literally meets the computing need of about 95% of buyers. And so bundling is viable again, and Apple is able to make products that work for the market again, and have the benefit of economy of scales that the unbundling community struggles to meet because they're fragmented.

The theory suggests a recurring cycle of bundling/unbundling and somewhere down the line Apple's model won't be suitable any longer and we'll get Meta branded laptops that are subsidized by their ad revenue or some other thing like that. We experimented with that with eMachines and it didn't stick. But it might in the future. And note, Apple is doing something a little like this but through their service ecosystem (iPhone/iPad/Mac/iCloud/ApplePay/iTunes/AppleTV/etc.) It's not really a subsidized go-to-market model the way that was tried a while back, but it's a value-add model. History tells us none of these are terribly durable. Apple started in the first bundling model (Apple II, early Mac), nearly went out of business in the unbundling era, and recovered in the next bundling cycle (iPod/iPhone with the Mac/AppleSilicon riding in their wake).