• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Worst CPUs ever, now with poll!

Page 9 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What's the worst CPU ever? Please explain your choice.

  • Intel iAPX 432

  • Intel Itanium (Merced)

  • Intel 80286

  • IBM PowerPC 970

  • IBM/Motorola PowerPC 60x

  • AMD K5

  • AMD family 15h

  • AMD family 10h

  • Intel Raptor Lake


Results are only viewable after voting.

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
I'm honored someone (not me) necroed this thread. 286 is on the list because it had a borked protected mode.

I actually quite liked the 80286, it was a nice upgrade from the 8086/88, in its day.

I was sad to find out this thread was a necro, because I liked it.

All the cpus, have had their good points, so it is difficult to identify a really bad one.

In some respects, the original Pentium 60 MHz, with the DIVIDE bug, could be argued to be the worst. Because you can not trust it, to (accurately) do many things, since you will always be worrying at the back of your mind, "Has the divide bug just caused this problem and/or a silent data corruption issue ?".

If I remember correctly, there were software patches, and Intel were eventually put under enough pressure, to release free replacement, divide bug fixed, Pentiums.
 

crashtech

Lifer
Jan 4, 2013
10,695
2,294
146
My own experience would lead me to specify anything mounted in Socket 423 as among the worst desktop CPUs.
 
Dec 30, 2004
12,553
2
76
I think Itanium could have been something fun but I don't know that much about cpu arch anymore, maybe we already have something like it.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Celeron D, was by FAR the worst thing i did EVER see in my life.


Celeron D was not to bad, it was a good improvement over the previous Northwood Celerons

and with OC it could fight with the high end CPus

winstone-2.png
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
100% WRONG pool

The right answer is :

CYRIX 166

(if I recall correctly) I really liked my Cyrix 166. It was even faster at some stuff (Integer/Dos) for architectural reasons. It was keenly priced.
But the Intel chips did have the edge (speed wise in general), especially with floating point.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
No bulldozer option?

I vote bulldozer. Slower than a similarly priced I7/Xeon in MT tasks, and slower than AMD's previous generation of processors in ST tasks. Its literally garbage.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
I dont know why people hit so hard on AMD h15 when K10 was a lot worse as a CPU launch. So you launch a brand new CPU with lackuster performance and worrying thermals, but to top it off, you have this bug that if you patch it you end with even LESS performance that the chip had at launch conditions. Really?


Blame Bulldozer for being delayed so many years, and still then we had a CPU that wasn't quite there with the software at times. It is getting there, tough.
 

Nothingness

Diamond Member
Jul 3, 2013
3,309
2,382
136
The worst CPU ever is 8086 because it was chosen by IBM and it resulted in massive usage of one the worst ISA's I ever had the displeasure to work with. It almost definitively disgusted me of assembly language.
 

TheSlamma

Diamond Member
Sep 6, 2005
7,625
5
81
AMD K6-2 the master of the blue screen and float point that got shredded by the original Celeron
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
The worst CPU ever is 8086 because it was chosen by IBM and it resulted in massive usage of one the worst ISA's I ever had the displeasure to work with. It almost definitively disgusted me of assembly language.

I wish they had chosen the (Motorola) 68000 (32 bit internally), which seemed to have a much better (modular) instruction set, without the (horrible in my opinion) 64K segmented address mechanisms.
 
Last edited:

Nothingness

Diamond Member
Jul 3, 2013
3,309
2,382
136
I wish they had chosen the (Motorola) 68000 (32 bit internally), which seemed to have a much better (modular) instruction set, without the (horrible in my opinion) 64K segmented address mechanisms.
Indeed. But Motorola was late.

That being said, I'm impressed by what Intel has done with that PoS, I definitely like their higher-end CPU for the performance they provide me with :)
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
Indeed. But Motorola was late.

That being said, I'm impressed by what Intel has done with that PoS, I definitely like their higher-end CPU for the performance they provide me with :)

On modern x86 cpus, individual programmers or compilers, can use SSE2 and later instruction sets, with the original x86 instructions and the old 8087 (floating point) instructions, not necessarily being used that much.

Although it is a lot better than it was, the early days of PCs suffered badly (in my opinion), because many things were limited to 64K (as they had to be in a single segment, unless complicated/slow changing segment addressing methods were used), and assembly language programming (on the 8086) would have been (arguably) easier on the 68000.

But yes, the 68000 may have been a bit too late for the PC. But I bet they could have used it, if they had wanted to.

The 32 bit 386 seemed to resolve a lot of the stuff and issues, which the 68000 would have solved, on day one.

In practice, if they had gone with the Motorola 68000, it is quite possible, that other problems/issues may have happened. E.g. It could have been too late to the market or too expensive etc.

Which could have meant that something completely different to the IBM PC, e.g. A D.E.C. CP/M MK2, could have been created (they may have done this anyway), running on a Z80++ (++ = next generation of Z80).
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,695
136
Celeron D was not to bad, it was a good improvement over the previous Northwood Celerons

and with OC it could fight with the high end CPus

The Northwood Celerons were absolutely horrible. Coupled with Intel's "Extreme" Graphics (2) they were even worse... D:

An upgrade from horrible to slightly less horrible isn't much. The only redeeming qualities from the Celeron D was that it used Socket 775, so you could upgrade the thing to Conroe-based CPUs, and the Cedar Mill (65nm) variety with 512KB L2.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I wish they had chosen the (Motorola) 68000 (32 bit internally), which seemed to have a much better (modular) instruction set, without the (horrible in my opinion) 64K segmented address mechanisms.

Apple did go that route. I was a huge fan of the MC68040 when it came out. It was a hell of a chip that blew the doors off x86 chips of the time.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
Apple did go that route. I was a huge fan of the MC68040 when it came out. It was a hell of a chip that blew the doors off x86 chips of the time.

As did many others (Atari + Commodore), including the (UK) Sinclair QL (68008).

Yes, a very nice chip series. Sadly I was not into Apple, when they were 68xxx based. If I traveled back in a time machine, I would reconsider that decision.
Maybe they (Apple) were very expensive at the time.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
As did many others (Atari + Commodore), including the (UK) Sinclair QL (68008).

Yes, a very nice chip series. Sadly I was not into Apple, when they were 68xxx based. If I traveled back in a time machine, I would reconsider that decision.
Maybe they (Apple) were very expensive at the time.

In the early days (mid 80s) they were priced very well. Prices did creep up over time. But they did have better hardware in comparison to most PCs. One of the reasons Apple came out with the LC series (low cost). Which typically used either one generation older CPU, or the MC68LC040, which lacked the MMU of the MC68040. The Quadra series continued using the MC68040.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
In the early days (mid 80s) they were priced very well. Prices did creep up over time. But they did have better hardware in comparison to most PCs. One of the reasons Apple came out with the LC series (low cost). Which typically used either one generation older CPU, or the MC68LC040, which lacked the MMU of the MC68040. The Quadra series continued using the MC68040.

(Arguably) What brought the prices down, and increased the availability/take-up of PCs, was the original IBM PC, being cloned.
Most of the PCs in the early days (even now, I guess), were made up of IBM clone machines. Because IBM was forced to allow clones, due to anti-monopoly decisions (in 1956 ???), when IBM got badly hit by official agency(s).

So if Apple stopped this (cloning), then that could be a factor in the relative lack of success of the 68xxx series (as regards PC like desktop computers).

I.e. In the 80s, "real" IBM PCs, were thousands of dollars, but clones could be a fraction of that price.
(Made up, guestimated, prices).
Real IBM PC = $2750
IBM CLONE PC = Similar specification and 100% hardware/software compatible = $750

So maybe the cloned vs no clones allowed was an important factor.

Also the cloning and opening of the IBM PCs hardware/software specifications, allowed many third party manufacturers, to produce many interesting hardware and software products for it.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
(Arguably) What brought the prices down, and increased the availability/take-up of PCs, was the original IBM PC, being cloned.
Most of the PCs in the early days (even now, I guess), were made up of IBM clone machines. Because IBM was forced to allow clones, due to anti-monopoly decisions (in 1956 ???), when IBM got badly hit by official agency(s).

So if Apple stopped this (cloning), then that could be a factor in the relative lack of success of the 68xxx series (as regards PC like desktop computers).

I.e. In the 80s, "real" IBM PCs, were thousands of dollars, but clones could be a fraction of that price.
(Made up, guestimated, prices).
Real IBM PC = $2750
IBM CLONE PC = Similar specification and 100% hardware/software compatible = $750

So maybe the cloned vs no clones allowed was an important factor.

Also the cloning and opening of the IBM PCs hardware/software specifications, allowed many third party manufacturers, to produce many interesting hardware and software products for it.

There were Mac clones in the early days. But the Clone company had to buy a real Mac in order to get the ROM to put into their own machine. Later in 1995 (PPC era) Apple did give out official licenses to clone companies (Motorolla, UMAX, Daystar, Power Computing, Radius, few others). So for about 3.5 years there was a lot of Macs to choose from. When Apple bought out NeXT and Steve Jobs returned, he put an end to clones. UMAX stayed around for a while, and their was talks with Sony and Compaq for them to make clones, but they never happened.

Had Apple allowed clones in the early days, its doubtful that Apple would still exist today. Much like how IBM has not been in the PC business for ages.
 

SOFTengCOMPelec

Platinum Member
May 9, 2013
2,417
75
91
There were Mac clones in the early days. But the Clone company had to buy a real Mac in order to get the ROM to put into their own machine. Later in 1995 (PPC era) Apple did give out official licenses to clone companies (Motorolla, UMAX, Daystar, Power Computing, Radius, few others). So for about 3.5 years there was a lot of Macs to choose from. When Apple bought out NeXT and Steve Jobs returned, he put an end to clones. UMAX stayed around for a while, and their was talks with Sony and Compaq for them to make clones, but they never happened.

Had Apple allowed clones in the early days, its doubtful that Apple would still exist today. Much like how IBM has not been in the PC business for ages.

That's exactly the sting in the tail, or double edged sword situation, I was worried about, when I wrote the stuff above. Clones = Apple might disappear, later.

Hindsight is a very powerful tool, but as you just said, such changes (allowing clones), could actually destroy/damage the company that you are trying to promote, in the first place.

Thanks, I did not know that there use to be (official) Apple clones, which were later stopped. (I know about HackedPCs being turned into Macs and stuff, but I mean sold in the high street clones).
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
That's exactly the sting in the tail, or double edged sword situation, I was worried about, when I wrote the stuff above. Clones = Apple might disappear, later.

Hindsight is a very powerful tool, but as you just said, such changes (allowing clones), could actually destroy/damage the company that you are trying to promote, in the first place.

Thanks, I did not know that there use to be (official) Apple clones, which were later stopped. (I know about HackedPCs being turned into Macs and stuff, but I mean sold in the high street clones).

Clones didn't make IBM disappear. Their hand was forced though, so they had to adapt. Same could have happened with Apple since they now make significantly more money on iPads and iPhones than Macs.