Without AMD (essentially an Intel Monopoly) What would CPUs cost?

stevech

Senior member
Jul 18, 2010
203
0
0
Without AMD (essentially an Intel Monopoly) What would CPUs cost? $300?
look at the obscene prices Micro$oft is forcing on us with their essential monopoly, for Win 7, a face-lift for the same ole OS.

Therefore, I have bought only AMD for many years, to do my little part to avoid $300 CPUs.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Many of us already have $300 CPUs. My current one was $230, last one was $300. Newegg says their top selling processors are the i7 930 ($290), i7 860 ($290), and Phenom 1090 ($296)
 

Dark4ng3l

Diamond Member
Sep 17, 2000
5,061
1
0
Most people in the market for cheap cpus just walk into a store an buy something off the shelf.
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
i would expect prices would be at least double or triple. And for much worse performing CPU's as there would be no reason to push them.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
i would expect prices would be at least double or triple. And for much worse performing CPU's as there would be no reason to push them.

Intel would lose billions if people had no reason to buy new computers. If a computer 3 years ago is just as fast as a new computer on the shelf today, there's no reason for me to buy a new computer.

Ironically this is already true to some degree. My 3.5 year old E6600 overclocked to 3.1ghz will crush half of the desktop computers in Best Buy.
 

JoshGuru7

Golden Member
Aug 18, 2001
1,020
0
0
Without AMD (essentially an Intel Monopoly) What would CPUs cost? $300?
Prices would rise without competition. Under perfect competition there is both producer and consumer surplus, while under a monopoly the monopolist is able to capture the consumer surplus. http://en.wikipedia.org/wiki/Monopoly#Price_Discrimination_and_Capturing_of_Consumer_Surplus

stevech said:
look at the obscene prices Micro$oft is forcing on us with their essential monopoly, for Win 7, a face-lift for the same ole OS.
Don't you feel just a little silly saying stuff like this?
 

Rifter

Lifer
Oct 9, 1999
11,522
751
126
Intel would lose billions if people had no reason to buy new computers. If a computer 3 years ago is just as fast as a new computer on the shelf today, there's no reason for me to buy a new computer.

Ironically this is already true to some degree. My 3.5 year old E6600 overclocked to 3.1ghz will crush half of the desktop computers in Best Buy.

Im not saying they would not advance, just at alot of a slower rate than they are now.

For example if a new CPU nowdays offered 50% boost in perormance over last gen i would expect less than 30% increase in a monopoly enviorment. No reason to push the R&D team which is the most expensive part of CPU development for performance gains when you have nothing to perform against. You would save a few hundred million on R&D and cut corners and go for a big enough to notice differnce but not more.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
Im not saying they would not advance, just at alot of a slower rate than they are now.

For example if a new CPU nowdays offered 50% boost in perormance over last gen i would expect less than 30% increase in a monopoly enviorment. No reason to push the R&D team which is the most expensive part of CPU development for performance gains when you have nothing to perform against. You would save a few hundred million on R&D and cut corners and go for a big enough to notice differnce but not more.
But then game developers notice things are not advancing, so they stop developing games that need the latest and greatest. It might actually make the process of computer upgrading a lot cheaper if all development stopped.

Video card vs CPU is a great example of how this works. Central processors evolve at a fairly slow rate, and games change CPU requirements very slowly because of this. Anand's article about the E6600 is dated July 2006, so that E6600 is technically 4 years old and it still runs modern games. In 2006, the best video card you could buy was the GeForce 7950, which is the card I bought with that E6600. That card absolutely will not run any modern game. Video cards evolve quickly, game requirements evolve quickly, so I need to keep buying new video cards all the time because the technology changes so fast.

In the end, we pay a lot more for things that change the fastest. Power supplies don't change at all, so I've been using the exact same power supply in several computers for the past 5 years or so. I've also been using the same mouse and keyboard for about 5 years.
 

NP Complete

Member
Jul 16, 2010
57
0
0
I think that if AMD wasn't around things wouldn't be too much different these days. Intel would have had to keep prices low enough to fend off Apple/Microsoft from porting their products to ARM, or risked a x86/ARM split. We still may see the x86/ARM split coming as mobile/low power devices converge on x86 performance/power, especially if Intel and AMD don't keep ontop of prices & innovation.
 

ModestGamer

Banned
Jun 30, 2010
1,140
0
0
I think that if AMD wasn't around things wouldn't be too much different these days. Intel would have had to keep prices low enough to fend off Apple/Microsoft from porting their products to ARM, or risked a x86/ARM split. We still may see the x86/ARM split coming as mobile/low power devices converge on x86 performance/power, especially if Intel and AMD don't keep ontop of prices & innovation.


Per clock speed and cycle on a motorola specific instruction set. Motorola "now freescale" embeded MCU's aboslutely crush desktop processors now.

Start looking at MIPS if you want to see it.
 

pitz

Senior member
Feb 11, 2010
461
0
0
Without AMD (essentially an Intel Monopoly) What would CPUs cost? $300?
look at the obscene prices Micro$oft is forcing on us with their essential monopoly, for Win 7, a face-lift for the same ole OS.

Therefore, I have bought only AMD for many years, to do my little part to avoid $300 CPUs.

$1000-$2000 more than likely. And without free/open source software being out there, you'd pay another $1000-$2000 for your operating system.

But on the bright side, if you worked in IT or in computer design/manufacturing or software development, you'd make a lot more money than you would typically make today.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Let me point out that Intel HAD a monopoly on x86 CPUs until AMD's release of the Am386DX in 1991 (okay, there were the NEC V20/V30 clones, and some even more obscure ones, but no meaningful marketshare).
So we don't have to ask ourselves 'what would happen', we can just look at history and see what actually DID happen.

CPU prices never changed really. They are completely determined by what people are willing to pay for a CPU.
CPU development also never froze. Before AMD, Intel had already developed the 4004, 8008, 8080, 8086/8088, 80186, 80286, 80386 and 80486. Moore's law in full effect, with no competition (yes, that Moore in Moore's law is Gordon Moore, one of Intel's founders).

So can we please drop all the crackpot theories? Nothing bad happened when AMD wasn't around to compete. Know your history.
Likewise, we know that when AMD was around to compete, eg with the Athlon/Athlon64, the result was not Intel prices going down, but AMD prices going up... with AMD introducing their Athlon FX series at $1000+ prices and such.
Know your history.

My first three PCs were bought in the pre-AMD era (8088, 80386SX and 80486DX2), I've always paid pretty much the same price for a new PC, and always had pretty much a 'mainstream' performer.
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Let me point out that Intel HAD a monopoly on x86 CPUs until AMD's release of the Am386DX in 1991 (okay, there were the NEC V20/V30 clones, and some even more obscure ones, but no meaningful marketshare).
So we don't have to ask ourselves 'what would happen', we can just look at history and see what actually DID happen.

You like to leave out important details that contradict your argument, don't you? Back when intel had it's supposed x86 monopoly, x86 itself wasn't the only option for a consumer computer CPU. If intel CPUs were too expensive, you could buy a Mac or an Amiga. There was still competition.

These days, things are different. x86 itself has a huge monopoly as the only viable PC CPU. If AMD went away, intel would have ZERO competition for it's x86 CPUs.


I've always paid pretty much the same price for a new PC, and always had pretty much a 'mainstream' performer.

You can get a pretty decent computer today for $600, 20 years ago there wasn't anything "mainstream performance" available for less than $2000. Unless your definition of mainstream performance has changed year by year, sounds like BS. Maybe you can get a mainstream performance PC today for what you paid for a low-end entry level PC 20 years ago.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
$1000-$2000 more than likely. And without free/open source software being out there, you'd pay another $1000-$2000 for your operating system.

But on the bright side, if you worked in IT or in computer design/manufacturing or software development, you'd make a lot more money than you would typically make today.

You can't simply assume prices will go up by 3-4x because competition completely disappeared and they won't suffer consequences.

Look at MS with their Vista OS. People were VERY reluctant to upgrade even though that was the only path for majority of the users. They just stuck to XP. When Windows 7 came out, people upgraded to it.

I remember Dell ads with systems costing almost $7000 and that was during the Pentium III days, when the Athlon was VERY competitive. Nowadays not even the 2P servers they advertise cost as much.

It's presumable that price cuts might stagnate, but not to a degree some people are suggesting.
 

ShawnD1

Lifer
May 24, 2003
15,987
2
81
You can get a pretty decent computer today for $600, 20 years ago there wasn't anything "mainstream performance" available for less than $2000. Unless your definition of mainstream performance has changed year by year, sounds like BS. Maybe you can get a mainstream performance PC today for what you paid for a low-end entry level PC 20 years ago.
Nah, mainstream computers were always cheap. Good computers were always expensive.

Before 1992, my computer was a Commodore 64. Wiki says these things came to market in 1982 with a price of $600 US. Commodore 128 came out a few years later with a price of only $300 in the US. The Amiga 500 was $700 US.

A bit of a "friend of a friend" story, but did you ever see that documentary Triumph of the Nerds? link. In part of that documentary, he goes over the history of Apple computers. In part, he said that a regular IBM computer running DOS would typically cost about $600 whereas the Apple with a mouse and color screen was more like $1200. So basically the typical PC at that time (mid 80s) was $600.

The only thing that really changes is how much is $600. An inflation calculator I found on google says $600 of 1985 money is roughly $1200 of 2009 money.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
You like to leave out important details that contradict your argument, don't you? Back when intel had it's supposed x86 monopoly, x86 itself wasn't the only option for a consumer computer CPU. If intel CPUs were too expensive, you could buy a Mac or an Amiga. There was still competition.

Amigas were home computers.
Macs were the only competition for the business market (especially back then, much larger than the home computer market), but Macs were (even then) suffering from a lack of good business software, so they never got much marketshare outside of typical Mac markets such as DTP.
Besides, Intel CPUs were used by IBM, and you know the saying: nobody ever got fired for buying IBM.

There were more options perhaps, but in sheer numbers, Intel/IBM were still HUGE compared to the rest.

You can get a pretty decent computer today for $600, 20 years ago there wasn't anything "mainstream performance" available for less than $2000. Unless your definition of mainstream performance has changed year by year, sounds like BS. Maybe you can get a mainstream performance PC today for what you paid for a low-end entry level PC 20 years ago.

And how much of that price is depedent on the CPU?
Especially memory was incredibly expensive back in those days. Harddisks were aswell (my first PC didn't even have one, it was a high-end feature). What about videocards, decent VGA monitors etc? My first VGA card cost more than an entire Amiga 500 system! There were exactly 0 Intel chips on that VGA card though.

I've never paid as much as $2000 for any PC, I know that much.
 

DrMrLordX

Lifer
Apr 27, 2000
21,634
10,849
136
Without AMD (essentially an Intel Monopoly) What would CPUs cost?

Your everlasting soul. Or perhaps a bag of tacos.

Really, it's hard to say. If AMD and their patent portfolio/technical expertise vanished without Intel losing any of the things that they've gained from their relationship with AMD, the gub'ment we have today would probably try to break Intel up, and then things would get all kinds of crazy stupid since, despite Intel's demise as a singular corporation, there still would be no credible competition to the CPUs Intel has been making over the past four years because designing and fabricating CPUs is damned expensive. It would be years before the market could recover from something like that, and by that point, we might all be stuck using Loongson systems. Oh the agony. Maybe VIA would save the day?

Had AMD never existed, well, that's a whole 'nother ball o wax. We'd all be running Itaniums, and wouldn't that be lovely.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Had AMD never existed, well, that's a whole 'nother ball o wax. We'd all be running Itaniums, and wouldn't that be lovely.

It could be.
The core logic of an Itanium is smaller than that of an x86.
The result is that with today's transistor budget for a 6-core x86, we may be able to implement an 8 or even 10-core Itanium.

Aside from that, had Intel concentrated all their R&D on Itanium, rather than having to split themselves between x86 and Itanium (with Itanium development pretty much at a complete standstill at times, and generally a process node behind), who knows what Itanium would have looked like today.

Same goes for compilers... if Intel/MS/gcc had invested all their time in Itanium optimizations... who knows what would have happened.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Same goes for compilers... if Intel/MS/gcc had invested all their time in Itanium optimizations... who knows what would have happened.

Seriously I wonder how much of this speculation matters. If things were that simple, everybody would be a fortune teller. Unfortunately, not even the analysts that get paid trying to predict the future can't even do it.

I don't think because Intel is executing well on their x86 lines, they can just make a new architecture based CPU and have it work flawlessly.

I still think it might even be the other way around. The Xeon MPs aren't being developed with all the RAS and I/O they need because that's for Itanium to take care of.
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Seriously I wonder how much of this speculation matters.

Why does it have to matter?
I'm just pointing out that x86 is a dreadful architecture, but because so much was invested in it over the years, it eclipsed pretty much everything else.
So while some people may think Itanium is a dreadful architecture... it could be another x86-scenario success story if there was nothing else to invest in.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Because some people make too much fuss about it over and over, even though its well... mostly over.

Who knows? I think the market would have still not wanted moving to entirely different ISA because the x86 based software is so entrenched everywhere. It's something too big even for Intel to break.

Besides, its all in the optimizations. Whatever minute ISA advantages EPIC might or might not have had, it doesn't exist because all the tools/compilers/infrastructure invested over the years for x86.

(I did support the Itanium quite strongly back in the pre-Core 2 days. Whatever Intel was doing between Pentium II and Core 2 was simply not exciting and/or brought little advantages. In comparison a radical uarch change brought with improvements that were more substantial than on the PC side was a fresh change. Its really hard to think like that now)
 
Last edited:

Chiropteran

Diamond Member
Nov 14, 2003
9,811
110
106
Nah, mainstream computers were always cheap. Good computers were always expensive.

Before 1992, my computer was a Commodore 64. Wiki says these things came to market in 1982 with a price of $600 US. Commodore 128 came out a few years later with a price of only $300 in the US. The Amiga 500 was $700 US.

I'm pretty sure those prices don't include a monitor. I'm also pretty sure those were among the cheapest bottom of the line computers you could buy. Further, NONE of those were x86 computers, which is what Scali claimed to have bought. They certainly weren't the "mainstream performance" computers Scali was talking about.


I've never paid as much as $2000 for any PC, I know that much.

Then you were not buying "mainstream performance" PCs back in the 80s. Not unless you define "mainstream performance" as budget low-end :p
 

Scali

Banned
Dec 3, 2004
2,495
0
0
Further, NONE of those were x86 computers, which is what Scali claimed to have bought.

Exactly, PCs (as in IBM compatibles, with x86 processors) were always mega-expensive, compared to home computers or game consoles.
You could buy 4 Amiga 500s for the price of a 10 MHz XT with monochrome monitor and no HDD.

Then you were not buying "mainstream performance" PCs back in the 80s. Not unless you define "mainstream performance" as budget low-end

Hard to say exactly. I'm not American, I don't pay in dollars. And our taxes are way higher... and then there's import costs etc. So we pay way more than Americans do anyway.
But I'd estimate that it's somewhere between $1000-$1500.
In those days there wasn't that much choice anyway. These days you have dozens of variations of CPUs, with tons of minuscule steps in clock speed. Back in those days you'd have 2 or 3 clockspeeds to choose from, and usually either cutting-edge architecture or last-gen (eg 10 MHz XTs became mainstream when the 286 was introduced, etc).
 
Last edited:

ShawnD1

Lifer
May 24, 2003
15,987
2
81
I'm pretty sure those prices don't include a monitor. I'm also pretty sure those were among the cheapest bottom of the line computers you could buy.
Has any of this changed? Show me a $600 modern computer that isn't a piece of shit. I'll quickly check Best Buy since that's a well known store. Here's a beauty, a nice dual core i3 for only $600: link. 6gb memory, 1tb hard drive (probably a caviar green piece of crap), integrated graphics. At the very least, it has gigabit ethernet. Monitor not included. This computer sucks.

If we step it up a notch, we get into the price range of a nice IBM PC from 1990. $1200 US gets you an i7 930, 8gb ram, blu-ray (this would have been a zip drive back the day), 1tb hard drive, gigabit ethernet, monitor not included. So even with competition, good x86 machines are expensive as fuck.


Not unless you define "mainstream performance" as budget low-end
Mainstream is budget low end. How many months/years did it take before Best Buy even carried i7 computers? When people on Anandtech were building i7 920s and pushing them to 4ghz, mainstream computers sold in stores were still E5200s. Average computers are horrendously bad.
http://en.wikipedia.org/wiki/Desktop_computer
For Microsoft Windows systems, the average selling price (ASP) showed a decline in 2008/2009, possibly due to low-cost Netbooks, drawing $569 at U.S. retail in August 2008. In 2009, ASP had further fallen to $533 by January.
 
Last edited: