Why does intel gimp some of their CPU lines?

Elixer

Lifer
May 7, 2002
10,371
762
126
Intel releases tons of SKUs, and, unlike their counterpart (AMD), they disable key features on some of them.

What would be the reason for this?

For example, if you are looking to use virtualization, you notice that the K series of CPUs from intel have that disabled.

AMD on the other hand, all processors are the same, with only the speed rating being the only difference.

Does that imply that intel's K series of CPUs have flaws in them, or are somehow of a lesser quality than the non-K series ?
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
It is called market segmentation. Everyone, including AMD, does it.

Specifically to your point, OC enthusiasts are generally not looking for that feature. K series = enthusiast.
 
Last edited:

Elixer

Lifer
May 7, 2002
10,371
762
126
I am talking about actually disabling features that the CPU would normally support, and, AFAICT, AMD don't do this.

This isn't about core count, or threads, this is about fusing off features of the CPU, so they are disabled.

Even if they are geared toward enthusiasts, why spend the extra $$$ to fuse it off, if all is well ?
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
I still think OCGuy answered it correctly. As far as I know, it's mostly for market segmentation. Whether they fuse off cores or caches or whatever, it's all mostly for the same reason. As for "why spend the extra $$$". Relatively speaking, getting some engineer to figure out how to fuse things off is still monstrously cheaper than redesigning a new core with those product differentiators.

So yeah, if I was a CEO and I thought product differentiation and market segmentation was the $$$ thing to do. However, making a new CPU without those features is insanely expensive to produce, the next best thing is to build a "it has everything chip" and then disable stuff.

I think of Samsung when I think of efficient market segmentation. They're really good at making varying products and they're so efficient at it that they don't have to resort to the "make everything product and turn some things off". They can successfully create a new variation cheaply and therefore reap the savings on production.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
Because if you need more speed than an i7 4770, and you need virtualization with directed i/o instead of just virtualization (which the 4770k supports), then you can spend the extra money for a 2011 socket quad core which has it and can be oc, or you go with a 2011 six core.

You are not an enthusiast if you need directed i/o you are a workstation.

Furthermore in reality a workstation user will not be overclocking, they can't risk the potential lost of time due to data corruption for less than a 20% gain in cpu performance.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I run 4770's in both my boxes and its disappointing that the one in my office box only has HD 4600 (where I use the iGPU), you'd think the i7 would come with at least HD 5000. And yes, its all $$$.
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
It is called market segmentation. Everyone, including AMD, does it.

Specifically to your point, OC enthusiasts are generally not looking for that feature. K series = enthusiast.

No. It is a money grab plain and simple. The chips are gimped. The features are there. They are disabled.
 

Roland00Address

Platinum Member
Dec 17, 2008
2,196
260
126
I run 4770's in both my boxes and its disappointing that the one in my office box only has HD 4600 (where I use the iGPU), you'd think the i7 would come with at least HD 5000. And yes, its all $$$.

Desktop cpus that you can get in sockets always only had a 4600, they use different dies for laptop/all in ones that have the 5000/iris/iris pro.

Your chip doesn't have fused off gpu parts.
 

TuxDave

Lifer
Oct 8, 2002
10,571
3
71
No. It is a money grab plain and simple. The chips are gimped. The features are there. They are disabled.

Market segmentation = Money grab = The goal of everyone

(non-MBA guy here so I could be talking out of my ass)

The way I see it, I have a good CPU that I want to sell it for $200, a competitive price. My competitor then sells a cheaper but worse CPU at $50, but it's "good enough" for the people who want to save money. I want all that money too. But if I change my price to $50, then I'm losing all the money from the people who were willing to spend $200. Do I waste a ton of money to get another engineering team to make a cheaper version of the chip? Would I get enough ROI (cheaper to produce, lower power etc...) to justify the extra development time? Maybe...

For when it doesn't make sense, I take my chip, disable some stuff, lock in lower frequencies, fuse off some caches and sell that version at $50. That way the cheapies in the world can buy my chip at $50 because it's "good enough". And then there's enough added value in the "premium" version for people to spend the extra $150.

Edit: I saw this in another thread and it said it best. You can look at it two ways. You can think it as either "you're robbing me of features that should be mine" or "you're allowing people who can only spend $50 to actually buy something"
 
Last edited:

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
No. It is a money grab plain and simple. The chips are gimped. The features are there. They are disabled.

Business is made to "grab money".

How is some feature that few even notice any worse than segmenting by core count or clock speed?

I guess on my personal outrage meter, this doesn't make it beyond "pizza delivery guy forgot cheese and red pepper packets"

FYI, AMD does the same, and I am just as apathetic. They sell MS and Sony $100 Jaguar cores that can run almost any game. They sold MS a Tri-core for Xbox360.

On the GPU side, they release APIs that only their top-end cards can take advantage of.

It is called business.
 
Last edited:

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
AMD on the other hand, all processors are the same, with only the speed rating being the only difference.

So all of AMD's processors are Black Edition/K versions with unlocked multipliers? Their lower-segmented APUs don't have cut-down GPUs? Every single FX-4XXX or 6XXX has respectively two or one damaged modules, and AMD has never cut down a working 8-core to meet market demands? (Seems pretty odd that we neither have a glut nor a scarcity of these partially disabled chips.)

Some X2 555's could be unlocked into tri- or quad-cores. Some Athlons could unlock Phenom cache. X3's into X4's. 960T's into hex-cores.
AMD does segment.

AMD can't afford to segment to the degree Intel can because they need every selling point possible that doesn't undercut their own higher profit models. Their chips are higher wattage, slower, and likely more expensive to produce. (315mm^2 FX-8350 has to be sold for less than a 160mm^2 i5 with disabled hyperthreading and cache, and significantly less than the enabled i7. And with the 4XXX and 6XXX it's putting 315mm^2 up against ~100mm^2 i3's and Pentiums. And that Intel die size includes an IGP that the FX lacks, which uses a different (and presumably cheaper) lithography than the CPU cores.)

AMD's CPU division has been running deep into the red. Don't assume they're giving you value-adds because they just wuv you so berry, berry much.

AMD doesn't love you
Anandtech said:
As with all FX series processors, the FX-60 debuts at $1031 in quantities of 1000, so you can expect street pricing to be at or around that number. The FX-57 will drop to $827 mark as it will co-exist with the FX-60.
 
Last edited:

Elixer

Lifer
May 7, 2002
10,371
762
126
So all of AMD's processors are Black Edition/K versions with unlocked multipliers? Their lower-segmented APUs don't have cut-down GPUs? Every single FX-4XXX or 6XXX has respectively two or one damaged modules, and AMD has never cut down a working 8-core to meet market demands? (Seems pretty odd that we neither have a glut nor a scarcity of these partially disabled chips.)
:rolleyes:
I was NOT talking about being able to o/c one series better than another, or the chopping off cores/cache size or anything like that.

I was talking about CPU features that are removed, for whatever reason, and I cited virtualization as a specific example.

You can get any AMD CPU they make, and it supports virtualization, even the lowly Geneva or Suzuka supports it.

It just seems like a odd thing to gimp, so I was curious for the reasons they could have behind the decision.
 

Lepton87

Platinum Member
Jul 28, 2009
2,544
9
81
And that Intel die size includes an IGP that the FX lacks, which uses a different (and presumably cheaper) lithography than the CPU cores.)

AMD's CPU division has been running deep into the red. Don't assume they're giving you value-adds because they just wuv you so berry, berry much.

AMD doesn't love you

No, the integrated GPU doesn't use a different process node, it's on the same die. The last time that the integrated GPU used a different process node was Arrandale, but the GPU was on a separate die which used 45nm process compared to 32nm that the cpu used. AFAIK you can't use a different process for a part of the chip.
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
Business is made to "grab money".

How is some feature that few even notice any worse than segmenting by core count or clock speed?

I guess on my personal outrage meter, this doesn't make it beyond "pizza delivery guy forgot cheese and red pepper packets"

FYI, AMD does the same, and I am just as apathetic. They sell MS and Sony $100 Jaguar cores that can run almost any game. They sold MS a Tri-core for Xbox360.

On the GPU side, they release APIs that only their top-end cards can take advantage of.

It is called business.

No, its called crapping on your customers. Same as Microsoft with all their different Windows SKU's. But that gravy train is ending with Google and Apple putting the screws to them. ARM will eventually destroy the Intel we know today.
 

OCGuy

Lifer
Jul 12, 2000
27,224
37
91
No, its called crapping on your customers. Same as Microsoft with all their different Windows SKU's. But that gravy train is ending with Google and Apple putting the screws to them. ARM will eventually destroy the Intel we know today.

Yea! Because Apple would never make you pay for more memory on your iPhone or iPad, right? Or charge you more for 4G access on your iPad. Or segment the market into 5/5c/5s Iphones.

And they would never avoid a uniform charging standard, so you have to buy a charger, car charger, etc every time you get a new phone, right?

Why does Google charge for the 1Gbit internet if the 5Mbit is almost free? I mean, the fiber is already there, as is the modem. Why are they artificially limiting you?

Why aren't all new features in the Andriod OS backwards compatible? It is almost like they make you want to upgrade!


Sheesh...
 
Last edited:

Ken g6

Programming Moderator, Elite Member
Moderator
Dec 11, 1999
16,591
4,497
75
What I don't get is why Intel would create new instruction sets, that they presumably want to be widely adopted, and then remove those instruction sets from some, or even most of their chips. What benefit do they get if nobody adopts the instructions because very few people have chips that can use them?
 

BonzaiDuck

Lifer
Jun 30, 2004
16,178
1,777
126
Market segmentation = Money grab = The goal of everyone

(non-MBA guy here so I could be talking out of my ass)

The way I see it, I have a good CPU that I want to sell it for $200, a competitive price. My competitor then sells a cheaper but worse CPU at $50, but it's "good enough" for the people who want to save money. I want all that money too. But if I change my price to $50, then I'm losing all the money from the people who were willing to spend $200. Do I waste a ton of money to get another engineering team to make a cheaper version of the chip? Would I get enough ROI (cheaper to produce, lower power etc...) to justify the extra development time? Maybe...

For when it doesn't make sense, I take my chip, disable some stuff, lock in lower frequencies, fuse off some caches and sell that version at $50. That way the cheapies in the world can buy my chip at $50 because it's "good enough". And then there's enough added value in the "premium" version for people to spend the extra $150.

Edit: I saw this in another thread and it said it best. You can look at it two ways. You can think it as either "you're robbing me of features that should be mine" or "you're allowing people who can only spend $50 to actually buy something"

Interesting that you would reference the "non-MBA" factor. Richard Wolff is a professor of Economics -- now at the "New School," but he had taught for some years of U of Mass Amherst. He has a different focus of problems than we discuss here, but had noted there are really "two Economics departments" in most universities -- the second one being the "business school." In the history of economic thought, there had been a term used called "krenelistics," which meant the use of economic knowledge purely for personal gain.

But back to Intel. There are various types of "market imperfection" in the inventory of economic models of how things work. In the extreme, there is monopoly, which often involves "barriers to entry" to prospective competitors. Then there is the "dominant firm" model. These firms expect entry here or there by their competition, but they pretty much call the shots. If Intel isn't a dominant firm, it is more like the dominant half of a duopoly (which would include AMD).

Product differentiation was an approach followed by the auto industry. There are plenty of examples of it. But product differentiation is also much like "price-discrimination" -- a form of monopoly. The classic example you would never think of otherwise appears as your independent small-town MD general practitioner. Like Robin Hood, he may charge his wealthier patients more and his more desperate patients less for the same service. But this isn't quite what Intel is doing, although the impact on profit is the same.

Back around 1998 give or take a year, Intel produce the "Pentium II" processor line. Federal Trade Commission had been all over Intel for a good while -- gathering "intelligence." It was later discovered that Intel was running the PII 250, 300 and 350(?) off the same assembly-line, and then disabling the output in segments to meet those three specs. Eventually, someone in the Philippines discovered how to re-enable those chips and sell them as the more expensive model for a higher price -- a form of counterfeiting. [And I got one of those processors, too . . .] It is doubtful to me that there was ever any legal culpability of Intel for doing this.

But go figure. Imagine the downward-sloping demand curve, and its intersection with upward-sloping supply. There is an area to the left of the intersection called "consumer surplus," which figures into monopoly profit. If I can sell a differentiated (but identically manufactured) product to one market segment for a higher price, and a lesser identically manufactured product to another segment at a lower price, I reap more profit, taking away with me the consumer surplus.

So you can have a Cadillac or an Oldsmobile. Maybe later, you find out that they have the same engine.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
nv didn't release any gk110 with all their cores for a year , sku after sku's milking all the way , when the full gk110 was released the 780ti is still is the holy grail of gaming cards ,and a 750ti that can't play games seems to be second place. marketing I guess ?
 

Homeles

Platinum Member
Dec 9, 2011
2,580
0
0
What I don't get is why Intel would create new instruction sets, that they presumably want to be widely adopted, and then remove those instruction sets from some, or even most of their chips. What benefit do they get if nobody adopts the instructions because very few people have chips that can use them?
I don't get that one either.
nv didn't release any gk110 with all their cores for a year , sku after sku's milking all the way , when the full gk110 was released the 780ti is still is the holy grail of gaming cards ,and a 750ti that can't play games seems to be second place. marketing I guess ?
A 750 Ti can't play games? People on the internet are so spoiled.
 

MagnusTheBrewer

IN MEMORIAM
Jun 19, 2004
24,122
1,594
126
As someone with a fair knowledge of electronics manufacturing, I suspect they disabled some things after the chip failed testing. They were then marketed to a different segment.
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
No, its called crapping on your customers. Same as Microsoft with all their different Windows SKU's. But that gravy train is ending with Google and Apple putting the screws to them. ARM will eventually destroy the Intel we know today.

So I guess Toyota crapped on me, because there's a place on my car where fog lights go, but I didn't get them.

Or, it could be that I didn't pay for them.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Intel releases tons of SKUs, and, unlike their counterpart (AMD), they disable key features on some of them.

There's a reason for Intel to launch tons of SKU, it's because their market share is big enough to warrant that kind of market segmentation in order for them to maximize profits. In order to make money Intel must invest in premium features and get people to pay for these premium features. The manufacturing costs is irrelevant here, how Intel plans to get the invested money back is what matters, and the path they followed is to charge more in premium parts.

AMD OTOH has volumes too small to bother, so it doesn't make sense to spend money designing new dies or even fusing off parts to generate new products because there's no money on it. In order to make money they must sell everything they can for the price they can, so they don't need to fuse off parts, but to do die-salvaging. That's why the same big opteron die suddenly becomes the humble FX 4x00 if the die is defective enough.

We are speaking here about two very different business models, you just can't compare the two.