Could the reason for introducing "F" CPUs from Intel, be because they want to create a market perception of value for the iGPU, before they release Xe

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Just thinking a little bit, after looking at the i5-9400F on sale at Newegg for $159.99 (minus $10 promo), and noticing at the bottom where they compare 4 products and their prices, the i5-9400 is $194.99.. Clocks and everything else appears to be identical between the chips, except the "F" SKU has the iGPU disabled, of course.

But, if Intel can create a market perception that the iGPU has actual value, for the difference in price that the non-F SKUs are showing up as, then perhaps they will use that to drive market perception that their CPUs with 'Xe'-based iGPUs have a premium value associated with them.

Maybe the creation of the "F"-SKUs, was not simply because they were having yield issues (on 14nm++? doubtful), but because of their marketing plans for newer CPU generations and how they would position them.
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
Just thinking a little bit, after looking at the i5-9400F on sale at Newegg for $159.99 (minus $10 promo), and noticing at the bottom where they compare 4 products and their prices, the i5-9400 is $194.99.. Clocks and everything else appears to be identical between the chips, except the "F" SKU has the iGPU disabled, of course.

But, if Intel can create a market perception that the iGPU has actual value, for the difference in price that the non-F SKUs are showing up as, then perhaps they will use that to drive market perception that their CPUs with 'Xe'-based iGPUs have a premium value associated with them.

Maybe the creation of the "F"-SKUs, was not simply because they were having yield issues (on 14nm++? doubtful), but because of their marketing plans for newer CPU generations and how they would position them.

I think it's more likely because Intel are dealing with a capacity shortage. Even if their yields are fine, there will still be some dies with defective GPUs- and the F series lets Intel get a few more saleable chips out of their limited capacity.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I also suspect that it allows them to then produce desktop dies without iGPUs if they want to (and may already be doing in the background) to increase usable dies from a wafer. With their current shortage, being able to make 8 core die without an iGPU would save them about 20% of the area of a die, so, on a wafer, they could conceivably produce an extra 15% more die, giving them a capacity increase without having to make an investment in additional production equipment. It also allows them the die area on the existing floor plan to make a 10 core iSeries die without having to adjust their external packaging. Given that their competition can now make 12 (and 16 core possibly) core packages in volume in the low cost desktop space, and Intel is capacity constrained, they need some sort of solution to being more competitive. So, assuming that they can do it from a power and thermals perspective, they could potentially make an i9-9950KF/KFS that has 10 cores. Anyone buying in that space won't balk at having to pay an additional $40 for a bottom tier dGPU if they don't care about graphics, and anyone that's actually being productive will have something much better than an iGPU in mind anyway. The big question is, will they be competitive with the R9 3900 with an i9-9950KF? That remains to be seen.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
I think it is multiple reasons.

Yields, paving the way for chips without IGPs vs chips with, and laying the groundwork for Xe cards for the midrange and up graphics user.
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
Given the few 10nm parts that were forced out the door had their iGPUs disabled I'm guessing iGPU defects are a bigger yield concern than one might suspect. Combine that with the 14nm shortage and I just think its a simple matter of making use of dies they otherwise would have tossed.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
I think it's more likely because Intel are dealing with a capacity shortage. Even if their yields are fine, there will still be some dies with defective GPUs- and the F series lets Intel get a few more saleable chips out of their limited capacity.
But, if so, then we know that Intel loves their ASPs, and why lower the price on the F-SKU CPUs then? If not to create a certain market perception? They could sell them for conceivably even pricing, between iGPU and non-iGPU models. But that would create the perception that there was no additional value to the iGPU, that it was just a "freebie", to be accepted, and not pay extra for. By creating this spread of pricing, between iGPU and non-iGPU, they are creating the market expectation, that the iGPU itself has value. (Which could also play into AMD's hands quite well, seeing as how their APUs currently exceed the graphics ability of Intel's iGPUs, at least, currently. And if nothing else, HDMI2.0 support from AMD.)
 

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
But, if so, then we know that Intel loves their ASPs, and why lower the price on the F-SKU CPUs then?

Who says Intel is lowering the price? (to retailers anyway) Newegg having sales doesn't mean Intel lowered prices.

Given the few 10nm parts that were forced out the door had their iGPUs disabled I'm guessing iGPU defects are a bigger yield concern than one might suspect. Combine that with the 14nm shortage and I just think its a simple matter of making use of dies they otherwise would have tossed.

Back in the day, Intel used to sell the IGP busted mainstream desktop processors as Xeons (E3) for entry level servers. They still do as Xeon E, but I imagine sales have fallen off a cliff as companies who might consider buying would just go to the cloud instead.
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
But, if so, then we know that Intel loves their ASPs, and why lower the price on the F-SKU CPUs then? If not to create a certain market perception? They could sell them for conceivably even pricing, between iGPU and non-iGPU models. But that would create the perception that there was no additional value to the iGPU, that it was just a "freebie", to be accepted, and not pay extra for. By creating this spread of pricing, between iGPU and non-iGPU, they are creating the market expectation, that the iGPU itself has value. (Which could also play into AMD's hands quite well, seeing as how their APUs currently exceed the graphics ability of Intel's iGPUs, at least, currently. And if nothing else, HDMI2.0 support from AMD.)

They are even pricing. Their pricing on ark is exactly the same for the 9400F and the 9400 for instance, which seems like a real insult!

The street price in entirely different story though. Who is playing games here, Intel or retailers I cannot say but somebody is getting paid for those iGPUs even if officially Intel deems them to have a value of $0.
 

abufrejoval

Member
Jun 24, 2017
39
5
41
One of these days I'd like to hear the full story on this from Intel or more likely from some ex-employee.

When they started fully putting the iGPU on the CPU die with Sandybridge, that iGPU took about the same space as the CPU cores: They could have produced an 8 core CPU as the desktop standard, just like they could have produced an 4 core CPU for the laptops right there and then?

So why didn't they?

In the Xeon e3 space, where vPro wasn't enough remote control so vendors would always put ASpeed or similar ILO/RSA
solutions on them anyway, an iGPU was sometimes even considered a negative on a server, so selling desktop CHIP derived chips with ECC enabled but the iGPU fused off made sense there, especially if some of these could be harvested from failures in the GPU part.

But they sold these at perhaps $5 discount, if that.

That made $5 the official value of half the chips 'net' area (discounting I/O and PCIe etc. for now).
They other half they sold for $380.

Of course 8 cores on a desktop didn't provide tons of tangible benefits to most desktop users, nor did 8 threads on a laptop 15 years ago (yes, that has changed slightly).

But when they decided against continuing to double the cores in the desktop/laptop space, they also decided that the iGPU should not cost any money whatsoever (or only $5).

Again, why?

Where it becomes really odd are the Iris Plus and Pro chips in the GT3 and GT4 variants, especially because of the extra eDRAM (64 or 128MB), because again these sell at the very same official list price as their ordinary GT2 brethren.

Actually some of them (e.g. the i7-6785R) offered higher clocks than GT2 equivalents (e.g. i7-6700HQ) at identical prices and the extra advantages of being able to use the eDRAM as L4 cache, so quite a few people would have eagerly taken them for desktop builds (btw. the extra 20 Watt TDP are for the iGPU delta: The CPU core parts on both of these CPUs are 45Watts).

A GT3 with 48EU is quite literally twice the size of a GT2 iGPU with 24EU. On a 2 core die with GT3, the CPU part they sell for '$370' almost disappears next to the Iris 550 GPU they sell at '$5' tops.

On a GT4 72EU CPU with four cores the iGPU still takes vastly more space than the CPU and then there is the 128MB of eDRAM--quite a bit of engineering effort--but again they don't charge extra on official list prices.

And that story has continued to Kaby Lake, even if GT4 variants seem to have dropped off the map, because Apple doesn't seem to ask for them any more.

And that's the other issue, that puts this insanity into some perspective: They only way you can buy these CPUs is in an overpriced Mac.

Well, actually there is one other way: Intel NUCs.

But that's it. If you were an OEM NUC vendor, if you wanted to build motherboards for the enthusiast market, evidently Intel would refuse to sell these chips, official prices or not.

Again, I ask: Why?
Why did they fix this odd price at the price of not being able to sell these chips en masse in a cost effective manner?

Must be quite a story...
 
  • Like
Reactions: moinmoin

jpiniero

Lifer
Oct 1, 2010
14,509
5,159
136
Where it becomes really odd are the Iris Plus and Pro chips in the GT3 and GT4 variants, especially because of the extra eDRAM (64 or 128MB), because again these sell at the very same official list price as their ordinary GT2 brethren.

No, the comparable part was $70-$100 more for the GT3+ (at least MSRP). You can see why Intel dropped it since there was no point since nVidia's GPUs were about the same price and were much faster, plus the nVidia branding.

Intel is moving to chiplets and they are coming soon (Rocket Lake / Tiger Lake).
 

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
Intel's GPU strategy has indeed been strange. To be honest I think its rooted in an assumption that they can just make this whole GPU thing go away because its a temporary development. Certainly, specialized coprocessors did fall out of existence in favor of a faster CPU doing the work over the years so I can see why the idea would have traction with the old guard. If you look at many of their strategies over the years it seems like they figure soon enough software rendering was going to make a big come back eventually, they just have to wait this whole GPU thing out and soon their CPUs will be much faster no one will buy a GPU anymore.
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
I think it's more likely because Intel are dealing with a capacity shortage. Even if their yields are fine, there will still be some dies with defective GPUs- and the F series lets Intel get a few more saleable chips out of their limited capacity.

Actually the limit is in validation. F processor simply skip iGPU validation part and hence it's disabled even if it most likely is functional.
 

abufrejoval

Member
Jun 24, 2017
39
5
41
No, the comparable part was $70-$100 more for the GT3+ (at least MSRP). You can see why Intel dropped it since there was no point since nVidia's GPUs were about the same price and were much faster, plus the nVidia branding.

Intel is moving to chiplets and they are coming soon (Rocket Lake / Tiger Lake).

The processors I quoted were $370 for the i7-6785R and $378 for the i7-6700HQ, GT3 being $8 cheaper on list.
Current examples would be the i7-8559U at $431 vs. i7-8565U at $409, but since they aren't the exact same generation and published months apart, that's price erosion perhaps.

The point is that the silicon die area difference is huge, the eDRAM extra cost and effort is huge, the price difference is minor. If it wasn't they'd simply make it mainstream.

Of course they were not price effective for Intel, that's the point!

So why quoting these obviously ruinous (for Intel everything below 40% margin is ruinous) prices?

Point in case, they are now advertising the fact that 10th gen *has* 48EU and GT3 performance without the eDRAM, thanks to increased LDRR4 bandwidth and a shrink.

Obviously Intel wanted to hurt ATI/AMD and Nvidia more than they cared about wasting die area on iGPUs they didn't monetize directly.

But with GT3 and GT4 the iGPU + eDRAM overhead must have become big enough to hurt even Intel's accountants.

Obviously without Apple pushing, Intel would have never made these chips.
And somewhat less obviously Apple hates Nvidia's guts.

Still leaves plenty of questions and one of them would be if Intel has to pay for GPU patents based on retail prices: That could explain why iGPUs were sold for free.

But it raises new questions for Xe.
 

abufrejoval

Member
Jun 24, 2017
39
5
41
I couldn't agree more that Intel can both be stubborn and wrong. Not in the least, because they can afford it: My memory reaches back to the Intel 80432 aka iAPX432.

But I cannot help to think that the iGPU story is more complex than that.
Of course, it could be no more than some corporate egos, bolstered by billions of revenue...
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I think it's more likely because Intel are dealing with a capacity shortage. Even if their yields are fine, there will still be some dies with defective GPUs- and the F series lets Intel get a few more saleable chips out of their limited capacity.

This sounds like the most reasonable explanation.

The point is that the silicon die area difference is huge, the eDRAM extra cost and effort is huge, the price difference is minor. If it wasn't they'd simply make it mainstream.

Of course they were not price effective for Intel, that's the point!

This makes little sense at all when you look at the actual prices and configurations of the systems. The ARK prices are only effective to enthusiast desktop users!

The higher end models allow Intel and manufacturers upsell them for extra revenue. The Iris and Iris Pro configurations are always the last thing you can choose when you go to a site and spec out your laptop, and therefore the most pricey option. If you believe ARK pricing, then Intel does some truly bizarre things. No, the reality is behind the curtains, they do something that we don't realize.

I think with this, I can add potentially another reason for the F chips. Quite a few people in the enthusiast forums complain about the "useless iGPU" taking up space and power. Sounds like a business opportunity to me.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
Simple the F versions increase the value for money compared to ryzen CPUs that also have no iGPU,intel released a 4c a 6c a 8c and a 8c/16t so every major tier is covered and people can choose to paid a bit more for a iGPU or not,depending on sales I can imagine them making this permanent in the future.
 

OTG

Member
Aug 12, 2016
101
175
116
Simple the F versions increase the value for money compared to ryzen CPUs that also have no iGPU,intel released a 4c a 6c a 8c and a 8c/16t so every major tier is covered and people can choose to paid a bit more for a iGPU or not,depending on sales I can imagine them making this permanent in the future.

Removing features while keeping the same price increases value for money? I suppose that's true from Intel's perspective.
Until they lower the MSRP for the F chips, there's no reason for a consumer to pick one over the normal lineup (other than availability).

Edit- right now on Newegg:
i5 9400f: 159.99
i5 9400: 194.99
i9 9900k: 489.99
i9 9900kf: 549.99
So Intel's pricing is some kind of messed up, will be an interesting year for them.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
Removing features while keeping the same price increases value for money? I suppose that's true from Intel's perspective.
Until they lower the MSRP for the F chips, there's no reason for a consumer to pick one over the normal lineup (other than availability).

Edit- right now on Newegg:
i5 9400f: 159.99
i5 9400: 194.99
i9 9900k: 489.99
i9 9900kf: 549.99
So Intel's pricing is some kind of messed up, will be an interesting year for them.
Yup in my country it's 147€ for the 9400F compared to 208€ for the 9400.
I would never consider paying 208€ for a CPU alone but ~150€ are looking very tempting.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Removing features while keeping the same price increases value for money? I suppose that's true from Intel's perspective.
Until they lower the MSRP for the F chips, there's no reason for a consumer to pick one over the normal lineup (other than availability).

Edit- right now on Newegg:
i5 9400f: 159.99
i5 9400: 194.99
i9 9900k: 489.99
i9 9900kf: 549.99
So Intel's pricing is some kind of messed up, will be an interesting year for them.
Intel has those pairs of chips priced the same, though. (RCP)
 

Wuzup101

Platinum Member
Feb 20, 2002
2,334
37
91
The pricing is certainly odd. I remember reading the (I think Anand) article when the F SKUs were first released, and it noted that the pricing was the same. However, the one SKU that I have actually thought about purchasing, the 9400F, is something like $60 cheaper at my local microcenter. This of course, pisses me off, because I wanted it for a plex media server / NAS build, and I actually want the iGPU lol.

That being said, my assumption has been that the "F" parts enabled them to sell chips that they would otherwise have to throw in the trash because of a defective iGPU. That being said, I can certainly see how the price difference between an "F" chip and a higher priced "non-F" chip helps to create a perception of value. To be honest, I agree that there is value here as a consumer, and I can totally see Intel in the future using chiplet design and choosing to have SKUs that are just a processor, and those that are an APU. It's a lot of wasted die area on some chips, like the 9900k, which will probably rarely see use.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
The 9900KF is probably perceived to overclock better, so it costs more. Accuracy be damned.
 
  • Like
Reactions: Wuzup101

PingSpike

Lifer
Feb 25, 2004
21,729
559
126
I truly believe few people buying the 9900K care much about the iGPU. But like Wuzup101, I think plenty of people shopping a 9400 care about the iGPU.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
F CPus were desperately included just to take every little bit of working silicon to market as possible. At least at first.

There is no doubt that Intel will find a way to build a premium markup on the "normal" skus in the future as their iGPs start to become actually decent. The F cpus is the first step in that timeline.
 

SPBHM

Diamond Member
Sep 12, 2012
5,056
409
126
was there ever a an explanation for Intel selling some Sandy Bridge with the IGP turned off, like the 2550K?

in any case, at the moment I think that the idea that they are just trying to salvage as many dies as possible makes sense, given all the supply issues going on.

i5 9400F has a huge portion disabled, 2 cores, IGP, lots of l3, that's like, almost half of it?
 

OTG

Member
Aug 12, 2016
101
175
116
I truly believe few people buying the 9900K care much about the iGPU. But like Wuzup101, I think plenty of people shopping a 9400 care about the iGPU.

It makes perfect sense that the 9400f is cheaper than the 9400, but it sure doesn't explain why the 9900kf is $60 MORE than the 9900k.