The future of Mainstream Enthusiast Desktop: Removal of the iGPU?

cbn

Lifer
Mar 27, 2009
12,968
221
106
I thought the following post (bolded part) was very insightful:

http://forums.anandtech.com/showpost.php?p=36512515&postcount=50

If 5820K is a single die like Core i7 3820 was, then it could be smaller than Core i7 4790K. They could price it at $399 and make more profit than any socket 1150 sku.
The only thing is, do they want enthusiasts converting to HEAD platform or Intel still wants to raise its GPU market share ??

And then you could have a 8-core 14nm die that is smaller than Broadwell Quad Core + GT3 iGPU next year. Do you push enthusiasts to HEAD platform and make more profit or stay at mainstream and keep the GPU market share ??


Lets wait and see, so close to Haswell-E release now ;)

1. Eliminating iGPU for enthusiast mainstream desktop makes a lot of sense to me.

But lets say Intel wanted to do this today (just for the sake of argument) using Haswell-E and X99 as starting point. How would they accomplish this?

Maybe create a feature reduced chipset called X95? (that could easily plug into existing X99 motherboard designs) Then pair it with chips like Pentium Haswell-E (4c/4t), Core i3 Haswell-E (4c/8t), Core i5 Haswell-E (6c/6t)?

By getting rid of the iGPU, Intel would be able to sell a chip that is much less costly to produce than products like Devil's Canyon.

I suspect even a 4c/8t Haswell-E Core i3 would be cheaper to produce than a LGA 1150 Haswell Core i3.

2. In the future, there is also the possible "transition to a more mobile oriented process tech" for the mainstream (cpu + gpu) dies to consider also. This might also help hasten a transition to the big socket for enthusiasts. (but I don't think this will happen with Haswell-E, of course. But maybe by Skylake-E's arrival this could be reality?)
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
IMHO If the cost creating a different die didn't outweigh the profit that could be made then Intel surely would have done this already.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
Not sure if that would even be a good idea. There are legit reasons for wanting an iGPU even for an enthusiast. I use the HD4x00 iGPU in my Haswell i5 for Quicksync video transcoding frequently.

I'm not sure how common this is in desktops, but in performance oriented laptops, the iGPU takes over when on battery or when the discrete GPU is unneeded; Nvidia Optimus and AMD's Enduro, I believe the platforms are called. No reason why the desktop couldn't do the same, or use the iGPU for physics or xFire/SLI benefits.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
Considering the same designs are used across desktop, mobile, and to some extent servers, Intel would be hardpressed to remove the iGP. Even for the server market, if having the integrated GPU means cutting out the need for a dedicated GPU (from AMD or Nvidia), Intel is going to do it.

I would still like to see high end Intel and AMD products that cut out the IGP in order to put on 8 full cores (cue MOAR CORES). I think there is a legit market for that kind of product with the enthusiast, gamer, and workstation crowds.
 
Last edited:

SPBHM

Diamond Member
Sep 12, 2012
5,068
423
126
Not sure if that would even be a good idea. There are legit reasons for wanting an iGPU even for an enthusiast. I use the HD4x00 iGPU in my Haswell i5 for Quicksync video transcoding frequently.

I'm not sure how common this is in desktops, but in performance oriented laptops, the iGPU takes over when on battery or when the discrete GPU is unneeded; Nvidia Optimus and AMD's Enduro, I believe the platforms are called. No reason why the desktop couldn't do the same, or use the iGPU for physics or xFire/SLI benefits.

it requires the monitor to be connected to the IGP, and the dGPU to pass the output via PCIE, I think it's always going to add more problems than a pure dgpu setup, also current NV/AMD GPUs can go as low as just 5W idle power usage, so it's hardly something that appealing for desktops... and shutting down the PCIE card seems to require some work, lucid virtu attempts something like that but it doesn't really turn power down on the card, I think nvidia and AMD had something working back in 2008 with the 8200m/780G and their high end cards (when it made sense, with VGA using 50W and more idle).

quicksync is OK I guess, but Nvidia and AMD GPUs also have built in hardware tranconders, and a lot of people still prefers software solutions because of the flexibility it offers.
 

NTMBK

Lifer
Nov 14, 2011
10,484
5,903
136
Considering the same designs are used across desktop, mobile, and to some extent servers, Intel would be hardpressed to remove the iGP. Even for the server market, if having the integrated GPU means cutting out the need for a dedicated GPU (from AMD or Nvidia), Intel is going to do it.

I would still like to see high end Intel and AMD products that cut out the IGP in order to put on 8 full cores (cue MOAR CORES). I think there is a legit market for that kind of product with the enthusiast, gamer, and workstation crowds.

Intel already has plenty of server CPUs with no IGP. The only ones with an IGP are the Xeon E3 line- the Xeon E5 and E7 have no IGP. Not just a disabled IGP, no IGP at all on the die. It's all CPU, cache and fabric.

However, I'm not sure if there's still a big enough market for a 4-core server CPU with no IGP. The smallest GPU-less Haswell die is the 6 core one.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Mainstream will always have an IGP. Active or not.

Only multisocket server chips (Read LGA2011 style and above) will be without.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Maybe it's time for Core-i4, i6 & i8 to come out complete with no IGP but proper TIM instead of the mickey mouse stuff Intel has been trying to force upon people.

It's not going to happen but maybe Intel will at least go back to good TIM.
 

DrMrLordX

Lifer
Apr 27, 2000
23,050
13,152
136
To heck with TIM, just solder the darn IHS down or sell the enthusiast chips "naked" with the IHS loose in the package so we can dispense with the annoying delid operations.

As far as iGPUs go, there are arguments for and against right now, but I think most people will be more inclined to appreciate them once stuff like DX12 and/or HSA become mainstream (assuming HSA ever becomes mainstream).
 

witeken

Diamond Member
Dec 25, 2013
3,899
193
106
Intel's CPUs without IGP have the same name (i3,i5,i7), but with the P suffix (e.g. i5 3350P).
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,250
136
Looking at the heat generated by the current offerings isn't the dead(disabled) silicone a good thing? Does it trap heat or help spread it out?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
Looking at the heat generated by the current offerings isn't the dead(disabled) silicone a good thing?

The IGP also serves like dark silicon when its active. Simply because it consumes much, much less power per mm2 than the cores.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Mainstream will always have an IGP. Active or not.

Only multisocket server chips (Read LGA2011 style and above) will be without.

What I am proposing is that the level of cpu performance we call "mainstream enthusiast" become part of the Big/LGA style socket. This accompanied by a lower cost/feature reduced PCH.

In this way, Intel could produce an iGPU-less 4C/8T die for probably the same cost as a 2C/4T with iGPU.

The mainstream (with its iGPU) will continue to live on, of course, but I am thinking Intel will probably eventually equip those dies with a more mobile tuned xtor (ie, lower drive current, but lower leakage, etc).
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
What I am proposing is that the level of cpu performance we call "mainstream enthusiast" become part of the Big/LGA style socket. This accompanied by a lower cost/feature reduced PCH.

In this way, Intel could produce an iGPU-less 4C/8T die for probably the same cost as a 2C/4T with iGPU.

The mainstream (with its iGPU) will continue to live on, of course, but I am thinking Intel will probably eventually equip those dies with a more mobile tuned xtor (ie, lower drive current, but lower leakage, etc).

The mainstream and performance segment will not come from Xeon parts. Only the highend enthusiast part where you have to buy the multisocket Xeon socket. Just look at E3 xeons, they got IGP as well.

Nobody (Read:Irrelevant minority) outside the multisocket segment is interested in IGP less chips.

The enthusiast chips as we know them may actually be in danger in the long run when servers turns to more cores and unable to keep up the frequency that singlethreaded heavy consumer applications require.

Intel have zero interest in producing a future 4C/8T die without IGP. Simply because its below the multisocket Xeon demand.
 

Fjodor2001

Diamond Member
Feb 6, 2010
4,354
636
126
IMHO If the cost creating a different die didn't outweigh the profit that could be made then Intel surely would have done this already.

They could sell chips where the iGPU is not working when tested in factory as pure CPUs without iGPU though. I.e. no extra cost for creating a separate die.

Also note nowadays the iGPU is not an insignificant part of the total die are. As the core count remains at 4 while the iGPU is beefed up for each new CPU generation this aspect becomes ever more important. We are past 50% of the total die area allocated to the iGPU for some SKUs already.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
They could sell chips where the iGPU is not working when tested in factory as pure CPUs without iGPU though. I.e. no extra cost for creating a separate die.

Disabling functionality - for example fusing off a non-functional IGP, does not remove the IGP.
 

NTMBK

Lifer
Nov 14, 2011
10,484
5,903
136
They could sell chips where the iGPU is not working when tested in factory as pure CPUs without iGPU though. I.e. no extra cost for creating a separate die.

They already do. It's the P series, as Witeken mentioned above.
 

Atreidin

Senior member
Mar 31, 2011
464
27
86
I remember years ago reading of people who refused to ever buy a processor with integrated graphics, like it was an affront to their honor or something. It was as if Intel including it on the processor was an insult. Disabling it wasn't good enough, it had to not be there at all. I wonder how well that position has held up.