The future of Mainstream Enthusiast Desktop: Removal of the iGPU?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Over half of that die is GPU.

I don't know if you misunderstood me or what, but I was talking about the parts with the lowest end IGP (ie, GT2), not all of their CPUs. For that refer to AtenRa's shot, which shows much less than half the die area occupied by GPU.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I don't know if you misunderstood me or what, but I was talking about the parts with the lowest end IGP (ie, GT2), not all of their CPUs. For that refer to AtenRa's shot, which shows much less than half the die area occupied by GPU.

Well actually those two dies are very close in size. 2+3 (Dual Core + GT3) is close to 190mm^2, the 4+2 (Quad Core + GT2) is at 177mm^2.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Well actually those two dies are very close in size. 2+3 (Dual Core + GT3) is close to 190mm^2, the 4+2 (Quad Core + GT2) is at 177mm^2.

I wasn't talking about total die size anywhere, just the portion of the die that's occupied by the GPU.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
I wasn't talking about total die size anywhere, just the portion of the die that's occupied by the GPU.

Yes i got that, i was only trying to communicate that those two dies are very close in size.

The thing to remember here is that CPU core size delta remains the same for the last 3-4 CPU families but iGPU die size keeps growing. On 14nm the CPU cores will shrink to half of the current size, iGPU will not. That will only make the iGPU to occupy a larger portion of the die.
 

Imaginer

Diamond Member
Oct 15, 1999
8,076
1
0
Removal of the iGPU?

In an instance where my dGPU card failed on me, this was a life saver in using my desktop for a bit while I had the card in the process of an RMA. I would not discount the iGPU out too soon in many cases.

But this is one instance where I actually appreciated the iGPU, another being it's integration lead into the interplay and integration on the mobile end of laptops and hybrids.
 

crashtech

Lifer
Jan 4, 2013
10,666
2,270
146
I have a small pile of old GPUs in a drawer, so no problems there. I would hope most enthusiasts in the market for something above mainstream would keep a few spare parts around, but perhaps I am an outlier.
 

jpiniero

Lifer
Oct 1, 2010
16,230
6,694
136
Considering how rare those P models are, I have a feeling that the GPU transitors are easier/cheaper/doesn't cause yield problems as much. So you can't really compare a 4+3e with a theoretical 6+1. Plus, the purpose of the mainstream line is mobile, and I am still not sure 6 core on mobile would be doable even on 14 nm.

With Intel intent on killing off the dGPU, I imagine the % of the die given to the GPU on the mainstream line is only going higher. And if you want more than 4 Intel cores, you will have to pay for the Server line.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Considering how rare those P models are, I have a feeling that the GPU transitors are easier/cheaper/doesn't cause yield problems as much. So you can't really compare a 4+3e with a theoretical 6+1. Plus, the purpose of the mainstream line is mobile, and I am still not sure 6 core on mobile would be doable even on 14 nm.

With Intel intent on killing off the dGPU, I imagine the % of the die given to the GPU on the mainstream line is only going higher. And if you want more than 4 Intel cores, you will have to pay for the Server line.

I have only used the 177mm^2 4+2 (GT2) not the 260mm^2 4+3e die.

Also just a food for thought, 14nm Skylake 4+4 (GT4) die could be bigger than 14nm 8-Core server CPU die :whiste:
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Removal of the iGPU?

In an instance where my dGPU card failed on me, this was a life saver in using my desktop for a bit while I had the card in the process of an RMA. I would not discount the iGPU out too soon in many cases.

Discrete cards like the HD5450 are pretty cheap on Newegg though (sometimes as low as $9.99 After rebate, with free shipping). They make a nice back-up if you don't use them for something else.

But this is one instance where I actually appreciated the iGPU, another being it's integration lead into the interplay and integration on the mobile end of laptops and hybrids.

For laptops and tablets, I agree. Integration of CPU and GPU is a good thing. I can't argue against that.

But I wonder how viable "laptop processor" integrated dies (like Devil's Canyon) will be for performance desktops going forward in the future?

I guess Intel can get away with this for the time being since they are so strong. However, this has been going on for 3.5 years now and I think even Intel will be forced to making a change eventually.
 
Last edited:

Redstorm

Senior member
Dec 9, 2004
293
0
76
With Haswell 4+2 (Quad Core + GT2 or Core i7 4770k/4790K) we have come to the point where we could have a 6-Core for the same die size.

That means we could have a 4C 8T at Core i5 4670K price or 6C 12T at Core i7 4790K price.

I don't believe we need to say anything more, iGPUs are taking valuable space that could be used for more CPU performance.

At 14nm we could have a 6-core die at less than 100mm^2, even with 50% higher cost for the new process, it would still be cheaper than 177mm^2 die size at 22nm. ;)

2.jpg

Well said my friend.

As an Enthusiast when i build my rigs I absolutely hate the compromise of having a CPU with a iGPU that ill never use, use that iGPU real estate to beef up the core count or IPC or fabric of the CPU.

Enthusiast builds will always include a discrete GPU, The argument that an iGPU helps when troubleshooting is just lazy, and only relevant if you don't have a spare PCIe GPU around. (i.e don't be lazy and learn to troubleshoot your kit properly)

As for the current gen Haswells with HD4xxx, I found out the hard way that these are broken, when i jumped onboard to update my HTPC. XBMC have implemented software interlacing due to Intel not even able to get hardware de-interlacing working in the iGPU.

The best hardware is worth nothing if you cant get working drivers for it.

Intel have never been able to produce a decent GPU past or present. Their CPU's are top notch though.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Cogitating upon imaginary dies and chipsets, I find my mind wandering towards a CPU that would fit quite well into a niche between mainstream and HEDT, namely a hexcore with HT disabled. i5-5670K?

Yes, that would be an awesome processor for anyone that needs that kind of power.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
At 14nm we could have a 6-core die at less than 100mm^2, even with 50% higher cost for the new process, it would still be cheaper than 177mm^2 die size at 22nm. ;)

Based on that estimate a dedicated Quad core die (with no iGPU) at 14nm is basically going to be as expensive to produce as Cherry Trail.

Nice thing about small die (as I understand it) is that it may let Intel get better yields if there happen to be a lot of defects on the wafer.

So maybe for advanced nodes a small (but higher profit die) would actually be an advantage to Intel in early part of production?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Hypothetical line-up for Skylake-E:

Skylake-E Pentium: 4C/4T

Skylake-E Core i3-7100K: 4C/8T

Skylake-E Core i5-7300K: 6C/6T

Skylake-E Core i5-7400K: 6C/12T

Skylake-E Core i7-7600K: 8C/8T

Skylake-E Core i7-7770K: 8C/16T

Chipset: X119 (High End), X115 (mid range), X111 (low end)
 
Last edited:

StinkyPinky

Diamond Member
Jul 6, 2002
6,948
1,257
126
Hypothetical line-up for Skylake-E:

Skylake-E Pentium: 4C/4T

Skylake-E Core i3-7100K: 4C/8T

Skylake-E Core i5-7300K: 6C/6T

Skylake-E Core i5-7400K: 6C/12T

Skylake-E Core i7-7600K: 8C/8T

Skylake-E Core i7-7770K: 8C/16T

Chipset: X119 (High End), X115 (mid range), X111 (low end)


Four core i3 with HT?? We can dream but I don't think so. Same goes for 6c/12t i5.
 

crashtech

Lifer
Jan 4, 2013
10,666
2,270
146
Dreaming has not yet been outlawed here. I like the vision even while pessimistic about actual outcomes.
 

DrMrLordX

Lifer
Apr 27, 2000
22,560
12,426
136
Considering how rare those P models are, I have a feeling that the GPU transitors are easier/cheaper/doesn't cause yield problems as much. So you can't really compare a 4+3e with a theoretical 6+1. Plus, the purpose of the mainstream line is mobile, and I am still not sure 6 core on mobile would be doable even on 14 nm.

With Intel intent on killing off the dGPU, I imagine the % of the die given to the GPU on the mainstream line is only going higher. And if you want more than 4 Intel cores, you will have to pay for the Server line.

Agreed. Furthermore, I'll reiterate a point I've made in another thread: that iGPU won't be so useless soon enough. Current gen Intel iGPUs aren't good for much outside of a cheap display option, but in the future . . .
 

DeathReborn

Platinum Member
Oct 11, 2005
2,786
789
136
Removal of the iGPU?

In an instance where my dGPU card failed on me, this was a life saver in using my desktop for a bit while I had the card in the process of an RMA. I would not discount the iGPU out too soon in many cases.

But this is one instance where I actually appreciated the iGPU, another being it's integration lead into the interplay and integration on the mobile end of laptops and hybrids.

If they really need IGP they can use LGA2011 and dedicate some of those pins to a off die but on package IGP or go back to having a IGP on the motherboards but with some dedicated VRAM on board like AMD used to. There are so many ways to keep an IGP without impinging on on the CPU's available real estate.

As for HSA etc, Intel have the interconnects to make it viable but it's cheaper & easier for them to ignore enthusiasts/gamers & focus on lowest common denominator customers.

I still have ISA, PCI, AGP, & some PCI-e graphics cards in a drawer just in case I have need for them.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Agreed. Furthermore, I'll reiterate a point I've made in another thread: that iGPU won't be so useless soon enough. Current gen Intel iGPUs aren't good for much outside of a cheap display option, but in the future . . .

For a dual core (Celeron, Pentium, Core i3) desktop I could see a GT3 or GT4 iGPU as being useful (or attractive) for low detail setting gaming. This provided memory bandwidth is sufficient (and affordable enough) and Intel doesn't price it like AMD does with their A10 APUs.

With that mentioned, I must say I am skeptical about the price of future dual core GT3/GT4 processors and the fast memory kits that will be required to run them without bottlenecking the graphics.

Now as far as unlocked Intel quad cores go, I'll bet the vast majority of users will probably use a discrete card for a quite some years to come. Therefore I question a larger iGPU on the quad core dies much more than I do the dual core dies.

(Fact remains good performing video cards with relatively small amount (2GB, etc) of GDDR5 are actually quite affordable. Integration of cpu and gpu should lower costs, but it seems other factors are at work preventing what should be an increased value from happening)
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
22,560
12,426
136
Well, the point I was making is that, in the future, OpenCL 2.0/HSA/DX12 are going to rock the boat quite a bit to the point that we will all come to appreciate iGPUs for more than just display reasons.
 

zebrax2

Senior member
Nov 18, 2007
974
66
91
I don't see Intel being in a hurry to change their current lineup. Their biggest competitor at this point is themselves and creating a cheap quad/quad with ht would only hit the sales of the higher tier CPUs.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I don't see being in a hurry to change their current lineup. Their biggest competitor at this point is themselves and creating cheap quad and a quad with ht would only hit the sales of the higher tier CPUs.

It would be a lot cheaper for Intel to make though.....and Intel could potentially sell more of them if the ASP could be adjusted downward.

I'm thinking (looking at the LGA 1150 die below as an example) the quad core without iGPU would be as cheap (or slightly cheaper) to make as a Core i3 dual core with GT2 iGPU, but Intel could sell it for a lot more than a Core i3 with GT2 iGPU. (re: It looks like two cores with cache takes up slightly less die area than the GT2 iGPU shown in the die below)

Intel-Haswell-Core.jpg
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
16,230
6,694
136
Now as far as unlocked Intel quad cores go, I'll bet the vast majority of users will probably use a discrete card for a quite some years to come.

Not really, when you consider OEM sales. Looking at Dell's website, the i7 Desktops are 50-50 but the i5's are 10-90. And on laptops, you want the IGP for Optimus even if it comes with a discrete chip.

It would be a lot cheaper for Intel to make though.....and Intel could potentially sell more of them if the ASP could be adjusted downward.

The die cost savings wouldn't make up for the cost of the mask/validation/etc.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Not really, when you consider OEM sales. Looking at Dell's website, the i7 Desktops are 50-50 but the i5's are 10-90. And on laptops, you want the IGP for Optimus even if it comes with a discrete chip.

Just to clear things up:

The mainstream (with iGPU) still exists.

But enthusiast mainstream quad core (eg, Devil's Canyon) loses the iGPU and becomes part of the LGA 2011 style sockets.