The future of Mainstream Enthusiast Desktop: Removal of the iGPU?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

mikeymikec

Lifer
May 19, 2011
18,417
11,032
136
I remember years ago reading of people who refused to ever buy a processor with integrated graphics, like it was an affront to their honor or something. It was as if Intel including it on the processor was an insult. Disabling it wasn't good enough, it had to not be there at all. I wonder how well that position has held up.

I suspect at the time I would have been worried that it would drive up CPU power requirements, cost or system temps. My reaction wouldn't have been as strong as you suggest, but I think I would have been a bit sceptical.

I didn't trust onboard graphics for a long time (bear in mind I saw it in its infancy on the PC with e.g. ATI 1MB onboard graphics (shared memory), which performed horribly for even the basics compared to almost any graphics card), until AMD started doing dedicated graphics RAM (rather than shared system memory only) for onboard graphics. Since then I've been OK with it for the average user.

These days I don't see the point in getting rid of it now it has been implemented well enough for the basics. I like the idea of being able to fall back to onboard graphics in an emergency / testing in case my graphics card failed.
 

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
Intel makes and will continue to make CPUs with only a relatively small amount of area dedicated to the IGP, for the large set of customers who don't care about gaming. Since enthusiast customers are paying a premium for a high bin part anyway they're really not being charged that much extra for having that small IGP on the die.

It's nice to have the IGP there even for dGPU owners, never hurts to have a backup.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
In just a few years the chips most people buy for their desktop/laptop PCs will be GPUs with integrated CPUs, looking at how much die area is allocated to what... :biggrin:
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,297
5,289
136
I suspect at the time I would have been worried that it would drive up CPU power requirements, cost or system temps. My reaction wouldn't have been as strong as you suggest, but I think I would have been a bit sceptical.

But it effectively has. Instead of improving CPU performance, Intel has been using more and more of its transistor budget on the iGPU. For anyone who doesn't use the IGP, performance/$ has stagnated since Sandy Bridge.
 

NTMBK

Lifer
Nov 14, 2011
10,297
5,289
136
Intel makes and will continue to make CPUs with only a relatively small amount of area dedicated to the IGP, for the large set of customers who don't care about gaming.

2.jpg


Over half of that die is GPU.
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
1. Eliminating iGPU for enthusiast mainstream desktop makes a lot of sense to me.
I don't think this will make sense to Intel, because they'd have to adapt a HEDT die, which is after all basically an E5 Xeon, to the mainstream socket.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I don't think this will make sense to Intel, because they'd have to adapt a HEDT die, which is after all just an E5 Xeon, to the mainstream socket.

Just to clear things up:

The iGPU less quad core would be used in what is currently called the HEDT socket (ie, LGA 2011 style socket), not the mainstream LGA socket.

To lower price of what is currently called HEDT there would be lower cost/feature reduced chipset.

Using current technolgy (simply as an analogy), think Haswell-E quad core with X95 (just a fictional name I made up for the lower cost/feature reduced chipset) rather than the X99 chipset.
 
Last edited:

Erenhardt

Diamond Member
Dec 1, 2012
3,251
105
101
No flame intended, but AMD is doing just that since llano APU. Another example of AMD leading the industry.
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
Just to clear things up:

The iGPU less quad core would be used in what is currently called the HEDT socket (ie, LGA 2011 style socket), not the mainstream LGA socket.

To lower price of what is currently called HEDT there would be lower cost/feature reduced chipset.

Using current technolgy (simply as an analogy), think Haswell-E quad core with X95 (just a fictional name I made up for the lower cost/feature reduced chipset) rather than the X99 chipset.
Well, the main reason for 2011 pins is quad-channel memory, which is a bit excessive for the mission you describe. Do you see this imaginary X95 board only making use of two channels? Lots of pins would go unused that way.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
If the GPU part is becoming an ever increasing part of the APUs in computers compared to the CPU part, perhaps AMD will have an advantage since it has a more efficient GPU than Intel.
 

DeathReborn

Platinum Member
Oct 11, 2005
2,767
766
136
But it effectively has. Instead of improving CPU performance, Intel has been using more and more of its transistor budget on the iGPU. For anyone who doesn't use the IGP, performance/$ has stagnated since Sandy Bridge.

This has always been my issue with on-die graphics, all that space & transistors not being used to push CPU performance in either IPC, cache or cores.

Well, the main reason for 2011 pins is quad-channel memory, which is a bit excessive for the mission you describe. Do you see this imaginary X95 board only making use of two channels? Lots of pins would go unused that way.

If only Intel could still use those 2 spare channels for say a RAM Drive or something, maybe even design a chip where some pins can be used for a SSD type of storage channel. It'd be handy having a extremely fast low latency connection directly to the CPU for an OS etc...
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
If the GPU part is becoming an ever increasing part of the APUs in computers compared to the CPU part, perhaps AMD will have an advantage since it has a more efficient GPU than Intel.

Not as long as they have an anemic CPU. And continueing to get further behind in the nodes.
 

mikeymikec

Lifer
May 19, 2011
18,417
11,032
136
But it effectively has. Instead of improving CPU performance, Intel has been using more and more of its transistor budget on the iGPU. For anyone who doesn't use the IGP, performance/$ has stagnated since Sandy Bridge.

<shrugs> Intel are still kicking ass, relatively speaking.

Performance / $ would have to stagnate quite a bit more before AMD are back in the game on the CPU front.
 

NTMBK

Lifer
Nov 14, 2011
10,297
5,289
136
<shrugs> Intel are still kicking ass, relatively speaking.

Performance / $ would have to stagnate quite a bit more before AMD are back in the game on the CPU front.

AMD are kind of irrelevant to this discussion. (As they are to the high performance market, badum-tish.) Who do Intel really need to convince to buy their latest high performance CPUs? The people who bought a 2500k three and a half years ago.
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Then what would you use? I have a 4770 non K in this office box and I naturally use the HD 4600 which is more than serviceable for porn and Chrome. If I bought a cheapo $30 dGPU it would likely be slower and suck more power. Eh? Plus another part that is prone to failure. People who buy HEDT are a tiny segment.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Well, the main reason for 2011 pins is quad-channel memory, which is a bit excessive for the mission you describe. Do you see this imaginary X95 board only making use of two channels? Lots of pins would go unused that way.

It should have the quad channel (which I would imagine would be good to have for a potential octo-core upgrade in the future), but maybe other features (including SATA, usb, pci-e lanes, etc) could be removed or not included in the hypothetical X95 chipset.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
Not as long as they have an anemic CPU. And continueing to get further behind in the nodes.

Their CPU keep improving and is soon "good enough" for most average users. Then the focus is on the GPU, where AMD beats Intel. And Intel's node advantage is shrinking. Just look at their 14 nm woes.
 

NTMBK

Lifer
Nov 14, 2011
10,297
5,289
136
People who buy HEDT are a tiny segment.

I understand the reasoning entirely. We are a tiny niche in the grand scheme, so we get either recycled laptop parts (LGA1150) or recycled workstation parts (LGA2011). But you can easily see why people in this niche feel a little neglected.
 

crashtech

Lifer
Jan 4, 2013
10,573
2,145
146
Cogitating upon imaginary dies and chipsets, I find my mind wandering towards a CPU that would fit quite well into a niche between mainstream and HEDT, namely a hexcore with HT disabled. i5-5670K?
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
People who buy HEDT are a tiny segment.

But if Intel opens up the options to include lower priced offerings, I'd imagine the number of owners will increase.

If this happens, maybe it won't be called HEDT anymore....maybe just LGA desktop?

(I guess it depends on what happens to the laptop/iGPU based dies being currently used for mainstream desktop. Will they become more optimized for mobile over time and eventually become strictly BGA?)
 

mikeymikec

Lifer
May 19, 2011
18,417
11,032
136
AMD are kind of irrelevant to this discussion. (As they are to the high performance market, badum-tish.) Who do Intel really need to convince to buy their latest high performance CPUs? The people who bought a 2500k three and a half years ago.

That's a segment of the market, and probably not a very big one (people who upgrade every three years). IMO most people upgrade because they need to, or they want a new toy. Both of those groups either buy what's put in front of them or they listen to what other people say.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
With Haswell 4+2 (Quad Core + GT2 or Core i7 4770k/4790K) we have come to the point where we could have a 6-Core for the same die size.

That means we could have a 4C 8T at Core i5 4670K price or 6C 12T at Core i7 4790K price.

I dont believe we need to say anything more, iGPUs are taking valuable space that could be used for more CPU performance.

At 14nm we could have a 6-core die at less than 100mm^2, even with 50% higher cost for the new process, it would still be cheaper than 177mm^2 die size at 22nm. ;)

2.jpg
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Their CPU keep improving and is soon "good enough" for most average users. Then the focus is on the GPU, where AMD beats Intel. And Intel's node advantage is shrinking. Just look at their 14 nm woes.

Lets hope they still make CPUs when they reach that performance level.

Intels node lead is expanding, nomatter how you wish to twist it. Specially against companies that cant afford 20nm and below.

The next AMD APU will be on 28nm vs 14nm Intel Broadwell and Skylake CPUs.
 

Fjodor2001

Diamond Member
Feb 6, 2010
3,989
440
126
IntelDieSize.png


And if we had the same die size as SB at ~215 mm^2 but using 14 nm we could have 12C/24T including an iGPU.

Or if we skipped the iGPU and had the ~300 mm^2 die size of Lynnfield but on 14 nm, we could have 18C/36T. :)

And remember, both SB and Lynnfield were mainstream CPU dies (the latter a "performance CPU", but still).
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,361
136
The next AMD APU will be on 28nm vs 14nm Intel Broadwell and Skylake CPUs.

Well, according to the latest news it seams that AMD will be able to release a new 28nm HDL APU in early 2015. Intel on the other hand will not be able to release a 2+3 14nm until the end of H1 2015 the earliest. :rolleyes:

It will be ownage to have a faster iGPU at 28nm vs 14nm :biggrin: