Any Intel CPU without their iGPU?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?

Wouldn't that be better for PC users/gamers? Heck yes.

But it's bad for Intel. Because they wouldn't be able to get insane profits like selling 5960X 8c/16t for $999. ;)
 

ehume

Golden Member
Nov 6, 2009
1,511
73
91
I run a 4770k and a 4790k, both without dGPU's because I don't game. Both show movies just fine. Both show my work just fine. Both show imaging apps just fine. I could afford dGPU's but I don't need them so I don't buy them. If I gamed or did CAD-CAM, I might feel differently.
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?

Wouldn't that be better for PC users/gamers? Heck yes.

But it's bad for Intel. Because they wouldn't be able to get insane profits like selling 5960X 8c/16t for $999. ;)

Just so you know, Intel's 4-core CPUs with iGPUs are still much smaller than their 6- and 8-core CPUs without:

http://www.anandtech.com/show/8426/...review-core-i7-5960x-i7-5930k-i7-5820k-tested

Ivy Bridge 4C + iGPU: 160mm^2
Ivy Bridge 6C (no iGPU): 257mm^2
Haswell 4C + iGPU: 177mm^2
Haswell 8C (no iGPU): 356mm^2

Despite the iGPU being half of the CPU's total die area, leaving it out and doubling the core count more than doubles the CPU size, suggesting 8 cores takes up more than 4x more area than 4 cores.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?

The thing that most people don't get is how they make money off these CPUs. And making money is ultimate goal of these companies.

The only way they can justify making a 8C/16T CPU is if they can make enough derivatives of it. And with current markets, its only if 8C/16T config is the one with the iGPU. In alternate universe where integrated graphics don't exist, and everyone wants fast discrete GPUs, the 8C/16T config without GPU would make sense.

So if they decided to make a 8C/16T CPU WITHOUT an iGPU its derivatives would serve an ever decreasing market, the one that's already very small.

That's why HEDT chips are derivatives of the server/workstation ones. They don't see the financial justification of creating new cores just for HEDT. If the rumor that Intel went with thermal paste versus soldered one because one of the reasons(among few others) is to lower cost, then you can imagine that you'll NEVER see a dedicated iGPU-less die for PC. Where it would take additional time and money changing millions of connections in the circuitry to create a new one.
 
Last edited:

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,414
8,356
126
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?

why would intel sell it at the same price? people who want 8 cores are willing to pay extra for those 8 cores than they are for 4 cores, and so intel prices accordingly.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Yep. And if that was the case, Intel should be able to sell their 6700K for $50. Let's say it would be $25 with A9X class clock speeds/performance. And you increase the price to make up for higher clocks. Voila! $50!

Actually, it may be even less than that, because A9X is larger than the 6700K.
 
Feb 19, 2009
10,457
10
76
why would intel sell it at the same price? people who want 8 cores are willing to pay extra for those 8 cores than they are for 4 cores, and so intel prices accordingly.

Heh that's why I said what Intel does, is great for Intel. Not so great for users. :)

Stagnation, itti-bity next-gen gains, huge $$ segmentation for enthusiast products.
 

nenforcer

Golden Member
Aug 26, 2008
1,767
1
76
Intel has something like 60% of the graphics marketplace with their built-in IGP. It's "good enough" for the vast majority of people out there.
 

majord

Senior member
Jul 26, 2015
433
523
136
Just so you know, Intel's 4-core CPUs with iGPUs are still much smaller than their 6- and 8-core CPUs without:

http://www.anandtech.com/show/8426/...review-core-i7-5960x-i7-5930k-i7-5820k-tested

Ivy Bridge 4C + iGPU: 160mm^2
Ivy Bridge 6C (no iGPU): 257mm^2
Haswell 4C + iGPU: 177mm^2
Haswell 8C (no iGPU): 356mm^2

Despite the iGPU being half of the CPU's total die area, leaving it out and doubling the core count more than doubles the CPU size, suggesting 8 cores takes up more than 4x more area than 4 cores.

that's because:

- The GPU was not the same area as the CPU cores on Haswell (it is on skylake)
- Stupendous amounts of L3 cache
- Quad channel memory controller
 

nerp

Diamond Member
Dec 31, 2005
9,866
105
106
Can you imagine a Skylake CPU that's the same size as the current i5/i7, ~120mm2, instead of having 4c/8t, it's actually 8c/16t and it costs Intel the SAME to manufacture meaning they could sell it at the same price?

Wouldn't that be better for PC users/gamers? Heck yes.

But it's bad for Intel. Because they wouldn't be able to get insane profits like selling 5960X 8c/16t for $999. ;)

That's now how the industry works. You don't know if it costs the same to manufacture. Plus, there's a lot more to designing, producing, packaging, marketing and selling a chip than just fabbing it.
 

fleshconsumed

Diamond Member
Feb 21, 2002
6,483
2,352
136
Meh, iGPU works well for everybody who doesn't game. The only reason you need discrete graphics is for gaming. Everybody else is just fine with iGPU. It's only logical for intel to include iGPU in all of its chips because a) most consumers need it and b) it would take more resources to design and manufacture different SKU without iGPU.

Between my girlfriend and me (mainly me) we have a lot of computers and laptops in our household. Only two have discrete GPU: my main desktop because I occasionally game, and my GF's desktop because she likes to play SIMS occasionally and she wanted a smoother experience and we had extra videocard laying around. Every other computer, my fileserver, my HTPC, my tablet, her macbook air, they all use iGPU and I have zero complaints about it, in fact I rather like not having discrete video when I don't need it.
 

sm625

Diamond Member
May 6, 2011
8,172
137
106
Before IGPs, CPUs cost about the same per unit, IGP is awesome, because no need for bad motherboard IGPs or discrete cards where they are not needed. IGP comes at no cost initially and during using it adds probably just few watts of draw to rest of the CPU. While my main rig has no IGP, I do use it in all other rigs, it's the best invention ever and if you are not gaming hardcore, it's nobrainer to have one. I'm glad that every CPU by now has it.

I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.
 

jhu

Lifer
Oct 10, 1999
11,918
9
81
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.

That's well and good for desktops. But for laptops, they're decent for older games. I play Guild Wars 2 on my Core i5 6200U without any problems. With a weak or no iGPU, I'd have to shell out more money for a laptop with a discrete GPU, which I don't want or need.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
I love the igp as a spare or second video card. It's good for testing purposes, or to get you by if your dgp dies on you.

Now with DX12, we may get some more use out of it.

Not likely to get anymore use out of it. DX12 when used properly makes use of the CPU more efficient and the IGP is nowhere near powerful enough to make your CPU the bottleneck when gaming.

I do agree with your first point though. I rather like having it. I have my secondary monitor running off the IGP which frees up a bit of vram from being used on my card, it also gives me QuickSync which I use often.
 
Feb 25, 2011
16,777
1,466
126
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.

Except that even 2D games - hell, even UI effects in your OS - need more GPU horsepower than that, and those requirements are always going up.
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
Except that even 2D games - hell, even UI effects in your OS - need more GPU horsepower than that, and those requirements are always going up.

I haven't encountered a need for anything more powerful then the Intel IGP's since Ivy Bridge even when running dual 1080p displays for your basic desktop needs. I'm not only speaking for myself but the hundreds of users I've supported over the years. A handful of which run 3 displays.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Not likely to get anymore use out of it. DX12 when used properly makes use of the CPU more efficient and the IGP is nowhere near powerful enough to make your CPU the bottleneck when gaming.

I do agree with your first point though. I rather like having it. I have my secondary monitor running off the IGP which frees up a bit of vram from being used on my card, it also gives me QuickSync which I use often.

It was already demonstrated to help out in mutiadapter mode. And that was presumably an earlier, less capable intel igp than what we have now.

McMullen showcased the benefits for a hybrid configuration using the Unreal Engine 4 Elemental demo. Splitting the workload between unnamed Nvidia discrete and Intel integrated GPUs raised the frame rate from 35.9 FPS to 39.7 FPS versus only targeting the Nvidia chip. In that example, the integrated GPU was relegated to handling some of the post-processing effects.

http://techreport.com/news/28196/di...-shares-work-between-discrete-integrated-gpus
 

2is

Diamond Member
Apr 8, 2012
4,281
131
106
It was already demonstrated to help out in mutiadapter mode. And that was presumably an earlier, less capable intel igp than what we have now.



http://techreport.com/news/28196/di...-shares-work-between-discrete-integrated-gpus

Again, not likely to happen, certainly not to an extent that will matter. A lot of things were demonstrated with DX12, and we are starting to see just how difficult it is to get that extra performance out of it, and that's before we get into the complication that is multi-adapter mode. It's the difference between what CAN be done and what WILL be done.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Again, not likely to happen, certainly not to an extent that will matter. A lot of things were demonstrated with DX12, and we are starting to see just how difficult it is to get that extra performance out of it, and that's before we get into the complication that is multi-adapter mode. It's the difference between what CAN be done and what WILL be done.

Well, you could be right. This is the last article I could find on it, and it doesn't make the future seem bright. :)

http://arstechnica.com/gaming/2016/...lly-work-together-but-amd-still-has-the-lead/
 

seitur

Senior member
Jul 12, 2013
383
1
81
Mainstream Intel CPU without iGPU* won't happen because economics.

What economic sense it would make to design mainstream priced product without iGPU in today's computing world?


* - disabled iGPU does not count, since you still have to pay for that unused iGPU siliconn wafer place anyway.
 

Zodiark1593

Platinum Member
Oct 21, 2012
2,230
4
81
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.

In a desktop setting, what you describe would probably be an acceptable use case. In places where upgrades are virtually non-existant (laptops), a beefy igp is a godsend and the difference between good framerates at low-medium settings, and running poorly no matter the settings.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
I agree, IGPs are great. But Intel's IGP would be just as good if it didnt try to be more useful than it is. It could and should be only 2-4 EU instead of ballooning to 12,16,24, etc. It only needs to be strong enough to run 2D games.

That's true, but CPU part won't change. So it'll be a 4 core 8 thread 6700K only with 4EUs, rather than 24EUs.

I think the reason they go big GPU is because that's the trend computing is going. Also with AMD going big GPU they can't avoid it.

It also seems to be a strategy for them to increase ASPs. Look at where GT3e and GT4e(eventually) will be selling. ARK tells you the pricing difference is only $30-50. But you don't see such systems. You don't see a Core i5 6200U system costing $600 and a Core i5 6260U at $650. Instead, you see a Core i7 6560U in $1500+ parts. Perhaps that's the way the entire PC industry is aiming at.

So for the manufacturer they get to sell a $1500+ system, when they wouldn't otherwise. For Intel, they get to sell a $400+ CPU.

Of course we're not saying its a sane strategy. It's crazy! Why not create a huge 300mm2 GT4e with 6 slices, 144EUs instead for the current pricing and give us GT3e with 48EUs everywhere? And that's for now, not in the distant future when GT3e will be the bare minimum like GT2 is currently.
 

ksec

Senior member
Mar 5, 2010
420
117
116
Except that even 2D games - hell, even UI effects in your OS - need more GPU horsepower than that, and those requirements are always going up.

Exactly. I dont do gaming on PC, not for a VERY long time anymore. And i am actually surprised more people are now doing PC gaming then before.

But yes basic GPU acceleration. I am surprised by the comment on Ivy Bridge Graphics because even Firefox blacklist a few old Intel drivers so they dont use GPU. Intel Gfx Drivers were really bad in the old days, and properly still not up to AMD / Nvidia standard.

And may be Ivy Bridge were really the turning point? SandyBridge GPU definitely dont do dual display well.

But my point is Skylake GPU still isn't up to stretch, if you do have 2K / 3K Monitor. Or at least you will need the Iris GPU to play it safe. But its configuration and market segmentation means Intel either dont offer this to you or you pay YET another heavy premium on it. I really wish they could set Iris as Standard.

May be, May be 10nm Cannonlake. But Kaby Lake does not seems to offer any GPU upgrade ( We shall wait and see ) . And 10nm will be so expensive Intel will likely skim on Transistor again. i.e Same prices for much smaller die size.
 
Last edited:

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
I haven't encountered a need for anything more powerful then the Intel IGP's since Ivy Bridge even when running dual 1080p displays for your basic desktop needs. I'm not only speaking for myself but the hundreds of users I've supported over the years. A handful of which run 3 displays.

HEVC will be a massive headache eventually. I'm looking for a new media streamer and I can either build an expensive HTPC ($500), import an expensive shield ($350), or roll the dice on a drop shipped knock off Android box (<$100) with no support. Its ridiculous there isn't a decent fixed function all round HEVC decoder in a cheap Celeron level chip. Skylake doesn't do 10-bit, and I don't want a compromise.