Do you think Intel should offer dual core chips with GT4e iGPU?

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Do you think Intel should offer dual core chips with GT4e iGPU?

  • Yes

  • Yes, but only in 2C/4T configuration

  • No


Results are only viewable after voting.

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
The eDRAM costs them $3 for a 128mb chip. I would expect the L4 would be restricted to the fastest, most-expensive 2C/4T desktop chip sporting it (assuming they made

And the 260mm2 GT3e die probably costs Intel $15 or less to make, but it goes for $600 on a mobile chip. Of course there's far more than just material costs. You watch any business shows and the more attractive business sells items for higher ratio.

Even pure costs, there's additional risk and cost related to putting it on a seperate die but in the same package. That makes the pure costs likely far more than $3.

That aside it would be sweet to have a $120 CPU + $50 iGPU for a $170 part.

The cards seem to have pretty stable prices. It's that the market for anything not 2-4x as fast as IGP is almost gone, and soon will be gone, except as a niche for added display connectors. Right now, a GTX 750 is about the minimum worth buying.
For me, I could go from a HD 3000 iGPU on a 2600K to a $60 discount R7 240 DDR3 that offers near Iris Pro 6200 performance which is 4-5x. Low end graphics still matters.

The only way that this thread makes sense is if everyone with an Intel system could just replace the chip to get the latest iGPU. Unfortunately, getting an iGPU for me means CPU+memory+mobo, which might get pretty high if the CPU is a one with performance close to a 2600K.

Again, so what? If that's what's in people's budget, then so be it. Intel has, up 'til now, been notorious for offering their best iGPU offerings with "more CPU than a gamer would need", such as the i7-5775c. GT4e on an i3 would be a step in the right direction with respect to CPU/GPU balance.
Its a very clever way to make you spend $100 extra on a CPU. I opted for a 2600K just for that, and I realize now that it probably wasn't the smartest thing to do. And based on their earnings and "APU" releases since Gen 1 i3/i5/i7(Nehalem) their strategy worked.

Before, you could opt for the most brand-spanking-new integrated graphics motherboard/chipset with the loweliest of CPUs.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
And the 260mm2 GT3e die probably costs Intel $15 or less to make, but it goes for $600 on a mobile chip. Of course there's far more than just material costs. You watch any business shows and the more attractive business sells items for higher ratio.
Whether it's $0.50, or $50, what really matters is that they can sell it to customers for much more than that cost. It's premium feature, so having a lower-end part with said feature would devalue it, while also costing them more overall, for having more different units to produce.

Until it is a mainstream feature, either by reduced cost or widespread demand (which itself would be based on costs, since it's mainly there to make dual-channel RAM IGP faster, rather than have wider memory), there's little point in making cheaper, weaker parts with it.

For me, I could go from a HD 3000 iGPU on a 2600K to a $60 discount R7 240 DDR3 that offers near Iris Pro 6200 performance which is 4-5x. Low end graphics still matters.
Or, you could just have a 530 or thereabouts, which was the implied fair comparison point. Small upgrades of video cards on old PC are not common, these days.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Whether it's $0.50, or $50, what really matters is that they can sell it to customers for much more than that cost. It's premium feature, so having a lower-end part with said feature would devalue it, while also costing them more overall, for having more different units to produce.

For the people that would buy 4C GT4e (Mobile Workstation users), a desktop TDP 2C/2T GT4e with high clocks would not be a replacement.
 
Last edited:

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
For the people that would buy 4C GT4e (Mobile Workstation users), a desktop TDP 2C/2T GT4e with high clocks would not be a replacement.
So, just one more nail in the coffin of that idea :). I'd like to see it on lesser chips, but like $150 chips. Doing that will require either some serious competition from AMD (unlikely), or for the implementation to be cheap enough that Intel and PC OEMs feel it will be worth it on midrange systems. For now, more options will suffice.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
So, just one more nail in the coffin of that idea :). I'd like to see it on lesser chips, but like $150 chips.

Yeah, I think the large iGPU makes the most logical sense on desktop with the lesser chips. (re: Large iGPU is not a premium feature on desktop like it is on a mobile)

With that mentioned, the main competition will be dGPUs.

How cheaply can Intel sell a GT4e desktop vs. what a lesser CPU + dGPU would cost?

Assuming Intel has the fab capacity, I would expect even a discounted GT4e to be very achievable for Intel to do.

I mean just look at how cheaply ($20) Intel lists some Cherry Trail (87mm2 on 14nm) chips on Ark. So, of course, they can make a 200mm+ 14nm chip with eDRAM for $150 (or less).

The question is how much impact can they make against dGPU with these chips? Quad core desktop GT4e won't do much, because it is not very relevant to that market segment, but a 2C GT4e desktop definitely is.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Yeah, I think the large iGPU makes the most logical sense on desktop with the lesser chips. (re: Large iGPU is not a premium feature on desktop like it is on a mobile)
That's how the gamer thinks (and I totally agree) but for intel the igpu is a coprocessor to provide additional features to the CPU, like opencl for office/exccell stuff or render speedups and of course multimedia,so of course they tailor them to be somewhat equal to the cpu,so big cpu big igpu,small cpu small igpu.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
That's how the gamer thinks (and I totally agree) but for intel the igpu is a coprocessor to provide additional features to the CPU, like opencl for office/exccell stuff or render speedups and of course multimedia,so of course they tailor them to be somewhat equal to the cpu,so big cpu big igpu,small cpu small igpu.

For the OpenCL Office/Excel speed-ups how much iGPU is typically required? Also for photo editing?

My thinking is not that much is required for desktop, but in a mobile environment the GT4e is of benefit because a larger amount of GPU silicon running at lower voltage and lower clocks helps performance per watt.

P.S. I've read that for video edit rendering, the amount of 3D effects on the timeline determines the amount of GPU needed relative to CPU.
 
Last edited:

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
For the OpenCL Office/Excel speed-ups how much iGPU is typically required? Also for photo editing?
Zero is required,it can all be done on cpu,but the more EUs(shaders/stream processors whatever they are called) you have the faster it will be, so again they pair higher performance of igpu with higher performanve cpus.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Zero is required,it can all be done on cpu,but the more EUs(shaders/stream processors whatever they are called) you have the faster it will be,

I was thinking mainly about the point of diminishing returns for GPU size (This is where extra GPU beyond a certain size yields only minor or slight reductions in processing time)

For basic Open CL apps like office and photo editing, I believe the point of diminishing returns happens fairly quickly (at a low level of GPU on desktop).

For 3D effects in video editing/rendering the point of diminishing returns can occur at a higher level of GPU. The greater the effects.... the greater the benefit a larger GPU brings.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
so again they pair higher performance of igpu with higher performanve cpus.

For mobile it also makes sense to pair a larger iGPU with quad core because when the graphics load increases the reduction in power to the CPU can be spread over four cores rather than concentrated in two (which would cause a greater clockspeed drop).

Of course, in a desktop, the TDP can be set high enough that downclocking a 2C with GT4e active shouldn't happen.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
Intel is practically jerking while there is no competition... they will make edRAM mainstream when AMD and nVIDIA ARM will start to put HBM on their chips.
 

dark zero

Platinum Member
Jun 2, 2015
2,655
140
106
You do realize that HBM is expensive, right?
Actually yeah by now... but what happens if you produce it on mass?
It will reduce dramatically the prices, just look to SSD.

That's why Pascal and Artic Islands are the beginning of the change. Zen are on their way too. And nVIDIA Tegra X1 based on their new architecture are on their way. Also we can't forget VIA's x86 and ARM hybrid project. That could be really dangerous if AMD and nVIDIA team up with VIA to create a juggernaut.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
How cheaply can Intel sell a GT4e desktop vs. what a lesser CPU + dGPU would cost?
Cheap enough, I'm sure, but how would affect their profit margins and ASPs? If they can do it, and there's demand, they'll do it...but, they aren't going to undersell themselves, nor should they.

The question is how much impact can they make against dGPU with these chips? Quad core desktop GT4e won't do much, because it is not very relevant to that market segment, but a 2C GT4e desktop definitely is.
Why, though? A quad with a Xeon label is all on the desktop that could matter against dGPUs, and then only against the lowest rung of them, which are not generally purchased for performance reasons.

Every higher-cost chip has to be compared to every lower-cost chip. Put too much good stuff into a cheaper one, and it devalues the costlier ones. While I think it's moronic to do (because it hampers widespread adoption), that's part of why Pentiums don't get AVX, FI. It has nothing to do with what they can fab, but with making sure the i3 and i5 CPUs don't suffer lower sales due to their higher prices.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
Every higher-cost chip has to be compared to every lower-cost chip. Put too much good stuff into a cheaper one, and it devalues the costlier ones.

I don't think there is any appreciable overlap though.

65W (or greater) BGA 2C/2T GT4e vs. 45W BGA 4C/4T GT4e or 45W BGA 4C/8T GT4e....for mobile?

I don't think the 45W 4C GT4e sales would be affected.
 

DrMrLordX

Lifer
Apr 27, 2000
22,530
12,402
136
The idea is solid,but sadly is like putting a F1 engine on a Basic Ford model...

You mean like . . . the Ford Escort RS Cosworth?

So, just one more nail in the coffin of that idea :). I'd like to see it on lesser chips, but like $150 chips.

That's why I think, in today's environment, it would still be worth Intel's while and still be a desirable chip for the end-user at a price point of around $200 instead of $150. Not as good as $150, but it's sufficiently differentiated from other i3s that the cost model will help Intel maintain their cash flow.

Every higher-cost chip has to be compared to every lower-cost chip. Put too much good stuff into a cheaper one, and it devalues the costlier ones.

Intel is losing potential sales right now whenever someone buys an i3 or a Pentium and pairs it with a dGPU. They can chip away at AMD's and Nvidia's market share by giving people an alternative to that system setup with GT4e in a big way. To an extent, they are already doing it, but GT4e would up the ante. They already have the product in the pipe, and much of the cost of GT4e is related to R&D. If implementing and entirely-new die were too expensive to justify the cost, then they could use a cut-down 4C/8T chip or what have you.

For people that are buying 4C/4T desktop i5s and pairing them with dGPUs, high-clockspeed core i3 is already a potential alternative (stuff like the i3-6320). 2C/4T @ ~3.6-3.8 GHz + GT4e isn't going to wipe out i5 sales any more than the i3-6320 does it already. It's about hurting sales of dGPUs.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,786
136
If this were true, this would imply 97.5% gross profit margins for such chips. Does this seem reasonable to you?

The $3 mentioned by Ashraff in his article is likely the pure cost of materials. In that manner, it would be that small for such a chip like GT3e.

Rough wafer calculator shows that a 80mm2 die chip like the 128MB eDRAM would yield nearly 750 dies in a 300mm one. $3 means the whole wafer might cost less than $3k. Does it make sense to you that $3 is the only cost of making such a chip? That would mean Core M's and GT2 15W Core chips cost $3 too.

If you assume die costs increase by square of the size, the 260mm2 chip would cost $30. Purely by that selling it for $600 means a 95% profit margin. Now that does not make sense either, so their real profit margin is far lower because that's not the only cost. That's why saying the eDRAM costs "only $3" is misleading.

Intel would have got bankrupt many years ago if material costs were in any way a significant contributor to what the sell.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
If implementing and entirely-new die were too expensive to justify the cost, then they could use a cut-down 4C/8T chip or what have you.

I think that route (using 4C GT4e dies also for the 2C GT4e) probably makes the most sense.

4C GT4e dies that fail functional or parametric yields for mobile could be used as 2C GT4e (or 2C GT3.75e) console/desktop chip.

Or another way of looking at the situation: If Intel increases production of 4C GT4 to a level beyond the demand for mobile they might end up with more "cherry pick level" samples to make an even higher end bin for mobile vs. what they would have had if they kept 4C GT4 production low.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
If you assume die costs increase by square of the size

That must the cost for a good bin (ie, larger die with no defects and meets all frequency/voltage targets)

For a lower end bins or harvested bins, I'd imagine the cost is closer to linear for an increase in die size. (Although the larger dies do have increased "edge of wafer loss" that needs to be factored in beyond functional and parametric yield.)
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
I think that route (using 4C GT4e dies also for the 2C GT4e) probably makes the most sense.

4C GT4e dies that fail functional or parametric yields for mobile could be used as 2C GT4e (or 2C GT3.75e) console/desktop chip.

Or another way of looking at the situation: If Intel increases production of 4C GT4 to a level beyond the demand for mobile they might end up with more "cherry pick level" samples to make an even higher end bin for mobile vs. what they would have had if they kept 4C GT4 production low.

Maybe there could even be 28W 4C GT4e (or 28W 4C GT3.75e) chips if 4C GT4 production was high enough.
 

cbn

Lifer
Mar 27, 2009
12,968
221
106
Looking at the following chart of the 15W Skylake mobile chips:

107%20-%2015W%20SKL-U%20i7i5.jpg


I wonder if Intel could even take volume production of 4C GT4e to level of 15W as well? (So 45W, 28W, 15W 4C GT4e/GT3.75e for mobile.....and anything that doesn't make functional and parametric yield becomes our new 65W (or greater) 2C GT4e or 2C GT3.75e desktop/console chips.

P.S. Iris 540 graphics is GT3e. But I am thinking GT4e or GT3.75e at 15W could be tuned have even greater performance per watt and thus allow an even greater CPU clockspeed under iGPU load compared to GT3e @ 15W). Either that or the GT4e or GT3.75e could be tuned for greater absolute performance <---This should still work as the power drop to the CPU cores would be spread across four CPU cores rather than the two CPU cores as found on lesser mobile chips.