Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It's a gradual shift. Take a look at what happened in laptops. First in Sandy Bridge Intel made integrated graphics good enough to kill the market for lower powered laptop graphics parts. Then they improved again with Ivy Bridge, and again with Haswell, and now even pretty strong midrange graphics parts like the GT650m are becoming irrelevant. Broadwell is coming with another process shrink next year, and you can bet that they'll use a lot of those new transistors to bulk out the graphics parts again. What's next to go? The 660m? And all of a sudden, NVidia is left with a market for only the Alienware-level parts- a tiny fraction of the market.

How many more chunks of the market can Nvidia stand to lose to Intel, before their market share is so small that they can't fund their R&D any more?

Haswell GT3e will compete with GT640M not GT650M,
secondly GT650M was released more than a year ago, more than 14+ months before Haswell release.

Also, GT650M is a small 118mm2 die at 28nm, it has Optimus and its only 45W TDP. It is one thing to get high score in 3D Mark 11 and another thing to have high performance in games.

Thirdly, It is cheaper to get a GT2 Haswell + 650M than GT3e and dont forget than both AMD and NVIDIA can lower their Mobile/Desktop Discrete GPUs prices since 28nm wafer prices are way lower than a year and and half ago.
 

NTMBK

Lifer
Nov 14, 2011
10,458
5,844
136
Haswell GT3e will compete with GT640M not GT650M,
secondly GT650M was released more than a year ago, more than 14+ months before Haswell release.

Also, GT650M is a small 118mm2 die at 28nm, it has Optimus and its only 45W TDP. It is one thing to get high score in 3D Mark 11 and another thing to have high performance in games.

Thirdly, It is cheaper to get a GT2 Haswell + 650M than GT3e and dont forget than both AMD and NVIDIA can lower their Mobile/Desktop Discrete GPUs prices since 28nm wafer prices are way lower than a year and and half ago.

Aye, this is definitely a possibility if Intel really screw up the pricing. Have we seen prices for GT3e yet?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
They can buy an Atom instead and maintain full backwards compatibility with all their software.

Assuming that Intel will catch up with the ARM world.
Baytrail only increases the graphics performances 3x, nVidia with Tegra 4 6-8x over Tegra 3. ImgTech announced that the first Rouge devices will have more than 160 GFLOPs, i expect Logan with 300 GFLOPs next year.

The performance increase in the ARM world is huge with every new generation. In only two years you can buy an ARM SoC with nearly the same performance like the GTX650m.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Aye, this is definitely a possibility if Intel really screw up the pricing. Have we seen prices for GT3e yet?

Since Quad core GT2 will be priced at $300+, GT3 and GT3e will be priced even higher.


Core i7 3940XM HD4000 55W TDP = $1096 :eek:
Core i7 3840QM HD4000 45W TDP = $568
Core i7 3740QM HD4000 45W TDP = $378

Core i7 3520M HD4000 35W TDP = $346

I dont expect Haswell GT2 (bigger die than IB GT2) to be cheaper. GT3 will most probable be $400/500+ with GT3e 55W TDP at 1000 ??
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
Core i7 4770K (GT2 die size close to 170-180mm2) will be at $350. Core i7 4770R GT3 (die size 200-200mm2) will be priced higher.

Simple as that ;)


It is not that simple. Any Haswell Qadcore has a die size between 180-190 mm². Price varies between 150-350. You cannot caculate a price just from Intels die size. That doesn't work.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
It is not that simple. Any Haswell Qadcore has a die size between 180-190 mm². Price varies between 150-350. You cannot caculate a price just from Intels die size. That doesn't work.

The cheapest i7 is how much?

The 4770R will not be cheaper than the 4770K.
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
It is not that simple. Any Haswell Qadcore has a die size between 180-190 mm². Price varies between 150-350. You cannot caculate a price just from Intels die size. That doesn't work.

Any Haswell Quad Core GT2, GT3 will be closer to 200-220mm2. Core i7 4770R will be a GT3 die. By being bigger and having lower TDP(65W) than Core i7 4770K(84W) will make it a higher priced part.

Cheapest Quad Core GT2 will be close to $180.00 not $150.00.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
The cheapest i7 is how much?

The 4770R will not be cheaper than the 4770K.

Any Haswell Quad Core GT2, GT3 will be closer to 200-220mm2. Core i7 4770R will be a GT3 die. By being bigger and having lower TDP(65W) than Core i7 4770K(84W) will make it a higher priced part.

Cheapest Quad Core GT2 will be close to $180.00 not $150.00.

Well lets see. The S and T series are cheaper than the K.

All these estimates on price due to size is just terrible.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
The 4770R will not be cheaper than the 4770K.

Once again your source?

Any Haswell Quad Core GT2, GT3 will be closer to 200-220mm2. Core i7 4770R will be a GT3 die. By being bigger and having lower TDP(65W) than Core i7 4770K(84W) will make it a higher priced part.

Cheapest Quad Core GT2 will be close to $180.00 not $150.00.


First of all GT3 Quadcore is according to my calculation ~235 mm² big in size. The point ist, you make wild assumptions just based on the die size. If Intel decides to price it affordable because it is a better match for soldered BGA CPUs aimed for all in one systems from OEMs, they can easily do it.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I don't get why people keep forecasting the death of discrete.. the bar gets shifted, not chopped.

Because nobody buys desktop. Sales are dipping nearly 20% per quarter, for some vendors more than that.

Like it or not, the first gateway computing device for the average consumer is a tablet or macbook/portable. For me, it was a desktop from Best Buy - and I eventually became an enthusiast DIY. Obviously in this example, do you see the problem here? If nobody buys desktops, whats the problem? There are fewer customers for discrete. Period. It won't be immediate, but unless you're oblivious to trends - it will happen. Maybe in 5 years, 10, we'll see.

The other point to consider is that iGPU is gaining performance at such an exponential rate - eventually the performance will be so good, and it will shut out traditional discrete manufacturers from the low-mid range market completely. Because of this shut out of the market, it will be that much harder to fund R+D - leaving them relegated to the high end market. As someone else said, if intel matched the GT650M this generation, what will it be next gen? The GT770M? Eventually the performance of iGPU will be so high that it shuts nvidia out from a majority of the market. And i'm afriad that will cause R+D costs to be prohibitively expensive, and will eventually necessitate an exit from the market. Nobody knows when this will happen, it could be in 5 years, 7 years, who knows. But at some point and time, it will happen. It will be a very slow and gradual shift. Look, nvidia knows this will eventually happen - this is the entire reason that nvidia spread their wings and entered the ARM SOC market. They know that discrete won't last forever - it's good money to them for the time being, and I personally think it will last them at least 5 years. Maybe more but it depends entirely on intel.

The tides are turning. The market is sustainable now for the hard-core enthusiasts but this WILL NOT last forever. This is not necessarily something that I like because I enjoy desktops! We speak the same language on this matter, and this is not something that i'm looking forward to. I like monster size dies with outrageous performance. But i'm afraid we're becoming the niche, not the norm. Eventually, we will either be phased out or face outrageously high costs due to being such a small niche.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
Well lets see. The S and T series are cheaper than the K.

All these estimates on price due to size is just terrible.

S and T series doesnt have bigger die than K series, they are exactly the same. GT3 will have a much bigger die and double the iGPU performance than GT2.

If Intel will price them lower, say good bye to high margins and profit.
 

SiliconWars

Platinum Member
Dec 29, 2012
2,346
0
0
Feb 19, 2009
10,457
10
76
Because nobody buys desktop. Sales are dipping nearly 20% per quarter, for some vendors more than that.

Like it or not, the first gateway computing device for the average consumer is a tablet or macbook/portable. For me, it was a desktop from Best Buy - and I eventually became an enthusiast DIY. Obviously in this example, do you see the problem here? If nobody buys desktops, whats the problem? There are fewer customers for discrete. Period. It won't be immediate, but unless you're oblivious to trends - it will happen. Maybe in 5 years, 10, we'll see.

The other point to consider is that iGPU is gaining performance at such an exponential rate - eventually the performance will be so good, and it will shut out traditional discrete manufacturers from the low-mid range market completely. Because of this shut out of the market, it will be that much harder to fund R+D - leaving them relegated to the high end market. As someone else said, if intel matched the GT650M this generation, what will it be next gen? The GT770M? Eventually the performance of iGPU will be so high that it shuts nvidia out from a majority of the market. And i'm afriad that will cause R+D costs to be prohibitively expensive, and will eventually necessitate an exit from the market. Nobody knows when this will happen, it could be in 5 years, 7 years, who knows. But at some point and time, it will happen. It will be a very slow and gradual shift. Look, nvidia knows this will eventually happen - this is the entire reason that nvidia spread their wings and entered the ARM SOC market. They know that discrete won't last forever - it's good money to them for the time being, and I personally think it will last them at least 5 years. Maybe more but it depends entirely on intel.

The tides are turning. The market is sustainable now for the hard-core enthusiasts but this WILL NOT last forever. This is not necessarily something that I like because I enjoy desktops! We speak the same language on this matter, and this is not something that i'm looking forward to. I like monster size dies with outrageous performance. But i'm afraid we're becoming the niche, not the norm. Eventually, we will either be phased out or face outrageously high costs due to being such a small niche.

PC sales have declined, because rubbish DELL/HP boxes made of trash cannot do anything special that ultrabooks now match or beat, and tablets/hybrids are replacing the typical usage scenario. There is less and less need for a crap PC. No doubt about this one.

But your point is off the mark, because how does this impact PC gaming? It has NO impact for PC gamers who still buy graphics cards like the gtx650ti or 7790 onwards. PC gamers that demand high details in their games, running full HD. Last I checked, people kept saying PC gaming is dying as well.. but its proving the naysayers wrong and wrong again every year. These PC gamers will still drive the demand for discrete GPU, because iGPUs are utterly worthless in comparison to a GPU that has a 150W TDP. Just because Haswell is able to come close to a 650M doesn't mean diddly squat to these PC gamers, as such, discrete GPU sales will be fine going forward. Unless you dream that somehow, PC gamers will stop caring about graphics pushing the boundaries with every new gen of games. Unlikely, we can agree on that.

Sure, iGPU is progressing very rapidly, but lets assume these new CPU are what, 250mm2, with half or more of that devoted to the iGPU? Lets be generous and give the best iGPU on haswell at 130mm2, with say, ~50% of that being the embedded vram. That's about 65mm2 for the GPU itself.

Its going to have to compete with a mainstream part that's 125-150mm2, with gddr5 not included in that die space. Lets give Intel the magic-sauce, and assume they can match NV/AMD in terms of perf/mm2 (this is a pretty big assumption!) and lets even be generous and give them a constant 1 full node advantage. At best, Intel's 65mm2 is going to be competitive vs a 130mm2 discrete IF its running at settings that do not saturate its embedded vram (3dmark is light on vram), if gamers cranks up settings, Iris is going to system ram and performance is going to tank, hard.

See scenarios where games would fill up 2gb vram these days? Happens often. So even in their best scenario for a long time forward, they will not be competitive vs the mainstream discrete. This and the mid-range discrete make up the bulk of GPU sales.

Lets not pretend to predict more than a few years into the future. It becomes about as reliable as anthropogenic global warming propaganda based on computer modelling (crystal ball gazing).. aka, worthless.
 

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
Just to put things in perspective, specially the "moving posts" crowd:


http://jonpeddie.com/press-releases/details/add-in-board-report-Q4-2012-crash-down/


(...)

JPR found that AIB shipments during Q4 2012 behaved according to past years with regard to seasonality, but the drop was considerably more dramatic. AIB shipments decreased 17.3% from the last quarter (the 10 year average is just -0.68%). On a year-to-year comparison, shipments were down 10%.

The quarter in general


• Total AIB shipments decreased this quarter, from the previous quarter by 17.3% to 14.5 million units (see Table 1).
• Nvidia continues to hold a dominant market share position at 64%; in Q4’12, AMD shipped 4.98 million units, and Nvidia shipped 9.5 (see Figure 2).
• Year to year this quarter AIB shipments were down 10% from 16.1 million to 14.5 million units.
• Almost 37.3 million desktop PCs shipped worldwide in the quarter, an increase of 2.7% compared to the previous quarter (based on an average of reports from Dataquest, IDC, and HSI).

(...)

The dGPU market is already crashing, and this is even before Haswell. Once Haswell arrives, we're headed for another crash, and next year the trend continues with Broadwell.

What we're seeing now is a starting symptom. Both GPU players canned their architectural refreshes this year because there is no ROI there, they will go for refreshes next year. The only reason you got to slow down your R&D is because you don't think the investment is worth the effort, and right now both players think it isn't.

But the deepest impact won't be felt in the next two years, as a nice part of the R&D budget is already spent. Both Nvidia and AMD are already funding products that will be marketed 3-4 years down the road, and the budget to those products is bigger than current products. Given that before Haswell the dGPU market is melting, what do you think that happens once you factor Haswell and Broadwell and Skylake down the road? You simply don't develop bigger, more complex and expensive products for a shrinking market.

AMD can dilute their GPU R&D among their other APU lines, one of them is descently profitable, but they are losing market everywhere, so they are going to face a GPU R&D squeeze rather sooner than later. It's the same trend we can verify in their CPU business, where they are a day-late-a-dollar-short long time ago.

As for Nvidia, they are in an even worse position, because their most profitable market (Quadro) can be destroyed by AVX2 in a not so distant future, and they have no profitable CPU business to talk about. I do believe that their management will somehow manage this transition, but it will be a rocky ride until then.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Gaming is a small portion of the overall computing market, and sales are driven by the overall market, not a small niche. Don't be so offended at what i'm saying. I think like you - i'm a gamer and enthusiast. But i'm not oblivious to the fact that people like us are a tiny portion of the overall scheme of things and the relatively few folks like us (in comparison to the vast number of people buying ipads and macbooks) aren't enough to sustain the market forever. It will last 5-10 years, after that R+D costs will be so high from intel removing the low-mid range discrete market that traditional discrete manufacturers will need to exit the market.

This will cause either 1 of 2 things to happen:

1) R+D costs will be so prohibitively high for discrete GPUs, that nvidia/AMD will only provide extremely expensive GPUs. Low to mid range GPUs will not exist. Because of inflated R+D costs, prices will be extremely high. Much higher than they are now.
or
2) Completely shut out of the market due to the advances intel is making. Again, we're only a small portion of the market. You can cite gaming figures forever but the overall computing market is driven by the mass market, not a niche market. And the mass market are buying tablets and macbook in droves, while only a relatively few are buying desktop parts.

The evidence is loud and clear. Discrete and desktop sales have sharply declined due to new trends in computing, and while gamers do have a tendency to buy the high end - again - R+D costs are funded by discrete makers providing products for the low to mid range of the market. Because most users aren't willing to pay 500$-1000$ for a discrete GPU, their sweet spot is 200-300$. When intel makes that discrete market obsolete, it will have a negative impact across the board for the likes of nvidia and AMD - and force them out of the market.

Again, is this something that I like? Absolutely not. I like huge monolithic dies with outrageous performance. But as I said, we're in a small minority now. Maybe there's a silver lining somewhere down the road which will revive desktop PC enthusiast sales, but i'm not overly optimistic for what the market will be like 7,8 years from now.
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
S and T series doesnt have bigger die than K series, they are exactly the same. GT3 will have a much bigger die and double the iGPU performance than GT2.

If Intel will price them lower, say good bye to high margins and profit.

Whats the die size of a 3820 vs 3770 or 2700?

If die sizes was so important,. Why dont a FX chip cost 2000$? Or why are GPUs not higher priced?

The wafer cost in itself is peanuts.

Its the same nonsense we heard for years in the GPU area until the tables reversed there.
 
Last edited:
Feb 19, 2009
10,457
10
76
Prices may go up since there's no longer low-low end to provide the high volume low profit sales. Sure, that makes sense.

But it doesn't do anything to prevent gamers buying better discrete to keep on gaming with ever increasing graphics workload. Note that mid-range has shifted this generation compared to previous, in terms of pricing, and high end also shifted up.

The evidence is far from clear. AIB shipment has declined, because a lot of those "shipments" are utter crap going into DELL/HP crap-boxes. There is NO evidence of a decline for mainstream, midrange or high-end. Recall the gtx680 situation, sold out everywhere for a long time. Demand is always there as long as there are enthusiastic PC gamers.

Edit: Let's play with some silly numbers.

Assume globally, there's 100 million enthusiastic PC gamers who typically buy discrete GPUs above low-end crap. Its not many considering WoW for many years had 13M subbers.

50% buy mainstream GPUs (even a game like WoW needs some GPU grunt in busy areas), 30% buy mid-range, and the rest buy mixtures of high end or multi GPU configs. That is still a LOT of potential buyers, who will be upgrading every few years to keep up with the technological improvements in gaming engines.
 
Last edited:

mrmt

Diamond Member
Aug 18, 2012
3,974
0
76
The evidence is far from clear. AIB shipment has declined, because a lot of those "shipments" are utter crap going into DELL/HP crap-boxes. There is NO evidence of a decline for mainstream, midrange or high-end. Recall the gtx680 situation, sold out everywhere for a long time. Demand is always there as long as there are enthusiastic PC gamers.

That utter crap is what AMD and Nvidia had to keep the price of their mainstream GPU down. Once it was written off, mainstream prices went up. How do you think that both AMD and Nvidia will counter the rising prices of their R&D budget? $1.000 Titan says something to you?
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Prices may go up since there's no longer low-low end to provide the high volume low profit sales. Sure, that makes sense.

But it doesn't do anything to prevent gamers buying better discrete to keep on gaming with ever increasing graphics workload. Note that mid-range has shifted this generation compared to previous, in terms of pricing, and high end also shifted up.

The evidence is far from clear. AIB shipment has declined, because a lot of those "shipments" are utter crap going into DELL/HP crap-boxes. There is NO evidence of a decline for mainstream, midrange or high-end. Recall the gtx680 situation, sold out everywhere for a long time. Demand is always there as long as there are enthusiastic PC gamers.

At this point you're just putting your fingers in your ears and ignoring all of the trends and sales figures in computing. Don't be oblivious to facts. The overall PC market isn't dictated by Dell/HP "crap boxes" , this applies to the PC market across the board. Desktop PCs aren't selling. Discrete sales are dipping by double digit figures. The mass market is being driven by ultra portable devices and tablets, it isn't being driven by gamers with 1000$ GPUs.
Additionally, you're also ignoring the fact that this discrete GPU market has a smaller user base from which to draw sales from now as compared to 10 years ago - as mentioned, people aren't buying desktop in droves, they're buying tablets and ultra portables. 10 years ago, everyone had a desktop as their first computing device and a small percentage of THAT installed base would go to best buy and purchase a discrete video card. The installed user base for desktop is slowly diminishing over time because very few buy a desktop as their gateway computing device.

You're also ignoring the fact that both nvidia and AMD subsidize their R+D costs by selling discrete cards for the highest selling portion of that market: the 150$-250$ low end. That market is fast disappearing because intel has over-taken that territory. This isn't directed at you, but in general: Ignoring trends and sales figures in this manner is just willful ignorance. Open your eyes to reality. This obviously isn't something that will happen overnight, but discrete and desktop in the long term is very questionable. I can't see the discrete GPU market sustaining itself 6 or 7 years from now - and if it does, costs will be outrageously high and exclusive to the professional crowd.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,458
5,844
136
Prices may go up since there's no longer low-low end to provide the high volume low profit sales. Sure, that makes sense.

But it doesn't do anything to prevent gamers buying better discrete to keep on gaming with ever increasing graphics workload. Note that mid-range has shifted this generation compared to previous, in terms of pricing, and high end also shifted up.

The evidence is far from clear. AIB shipment has declined, because a lot of those "shipments" are utter crap going into DELL/HP crap-boxes. There is NO evidence of a decline for mainstream, midrange or high-end. Recall the gtx680 situation, sold out everywhere for a long time. Demand is always there as long as there are enthusiastic PC gamers.

680 sold out because the production was terrible...

As for pricing- look in VC&G. Lots and lots of users are complaining that Titan is just too expensive for them, and they're not going near it. Now extrapolate that to a future where low and mid range cards are dead, and every dGPU costs the same as a Titan. How many of us, tech enthusiasts, are going to go for that?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
The dGPU market is already crashing, and this is even before Haswell. Once Haswell arrives, we're headed for another crash, and next year the trend continues with Broadwell.

The market crashed because x86 crashed. dGPUs are tied to the x86 market. When OEMs selling less x86 PCs they are selling less dGPU, too.
On the other hand nVidia's revenue increased last year in Q3 and Q4 with a declining x86 market.

It has nothing to do with iGPU, because now nearly every new CPU has one.

What we're seeing now is a starting symptom. Both GPU players canned their architectural refreshes this year because there is no ROI there, they will go for refreshes next year. The only reason you got to slow down your R&D is because you don't think the investment is worth the effort, and right now both players think it isn't.

They are not slowing down their R&D. Both waiting for 20nm. There is no reason to do the same thing again. It costs only money. Right now they are doing other things. For example nVidia designed a new low power dGPU (GK208).
BTW: nVidia's R&D in discrete GPUs is on a all time high with $900 millions this year.

But the deepest impact won't be felt in the next two years, as a nice part of the R&D budget is already spent. Both Nvidia and AMD are already funding products that will be marketed 3-4 years down the road, and the budget to those products is bigger than current products. Given that before Haswell the dGPU market is melting, what do you think that happens once you factor Haswell and Broadwell and Skylake down the road? You simply don't develop bigger, more complex and expensive products for a shrinking market.
We hearing this since Sandy Bridge. And nothing happend. Right now the biggest threat to dGPUs is ARM and the whole smartphone and tablet market.

As for Nvidia, they are in an even worse position, because their most profitable market (Quadro) can be destroyed by AVX2 in a not so distant future, and they have no profitable CPU business to talk about. I do believe that their management will somehow manage this transition, but it will be a rocky ride until then.
Wow, we are going back to the good old days: Software graphics provided by a CPU. :awe:
 
Feb 19, 2009
10,457
10
76
Intel has over-taken the $150-$250 GPU territory? Intel doesn't even scratch the paint on cards like the gtx660/ti or 78xx.

Again, you and other market surveyors have not broken down the decline of PC sales into categories. The mass market was never driven by gamers with $1000 GPUs, never said it was.

The mass market is already driven by crap PC boxes without any discrete at all, given that regular GPU surveys have Intel as 2/3 of the GPU brands. All it did is kill off $50-75 discrete, which was rubbish to begin with and was just included for the sake of pamphlets having a tick on "2GB video memory!!"..

So back to the point, unless you feel that the numbers of gamers devoted to PC gaming is going to significantly drop going forward, there's no threat at all to mainstream, midrange or highend discrete for a long time.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
You can always rely on sontin to have the utmost objective opinion on the matter:

https://www.google.com/#output=sear...87,d.eWU&fp=61967fd3e70e14f3&biw=1657&bih=903

With 47,000 google search results for nvidia related queries under that user name, who would have guessed? If he isn't a shareholder in nvidia, he has a lot of time to waste of his own free will. I'm not saying that in a derogatory manner, and this isn't to be mean.... it's just funny - let's not pretend you have any sort of objective opinion on this matter. Just put it out there. Tegra 4 will obliterate haswell. Right? Right?
 
Last edited: