Intel Iris Pro 6200 is something else

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.
Feb 19, 2009
10,457
10
76
While a lot of people around here won't be affected, the reality is, this is big news, as a LOT of people game on comparable systems now, but using higher powered dGPU's. This will make a lot of laptop gamers happy. The typical ones without high end mobile GPU's.

The question is price. Would $279 raise the cost of entry gaming notebooks? Yes it will.

Will a comparable priced notebook with a Maxwell dGPU offer better performance? Yes it will.

What's the advantage? Lower overall TDP. Does it matter? Yes if they game on battery, no if they game plugged to the wall.

There's one niche I see it as an advantage, Ultrabooks, with the slim profile, that performance at low TDP gives it a huge bonus and Ultrabooks can price higher due to form appeal.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
The question is price. Would $279 raise the cost of entry gaming notebooks? Yes it will.

Will a comparable priced notebook with a Maxwell dGPU offer better performance? Yes it will.

What's the advantage? Lower overall TDP. Does it matter? Yes if they game on battery, no if they game plugged to the wall.

There's one niche I see it as an advantage, Ultrabooks, with the slim profile, that performance at low TDP gives it a huge bonus and Ultrabooks can price higher due to form appeal.

I'm not sure about that 1st one. A similar CPU + a dGPU is not going to be much cheaper and is routinely purchased for gaming.
 

shady28

Platinum Member
Apr 11, 2004
2,520
397
126
The car analogy doesn't really work. Sure, it's probably true for the Big 3 and their Japanese and Korean counterparts, but companies like Ferrari and Lamborghini don't sell any mainstream cars, only ultra-high-end performance models.

Ferrari is owned by Fiat, which also owns Chrysler/Dodge.

Lamborghini is owned by Volkswagen via it's subsidiary Audi.

Could go on with that. Jaguar = Ford, Bentley and Rolls Royce = Volkswagen, etc.

Most of these companies were absorbed by the larger mass market auto groups for a reason, and it wasn't because they were viable and profitable in and of themselves.

Anyway, it's worth pointing out that (as you note in the first paragraph) the GTX 970 is basically a mainstream GPU; sales are massive by all accounts, far exceeding even Nvidia's expectations. And assuming the Broadwell ~= GTX 750 comparison is roughly accurate, you need to at least triple Broadwell's perfomance to get close to GTX 970 levels.

And while the GTX 980 Ti is no doubt a much lower selling card, sales aren't negligible; it's reported as being sold out almost instantly everywhere. ..


I agree the 970 is a midrange price bracket card - barely.

People lose their perspective because of all the hype on these high end cards.

Point in fact, when you look at Steam hardware surveys the GTX 980 gets a whoppping - 0.69%. Yes, two-thirds of one perecent, on a survey that is almost certainly weighted to the high end.

There are more Radeon 5450s in use on steam than 980s, more GT 430s in use, more GTX 640s, etc etc.

This super high priced, high end mess is relatively new. Go back to the days of the GeForce 256 - a $299 card.

This is what AT said about it in Dec 1999 :

"The DDR GeForce is everything we expected from the original GeForce, unfortunately its high price tag [edit: $299] will keep it out of the hands of many. "

So in real dollar terms, $299 then = about $350 now.

These $500+ cards are interesting, but it's not where the money making machine is.

If Intel starts really eating up the bottom, like obsoleting the R7 250 or the GTX 740 / 750, AMD and Nvidia will literally be getting eaten feet first. Intel will not need to get anywhere GTX 970 levels to kill those companies.

If they get 750 Ti levels it will kill off the entire AMD R7 line and blow the bottom end out of Nvidias lineup (look how many x30 and x40 and x50 line cards are sold).

It would probably kill off the R9 270/270X and GTX 960 as well - after all, they're faster, but not so much to justify the expenditure for most people.

And once you get past that point, it's just the high end peanuts left. AMD might actually outlive Nvidia in that scenario, because AMD has its APUs.
 
Feb 19, 2009
10,457
10
76
I'm not sure about that 1st one. A similar CPU + a dGPU is not going to be much cheaper and is routinely purchased for gaming.

Not cheaper, but faster for sure.

Outside of ultra thin profile & low TDP constraints, it makes little sense being so expensive. You or I won't go for entry gaming notebooks, but they sell very well and perf/$ matters.

As said, if Intel makes a cheap i3 + Iris Pro (~$175). It will instantly kill all of AMD's APU, bang, dead.
 

MrTeal

Diamond Member
Dec 7, 2003
3,919
2,708
136
The question is price. Would $279 raise the cost of entry gaming notebooks? Yes it will.

Will a comparable priced notebook with a Maxwell dGPU offer better performance? Yes it will.

What's the advantage? Lower overall TDP. Does it matter? Yes if they game on battery, no if they game plugged to the wall.

There's one niche I see it as an advantage, Ultrabooks, with the slim profile, that performance at low TDP gives it a huge bonus and Ultrabooks can price higher due to form appeal.
This is exactly the reason I'm interested in Skylake's IGP. I have an Asus N56VJ-DH71 that has an i7-3610QM and a GT 635M. I could see replacing it this fall with something like the UX501, when they do a Skylake refresh. For the foreseeable future discrete GPUs won't be on 14/16nm, so I don't see the 960M in that (aka 750Ti) getting a refresh until Pascal hits in 2016.

A top-end Skylake Iris Pro could give 960M levels of performance or better, along with lower power consumption. The GT4e version might cost a little more than the GT2 version of an i7, but likely not any more than the extra cost of the 960M. In either case, it doesn't matter much in a $1500 laptop if it gives an extra 1/2 hour or hour of battery life.

The big challenge is going to be convincing their partners to forgo the dGPU checkbox if GT4e really does end up being the superior solution.
 
Feb 19, 2009
10,457
10
76
The big challenge is going to be convincing their partners to forgo the dGPU checkbox if GT4e really does end up being the superior solution.

Come on, it's Intel we're talking about. $$$$

A few years now, ASUS's Ultrabook range has no dGPU options. It's the niche for Intel's iGPU there due to their low TDP. They own that.
 

Shivansps

Diamond Member
Sep 11, 2013
3,918
1,570
136
no dgpu solution also makes for a cheaper motherboard build and cheaper cooling, so im not so sure about the final price compared to a dgpu option.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Point in fact, when you look at Steam hardware surveys the GTX 980 gets a whoppping - 0.69%. Yes, two-thirds of one perecent, on a survey that is almost certainly weighted to the high end.

There are more Radeon 5450s in use on steam than 980s, more GT 430s in use, more GTX 640s, etc etc.

I think you're giving the Steam survey too much weight. Participation is optional, so it's likely that the more experienced users are the ones who opt out (for privacy reasons). And not all gamers use Steam, though a substantial portion do.

Microsoft made the same mistake with Windows 8, assuming that they could get rid of the Start Menu because the Windows Customer Improvement Experience telemetrics indicated that few people used it. Well, it turned out that a lot of experienced users turned that crap off for privacy reasons, and many companies blocked it via Group Policy, so the telemetrics didn't reflect the real-world usage.

Anyway, if you look at the Steam survey, the most popular discrete cards are the GTX 760 (2.61%) and GTX 970 (2.41%). The 970 is the second most popular discrete card; if Intel can't beat that, then they're not going to convince anyone who's halfway serious about gaming. That doesn't mean the chips can't be useful for other purposes, or for casual gamers - they're fine for that, but they're not going to make discrete cards go away without competitive performance.

This super high priced, high end mess is relatively new. Go back to the days of the GeForce 256 - a $299 card.

This is what AT said about it in Dec 1999 :

"The DDR GeForce is everything we expected from the original GeForce, unfortunately its high price tag [edit: $299] will keep it out of the hands of many. "

So in real dollar terms, $299 then = about $350 now.

No, according to the official US government CPI website, $299 in 1999 is equivalent to $424.63 today.


These $500+ cards are interesting, but it's not where the money making machine is.

If Intel starts really eating up the bottom, like obsoleting the R7 250 or the GTX 740 / 750, AMD and Nvidia will literally be getting eaten feet first. Intel will not need to get anywhere GTX 970 levels to kill those companies.

If they get 750 Ti levels it will kill off the entire AMD R7 line and blow the bottom end out of Nvidias lineup (look how many x30 and x40 and x50 line cards are sold).

Hold on a minute. You're confusing sales volume with profit. It's entirely possible that the $500+ cards represent only a small percentage of sales, but have far higher profit margins than the cheap, high-volume cards. If it only costs Nvidia $300 to manufacture a Titan X, then that's $700 profit on each card rolling off the assembly line. How many crappy, dirt-cheap Fermi rebrands would they have to sell to make that much? 20? 100?

Even if they sell briskly to OEMs and inexperienced users, these cards probably play a negligible role in profitability. The fact that they almost never get refreshed and often use architectures generations out of date is evidence of that. If these cards were big money-makers, then they wouldn't be so neglected by the companies. They're neglected because they don't matter, and both AMD and Nvidia are already well aware that these bottom-of-the-barrel offerings will eventually be supplanted by iGPUs.

You're also ignoring the professional market. How are iGPUs supposed to replace a professional render farm or an HPC installation? Even high-end desktop users with AutoCAD or SolidWorks often need more than an iGPU can offer. And the professional GPUs are far more profitable per-unit than the consumer cards. We know that a $999 Titan X must be making big profits for Nvidia; imagine what kind of profits per unit the $4,999 Quadro M6000 (same GPU) must be raking in.

Tahiti and GK110 were both designed as compute-first chips, but worked fine for gaming. We may see more designs like this if the low end is eaten away by iGPUs. The actual need for more GPU power among professional users as well as gamers isn't going to go away.
 

coercitiv

Diamond Member
Jan 24, 2014
7,359
17,443
136
The question is price. Would $279 raise the cost of entry gaming notebooks? Yes it will.
Where did you come up with that figure? The price difference from Iris Pro in Intel's mobile SKU lineup is around $50. Smaller PCB and lower overall TDP also lowers costs, which cuts into that $50 premium even before adding the cost of the dGPU.
 

Maxima1

Diamond Member
Jan 15, 2013
3,549
761
146
The only problem is if the OEMs are going to use it in the models. The Skylake with quad i5 and GT4e would be pretty sweet in a cheaper model.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
I just don't see Intel making discrete GPUs obsolete any time in the forseeable future.

Probably when we reach 16K resolutions, for us with the monitor so close to our face. When iGPU's start pushing 1080p~4K at high FPS on a longer distance couch set up, it could be argued if anything else would make a difference.
 

coercitiv

Diamond Member
Jan 24, 2014
7,359
17,443
136
What should I be comparing it to, because its 15-47W prices and the benchmark from AT is even higher TDP.
This is the list of Haswell mobile i5 SKUs. The Haswell 37W/47W models are priced either at $225 or $266, depending on top speed. The Broadwell i5 5350H is $289.

So, depending on the Haswell SKU of your choice, price difference can be between $23 and $64, with the Broadwell CPU being faster anyway.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Haserath, I don't think iGPUs will really be good enough until they adopt HBM and up the TDP.

DX12 should help AMD in this route, due to their GCN cores able to share processing with GCN dGPUs easily. So an AMD Zen APU at 200W TDP with HBM, would be great by itself for gaming, but plug in a AMD dGPU, and you get the benefits of the dGPU along with the iGPU.

This.

HBM is the biggest thing to hit CPUs (IMHO) since the memory controller and GPU were added to the CPU package. For now, continuing to tweak the arch and keep power consumption in check will enable a lot more once HBM/HMC can start feeding it enough bandwidth.

A couple years down the road will get pretty exciting. :)
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Where did you come up with that figure? The price difference from Iris Pro in Intel's mobile SKU lineup is around $50. Smaller PCB and lower overall TDP also lowers costs, which cuts into that $50 premium even before adding the cost of the dGPU.

Have you seen that logic being applied to the current Haswell Iris Pro? It's worth noting from the Anandtech review of the Iris Pro 5200 that manufacturers think the added cost isn't worth it from the performance it offers.

That means it does not have a pricing advantage.

http://www.anandtech.com/show/7834/nvidia-geforce-800m-lineup-battery-boost/4

At the above link on the last pic it shows that 840M offers the performance that's better than Iris Pro 5200 IGP at lower power.

It does not even have a perf/watt advantage. What's the advantage of Iris Pro setups? Nothing. Bad architecture explains lack of perf/watt but what about perf/$? Intel knows very well its not very competitive, yet they price it, because... business reasons dictate their decisions.
 

coercitiv

Diamond Member
Jan 24, 2014
7,359
17,443
136
Have you seen that logic being applied to the current Haswell Iris Pro? It's worth noting from the Anandtech review of the Iris Pro 5200 that manufacturers think the added cost isn't worth it from the performance it offers.

That means it does not have a pricing advantage.
Haswell Iris Pro was a $90 cost add, Broadwell is $56. Logic dictates it's significantly cheaper now, right?

At the above link on the last pic it shows that 840M offers the performance that's better than Iris Pro 5200 IGP at lower power.
Are we discussing the 6200 or the 5200, because as far as I saw in the reviews, there's quite a difference between the two of them.

It does not even have a perf/watt advantage. What's the advantage of Iris Pro setups? Nothing. Bad architecture explains lack of perf/watt but what about perf/$? Intel knows very well its not very competitive, yet they price it, because... business reasons dictate their decisions.
I'll take that as your opinion on the matter.

I guess we'll know more on performance and efficiency when reviews come in for the new Broadwell SKUs.
 

Shehriazad

Senior member
Nov 3, 2014
555
2
46
Intel is obviously doing the right thing, attacking the gpu market from the bottom.


Going for the high end right now is pointless, anyway. Give or take 5 to 10 years and systems even in the high end sector will slowly start phasing out dGPUs.

Why? Because APU and SoCs are the future. Especially once we approach that sub 10nm area it would just be the economically smart thing to do.


Sure, for super high end enthusiasts and supercomputers dGPUs will be a thing.

But AMD and Intel are both doing the right thing. I actually feel like it's going to be Nvidia that will be at a disadvantage in a few years.

Sure, they dominate dGPU shares right now and dipped their feet into cars...but what else is there? They half heartedly try some stuff with their Tegra chips...and while the GPU for smartphone sized gaming is actually neat.... in this Area Nvidia will be years behind the competition for years to come.

IF AMD manages to get through its' current struggle...we might end up seeing Nvidia just slowly vanish into different markets and super enthusiast only.

Once Intel has a few more revisions on its' iGPU and AMDs APUs start using at least 14nm and HBM²...even the Desktop markets will change toward this...I would bet my dog on it.



That said....the iGPUs/APUs are not quite where I want them yet...but I expect them to "master" 1080P gaming (thanks to DX12 as well) by the end of 2016. (Maybe not with that SSAO+ 4xMSAA...but close xD)



P.S. of course god forbid that Nvidia and Intel actually teamed up to make Desktop SoCs...that stuff would just be straight up insane...a man can dream.
 
Last edited:

Innokentij

Senior member
Jan 14, 2014
237
7
81
Really like this new intel IGPU, i have never had a laptop that didnt throttle cause of the gpu and i think the performance will finally be enough for me. Will give it a shoot next time i buy a laptop.
 

coercitiv

Diamond Member
Jan 24, 2014
7,359
17,443
136
I actually feel like it's going to be Nvidia that will be at a disadvantage in a few years.
Don't make your bets just yet. Once iGPUs from Intel/AMD start being worthy for gaming laptops you'll see a very different side of Nvidia: competition may not be the most efficient catalyst for innovation (cost wise), but it sure as hell is spectacular. And make no mistake, they saw the warning signs long time ago.

Some on this forum blame Nvidia's focus on power consumption as just another marketing trick in the short war versus AMDs flagships. Others might think different. Fun times ahead.