Intel Iris Pro 6200 is something else

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Blue_Max

Diamond Member
Jul 7, 2011
4,223
153
106
It makes sense for notebooks for sure, but not for desktops as the price and performance profile isn't good. For a HTPC or NUC, you don't need such CPU performance and for a gaming desktop build, it lacks gaming performance that cheap dGPUs provide.

No man's land due to the price. At least for AMD, the el-cheapo APU has a small niche.

Cost will be the deciding factor.

Still, there's something to be said for being able to get "pretty good" gaming performance out of a tiny NUC box in your bag vs. even the smallest of ITX rigs.
zb_v3.JPG
 

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
That is nice PR review,

Bioshock 1080p low and HD4600 is on par with Kaveri A10-7700. :whiste:

GTA V 720p Minimum and R7 240 faster than Kaveri A10-7850K with DDR2400MHz. :p

This is an Intel thread, not an AMD thread. Do your ADF duties elsewhere please.

Member callouts are not allowed
Moderator Subyman
 
Last edited by a moderator:

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
e $150-$350 volume sellers. This would encompass the 750 Ti to the GTX 970, R7 260X to the R9 290. If Intel has that performance and proliferates it into other chips it would begin a really fast evisceration of Nvidia and AMDs dGPU market.

Things like the Titan X, 980 Ti, Fiji, 295X2, and other > $350 cards are tiny market share bragging rights cards, sort of like the Dodge Hellcat vs a normal Challenger. The only reason the exotic one exists is to sell more of the normal one. If normal challengers stop selling, you won't see any more hellcats (or challengers). Same for these video cards.

This is something I really don't want to see happen. So, I hope Intel's iGPU does in fact suck when put up in real benchmarks.

The car analogy doesn't really work. Sure, it's probably true for the Big 3 and their Japanese and Korean counterparts, but companies like Ferrari and Lamborghini don't sell any mainstream cars, only ultra-high-end performance models.

Anyway, it's worth pointing out that (as you note in the first paragraph) the GTX 970 is basically a mainstream GPU; sales are massive by all accounts, far exceeding even Nvidia's expectations. And assuming the Broadwell ~= GTX 750 comparison is roughly accurate, you need to at least triple Broadwell's perfomance to get close to GTX 970 levels.

And while the GTX 980 Ti is no doubt a much lower selling card, sales aren't negligible; it's reported as being sold out almost instantly everywhere. And once 16nm FinFET+ comes along in late 2016 / early 2017, we should expect to see GTX 980 Ti levels of performance come down to GTX 970 price levels (and even lower power consumption). That's how it has usually worked in the past; new GPU generations usually mean everything gets bumped down a notch, with the new midrange being equivalent or superior in performance to the old high-end.

According to Intel's promo materials, Skylake is supposed to have about 60% iGPU performance improvement over Broadwell. That might put it roughly on par with Pitcairn - a midrange discrete GPU from 2012. A $200 discrete card today (GTX 960 or R9 285) can easily beat that. Even giving Intel the benefit of the doubt, it will probably be at least Cannonlake (if not longer) before they manage to get iGPU performance up to GM206/Tonga levels. And by that time, thanks to FinFET+, $200 discrete cards will be providing performance on level with GM204 at ~125W.

On top of all this, keep in mind that Intel reserves their eDRAM-reinforced iGPUs for the most expensive SKUs. For most users, an i3 with a cheap discrete GPU is going to offer better perf/$. If Intel wants to take over the market, they need to put the best iGPU on everything but low-power laptop and HEDT/server SKUs.

I just don't see Intel making discrete GPUs obsolete any time in the forseeable future.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Skylake will not be anywhere close to Pitcarin just like broadwell is nowhere close to the 750. Broadwell 6200 is at best around a 250 (non x) GDDR5. Broadwell is around 30-40% better than kaveri (or will be once its drivers get more mature) and nowhere close to the 750.
 

jkauff

Senior member
Oct 4, 2012
583
13
81
Broadwell would be a excellent CPU for 4K video. Decoding HEVC takes a lot of computing power, and the Iris Pro can do hardware-assisted decoding. At wholesale prices, it could make sense as the guts for a standalone UHD Blu-ray player when they hit the market later this year.

Why a desktop, overclockable version? The HTPC market, small as it is, will also be interested in 4K video boxes, and those of us with non-gaming desktops might be able to ditch our dGPUs. The price is high for a Haswell swap-out, but being able to re-use our Z97 motherboards saves a couple hundred dollars on an upgrade.
 
Last edited:

Phynaz

Lifer
Mar 13, 2006
10,140
819
126
Not sure I can follow you here, should we stop comparing Iris Pro with Nvidia's products as well?

Some of us are sick of his constant thread crapping. Notice the numbers he posted weren't from 6200, but from 4600. That's meant to deceive.

He has always argued that a Intel CPU mated with a cheap dGPU wasn't a fair comparison with an AMD APU. Now all of a sudden it's okay to add a dGPU to an AMD system because he needs to support his pro AMD position.

Goalposts moved again. Expect it to happen a few more times in this thread.

Cool it with the callouts
Moderator Subyman
 
Last edited by a moderator:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
Not happening anytime soon. Skylake wont have it. Cannonlake very unlikely. So the first real hope would be Icelake. But I honestly dont even believe in that.

What makes you think that? Intel supports every other VESA spec out there - they are on the board of directors, it's part of DP. So far, there's been no indication that Adaptive Sync needs anything beyond drivers and firmware to function, NVIDIA could do it now and so could Intel.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
What makes you think that? Intel supports every other VESA spec out there - they are on the board of directors, it's part of DP. So far, there's been no indication that Adaptive Sync needs anything beyond drivers and firmware to function, NVIDIA could do it now and so could Intel.

You forget its an optional part of DP.

nVidia havent done it and dont have any plans for it. Instead they launched mobile G-Sync for laptops.

Even if intel implement it, then they also need driver support. And for what, 30-40FPS minimums? Anyway, its not going to happen anytime soon, even if they commit. Skylake+Skylake Refresh is what you get on desktops. So we are way deep into 2017 before there is even a possibility with Icelake.

If you want adaptive sync you buy AMD.
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
Some of us are sick of his constant thread crapping. Notice the numbers he posted weren't from 6200, but from 4600. That's meant to deceive.

He has always argued that a Intel CPU mated with a cheap dGPU wasn't a fair comparison with an AMD APU. Now all of a sudden it's okay to add a dGPU to an AMD system because he needs to support his pro AMD position.

Goalposts moved again. Expect it to happen a few more times in this thread.

I noticed this too...
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
You forget its an optional part of DP.

nVidia havent done it and dont have any plans for it. Instead they launched mobile G-Sync for laptops.

Even if intel implement it, then they also need driver support. And for what, 30-40FPS minimums? Anyway, its not going to happen anytime soon, even if they commit. Skylake+Skylake Refresh is what you get on desktops. So we are way deep into 2017 before there is even a possibility with Icelake.

If you want adaptive sync you buy AMD.

This would be a great opportunity for AMD to implement some nice Optimus-like functionality with adaptive sync for laptops. Intel CPU and AMD GPU + adaptive sync.

Intel may just be waiting for this to all get settled and then adopt later, and just leverage the GPU companies in discretes for the time being (NV-Gsync and AMD adaptive sync).

I haven't heard Intel has announced if they would or would not support. I agree, that likely means we will not see anything really soon...
 

exar333

Diamond Member
Feb 7, 2004
8,518
8
91
As long as he's citing specs for similarly priced solutions.

True.

Also, if Intel decided to release an i3 + Iris Pro for desktop, that could be a game-changer. Would love to see one of those ~$150-170...
 

AtenRa

Lifer
Feb 2, 2009
14,003
3,362
136
He has always argued that a Intel CPU mated with a cheap dGPU wasn't a fair comparison with an AMD APU.

Nope, i have argued that a DUAL core Pentium + cheap dGPU is not the same as a Quad Core AMD APU, because for the same price you have lower CPU performance with the Intel CPU.
 
Feb 19, 2009
10,457
10
76
Dual cores are dead for gaming anyway with many recent games now tanking badly on anything not 4 cores. It's quite bad with GTA V, i3 struggling at 28-40 fps trying to drive Titan X heh.

But IF Intel releases a cheap i3 + Iris Pro combo and price it ~$150, it would make for a great HTPC APU and really destroy the only niche for AMD's APU, el-cheapo gaming & entry gaming notebooks.

But pricing it towards the $300 mark for an i5 quad core where builds typically use dGPUs, its stuck in no man's land.
 

Haserath

Senior member
Sep 12, 2010
793
1
81
Not sure why anyone would be excited about this. I've been waiting for a decent IGP, but this has 2.4x the EUs+ improvements to arch+Crystalwell+14nm, but it's still less than 2x better than Haswell GT2, even if it's 65W vs 84/88W.

It's around an R7 250(also loses to the 240 depending on the game), which is about half a 750 Ti. Yet the Ti is just about the same size(relative node).

It's progess, but it's still not a viable solution except for Valve-level titles. Expected more from Broadwell.
 
Feb 19, 2009
10,457
10
76
Haserath, I don't think iGPUs will really be good enough until they adopt HBM and up the TDP.

DX12 should help AMD in this route, due to their GCN cores able to share processing with GCN dGPUs easily. So an AMD Zen APU at 200W TDP with HBM, would be great by itself for gaming, but plug in a AMD dGPU, and you get the benefits of the dGPU along with the iGPU.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Not sure why anyone would be excited about this. I've been waiting for a decent IGP, but this has 2.4x the EUs+ improvements to arch+Crystalwell+14nm, but it's still less than 2x better than Haswell GT2, even if it's 65W vs 84/88W.

It's around an R7 250(also loses to the 240 depending on the game), which is about half a 750 Ti. Yet the Ti is just about the same size(relative node).

It's progess, but it's still not a viable solution except for Valve-level titles. Expected more from Broadwell.

While a lot of people around here won't be affected, the reality is, this is big news, as a LOT of people game on comparable systems now, but using higher powered dGPU's. This will make a lot of laptop gamers happy. The typical ones without high end mobile GPU's.
 

tential

Diamond Member
May 13, 2008
7,348
642
121
Nice find, thanks! A dual core Skylake plus GT4e and DDR4, that's a serious console killer...

I've been VERY VERY excited about the GPU increases intel has made. I'm curious though as to why they keep going forward. I thought they needed a small bump maybe, but why are they so GPU focused even now? Who is demanding that intel continue to bump it's IGP perf past what it's right now, and not by a small amount, but by DRASTIC amounts.

Otherwise, I'm happy about it because it means that soon maybe even IGPs will be able to game quite decently or even at console levels if intel continues this drive to increase it's graphic performance at crazy rates. It'd be pretty cool to be able to sell your video card, still be able to game on your IGP, then get a new one a month later or whenever the new one came out.
IGP's suck right now but maybe they can be good enough to be that stopgap measure inbetween cards.
 
Last edited:

Haserath

Senior member
Sep 12, 2010
793
1
81
Haserath, I don't think iGPUs will really be good enough until they adopt HBM and up the TDP.

DX12 should help AMD in this route, due to their GCN cores able to share processing with GCN dGPUs easily. So an AMD Zen APU at 200W TDP with HBM, would be great by itself for gaming, but plug in a AMD dGPU, and you get the benefits of the dGPU along with the iGPU.

The 750 Ti is on a far inferior process and is twice as fast at 60W. The Ti is good enough for current 1080p, unlike Iris Pro. Crystalwell+DDR3 has enough bandwidth to roughly equal GM107. An ~80mm2 30W 14nm GPU should at least be able to come close.

While a lot of people around here won't be affected, the reality is, this is big news, as a LOT of people game on comparable systems now, but using higher powered dGPU's. This will make a lot of laptop gamers happy. The typical ones without high end mobile GPU's.

Yes, this may be a solution for laptops, but Intel still charges a lot for Iris pro. Does Intel even update Drivers regularly like Nvidia?

Still not sure which is more efficient. I suppose Intel needs a few more generations to catch Nvidia.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Not sure why anyone would be excited about this. I've been waiting for a decent IGP, but this has 2.4x the EUs+ improvements to arch+Crystalwell+14nm, but it's still less than 2x better than Haswell GT2, even if it's 65W vs 84/88W.

It's around an R7 250(also loses to the 240 depending on the game), which is about half a 750 Ti. Yet the Ti is just about the same size(relative node).

It's progess, but it's still not a viable solution except for Valve-level titles. Expected more from Broadwell.

Skylake will bring more than Broadwell.

Most likely much more.