Intel Iris Pro 6200 is something else

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
I think if we move the time forward to 2025. You see nVidia much more busy with non gaming GPU applications. Neuro networks, cloud service etc.

Its quite clear that one day the dGPU will dissapear.
 

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
You forget its an optional part of DP.

nVidia havent done it and dont have any plans for it. Instead they launched mobile G-Sync for laptops.

Even if intel implement it, then they also need driver support. And for what, 30-40FPS minimums? Anyway, its not going to happen anytime soon, even if they commit. Skylake+Skylake Refresh is what you get on desktops. So we are way deep into 2017 before there is even a possibility with Icelake.

If you want adaptive sync you buy AMD.

No, I didn't forget at all. Intel and NVIDIA both have the option - RIGHT NOW - to support it via driver updates. NVIDIA may not be officially stating anything, but I guarantee* that they have an internal test bed that is operational. They basically proved it with the leaked notebook driver a few months back. G-Sync without a module? Sounds familiar.

Intel is... being Intel. Intel is actually the perfect customer for ASync given the wide gamut of frame rates its IGP gets.

AdaptiveSync will keep gaining momentum, then Intel will support it out of the blue with little to no fanfare, then when NVIDIA realizes that it can make more money supporting both, they will announce a driver update that will enable AdaptiveSync.

*not guaranteed
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
No, I didn't forget at all. Intel and NVIDIA both have the option - RIGHT NOW - to support it via driver updates.

It requires hardware support. Not just drivers. AMD itself is an example of this.

NVIDIA may not be officially stating anything, but I guarantee* that they have an internal test bed that is operational. They basically proved it with the leaked notebook driver a few months back. G-Sync without a module? Sounds familiar.

I never saw any test with this, besides someone selecting an option in a driver. Working or not. Could you link me the tests?

Intel is... being Intel. Intel is actually the perfect customer for ASync given the wide gamut of frame rates its IGP gets.

AdaptiveSync will keep gaining momentum, then Intel will support it out of the blue with little to no fanfare, then when NVIDIA realizes that it can make more money supporting both, they will announce a driver update that will enable AdaptiveSync.

*not guaranteed

Adaptive sync needs to support well below 30 or 40FPS minimums before its worthwhile for IGPs.

And lets see when or if Intel will support it. We are talking 2017 at the soonest with Icelake.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
It requires hardware support. Not just drivers. AMD itself is an example of this.



I never saw any test with this, besides someone selecting an option in a driver. Working or not. Could you link me the tests?

It wasn't so much of tests, but someone hacked the drivers and manged to get Async to work, which goes to show that the hardware to support it exists now. He also made the drivers available to others to try.

I don't know if it truly worked personally, but others claim it did.
 
Aug 11, 2008
10,451
642
126
Intel is obviously doing the right thing, attacking the gpu market from the bottom.


Going for the high end right now is pointless, anyway. Give or take 5 to 10 years and systems even in the high end sector will slowly start phasing out dGPUs.

Why? Because APU and SoCs are the future. Especially once we approach that sub 10nm area it would just be the economically smart thing to do.


Sure, for super high end enthusiasts and supercomputers dGPUs will be a thing.

But AMD and Intel are both doing the right thing. I actually feel like it's going to be Nvidia that will be at a disadvantage in a few years.

Sure, they dominate dGPU shares right now and dipped their feet into cars...but what else is there? They half heartedly try some stuff with their Tegra chips...and while the GPU for smartphone sized gaming is actually neat.... in this Area Nvidia will be years behind the competition for years to come.

IF AMD manages to get through its' current struggle...we might end up seeing Nvidia just slowly vanish into different markets and super enthusiast only.

Once Intel has a few more revisions on its' iGPU and AMDs APUs start using at least 14nm and HBM²...even the Desktop markets will change toward this...I would bet my dog on it.



That said....the iGPUs/APUs are not quite where I want them yet...but I expect them to "master" 1080P gaming (thanks to DX12 as well) by the end of 2016. (Maybe not with that SSAO+ 4xMSAA...but close xD)



P.S. of course god forbid that Nvidia and Intel actually teamed up to make Desktop SoCs...that stuff would just be straight up insane...a man can dream.

I have been following the apu tests on game.gpu. Most of the games they have tested are playable at 1080p with 7850k, but Witcher 3 is only marginally playable (high 20s fps) at 800p. And that is with a 95 watt tdp. So yes, there is still a long way to go. In addition, 14nm dgpu have to come eventually, and will raise the bar a lot, hopefully.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
It wasn't so much of tests, but someone hacked the drivers and manged to get Async to work, which goes to show that the hardware to support it exists now. He also made the drivers available to others to try.

I don't know if it truly worked personally, but others claim it did.

You mean g-sync.

And by hacked, was it actually working? Or was it just someone that hacked the driver to change the setting without effect.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
You mean g-sync.

And by hacked, was it actually working? Or was it just someone that hacked the driver to change the setting without effect.

I meant he used A-sync (adaptive sync) to achieve the desired G-sync/Freesync effect. He showed what he considered proof. The data appeared to be accurate, but I couldn't visually see the results to know for certain. It's hard to know for certain how well it worked.
 

mutantmagnet

Member
Apr 6, 2009
41
1
71
The problem is they are charging $120 to $200+ for half a die's worth of GPU transistors. It is getting to the point where so much money is wasted on useless GPU transistors that at some point people will have to seek an alternative. Every single notebook that uses one of these chips plus a nvidia discrete is going to be getting outright ripped off by intel for at least $120.

You're overlooking how laptops work. Even the ones with discrete cards will switch to the iGPU because they are more capable at increasing the battery depletion time.
 

mutantmagnet

Member
Apr 6, 2009
41
1
71
Intel is obviously doing the right thing, attacking the gpu market from the bottom.


Going for the high end right now is pointless, anyway. Give or take 5 to 10 years and systems even in the high end sector will slowly start phasing out dGPUs.

Why? Because APU and SoCs are the future. Especially once we approach that sub 10nm area it would just be the economically smart thing to do.


Sure, for super high end enthusiasts and supercomputers dGPUs will be a thing.

But AMD and Intel are both doing the right thing. I actually feel like it's going to be Nvidia that will be at a disadvantage in a few years.

Sure, they dominate dGPU shares right now and dipped their feet into cars...but what else is there? They half heartedly try some stuff with their Tegra chips...and while the GPU for smartphone sized gaming is actually neat.... in this Area Nvidia will be years behind the competition for years to come.

IF AMD manages to get through its' current struggle...we might end up seeing Nvidia just slowly vanish into different markets and super enthusiast only.

Once Intel has a few more revisions on its' iGPU and AMDs APUs start using at least 14nm and HBM²...even the Desktop markets will change toward this...I would bet my dog on it.



That said....the iGPUs/APUs are not quite where I want them yet...but I expect them to "master" 1080P gaming (thanks to DX12 as well) by the end of 2016. (Maybe not with that SSAO+ 4xMSAA...but close xD)



P.S. of course god forbid that Nvidia and Intel actually teamed up to make Desktop SoCs...that stuff would just be straight up insane...a man can dream.


Nvidia is aware of this problem. What holds them back is that they can't get an x86 license. Their solution is to make an ARM cpu but for many obvious reasons they're still looking for other ways to broaden their GPU markets because ARM (as it exists now) won't cut it for their core discrete GPU audience.
 

Insomniator

Diamond Member
Oct 23, 2002
6,294
171
106
While I'm happy for more laptops to have better, if still not good GPU performance, I do agree that paying for a half GPU chip on a desktop is getting annoying. Then again my i5 2500k has lasted 4 years even with its useless GPU so what do I care.

I guess for kids or something, but since I was... 18?... I've been capable of buying or saving up for a $200+ GPU that makes integrated completely pointless to me. The rest are still better served by a $100 750ti, or just get a console at that point. Low end PC gaming just... doesn't feel right (and its not like a $300+ intel APU is a 'lowend' part anyway).

So, good for small laptops, still not seeing the point for desktops.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
146
106
How long until we see one?

Not that long:
intel-skylake-schedule.jpg
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Haswell Iris Pro was a $90 cost add, Broadwell is $56. Logic dictates it's significantly cheaper now, right?

Intel LOWERED the price of Haswell Iris Pro to be a $56 adder.

http://ark.intel.com/products/family/75023/4th-Generation-Intel-Core-i7-Processors#@All

Core i7 4950HQ: $623
Core i7 4910MQ: $568

623-568 = $56


Are we discussing the 6200 or the 5200, because as far as I saw in the reviews, there's quite a difference between the two of them.


I'll take that as your opinion on the matter.

I guess we'll know more on performance and efficiency when reviews come in for the new Broadwell SKUs.
No there isn't: http://www.overclockers.ua/news/cpu/115939-intel-broadwell-m-2.png

Even Intel themselves say it. Iris Pro 6200 looks better because they refuse to compare with Iris Pro 5200 parts, which are direct replacements. 20%. Whoopee-doo. Remember how CPU-World claimed that Broadwell GT3e according to Intel presentations said 80% faster than Haswell, and people thought it was over Haswell GT3e? And CPU-World later changed the article to claim its 80% over Haswell HD 4600.

http://newsroom.intel.com/community...computing-experiences-intelligence-everywhere

Intel introduced the Intel® Xeon® processor E3-1200 v4 product family, the first time the Xeon processor line has included integrated Intel® Iris™ Pro graphics P6300. Built on 14nm process technology and designed for visually intensive, cloud-based workloads such as high-definition (HD) video transcoding and remote workstation delivery, the product family delivers up to 1.4x the performance for video transcoding3 and up to 1.8x the 3-D graphics performance4 compared to the prior generation.

You'll notice in small letters that 80% gain is Broadwell GT3e over Haswell GT2. They gained 20% for a iGPU, and took them nearly 2 years. WOW!!

ShintaiDK said:
Not that long:

The fact that they are releasing Broadwell GT3e now is troubling. Intel isn't traditionally known to cannibalize their products. That roadmap is either wrong or those parts are GT2 SKL parts.
 
Last edited:

therealnickdanger

Senior member
Oct 26, 2005
987
2
0
It requires hardware support. Not just drivers. AMD itself is an example of this.



I never saw any test with this, besides someone selecting an option in a driver. Working or not. Could you link me the tests?



Adaptive sync needs to support well below 30 or 40FPS minimums before its worthwhile for IGPs.

And lets see when or if Intel will support it. We are talking 2017 at the soonest with Icelake.

https://youtu.be/K7SYvgB6SZ4
 
Aug 11, 2008
10,451
642
126
@ Intel 2000

"Intel introduced the Intel® Xeon® processor E3-1200 v4 product family, the first time the Xeon processor line has included integrated Intel® Iris™ Pro graphics P6300. Built on 14nm process technology and designed for visually intensive, cloud-based workloads such as high-definition (HD) video transcoding and remote workstation delivery, the product family delivers up to 1.4x the performance for video transcoding3 and up to 1.8x the 3-D graphics performance4 compared to the prior generation.

You'll notice in small letters that 80% gain is Broadwell GT3e over Haswell GT2. They gained 20% for a iGPU, and took them nearly 2 years. WOW!!"

Not that it matters, because the performance "is what it is" but the article makes it very clear what the comparison is. They are talking about Xeon. There was no gt3e xeon for Haswell, so Broadwell gt3e xeon can only be compared to Haswell GT 2. You cant say they are cherrypicking the numbers when they are making the only comparison possible.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Even Intel themselves say it. Iris Pro 6200 looks better because they refuse to compare with Iris Pro 5200 parts, which are direct replacements. 20%. Whoopee-doo.
This is actually a really good point. To further add to this, Intel only managed 20% despite having a 22nm to 14nm die shrink.

Meanwhile AMD/nVidia have managed much better twice while staying on the same 28nm. So much for "superior process advantage will kill dGPUs".

Also the eDRAM is a one-trick pony, meaning once you have it, you won't get that kind of boost again. In other words unless Skylake moves to HBM, I expect the same 20%, especially since they're staying with 14nm.
 

TheELF

Diamond Member
Dec 22, 2012
4,027
753
126
Also the eDRAM is a one-trick pony, meaning once you have it, you won't get that kind of boost again. In other words unless Skylake moves to HBM, I expect the same 20%, especially since they're staying with 14nm.

HBM is also a one-trick pony unless,wait right they are already talking about improving HBM so why would it be impossible to improve eDRAM? Make it faster bigger or even just cheaper.

And HBM only works on VGAs, where the manufacturer can reconfigure the ram modules anyway they like,I very much doubt that you can turn your standard dual channel ddr3 dimm modules into stacked memory,no matter what you do software or controller wise,and putting even only 1gb of (HBM) mem into a CPU or into the chipset? I don't really see that happening either. I mean intel charges ~ $50 for 126Mb ,imagine the price for 1Gb.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,576
126
Intel is working with HBM / HCM. So I wouldn't be surprised to see in turn up somewhere in Intel products.
 

Qwertilot

Golden Member
Nov 28, 2013
1,604
257
126
Worth remembering that they're still not really trying :) Far from flat out with GPU > CPU anyway.

Toms reckons IP6200's power draw to be 10-12w. They could easily enough double that (or more) and have a viable product.

Skylake isn't due to go that far, but the big increase in Skylake is due to come from a model which has rather more iGPU on it than these Broadwell things do, so >20% should be relatively easy.
 

PhIlLy ChEeSe

Senior member
Apr 1, 2013
962
0
0
THG just posted a review of the new Broadwell desktop CPU's. Nothing too interesting on the CPU side, but the integrated Iris Pro 6200 puts up some pretty startling numbers for a low power integrated GPU. Certainly not going to replace a 980Ti, but it destroys anything AMD has in their integrated portfolio.

http://www.tomshardware.com/reviews/intel-core-i7-5775c-i5-5675c-broadwell,4169-6.html

Looks like one helluva HTPC chip, that can legitimately game at 1080p.


Who do you think is number 2 in GPU'S? The land scape has changed.......