• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

paperwastage

Golden Member
May 25, 2010
1,848
2
76
what's with the "3 screen collage mode"?

thought HD4000 already supported 3 monitors (though it must be some specific combination of Displayport + other ports, and/or refresh rate or resolution must be same )
 

Galatian

Senior member
Dec 7, 2012
372
0
71
Wouldn't this make more sense in the 13" rMBP? They could run it on cTDP down to come close to the current 35W envelop, though I'm sure they could also run it at the nominal 45W TDP if they wanted to.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Wouldn't this make more sense in the 13" rMBP? They could run it on cTDP down to come close to the current 35W envelop, though I'm sure they could also run it at the nominal 45W TDP if they wanted to.

The better question is why in the world would Apple be stupid enough to use discrete when GT3E performs the same as the GT650M? The rMBP gets hot during heavy graphics load. Not having an oven on your lap while playing torchlight 2 would certainly be a nice thing.

Not using discrete will allow less heat output during graphics load, less cost, higher battery life in normal use, and perhaps a more slim form factor. If GT3E is as good as 650M as intel claims, there is absolutely no reason for apple to use nvidia. Besides which, the rMBP is not a gaming machine. 650M was used in 2012 because HD4000 wasn't very suitable for the retina resolution in normal tasks. That is no longer the case with GT3E - it is a pretty beastly graphics chip. So, yeah. No incentive for discrete if the GT3E is as good as intel claims.
 
Last edited:

erunion

Senior member
Jan 20, 2013
765
0
0
what's with the "3 screen collage mode"?

thought HD4000 already supported 3 monitors (though it must be some specific combination of Displayport + other ports, and/or refresh rate or resolution must be same )

sounds like 3 monitors stitched into 1 virtual monitor. Eg eyefinity.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
sounds like 3 monitors stitched into 1 virtual monitor. Eg eyefinity.

Surprising. I had assumed that intel didn't support any type of surround. Just FYI, you can use eyefinity in non extended mode. It doesn't have to be stretched or spanned across all screens. Personally I like that mode of use better for surround (non spanned)
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,977
1,276
126
Such a shame Nvidia does not have an x86 license. I think that's going to hurt them big time in the long run. AMD at least have more of a means to fight against Intel, but Nvidia are fighting a difficult battle.

What is the reason for them not having this license?
 

Skurge

Diamond Member
Aug 17, 2009
5,195
1
71
Such a shame Nvidia does not have an x86 license. I think that's going to hurt them big time in the long run. AMD at least have more of a means to fight against Intel, but Nvidia are fighting a difficult battle.

What is the reason for them not having this license?

Intel would just embarrass them anyway.
 

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0


Isnt that kinda disappointing?

i7-4930MX (57w TPD) (GT2) (new haswell) = 1418p in 3Dmark11
A10-4600m (35w TPD) (old product) = ~1200p in 3Dmark11

18% faster than a A10-4600m, and its TPD is ~62% higher.

I think AMDs A10-5750M will be really close to the I7-4930MX (in 3Dmark11),
and only use 35w TPD.


http://www.fudzilla.com/home/item/30226-core-i7-3940mx-is-intel’s-new-mobile-king

It costs an arm, a leg and a few other organs of your choice, as its official price sits at $1096.
Fudzilla calls the I7-4930mx, the "king" of the mobile CPUs intel is makeing.



That makes me think AMD is better off, than I did before seeing that news.
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Isnt that kinda disappointing?

i7-4930MX (57w TPD) (GT2) = 1418p in 3Dmark11
A10-4600m (35w TPD) = ~1200p in 3Dmark11

18% faster than a A10-4600m, and its TPD is ~62% higher.

Fudzilla calls the I7-4930mx, the "king" of the mobile CPUs intel is makeing.

That depends on how it performs in real games. If its 60% faster in games in average like how 3DMark11 is, even the GT2-based 4930MX might beat Richland. I doubt it'll be 60% faster, but 20-30%. It may approach 60% in some circumstances, but it may be with settings that won't be playable regardless. FWIW, The A10-5750M "Richland" gets 1400.
 

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
Isnt that kinda disappointing?

i7-4930MX (57w TPD) (GT2) = 1418p in 3Dmark11
A10-4600m (35w TPD) = ~1200p in 3Dmark11

18% faster than a A10-4600m, and its TPD is ~62% higher.


No it's pretty good, way better than expected. This is only a GT2. I never expected the GT2 with 25% more EUs 60% better in some benchmarks. Slower models are lower clocked but it doesn't make a big difference. Just like Ivy Bridge where most GT2 models clocked between 1100-1300 Mhz. It looks to me even a GT2 with a moderate frequency is decent enough to reach the fastest mobile Trinity.


That depends on how it performs in real games. If its 60% faster in games in average like how 3DMark11 is, even the GT2-based 4930MX might beat Richland. I doubt it'll be 60% faster, but 20-30%. It may approach 60% in some circumstances, but it may be with settings that won't be playable regardless. FWIW, The A10-5750M "Richland" gets 1400.


20-30% average in games would be enough to match A10-4600M. I don't expect 60% average either.
 
Last edited:

daveybrat

Elite Member
Super Moderator
Jan 31, 2000
5,817
1,029
126
I love that Intel is finally improving their GPU performance but..........

Better hardware isn't always the most important part. They really need to focus on better drivers as well. Games just don't look the same on an Intel GPU compared to an AMD APU.

AMD's drivers are superior to Intel's for gaming. Hopefully Intel starts churning out some better drivers for the Haswell processors.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
3dmark11 from a mobile high-end GT2.

GT2 isn't the high end. It isn't even Iris. The high end are HD5100, 5200, iris and iris pro. Those will not be available (BGA only) on desktop and you won't see benchmarks anytime soon.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Isnt that kinda disappointing?

i7-4930MX (57w TPD) (GT2) (new haswell) = 1418p in 3Dmark11
A10-4600m (35w TPD) (old product) = ~1200p in 3Dmark11

18% faster than a A10-4600m, and its TPD is ~62% higher.

I think AMDs A10-5750M will be really close to the I7-4930MX (in 3Dmark11),
and only use 35w TPD.

well, it's only GT2...
as for the TDP, you need to consider the CPU performance difference...
the A10 is a dual module quad core with not l3 cache, low clock... performance is comparable to the mobile ivy bridge core i3 when it comes to CPU, the 4930MX is a quad core (8t) 3GHz-3.9GHz Haswell, that's faster than the 8350 (125w+ with no IGP)


I'm really impressed by the info and results from haswell IGP so far.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
100025hzra17fmnziiummm.jpg
 

LogOver

Member
May 29, 2011
198
0
0

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
The better question is why in the world would Apple be stupid enough to use discrete when GT3E performs the same as the GT650M? The rMBP gets hot during heavy graphics load. Not having an oven on your lap while playing torchlight 2 would certainly be a nice thing.

Not using discrete will allow less heat output during graphics load, less cost, higher battery life in normal use, and perhaps a more slim form factor. If GT3E is as good as 650M as intel claims, there is absolutely no reason for apple to use nvidia. Besides which, the rMBP is not a gaming machine. 650M was used in 2012 because HD4000 wasn't very suitable for the retina resolution in normal tasks. That is no longer the case with GT3E - it is a pretty beastly graphics chip. So, yeah. No incentive for discrete if the GT3E is as good as intel claims.


I'm gonna disagree with this.

I just don't see Apple's web-pages marketing the 2013 15" rmbp with:

We've increased battery life and built a cooler running macbook, but our graphics performance is down by 20%... AND, our top of line machine has lost CUDA in pro-applications...

Not happening, that's not appealing to consumers. They will probably use the GTX 750M along side with Iris Pro, and just say that it has better graphics performance in both integrated and discrete, while maintaing the same battery life and heat out. That's a lot better sounding to someone laying down over 2K for a laptop.

For the 13" model... hell yea, they will not only increase battery life, but also 2x improve graphics. It's win-win.
 

piasabird

Lifer
Feb 6, 2002
17,168
60
91
If it is only available on a BGA package i7 it will be priced kind of high for most people. Most people that purchase an i7 are looking for playing games, multitasking and workstations.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
I'm gonna disagree with this.

I just don't see Apple's web-pages marketing the 2013 15" rmbp with:

We've increased battery life and built a cooler running macbook, but our graphics performance is down by 20%... AND, our top of line machine has lost CUDA in pro-applications...

Not happening, that's not appealing to consumers. They will probably use the GTX 750M along side with Iris Pro, and just say that it has better graphics performance in both integrated and discrete, while maintaing the same battery life and heat out. That's a lot better sounding to someone laying down over 2K for a laptop.

For the 13" model... hell yea, they will not only increase battery life, but also 2x improve graphics. It's win-win.

Last I checked, there are zero OSX cuda applications, and on top of this MOST macbooks do not have nvidia discrete chips at all. You act like it's some sort of pre-requisite. Give me a break. Many prior year models used radeon graphics, while nearly all macbook airs did not use nvidia chips - also, OSX CUDA development tools are there but the actual development is non existent. I'm fairly confident in stating most buyers of MBP do not care. My speculation is this: If the GT3E is matches the performance of GT 640/650 you can be pretty fairly certain nvidia won't be inside the machine. That's what the rumors are already suggesting (no discrete) and seems very likely given the performance of the Haswell's GPU.

Apple used discrete in prior years due to the inadequacy of integrated graphics for basic use. With Haswell's GPU that is no longer the case. It is actually a damn good GPU despite being integrated.
 
Last edited: