• We should now be fully online following an overnight outage. Apologies for any inconvenience, we do not expect there to be any further issues.

Intel Iris & Iris Pro Graphics: Haswell GT3/GT3e Gets a Brand

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Hulk

Diamond Member
Oct 9, 1999
5,153
3,760
136
Why did the DailyTech article say Iris Pro 3Dmark2011 score should be around 2900 (based on Intel 3x performance claim) but posters here are saying 2200?
http://www.dailytech.com/Intel+Anno...Up+to+a+3x+Performance+Boost/article31464.htm

Seems like we might want to stop digging Intel's grave. While there will for the foreseeable future be a high end discrete graphics segment, the mainstream which is where the bulk of the sales are is going to have a large chunk taken out of it by Iris. Broadwell may well finish the job.

If Intel can successfully move into the ARM space that would be nuts. Good for us in the short term as they have been driven to catch up with these markets, but in the long run the lack of sales ($) to the competition will only slow growth. I hope the race stays close...
 

386user

Member
Mar 11, 2013
66
0
16
curious to see what the cost for gt3 will be

hopefully less than a lesser cpu + discrete 640/650 although ..unfortunately i dont think this will be the case
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Never understood that argument of yours. The 3DMark score of Trinity is somewhat higher than it should be, but one can still extrapolate expected gaming performance of non-mainstream games from that score. Doing the same with HD3000 will leave you with quite some surprises.

First, HD 3000 doesn't support DX11, so there's no 3DMark11 scores for it. HD 4000 is what supports DX11.

http://anandtech.com/show/5831/amd-trinity-review-a10-4600m-a-new-hope/5

A10-4600M gets 41% better score in 3DMark11 than HD 4000, but in average only gets about half the advantage in games.

Another point of comparison is against the HD 6630M setup. The 7660G scores better in 3DMark11, but loses in almost every single game.
 

SPBHM

Diamond Member
Sep 12, 2012
5,066
418
126
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
3Dmark 11 is very CPU dependent (score wise).
A10 5800k @ 4.4ghz cpu scores 1713, while at at stock only 1493.

A10-6800K "Richland" gets 1667 points in 3DMark11.

It's also worth noting the Intel slides are showing "Graphics Suite" scores while we're talking about Overall scores.

Graphics Suite score for HD 4000 in the 3770K gets 652 points.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Well the fact of the matter is Apple isn't using discrete graphics in the 2013 rMBP. Furthermore, you cannot add discrete chips without a corresponding loss of battery life. Now the kepler may be efficient, but it's not without 1) obvious additional costs and 2) losses in battery life during certain usage scenarios.

Intel claims that the GT3E is faster than the GT650M. Based on the fact that apple is no longer using discrete, i'm inclined to believe that.

The 2013 rMBP isn't using discrete. Take that for what it's worth. You can argue until you're blue in the face about how mobile discrete isn't dead, and I would agree with that for full size gaming laptops. However, we know this: most ultrabooks already (95%) do not use discrete. The biggest discrete customer was Apple and now THEY aren't using discrete.

The facts here are just very telling. You can either ignore them, or argue until you're blue in the face about hypothetical matters.

Sure I can add a discrete gpu without affecting battery life. I simply don't use it when I don't need it (go to nvidia control panel select "use integrated graphics processor"). There now the dgpu is using no power because its not activated. No battery life hit (or very very very minimal). Considering when on battery I'm usually not doing anything that requires heavy gpu activity it has no effect on me at all.

The edram costs $50 alone; given the additional cost of a gt3 chip vs a gt2 chip it would not be surprising if a 650m costs LESS than a gt3e chip.

I don't know if the 2013 rmbp will use a dgpu or not. All we have is a few rumours (there were rumours that the mba was going to use amd's apu too last year). Its too early to say with certainty.

Second, which 650m? One with DDR3 or GDDR5? One at the reference speeds or at the boost speeds? Apple also uses an overclocked 650m in their machines.

I'll believe intel's claim when I see it confirmed. However, I will say that I doubt that the gt3e will be able to compete with a 650m at boost speeds (basically every single chip runs at boost speeds) with GDDR5 (which is how you usually find the 650m) in the majority of cases. Intel drastically needs to improve their drivers (the hd 4000 performs very similarily to the hd 3000 in certain situations).

Screen%20Shot%202013-05-01%20at%205.37.22%20PM_575px.png


2.5x increase.

3840QM gets around 730 points. 730 x 2.5 = 1825 points. The 650m gets around 2200 points (gpu) at 835 mhz and 2000 points at 745 mhz. Either way I see the gt3e more as solid competition for the 640m. Can it compete though with the 650m? Yes. Will is lose most of the time? Probably. (And the 650m has been replaced with the 750m with clocks of 967 mhz + boost).

I agree that for the most part igp will make discrete obsolete in the majority of cases but I wasn't talking about ultrabooks. (Plus anything that can fit a 47 watt tdp cpu in it is probably not an ultrabook).

I guess I bolded some extra text.

Comment was primarily in relation to

Hell, many gaming tablets/laptops actually use GT650M so many gaming devices will probably use GTe3 as well this time around.

Which will definitely be true when space or thermals are an issue. But price? Not necessarily. Performance? Not necessarily.
 
Last edited:

mikk

Diamond Member
May 15, 2012
4,308
2,395
136
It's also worth noting the Intel slides are showing "Graphics Suite" scores while we're talking about Overall scores.


This is important, looks like most people overlook this. It means with a 2.9x factor the 65W GT3e scores around 1900. For comparison AMDs A10-5800k scores around 1400 points.
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Sure I can add a discrete gpu without affecting battery life. I simply don't use it when I don't need it (go to nvidia control panel select "use integrated graphics processor"). There now the dgpu is using no power because its not activated. No battery life hit (or very very very minimal). Considering when on battery I'm usually not doing anything that requires heavy gpu activity it has no effect on me at all.

Oh so a discrete chip doesn't use battery life when it's disabled. Really? Did you actually state this as a counter point? So how exactly do you do this in OSX? Anyway, I don't think it's far fetched to state most real world usage scenarios will include using a decent graphics workload now and then, and in those cases integrated will always without exception lead to better battery life. Thus a real world usage scenario will always have better battery life characteristics on an ultrabook or ultra portable with integrated, rather than discrete.

Anyway, I'm not sure how this will affect full size laptops but in my opinion, I don't see many ultrabook makers (if any) using discrete this time around. I could be wrong, but seeing as the biggest discrete customer suddenly switch has implications that we may not be aware of yet. This is just my opinion, perhaps i'm wrong. It has happened once before.
 
Last edited:
Aug 11, 2008
10,451
642
126
curious to see what the cost for gt3 will be

hopefully less than a lesser cpu + discrete 640/650 although ..unfortunately i dont think this will be the case

This is the problem I see. It is a nice advance, could be great for ultrabooks, but for regular laptops, ehh, so what? What can I really do that I could not do with Ivy Bridge?

Normal surfing, social apps, a few videos, no problem with HD4000. Want to game or use some other heavy graphics utilization, I think you can still get better performance with Ivy and GT650m, quite likely for a lot less money as well.

Edit: even less intesting in the desktop of course, just add a discrete gpu.
 

Piroko

Senior member
Jan 10, 2013
905
79
91
First, HD 3000 doesn't support DX11, so there's no 3DMark11 scores for it. HD 4000 is what supports DX11.

http://anandtech.com/show/5831/amd-trinity-review-a10-4600m-a-new-hope/5

A10-4600M gets 41% better score in 3DMark11 than HD 4000, but in average only gets about half the advantage in games.

Another point of comparison is against the HD 6630M setup. The 7660G scores better in 3DMark11, but loses in almost every single game.
My bad, was talking about the HD 4000. But these reviews are what irks me. I just can't tell which games run well and which won't. Some games run better than I expected, like SOASE:Rebellion, others perform somewhat erratic, like Civ5, Tera and Minecraft. Although Minecraft performance got better within one of the last updates.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Oh so a discrete chip doesn't use battery life when it's disabled. Really? Did you actually state this as a counter point? So how exactly do you do this in OSX? Anyway, I don't think it's far fetched to state most real world usage scenarios will include using a decent graphics workload now and then, and in those cases integrated will always without exception lead to better battery life. Thus a real world usage scenario will always have better battery life characteristics on an ultrabook or ultra portable with integrated, rather than discrete.

Anyway, I'm not sure how this will affect full size laptops but in my opinion, I don't see many ultrabook makers (if any) using discrete this time around. I could be wrong, but seeing as the biggest discrete customer suddenly switch has implications that we may not be aware of yet. This is just my opinion, perhaps i'm wrong. It has happened once before.

I agree with you for the most part about ultraportables but gt3e is not really relevant for ultraportables with a 47 watt tdp. gt3 is though. Considering there are very few ultraportables using discrete graphics at all right now intel can hardly kill the market (cause there is almost no market).

OSX automatically switches graphics for you. (And yes, when its disabled it doesn't use any power--well maybe the insignificant bit it takes to run the switching software).

The decent graphics workload only affects battery life if you are using the battery while you are running the workload. Usually whenever you are running that kind of workload you are not on battery. Browsing, word processing, music, movies are the vast majority of what the average consumer is going to be doing while on battery and none of that requires a dgpu. If you start gaming, power consumption is going to be so high that an extra 10 watts isn't really going to matter (a load of 50 watts vs a load of 65 watts on a 60 watt-hour battery will last 72 minutes vs 55 minutes or 17 minutes longer).
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,143
136
What is GT640M/GT650M's power comsumption under load? Unless I'm mistaken it should draw considerably more power (coupled with a 35/45W quad-core IB) than a 47W quad-core Haswell with GT3e.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
What is GT640M/GT650M's power comsumption under load? Unless I'm mistaken it should draw considerably more power (coupled with a 35/45W quad-core IB) than a 47W quad-core Haswell with GT3e.

Its hard to find concrete mobile chips power consumption figures but probably around 35 watts tdp.

rmbp 15" has a 85 watt charger, an overclocked 650m and a 45 watt cpu (not to mention a fairly power hungry screen) and it can game with minor sucking of battery power.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,787
136
Browsing, word processing, music, movies are the vast majority of what the average consumer is going to be doing while on battery and none of that requires a dgpu. If you start gaming, power consumption is going to be so high that an extra 10 watts isn't really going to matter (a load of 50 watts vs a load of 65 watts on a 60 watt-hour battery will last 72 minutes vs 55 minutes or 17 minutes longer).

I think the whole point here is thermals. It determines what kind of systems can be designed. Since GT3e is still not at the bleeding edge level of performance, its for those in-between systems that want thin and light but game decently.

Also, with integrated you can get benefits beyond what TDP differences indicate. The reason is because integrated systems can offer greater power management functions than what totally seperate systems can do.

I also wonder if we'll see a new era where iGPUs aren't always at a total performance disadvantage compared to discrete. See technologies like hUMA and Instant Access. Even John Carmack himself once mentioned that integrated may someday offer advantages that discrete can't. He may have been referring to these technologies.
 

Ventanni

Golden Member
Jul 25, 2011
1,432
142
106
The GT3e doesn't have to be faster than the 650m. It just has to be fast enough to power the rmbp display at a sufficient level of performance suitable to their needs, which is something the HD4000 cannot do. Even if the GT3e isn't cheaper than a discrete solution, it's certainly less complex, and that's a win for Apple.
 

StinkyPinky

Diamond Member
Jul 6, 2002
6,977
1,276
126
Nvidia must be worried. I really don't see any reason to use anything other than Intel for low and mid end laptops. Assuming these claims are true, and companies are prone to exaggerating afterall.

Man, Intel are really starting to flex their muscle. I'd love for them to move into the dedicated graphics market for desktop.
 

Blitzvogel

Platinum Member
Oct 17, 2010
2,012
23
81
The names all make sense I think, since they all revolve around visual terms in the graphics name or in Nvidia's case, the company name. Iris is a pretty decent name. It's only 4 letters and two syllables. I'll admit though, it's a bit worrisome to see Intel advance so quickly against AMD in this regard, but AMD does have price on there side. Getting GT3e would mean having to spend a pretty penny that arguably would be better put towards a good i5 and much better dedicated graphics.
 
Last edited: