Intel to Supply Apple with Special High-End Haswell Processors for MacBook Pro [MacRu

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
Please explain how the CPU and GPU are using anything close to 90 watts when the entire notebook is running on an 85 watt adapter.

First of all, that number was quoted by AnandTech, not me.

AnandTech said:
The GT 650M is a 45W TDP part, pair that with a 35 - 47W CPU and an OEM either has to accept throttling or design a cooling system that can deal with both. Iris Pro on the other hand has its TDP shared by the rest of the 47W Haswell part. From speaking with OEMs, Iris Pro seems to offer substantial power savings in light usage (read: non-gaming) scenarios. In our 15-inch MacBook Pro with Retina Display review we found that simply having the discrete GPU enabled could reduce web browsing battery life by ~25%. Presumably that delta would disappear with the use of Iris Pro instead.

55283.png


AnandTech said:
With a discrete GPU, like the 650M, you end up with an extra 45W on top of the CPU’s TDP. In reality the host CPU won’t be running at anywhere near its 45W max in that case, so the power savings are likely not as great as you’d expect but they’ll still be present.

A regular GT650M has 45W TDP, Apple's GT650M (clocked higher than a 50W TDP GTX660M) likely draws even more power, contributing to the unimpressive 2.33 hour battery @ heavy workloads of the current rMPB 15''. Anand is very clear here, the current CPU+dGPU solution wont be running at rated TDP simultaneously (hence your 85W power adapter) but it's still more power hungry than a single 47W chip would be. Doesnt take a genious to figure out an Iris Pro could deliver better battery life, question is how much.
 
Last edited:

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
First of all, that number was quoted by AnandTech, not me.



55283.png




A regular GT650M has 45W TDP, Apple's GT650M (clocked higher than a 50W TDP GTX660M) likely draws even more power, contributing to the unimpressive 2.33 hour battery @ heavy workloads of the current rMPB 15''. Anand is very clear here, the current CPU+dGPU solution wont be running at rated TDP simultaneously (hence your 85W power adapter) but it's still more power hungry than a single 47W chip would be. Doesnt take a genious to figure out an Iris Pro could deliver better battery life, question is how much.

The problem is that AT has made several mistakes with regards to mobile GPUs.

1. Have not researched the 660m which has a boost and runs at 950/1250 in almost every notebook its in. (I have a 660m and can confirm this).

2. Inablilty to actually analyze TDP ratings.

650: 64 watt tdp
7750: 55 watt tdp

Problem is that generally 650 consumes the same as or slightly less than the 7750.

Power.png


power_peak.gif


Go to a higher binned mobile card and drop the clocks to 900 mhz and apple's 650m isn't touching 50 or 45 watts under gaming loads.

Also, in that test Anand is running the 650m on the macbook pro and both CPU + GPU + rest of system are in the 85 watt power envelope.

Thing is TDP isn't power consumption. At a given TDP power consumption can fluctuate quite a bit.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Special version? Interesting. Even 5-10% better performance than i7-4950HQ would put it dangerously close to Apple's special GT650M (clocked higher than a regular GTX660M) in gaming performance.

"Dangerously close" still offers worse performance. Than the last generation. For a newer machine, you'll want improved performance.

I bet few people buy a >$2000 rMPB to play games when you have much better options (for that) in the Windows space.

Most people buy a Retina MacBook Pro because it's 7/10 of an inch thick with a 5 million pixel display. A decent GPU is a given.
 

Sweepr

Diamond Member
May 12, 2006
5,148
1,142
131
"Dangerously close" still offers worse performance. Than the last generation. For a newer machine, you'll want improved performance.

They should stick with a GT750M-like dGPU this year and jump on the iGPU bandwagon with Broadwell GT4 next year, then you'd probably have the best of both worlds (improved GPU performance vs last year + integrated graphics efficiency) but according to the latest rumours its almost a given that they're going Iris Pro this year. Costs of Iris Pro vs regular Haswell + dGPU are probably similar (if Iris Pro is not more expensive) so its a performance vs power comsumption/area choice for Apple. They wouldnt dump dGPUs if it wasnt a worthy trade-off in some way.
 
Last edited:

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Let's not discount Apple's intelligence in this matter - they would not go discrete-less unless it were somehow worth the trade-off. Additionally, Apple has not and never will tailor their machines towards gaming performance - I think in terms of general use, the GT3e does indeed come very close to the 650m. There are situations during gaming where the 650m will be better, but I don't think that's entirely relevant with an Apple machine; gaming benchmarks don't align with what Apple is going for in terms of graphics performance for the rMBP - I have to think that Apple has the business smarts to use GT3e (as mentioned above) only if it is worthwhile.

The only remaining question is whether both the 13 and 15 inch rMBP's are using GT3e, and i'd assume so since there has been no mention of an nvidia design win for the 15 inch model.
 
Last edited:

Idontcare

Elite Member
Oct 10, 1999
21,118
58
91
Let's not discount Apple's intelligence in this matter - they would not go discrete-less unless it were somehow worth the trade-off.

Not only that but they also aren't your typical "margins scraping" business mentality either, meaning they aren't about to cut out the discrete GPU simply to save themselves a few bucks on the BoM while drastically under-cutting the user's experience because of it.

They didn't push retina display because they wanted to save a few bucks, they sure as hell aren't going to make you wish you hadn't bought a high-rez screen product from them because they skimped on the GPU that powers it either.

It really isn't Apple's style. That would be DELL's or HP's style, but not Apple.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Well, they're doing it, so deal with it or take a look at the competition. There are plenty of cheaper notebooks running fast dGPUs. Good luck with Optimus and battery life once a graphics intensive workload running @ retina resolutions kicks in.

What workloads? Besides gaming there are very few graphics intensive workloads heavy enough to require the HD 4600 that the average consumer is ever going to use. And anyone working seriously knows that you don't run intensive workloads on battery and would very likely be willing to trade off a little less battery for 40%+ more performance.
 

JDG1980

Golden Member
Jul 18, 2013
1,663
570
136
Most of the people commenting here seem to think that Iris Pro (GT3e) is slightly inferior to the discrete NVIDIA GT650M. From what I can tell, these conclusions seem to be mostly based on Anandtech's review of Iris Pro from last month. But I think that the significance of these benchmarks - especially the gaming benchmarks - is being overblown. The tests were all run on Windows, and most involved DirectX games. The gaming benchmarks were run at low resolutions (1366x768), presumably because higher resolutions would have resulted in a completely unplayable slideshow. But we cannot necessarily assume that performance with heavy loads (AAA games) at a low resolution equates to performance with light loads (UI rendering) at a high resolution - and the latter is what Apple is obviously targeting. Since the GT3e was basically designed because Apple wanted it, isn't it reasonable to assume that it was tuned to meet Apple's workload? In contrast, both the hardware and (perhaps more importantly) the drivers on NVIDIA and AMD consumer-grade solutions are designed to produce high framerates in games. In fact, there are often game-specific hacks included, both on the driver side and by the game coders. Intel doesn't hack their drivers nearly as much, and game developers don't care if Intel integrated solutions have low framerates.

Remember that Apple's window manager uses OpenGL for compositing, not DirectX. It seems likely that optimization efforts in the driver were therefore targeted towards OpenGL. The compute benchmarks in LuxMark and Fluid Simulation, both of which use OpenCL and beat GT650M by a considerable margin, provide some evidence for this guess. But I suppose we'll have to wait and see.
 

sushiwarrior

Senior member
Mar 17, 2010
738
0
71
Since the GT3e was basically designed because Apple wanted it, isn't it reasonable to assume that it was tuned to meet Apple's workload?

No, because all Intel did was take a poorly performing base architecture, double it a few times, add an EDRAM bandaid, and call it amazing. Iris Pro is just trying to build a bigger building out of flawed building blocks. It is certainly not "tuned".
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Most of the people commenting here seem to think that Iris Pro (GT3e) is slightly inferior to the discrete NVIDIA GT650M. From what I can tell, these conclusions seem to be mostly based on Anandtech's review of Iris Pro from last month. But I think that the significance of these benchmarks - especially the gaming benchmarks - is being overblown. The tests were all run on Windows, and most involved DirectX games. The gaming benchmarks were run at low resolutions (1366x768), presumably because higher resolutions would have resulted in a completely unplayable slideshow. But we cannot necessarily assume that performance with heavy loads (AAA games) at a low resolution equates to performance with light loads (UI rendering) at a high resolution - and the latter is what Apple is obviously targeting. Since the GT3e was basically designed because Apple wanted it, isn't it reasonable to assume that it was tuned to meet Apple's workload? In contrast, both the hardware and (perhaps more importantly) the drivers on NVIDIA and AMD consumer-grade solutions are designed to produce high framerates in games. In fact, there are often game-specific hacks included, both on the driver side and by the game coders. Intel doesn't hack their drivers nearly as much, and game developers don't care if Intel integrated solutions have low framerates.

Remember that Apple's window manager uses OpenGL for compositing, not DirectX. It seems likely that optimization efforts in the driver were therefore targeted towards OpenGL. The compute benchmarks in LuxMark and Fluid Simulation, both of which use OpenCL and beat GT650M by a considerable margin, provide some evidence for this guess. But I suppose we'll have to wait and see.

Fluid simulation is heavily cache dependant.

Intel's IGP architecture

Pros

-Relatively efficient
-Good tesselation/geometry
-Good Compute
-Decent DP
-Quicksync


Cons

-large die area
-poor textel performance
-Poor AA support (4x AA halves framerates in synthetics).
-Poor scaling between HD 4600 and HD 5200, never better than 83% in the AT review, average ~70-75% (especially considering crystalwell and that HD 5200 is running 50 mhz faster).
- intel tends to have driver problems and fixes can take a while.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
Lol @ other vendors. As if there is that much of a market for $1000+ laptops outside Apple. Intel has become Apple's bitch.
Hardly. OEMs have gotten custom SKUs from Intel, AMD, nVidia, and ATI in the past. In some cases, they got special higher-performing parts, in others lower TDP (still common), in other cases lower cost (like <2GHz Phenom and Phenom II CPUs), and in yet others, better perf/W (ULV mobile may be new for consumers, but it's been a thing for big and/or vertical-market business users for a good 10 years, now). There's a healthy market on the non-Apple side of things, so long as value is offered (that was the problem with the initial Ultrabooks: $1000+ for a $600 computer made <1/4" thinner).

The big OEMs currently being unwilling to put in an effort to divorce themselves from reliance on Windows is giving Apple management a high no drug can compare to (take the MBA, FI--soon, Windows will have baked-in support for Haswell's C7, better than OEM utilities' support, but it'll be too little too late, and many users would prefer OS X over Windows 8.x, anyway), but you're very much misunderstanding what's going on. If HP had wanted specially-binned SKUs, they'd have gotten them, too. But, Apple wanted them, and was willing to make whatever deal with Intel they needed to get them exclusively. Apple also has a solid idea of what many of their users need and want, which now includes a balance of scalar CPU performance, GPU performance, GPGPU/SIMD performance, and battery life, with a mostly-known subset of applications, which makes it easier for them to spec hardware they know will satisfy most users.

That much energy stored in one place can cause fires.
Yes, but that's not related to Wh, but energy density, storage, and use of unsafe chemistries that require active protection, not total battery capacity. Cell phones with <5Wh batteries, FI, can start a fire just fine. As technical justification, it's stupid. Don't let politicians fool you into thinking any good is being done, that way. All Lithium-Ion batteries not known to use one of a handful of safe chemistries should be treated with equal care.

But, a quick search yielded this, so yes, it's real:
http://www.iata.org/whatwedo/cargo/dgr/Documents/Lithium-Battery-Guidance-2013-V1.1.pdf
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
Removing the GPU means you can get more space/less weight and less battery drain. So you can either use it to make it slimmer, slimmer and longer battery or just plain longer battery. The middle option is usually what apple likes. Cost benefit is secondary for apple.
 

SunRe

Member
Dec 16, 2012
51
0
0
I just don't get why the other computer manufacturers can't get their act together and release a proper mobile workstation.

I don't want it cheaper that MacBook Pro, I just want an alternative and the broader thunderbolt support that this would bring.

But why this is not happening is beyond me.. It's simply unbelievable that a machine copying the macbook pro (I'm not saying innovating over it!) doesn't exist yet...

This collaboration between Intel and Apple on Thunderbolt and the i7 HQ Haswell shows how determined they are to get things right. Why the others simply doesn't follow them baffles me..
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
Pretty much. The likes of Dell, HP, Lenovo are completely concerned with bottom dollar junk machines. I would love for a proper competitor to the rMBP to exist, but sadly, it doesn't. It seems like all of the Wintel ultrabook manufacturers are merely obsessed with bottom dollar costs instead of actual quality devices - and as the market has shown, they will pay more for quality.

I think there has been a recent shift in the market in the past 5 years or so. A decade ago, bottom dollar mattered. Now? Consider how many units high end phones, tablets, and computers such as the Samsung S4 (costs nearly 650$ unlocked), iPhone (650$), iPad (500$), macbook pros (north of 1500$). People are buying quality devices in droves, the market WILL pay for quality. The only question is, when will ultrabook manufacturers step up and create a quality device.
 

Cerb

Elite Member
Aug 26, 2000
17,484
33
86
The only question is, when will ultrabook manufacturers step up and create a quality device.
Hopefully soon. They relatively quickly got the message that people wouldn't pay only for a smaller machine. But, higher cost for a better machine means risk (it can also mean higher margins), so I wouldn't hold my breath.

IMV, competition for the MBA is just a matter of time, but for the MBP, I don't know. Fujitsu and Toshiba often try, but they also often don't sell their cool machines outside of Japan.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
Hopefully soon. They relatively quickly got the message that people wouldn't pay only for a smaller machine. But, higher cost for a better machine means risk (it can also mean higher margins), so I wouldn't hold my breath.

IMV, competition for the MBA is just a matter of time, but for the MBP, I don't know. Fujitsu and Toshiba often try, but they also often don't sell their cool machines outside of Japan.

You might want to look at the new Vaio Pros..
 

Andrmgic

Member
Jul 6, 2007
164
0
71
I can tell you from experience that under extremely high load, the 2012 rMBP does in fact consume more power than its power adapter provides.

The battery drains slowly when connected to a power source, but it does drain.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
Well, they're doing it, so deal with it or take a look at the competition. There are plenty of cheaper notebooks running fast dGPUs. Good luck with Optimus and battery life once a GPU intensive workload running @ retina resolutions kicks in.

The 13" has integrated graphics. The 15" has integrated and discrete with Optimus.

A MacBook pro is not in competition with "cheaper notebooks."
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
You might want to look at the new Vaio Pros..

Sony consistently screws up their products. They are very nice but what I've noticed about their products.

1. Battery is generally poor for mainstream products (rarely see batteries for their cheaper products over 50 watt hours). (seriously their s series they quote for "3 hours 45 minutes" though they have a sheet battery available for $150)

2. Design. Bright tacky plastic that looks like it was designed for a stormtrooper with a paint job. (not their higher end models). They seem to have largely fixed this.

3. Upgrades are pricey.

4. Obsession with fixed ram (vaio s) or singlechannel RAM (vaio pro) or slow RAM (vaio s has 1333 mhz RAM on a $1000 notebook).

Nice things about Sony

1. Nice displays (especially for more budget options).

2. Option for dgpu.

3. Form factor is nice.

I can tell you from experience that under extremely high load, the 2012 rMBP does in fact consume more power than its power adapter provides.

The battery drains slowly when connected to a power source, but it does drain.

Possibly using 87-88ish watts per hour.
 

Rakehellion

Lifer
Jan 15, 2013
12,182
35
91
I can tell you from experience that under extremely high load, the 2012 rMBP does in fact consume more power than its power adapter provides.

The battery drains slowly when connected to a power source, but it does drain.

What if you remove the battery and run it on a power adapter, will it suddenly shut off?