Apple dumps NVidia from the Macbook Pro

sm625

Diamond Member
May 6, 2011
8,172
137
106
I cant believe they are charging $500 to upgrade from intel iris to a lowly M370X. They are basically shaving $100 off the cpu cost and adding a $200 gpu for a total net cost of $100 and gouging the heck out of the consumer for $400.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
I cant believe they are charging $500 to upgrade from intel iris to a lowly M370X. They are basically shaving $100 off the cpu cost and adding a $200 gpu for a total net cost of $100 and gouging the heck out of the consumer for $400.

You do realize that more expensive one has a faster CPU and double the storage space right, plus the better GPU.

The cost is not out of line.
 

AdamK47

Lifer
Oct 9, 1999
15,207
2,838
126
If people are willing to pay, and Apple can get away with it, they'll gouge the heck out of it.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
Anyone has any information on the new GPU? Core clocks, core count, anything...?
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
There was a leak in december, if thats to be trusted the M370X would be around what a cutdown Tonga would deliver.

16592-ec6a34f8.jpg
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Apple pushes OpenCL, it was to be expected.

Dont know what Apple will do when NV shows them new shinning slides of pascal claiming x flops in FP16. They probably will laugh their *sses off and dump them from their product stack, again.
 
Aug 11, 2008
10,451
642
126
With the great efficiency of Maxwell, I really dont understand why Apple went back to AMD, unless like others said it has to deal with open GL vs CUDA. Either that or there is something political going on behind the scenes.
 

smackababy

Lifer
Oct 30, 2008
27,024
79
86
I cant believe they are charging $500 to upgrade from intel iris to a lowly M370X. They are basically shaving $100 off the cpu cost and adding a $200 gpu for a total net cost of $100 and gouging the heck out of the consumer for $400.

It is a $300 upgrade from 256GB to 512GB SSD...
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Maxwell efficiency in gaming and not in GPGPU (when it drastically bombs in performance to stay within their alledged gaming tdp) probably has to do something with it. In FP32 maxwell is roughly 0.75x perf for 0.5x power consumption versus kepler.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
Maxwell efficiency in gaming and not in GPGPU (when it drastically bombs in performance to stay within their alledged gaming tdp) probably has to do something with it. In FP32 maxwell is roughly 0.75x perf for 0.5x power consumption versus kepler.

nVidia doesnt support OpenCL 2.0, whereas AMD/Intel do. This was most likely a big reason as well.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
What, the hell, exactly is the R9 M290(yes, there is no X in it), and M370X?
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Maxwell efficiency in gaming and not in GPGPU (when it drastically bombs in performance to stay within their alledged gaming tdp) probably has to do something with it. In FP32 maxwell is roughly 0.75x perf for 0.5x power consumption versus kepler.

People are still recycling this old argument which doesn't pan out.

http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-16.html

http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-17.html

750 ti is quite competitive vs. Bonaire and Pitcarin at SP OpenCL compute.

All while using solidly less power.

07-Power-Consumption-Peak.png


There is no bomb in performance or clocks. (Toms 970/980 review is plagued by problems and they later redacted part of their data - which was brought up many times on these forums). Also Ryan Smith (post history) stated that ATs reference 980 throttled back but never went over TDP for ATs compute benchmarks.
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
With the great efficiency of Maxwell, I really dont understand why Apple went back to AMD, unless like others said it has to deal with open GL vs CUDA. Either that or there is something political going on behind the scenes.

Nothing political, just money.

The performance/watt of maxwell isn't as great in compute as it is in graphics, which apple doesn't care about at all, since you can't really game on macs anyway, so it's not worth the premium.
 

PPB

Golden Member
Jul 5, 2013
1,118
168
106
Sorry, not taking about benchs here but real performance. CUDA devs had to rewrite quite big portions of their software (without tech papers of maxwell v2,to their dismay) to acomodate to the new memory and cache system and to make maxwell at least competitive with kepler products. For each software you show a bench of doing "ok" performance I can show a real world software that has either abysmal maxwell performance o has to be reworked to make maxwell competitive. Right now lots of prosumers are still on kepler because maxwell performance is inconsistent and they are still waiting their entire production software stack to acomodate to the changes in maxwell. Seeing apple is all about a tight circle between software and hardware, inconsistency or rewriting your software around 1 generation of products is undesirable. Why bother if the competition has performance consistency in the api you really use (OpenCL)?

Power efficiency is a gaminv fad right now. Translating it into the prosumer world and thinking the same argunent will hold is just downright silly.
 

Enigmoid

Platinum Member
Sep 27, 2012
2,907
31
91
Not sure why that matters to Apple. Moving to AMD cuts out CUDA completely.

As an Aside can you PM me some links for those reworks? I'm curious.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
It must have DP 1.3. I don't see a possibility to push 5K at 60Hz on single singnal otherwise.
 

Stuka87

Diamond Member
Dec 10, 2010
6,240
2,559
136
People are still recycling this old argument which doesn't pan out.

http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-16.html

http://www.tomshardware.com/reviews/geforce-gtx-750-ti-review,3750-17.html

750 ti is quite competitive vs. Bonaire and Pitcarin at SP OpenCL compute.

All while using solidly less power.

07-Power-Consumption-Peak.png


There is no bomb in performance or clocks. (Toms 970/980 review is plagued by problems and they later redacted part of their data - which was brought up many times on these forums). Also Ryan Smith (post history) stated that ATs reference 980 throttled back but never went over TDP for ATs compute benchmarks.

You are comparing desktop chips to mobiles? The 750Ti is not a mobile chip, its performance does not represent a 750M.

Also nVidia's OpenCL performance does not compare to AMDs, and the fact that they JUST NOW started to support OpenCL 1.2 (Which AMD supported YEARS AGO) is another issue. Apple has been pushing for 2.0 support, which Intel and AMD have.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
Apple really pushes OpenCL? Final Cut Pro is about the only major product from Apple that uses OpenCL from what I have seen. I'd guess the real reason is AMD gave Apple a deal they couldn't resist. You know, why AMD's margins will remain 50% lower than their competition.
 

zlatan

Senior member
Mar 15, 2011
580
291
136
With the great efficiency of Maxwell, I really dont understand why Apple went back to AMD, unless like others said it has to deal with open GL vs CUDA.

It's simple. The 15" Macbook Pro is a "professional" product just like iMac and Mac Pro. The software partners are now optimizing their product to GCN, and Apple just use the ideal GPU-arch in the new Macbook Pro for their needs. In this case they don't need to optimize for Maxwell and Kepler, just for Intel Gen7.5/Gen8 and AMD GCN. Focusing for just two base GPU-arch is much simpler for them.
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,549
136
You know why Apple could chose AMD over Nvidia? First is the cash. Second is the... Freesync technology. Retina 4K and 5K Thunderbolt Displays...