(MacRumors) Apple May Launch 27-Inch Retina iMac With AMD Graphics Next Month

Bateluer

Lifer
Jun 23, 2001
27,730
8
0
I also wonder how much GPU grunt it'll come with. New high powered card from AMD maybe?

Odds are, it'll top out at AMD's current mid range sku, possibly with a Crossfire option. Apple almost never goes with high end GPUs, and I don't think they've ever had a high end GPU in any iMac sku.

R7 270, or maybe R9 285 if really lucky. Most likely something in the R7 26x line up.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Odds are, it'll top out at AMD's current mid range sku, possibly with a Crossfire option. Apple almost never goes with high end GPUs, and I don't think they've ever had a high end GPU in any iMac sku.

R7 270, or maybe R9 285 if really lucky. Most likely something in the R7 26x line up.

AFAIK, it needs DP1.3 though to drive that many pixels. Nothing right now supports that.
 

Gloomy

Golden Member
Oct 12, 2010
1,469
21
81
Full Tonga at most I think, binned and clocked a little low. I don't think Apple would put anything over 150w in an iMac.

Either that, or a totally new chip.
 
Feb 19, 2009
10,457
10
76
Apple did put in the Hawaii Firepro reference blower in their Mac Pro, which is ridiculous since its 250W+ and hot/noisy.

I still can't believe people in charge at AMD gave the OK for that terrible blower to be used in such an expensive Firepro product.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
AFAIK, it needs DP1.3 though to drive that many pixels. Nothing right now supports that.

and nothing right now supports the 5K Dell Ultrasharp, so they get around that limitation by tiling with 2x DP1.2 connections

and considering Apple's complete control over their hardware and software, it really wouldn't be hard for them to get 5K working flawlessly without DP1.3
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Apple did put in the Hawaii Firepro reference blower in their Mac Pro, which is ridiculous since its 250W+ and hot/noisy.

I still can't believe people in charge at AMD gave the OK for that terrible blower to be used in such an expensive Firepro product.

I didn't realise they use the ref blower? I thought they used their own cooling with the "wind tunnel" design.
 

Subyman

Moderator <br> VC&G Forum
Mar 18, 2005
7,876
32
86
I'd have to think Apple would switch over to Nvidia for the Mac Pro refresh. They'd instantly lose 200W of thermals if they went with 2 cards again. I can see Apple doing 4K, but why 5k+? Maybe so you can edit 4K in native res while still having room for the UI?
 

dacostafilipe

Senior member
Oct 10, 2013
772
244
116
If true and the release time, we'll need to see DP1.3 on some card from AMD.

http://www.macrumors.com/2014/09/29/retina-imac-27-amd/

I also wonder how much GPU grunt it'll come with. New high powered card from AMD maybe?

There are no drivers for 290(X) under 10.9. Under Yosmite 10.10 beta the Hawaii driver seem more or less stable but is not finished yet (framebuffers missing, no multimonitor support with DVI, ...). My 290X is still waiting on the shelf :p

Some goes for Maxwell, where there are web beta drivers for 10.10 from nVidia, but they are not stable at the moment (screen corruptions, freeze).

A lot can happen in 2-3 Month, but I would not bet on a Hawaii/Tonga or even Maxwell GPU.
 

NTMBK

Lifer
Nov 14, 2011
10,245
5,035
136
Apple did put in the Hawaii Firepro reference blower in their Mac Pro, which is ridiculous since its 250W+ and hot/noisy.

Um, what? No, they really didn't. The Mac Pro has a single massive heatsink.

VbSKmDcrJLnt6WTn


https://www.ifixit.com/Teardown/Mac+Pro+Late+2013+Teardown/20778
 

Galatian

Senior member
Dec 7, 2012
372
0
71
I'd have to think Apple would switch over to Nvidia for the Mac Pro refresh. They'd instantly lose 200W of thermals if they went with 2 cards again. I can see Apple doing 4K, but why 5k+? Maybe so you can edit 4K in native res while still having room for the UI?


Uh, no? You should be reading the tomshardware review of the Maxwell cards, especially the part about the power needed to drive those things. What NVidia did is not something magical, they just made sure Maxwell is extremely fast to switch between voltage states. That way they can claim less power used while gaming, but when you look at GPGPU tasks it is not more efficient then Kepler.

Apple wants to use AMD GPUs because of their OpenCL support.
 

ams23

Senior member
Feb 18, 2013
907
0
0
Uh, no? You should be reading the tomshardware review of the Maxwell cards, especially the part about the power needed to drive those things. What NVidia did is not something magical, they just made sure Maxwell is extremely fast to switch between voltage states. That way they can claim less power used while gaming, but when you look at GPGPU tasks it is not more efficient then Kepler

This statement is false. If you re-read Tom's updated article, you will see that the Gigabyte Windforce cards have a much higher power target of ~ 250w compared to other Maxwell cards that are using the reference spec power target of ~ 165w. So Maxwell's exceptional efficiency is for real.

http://forum.beyond3d.com/showpost.php?p=1876415&postcount=2363
 
Last edited:

Galatian

Senior member
Dec 7, 2012
372
0
71
This statement is false. If you re-read Tom's updated article, you will see that the Gigabyte Windforce cards have a much higher power limit of ~ 250w compared to other Maxwell cards that are using the reference spec power limit of ~ 165w. So Maxwell's exceptional efficiency is for real.


You are wrong...I'm not talking about the problem they had with their Gigabyte card...I'm referring to this:

"If the load is held constant, then the lower power consumption measurements vanish immediately. There’s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980’s power consumption to an overclocked GeForce GTX Titan Black, there really aren’t any differences between them. This is further evidence supporting our assertion that the new graphics card’s increased efficiency is largely attributable to better load adjustment and matching."

My point stands...I think this actually should
be required reading for all AMD doomsayers that have been out in force ever since Maxwell came out. There is nothing magical about the card.
 

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
You are wrong...I'm not talking about the problem they had with their Gigabyte card...I'm referring to this:

"If the load is held constant, then the lower power consumption measurements vanish immediately. There&#8217;s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980&#8217;s power consumption to an overclocked GeForce GTX Titan Black, there really aren&#8217;t any differences between them. This is further evidence supporting our assertion that the new graphics card&#8217;s increased efficiency is largely attributable to better load adjustment and matching."

My point stands...I think this actually should
be required reading for all AMD doomsayers that have been out in force ever since Maxwell came out. There is nothing magical about the card.

I got a GTX980, and it uses less pwoer than my GTX680.

Just stop the FUD already and accept reality. Its getting bothersome that we have to hear all this nonsense over and over again because of 1 single site.
 

ams23

Senior member
Feb 18, 2013
907
0
0
You are wrong...I'm not talking about the problem they had with their Gigabyte card...I'm referring to this:

"If the load is held constant, then the lower power consumption measurements vanish immediately. There&#8217;s nothing for GPU Boost to adjust, since the highest possible voltage is needed continuously. Nvidia's stated TDP becomes a distant dream. In fact, if you compare the GeForce GTX 980&#8217;s power consumption to an overclocked GeForce GTX Titan Black, there really aren&#8217;t any differences between them. This is further evidence supporting our assertion that the new graphics card&#8217;s increased efficiency is largely attributable to better load adjustment and matching."

My point stands...I think this actually should
be required reading for all AMD doomsayers that have been out in force ever since Maxwell came out. There is nothing magical about the card.

Again, you are incorrect. Please re-read the article. The GTX 980 reference spec card has much lower power consumption than the Gigabyte Windforce GTX 970 and 980 models. This is simply due to differences in power target set in the BIOS!

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html

In fact, if you look at the average power consumed in the GPGPU "Torture Test", you will see that a GTX 980 reference spec model has MUCH lower power consumption than any Gigabyte Windforce GTX 970 or 980 model or Titan Black (look at the red vs. orange bar graphs below).

103-Overview-Power-Consumption-Torture.png
 
Last edited:

ShintaiDK

Lifer
Apr 22, 2012
20,378
145
106
DP 1.3 is certainly out of the question. We are long into 2015 before we even see the first device there. It will simply be driven by DP 1.2 the same way as Dells 5K, assuming it will be 5K.
 

ams23

Senior member
Feb 18, 2013
907
0
0
All current imacs ship with keplers, strange they didn't simply replace them with maxwell.

Technically the two most affordable 21.5" iMac variants ship with Intel HD graphics. The 27" iMac variants ship with Kepler graphics, and same with the most expensive 21.5" iMac variant too. So the bulk of iMac shipments are likely still Intel HD graphics when all is said and done.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
Again, you are incorrect. Please re-read the article. The GTX 980 reference spec card has much lower power consumption than the Gigabyte Windforce GTX 970 and 980 models. This is simply due to differences in power target set in the BIOS!

http://www.tomshardware.com/reviews/nvidia-geforce-gtx-980-970-maxwell,3941-12.html

In fact, if you look at the average power consumed in the GPGPU "Torture Test", you will see that a GTX 980 reference spec model has MUCH lower power consumption than any Gigabyte Windforce GTX 970 or 980 model or Titan Black (look at the red vs. orange bar graphs below).

103-Overview-Power-Consumption-Torture.png


Are we reading the same article?

"Note that the GeForce GTX 980's stress test power consumption is actually a few watts lower than the gaming result. This is likely due to throttling that kicks in when we hit the thermal ceiling."
 

Abwx

Lifer
Apr 2, 2011
11,030
3,665
136
I got a GTX980, and it uses less pwoer than my GTX680.

Just stop the FUD already and accept reality. Its getting bothersome that we have to hear all this nonsense over and over again because of 1 single site.

That your 980 use less power than the 680 doesnt make him wrong, read the quoted sentence carefuly.
 

jpiniero

Lifer
Oct 1, 2010
14,656
5,280
136
btw, the current 27" iMac comes with the mobile versions - ie: 770M and 780M. So the 970M and 980M would make obvious choices, Apple probably went with AMD because they had a better supply or got a great deal. I would not expect a new product.
 

Wild Thing

Member
Apr 9, 2014
155
0
0
With all the forum commentary around the Net on the true power consumption figures of the Maxwell chips being in question, it's probably time for a proper investigation by some reputable sites.
Sounds like there are some shenanigans going on here,a good investigation like was done into the frame latency questions surrounding AMD cards last year would seem appropriate.:thumbsup: