(MacRumors) Apple May Launch 27-Inch Retina iMac With AMD Graphics Next Month

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
I got a GTX980, and it uses less pwoer than my GTX680.

Just stop the FUD already and accept reality. Its getting bothersome that we have to hear all this nonsense over and over again because of 1 single site.

We know Maxwell is very efficient for gaming. nVidia is not the best choice for high res though. Performance drops pretty badly @4K. Adding 5K to the mix would likely create even more performance issues.
 

Galatian

Senior member
Dec 7, 2012
372
0
71
I got a GTX980, and it uses less pwoer than my GTX680.

Just stop the FUD already and accept reality. Its getting bothersome that we have to hear all this nonsense over and over again because of 1 single site.

Again...please read the article(s) and then you can stop with all the nonsense...

With all the forum commentary around the Net on the true power consumption figures of the Maxwell chips being in question, it's probably time for a proper investigation by some reputable sites.
Sounds like there are some shenanigans going on here,a good investigation like was done into the frame latency questions surrounding AMD cards last year would seem appropriate.:thumbsup:

THW did a very thorough test and I believe them. I mean there is no magic going on, Maxwell is a simple evolution of an already existing system: power gating and doing this as fast as possible. Intel and ARM have been doing this for years. Of course the average power consumption will go down and that is all neat. By all means: under normal gaming conditions those Maxwell chips are using a lot less power, hence nVidia can achieve higher clocks when needed, so they are even faster then the Kepler series chips under gaming loads.

And this is also the answer to the question on this other thread: "all" AMD needs to do is to implement better power gating and since this is also something of interest for the CPU line, I think it is a pretty save bet that they are already working on that. No doom and gloom here.
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
We know Maxwell is very efficient for gaming. nVidia is not the best choice for high res though. Performance drops pretty badly @4K. Adding 5K to the mix would likely create even more performance issues.

1. the 980 is still the fastest @ 4K, despite any "pretty bad" performance drop

2. Apple doesn't make gaming computers and pretty much never choose the fastest GPU to pair with their systems, especially the AIO iMac models. The fastest available in the current 27" iMac is a 780M (basically and under-clocked 680/770)

If Apple is going AMD for the next iMac its likely 100% for economical reasons with very little regard for performance characteristics (if anything, Apple would care most about efficiency of the chips they use to better suit their stereotypical form over function designs)
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
1. Still a rumour. I think Apple could still go with Maxwell mobile 970M or similar due to efficiency.

2. Even if they go with 980M, 5K is going to make that GPU look like Intel integrated graphics in modern games because it's 78% more pixels than 4K. 980M is too slow for 4K gaming because you'd want a minimum of 970SLI/290 CF and 980M is nowhere near that fast. So ya, you'd want a minimum of 980M SLI to even start discussing 4K gaming, nevermind 5K.

3. Who the hell buys an iMac to play games? OSX is not a gaming platform. All it needs is a GPU cheap enough to not inflate the price to $4000 (i.e., so that Apple can instead spend it on the more expensive 5K panel) and fast enough for 3d video acceleration and 2D desktop work.
 

jpiniero

Lifer
Oct 1, 2010
16,818
7,258
136
3. Who the hell buys an iMac to play games? OSX is not a gaming platform. All it needs is a GPU cheap enough to not inflate the price to $4000 (i.e., so that Apple can instead spend it on the more expensive 5K panel) and fast enough for 3d video acceleration and 2D desktop work.

These are super high end machines - the 27 inch starts at $1799. Coming with discrete graphics makes sense since it is a premium machine, even if they won't be used for gaming much. You obviously won't be playing games at 5K though. The cheaper 21 inch models mostly use the IGP.
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Again...please read the article(s) and then you can stop with all the nonsense...



THW did a very thorough test and I believe them. I mean there is no magic going on, Maxwell is a simple evolution of an already existing system: power gating and doing this as fast as possible. Intel and ARM have been doing this for years. Of course the average power consumption will go down and that is all neat. By all means: under normal gaming conditions those Maxwell chips are using a lot less power, hence nVidia can achieve higher clocks when needed, so they are even faster then the Kepler series chips under gaming loads.

And this is also the answer to the question on this other thread: "all" AMD needs to do is to implement better power gating and since this is also something of interest for the CPU line, I think it is a pretty save bet that they are already working on that. No doom and gloom here.

Screen-Shot-2014-04-29-at-1.08.08-AM.jpg
 

geoxile

Senior member
Sep 23, 2014
327
25
91
Man, Nvidia is almost a year ahead AMD on the power management tech if I'm reading this correctly.

Well, at least AMD is working on it. :awe:
Well they are working on 3 CPU architectures at the same time. And working on GCN for 3 different nodes: 20nm for Skybridge APUs, 28nm HPM for discrete GPUs, and 28nm at Glofo for Carrizo (i think anyway)
 
Last edited:

positivedoppler

Golden Member
Apr 30, 2012
1,148
256
136
yup, apple is flipping the bill for amd to move up to 20nm at tsmc. All the initial exclusive 20nm will be apple first. Apple has been really pleased with amd recently. the push for samsung to license 14nm to gf has been for amd too per apple's demand
 

Eug

Lifer
Mar 11, 2000
24,131
1,780
126
Just a note:

Although I agree that Retina iMacs will come sooner rather than later, the original "article" Macrumors quoted was from a teenage blogger with no track record for reliability.

What this guy seems to be doing is scouring the net for clues and then posting his guesses. Now, some of those guesses make sense, but I think the real reason he gets so much airplay is that he doesn't present them as guesses. Instead, he words his articles as if they are fact. Then it just snowballs from there. The tech media eats it all up.

To put it another way, I'd rate the original article that Macrumors referenced as about 3 tiers below DigiTimes, and we all know DigiTimes has a fairly poor track record when it comes to predicting Apple product releases.

But again, that said, I believe a Retina iMac is coming soon. I just hope it isn't a 27" behemoth with a great big chin. I find my current 27" iMac non-ergonomic, because it's about 2" taller than it should be, because of that chin, and also because I'm not a big fan of its relatively high pixel density (for a desktop), since the default text sizes end up being smaller than what I prefer.

My ideal iMac would be a 24" Retina model with no chin and a somewhat lower pixel density (than a hypothetical 27" Retina model would likely have), paired with a matching secondary 24" screen. IOW, back on topic sort of: I'd hope the Retina iMac would be 3840x2400 and 24", with a GPU fast enough to power two of these screens simultaneously with no noticeable lag in all common non-gaming activities.
 
Last edited: