AMD Raven Ridge 'Zen APU' Thread

Page 66 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Yotsugi

Senior member
Oct 16, 2017
794
196
96
I wouldn't be surprised to find the first batch of releases being a thermal throttling mess and/or a GPU that is starved because lack of dual channels, and that will basically kill AMD's momentum.
Two out of three lappies presented were dual channel, Acer one was with 25W chassis.
Pretty decent for the first batch.
 

majord

Senior member
Jul 26, 2015
308
1
86
Well neither Intel or AMD currnt platforms support lpddr4, so none.

Only question mark is the price. Not sure how it compares with std ddr4 modules GB for GB

EDIT. Appllogies, Goldmont Atom does support it
 

coercitiv

Diamond Member
Jan 24, 2014
3,244
603
136
I wouldn't be surprised to find the first batch of releases being a thermal throttling mess
Well, it seems AMD took the right step to address this and will only enable XFR on validated models. They better do this right, it's important.

 

SPBHM

Diamond Member
Sep 12, 2012
4,821
55
126
It may also be possible that there's a GDDR5 controller in there somewhere. The only thing holding back Raven Ridge from >= PS4 gaming performance is bandwidth.
at the end of the day it's 640SPs and 128bit memory, it would need very high and probably unrealistic clocks to match the PS4 (1152 and 256bit but older architecture and conservative clocks)

with a 65w+ TDP it will be interesting to see how the desktop version is going to perform, if it can beat the GT 1030, the RX550 is on the 1030 level with only 512SPs, but it also uses 128bit GDDR5, with DDR4 and penalty for sharing with the CPU, let's see how it goes.
 

sirmo

Golden Member
Oct 10, 2011
1,010
7
136
People need to stop fantasizing about HBM on an APU anytime soon. A mainstream part like this doesn't justify HBM from a cost or performance perspective.
HBM keeps finding more and more uses, and as the adoption grows it's inevitable it will start trickling down to the mainstream. Apple's recent decision to build iMac's using Vega HBM GPUs is another notch towards HBM's wider adoption. Personally I can't wait. An HBM APU would be killer.
 

HurleyBird

Golden Member
Apr 22, 2003
1,805
117
126
at the end of the day it's 640SPs and 128bit memory, it would need very high and probably unrealistic clocks to match the PS4 (1152 and 256bit but older architecture and conservative clocks)
704SPs. There are 11 CUs total in Raven Ridge, 10 of which are enabled on the 2700U. To match the PS4's 1.84 teraflop throughput, an uncut version of Raven Ridge would require 1309 MHz. The GPU in the 15W 2700U already runs at up to 1300 MHz, although it probably spends most of its time lower than that.

Given how horrendously the 2700U runs overwatch compared to a PS4, or even the XBone, there's a huge amount of performance that is unable to manifest itself because of the memory bottleneck. Raven Ridge is crying for a pool of supplementary high speed memory - HBM, EDRAM, GDDR5, whatever - to feed the GPU, significantly more so than any Intel IGP.
 

sirmo

Golden Member
Oct 10, 2011
1,010
7
136
704SPs. There are 11 CUs total in Raven Ridge, 10 of which are enabled on the 2700U. To match the PS4's 1.84 teraflop throughput, an uncut version of Raven Ridge would require 1309 MHz. The GPU in the 15W 2700U already runs at up to 1300 MHz, although it probably spends most of its time lower than that.

Given how horrendously the 2700U runs overwatch compared to a PS4, or even the XBone, there's a huge amount of performance that is unable to manifest itself because of the memory bottleneck. Raven Ridge is crying for a pool of supplementary high speed memory - HBM, EDRAM, GDDR5, whatever - to feed the GPU, significantly more so than any Intel IGP.
Keep in mind Overwatch tends to underperform on AMD GPUs:

 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
0
106
Even if Raven Ridge is not quite as fast as a PS4, it would be 90% as fast if it just had access to something like 8GB of GDDR5 or 8GB of HBM2. I mean, 90% as fast for much lower power consumption is really not bad.
 

SPBHM

Diamond Member
Sep 12, 2012
4,821
55
126
704SPs. There are 11 CUs total in Raven Ridge, 10 of which are enabled on the 2700U. To match the PS4's 1.84 teraflop throughput, an uncut version of Raven Ridge would require 1309 MHz. The GPU in the 15W 2700U already runs at up to 1300 MHz, although it probably spends most of its time lower than that.

Given how horrendously the 2700U runs overwatch compared to a PS4, or even the XBone, there's a huge amount of performance that is unable to manifest itself because of the memory bottleneck. Raven Ridge is crying for a pool of supplementary high speed memory - HBM, EDRAM, GDDR5, whatever - to feed the GPU, significantly more so than any Intel IGP.
right, I saw now that there is an extra CU in there not being used, but the problem of memory remains, 128bit GDDR5 at high clock is likely to bump the power usage a bit and is not quite PS4 level.

from looking at this video other GPUs with a little lower bandwidth (but no CPU sharing it) are not running a lot different than the 2700U
https://www.youtube.com/watch?v=56jNOJdKZfo
that's 29GB/s memory and a weak Kepler 384SPs GPU running at higher res (900P 75% and not 720P 79%), but it's impossible to know how close this is to the test scene AMD used, still I feel like this might be a similar level of performance higher res but possibly a little lower average fps, so yes the 2700u is likely heavily bottlenecked by the memory bandwidth in this game, yet, we don't know the actual clocks while running the game.

still, I do think that maybe lets say a 45W laptop with GDDR5 could be very interesting, performance could be maybe between the 550 and 460, which is massively better.
 

NTMBK

Diamond Member
Nov 14, 2011
8,303
280
126
Even if Raven Ridge is not quite as fast as a PS4, it would be 90% as fast if it just had access to something like 8GB of GDDR5 or 8GB of HBM2. I mean, 90% as fast for much lower power consumption is really not bad.
That was the original plan for Kaveri, for it to support GDDR5: https://www.anandtech.com/show/7702...nce-quadchannel-memory-interface-gddr5-option Sadly it never came to anything, probably due to the GDDR5M standard going tits-up.

I also don't know how well power-hungry GDDR5 would go down in the laptop market. For example, take a look at the Macbook Pro. Apple use LPDDR3, which means that they can't have more than 16GB of RAM, and are limited to slower speeds than the DDR4 which is also supported- but they're happy to make the trade off to reduce power consumption. I guess that a GDDR5 APU could work out more power efficient than a GDDR5 GPU plus an LPDDR3/4 CPU in high performance situations, but how would it fare in idle time? There's a reason that the Macbook Pro switches over to use the Intel GPU when the dGPU is not necessary.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
0
106
I don't expect them to use GDDR5 in the laptop market - my point was more hypothetical, that if AMD were to make an APU today with similar performance to the PS4, they could probably have the APU itself consume something like 25W. Obviously the memory would be extra, but it would be very impressive to squeeze PS4 level performance into 25W.

Everyone is hoping for some HBM2 coupled with an APU - however, rumours abound that Apple has an exclusivity agreement with Intel, so this is not likely to happen.
 

Insert_Nickname

Diamond Member
May 6, 2012
3,559
113
126
They could easily just clock a R7 1700 at 2 ghz and do the same with pretty much any discrete GPU.
That could work. At idle, my 1700 doesn't pull more then ~15W@1550MHz. With binning I'm sure that could be reduced. 25W should give 2GHz on all cores, and 3'ish GHz dual core turbo.

That plus a Polaris 12 or GP108 would make one evil laptop... ;)
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
0
106
Apple is not afraid to ditch vendors; they killed off PPC just like that.
They won't do it to Intel unless they can make a complete transition to AMD across their entire product line.

Which, admittedly, they could do, since AMD has well performing products in every market that Apple operates in. So, it is possible.

Actually what might bite AMD here is the supply issue - can AMD guarantee Apple a supply of the chips it needs? HBM2 in particular is difficult to get hold of.
 

Yotsugi

Senior member
Oct 16, 2017
794
196
96
Actually what might bite AMD here is the supply issue - can AMD guarantee Apple a supply of the chips it needs
Yes. The company operates semi-custom unit and I don't remember Sony or MS complaining about chip supply.
HBM2 in particular is difficult to get hold of.
But it's not?
 
Apr 27, 2000
11,849
1,046
126
I don't think Apple will switch to AMD CPUs/APUs except maybe as a stopgap to something else.
 

NTMBK

Diamond Member
Nov 14, 2011
8,303
280
126
They won't do it to Intel unless they can make a complete transition to AMD across their entire product line.
If anything, this would simplify their lineup. At the moment they have a mix of Intel IGPs and AMD dGPUs. If they could move to one graphics architecture across the whole lineup then it would make life simpler for Apple.
 

Ancalagon44

Diamond Member
Feb 17, 2010
3,274
0
106
If anything, this would simplify their lineup. At the moment they have a mix of Intel IGPs and AMD dGPUs. If they could move to one graphics architecture across the whole lineup then it would make life simpler for Apple.
it would at that. I don't think Intel is really capable of making an IGP powerful for Apple's purposes. I mean, they have Crystal Well, which is good, but it is still not powerful enough for all of Apple's needs.

In this sense, AMD has a lot more options on the table. Firstly they obviously have APUs with more grunt than Intel can provide, especially if they were able to integrate them with HBM2. Their HBC technology may mean that even 2GB of HBM2 will make a lot of difference.

Second AMD could provide a solution with CPU and GPU integrated via infinity fabric, coupled to either GDDR5 or HBM2.
 
Jun 8, 2003
14,129
163
126
They literally spent like three slides explaining new and shiny power management to you. It's actually really awesome.

If course it does, down to cTDP levels.

A little bit. Vega is much more effective as using bandwidth, courtesy of DSBR.
Still needs NGG Fast Path to eke more performance out of every ALU.
With NVIDIA Optimus, Intel integrated GPU is just a display controller, idling ~0.1 W, when dedicated GPU is rendering game.

AMD Raven Ridge power limit supposedly works like Intel's: Turbo Boost short (temporary, for the Cinebench score for headlines) and long (sustained) power limit.
That's not the problem, iGPU power is near zero while the dGPU is active. The difference is the sum of chip TDP: 15W for i7 + 25W for MX 150 results in a total platform power consumption of about 60W. By contrast a platform with 15W TDP CPU and no dGPU will use around 25-30W, around half the power.

The only fair comparison would be i7 + MX150 versus Raven Ridge with 35-45W cTDP setting and equivalent cooling.
thanks guys, good information.
 


ASK THE COMMUNITY

TRENDING THREADS