Speculation: Ryzen 4000 series/Zen 3

Page 72 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Panino Manino

Senior member
Jan 28, 2017
813
1,010
136
Vega with 60% more performance is intriguing.
What's this? Maybe that fabled Arcturus GPU? An in-between Vega and Navi?
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
  • Like
Reactions: amd6502

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
Total costs mister. Chiplets are a lot cheaper overall for this scenario. By AMD's own estimates for Zen1, they were 59% the cost of a monolithic equivalent.

aHR0cHM6Ly9pbWcucHVyY2guY29tL3cvNTAwL2FIUjBjRG92TDIxbFpHbGhMbUpsYzNSdlptMXBZM0p2TG1OdmJTODJMelF2TnpBMU1qUTBMMjl5YVdkcGJtRnNMekF5TGxCT1J3PT0=

That was only the case when yields were lower. A monolithic die with 8 cores and an IO die is by nature cheaper than chiplets. The packaging alone costs less, and there is potential silicon savings to be had. This is evidence by the fact they moved to a monolithic die for mobile. Also, there are now further indications from CES that the die will indeed be monolithic for consoles (or at least the Xbox).
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
Some info is starting to trickle otu about Renoir power-efficiency.

Load power draw seems extremely good (especially multithreaded), Idle uplift doesn't seem nearly as big, but is still a healthy improvement:

From Anandtech's Renoir article

These CPUs all support LPDDR4X memory, up to 64 GB, and AMD says that the infinity fabric is not tied to this memory clock. This helps the chip reach even lower power in its idle states, and the company said that they have rearchitected a good portion of the power delivery in the APU in order to be able to power down and power gate more elements of the SoC than was previously possible. AMD said that this decoupling of the infinity fabric and memory support, especially with both CPU and GPU accessing it, was made substantially easier due to the APU being a monolithic solution (with that in mind, it’s likely that AMD might not be going down the chiplet APU route any time soon). Also worthy to note is that AMD is saying that they have reduced the latency for parts of the chip to enter/exit idle states by 80%, and it’s this that helps enable the power gating in such a way to remain responsive. In previous products, certain elements of the design had to remain powered in order to be as responsive as the user required.
and
In terms of power, AMD is touting a full 2x performance per watt on the new 15 W CPUs, made possible by doubling the cores in the same power envelope and keeping the frequency high. AMD stated that this was possible due to +30% efficiency from the core and the SoC design, and +70% efficiency in the process compared to the previous products. Overall SoC power for the same frequency of the APU is down 20% as well, allowing AMD to push more out of the hardware.

From Anandtech's Lenovo Yoga Slim article

Prices will start at $699, although that doesn’t state which processor/memory/storage configuration that would be. The battery comes in at 60.7 Wh, which Lenovo is stating should be good for 14 hours, which would be a sizeable uplift in mobile battery efficiency from AMD.

And Here is the relevant slide:

AMD%20CES%202020%20Update_Client_Embargoed%20Until%20Jan.%206%20at%206pm%20ET-page-014.jpg
 
Last edited:

exquisitechar

Senior member
Apr 18, 2017
655
862
136
For an 8c/16t that's ultra low power. Very nice. Any higher gpu power laptop will simply come with dGPU and have the iGPU just for long battery life ability.

8CU at peak perf watt freq will provide nice > 720p-hi settings ability.

For desktop SKUs they can run hot near peak perf freq, and the 8CU will match 12nm Vega 11 CU (and bandwidth would limit perf anyways unless using very pricey high freq DDR4).
Agreed. Very nice performance uplift from clocks+faster memory anyway. If you want more GPU performance AMD will toss in a 5500m or whatever.

Van Gogh is the part that will bring a huge iGPU performance uplift, Renoir focuses on other things.
 
  • Like
Reactions: amd6502

uzzi38

Platinum Member
Oct 16, 2019
2,565
5,575
146

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Well now i can say this is a intentional GPU downgrade? And im sorry but the "memory constrained" argument is just wrong, with my old 3200G i did some tests with a guy that had the 3400G with both at 1700mhz the 3400G was always better at 3200mhz DDR4. They are doing the minimum effort as possible not to kill their own GPUs or even worse, they are getting paid by Sony. I expect the press to bash AMD hard for this. With Renoir hitting previusly impossible DDR4 speeds this was a perfect oportunity for Vega 11 to stretch its legs.

And if Vega 11 is not avalible in mobile i would expect to be unavalible on desktop as well... altrought it may happen for a top of the line product that at a price that will defeat the propourse.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Fair
That was only the case when yields were lower. A monolithic die with 8 cores and an IO die is by nature cheaper than chiplets. The packaging alone costs less, and there is potential silicon savings to be had. This is evidence by the fact they moved to a monolithic die for mobile. Also, there are now further indications from CES that the die will indeed be monolithic for consoles (or at least the Xbox).
Can't agree with that statement. Either do the math or use the yield calculators and see the % difference in usable silicon with aggregated small die vs monolithic one.

Fairly certain that the mobile issue has more to do with IF power savings thus battery life. One present problem with chiplets appears to be idle power.
 

insertcarehere

Senior member
Jan 17, 2013
639
607
136
To play devil's advocate, it doesn't matter how strong the APU IGPU is if OEMs are just going to gimp it with single channel memory or AMD doesn't follow up with good graphics drivers. Both of which wouldn't be unexpected given previous track records.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
Well now i can say this is a intentional GPU downgrade?
Depends if the double-precision is garbage like Ryzen 2700u/3700u. Basically being near 16x rate.

2700u(10 CU/1.3 GHz): 1306 single-precision/82.34 double precision gflops. (Peak SP FMA Gflops: 1664)
3700u(10 CU/1.4 GHz): 1371 single-precision/85.59 double precision gflops. (Peak SP FMA Gflops: 1792)
a10-9700(desktop)(6 CUs/1028 MHz): 771 sp/392.5 dp gflops (Peak SP FMA gflops: 790)

If Renoir drops and it has Bristol's double-precision:
4800u(8 CU/1.75 GHz): Peak SP/DP FMA: 1792/896

If it has Radeon VII's double precision: 448 GFlops.
 
  • Like
Reactions: lightmanek

amd6502

Senior member
Apr 21, 2017
971
360
136
512 * 1.75 = 896 operations // 128-bit LPDDR4-4266 => 68.3 GB/s
vs
640 * 1.4 = 896 operations // 128-bit DDR4-2400 => 38.4 GB/s

Dali appears to just be a refresh of Raven2: https://images.anandtech.com/doci/15324/AMD CES 2020 Update_Client_Embargoed Until Jan. 6 at 6pm ET-page-028.jpg
No A-series in sight. It seems to finally be over.


Expensive exotic memory isn't mainstream. The desktop version should have higher iGPU freq that very well matches with affordable mainstream memory. Esp'ly if you consider that APUs originally were oriented towards mostly budget conscious builders.

I've kind of suspected this Raven2 (and its refresh) would be Bristol Ridge (and high bin Stoney) successors. So far 2c/2t 15w is its low bin, low end SKU. But a die salvaged 1c/2t at 6W would be a Stoney 6W done right.

Totally changing topic now to flagship 4000 mobile, this (second picture) looks like a ballpark 200mm2 die; maybe just under. cores, uncore, and gpu each adding 60something mm2 (well 70 or 80something for the gpu maybe).

Well now i can say this is a intentional GPU downgrade? And im sorry but the "memory constrained" argument is just wrong, with my old 3200G i did some tests with a guy that had the 3400G with both at 1700mhz the 3400G was always better at 3200mhz DDR4.

Of course the 3400g will be ahead, but there are diminishing returns. This means the scaling of performance to CU count (or iGPU frequency) gets really really bad. It's hard to be 100% memory constrained, but 95% would be something like increase your CUs (or freq) by 50% and but you see only 5% to 10% performance increase (rather than a 50% performance increase that you would get if you're 0% bandwidth constrained).

It's pretty much what jpiniero says (Die size considerations & dGPU self-competition) along with the fact that this is a high end mobile oriented product.

A serious gaming laptop is going to come with a dGPU anyways. So the iGPU is there for general use and unplugged gaming only. You don't need 12CU-16CU for decent enough graphics at peak perf/watt frequencies; by having a much larger iGPU you might add wattage rather than reduce it. Also, for their smallest mobile socket, a 16CU APU might also exceed the socket size.

To target the smaller market of desktop users who want big iGPU/APUs they're better following up with Zen3 big-iGPU MCM. They can integrate a big GPU with the IO hub for MCM APUs while reusing the current GPU-less IO hub for MCM Zen3 CPUs. I don't think the Zen3 chiplet is all that far away.
 
Last edited:
  • Like
Reactions: Zepp

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Of course the 3400g will be ahead, but there are diminishing returns. This means the scaling of performance to CU count (or iGPU frequency) gets really really bad. It's hard to be 100% memory constrained, but 95% would be something like increase your CUs (or freq) by 50% and but you see only 5% to 10% performance increase (rather than a 50% performance increase that you would get if you're 0% bandwidth constrained).

It's pretty much what jpiniero says (Die size considerations & dGPU self-competition) along with the fact that this is a high end mobile oriented product.

A serious gaming laptop is going to come with a dGPU anyways. So the iGPU is there for general use and unplugged gaming only. You don't need 12CU-16CU for decent enough graphics at peak perf/watt frequencies; by having a much larger iGPU you might add wattage rather than reduce it. Also, for their smallest mobile socket, a 16CU APU might also exceed the socket size.

To target the smaller market of desktop users who want big iGPU/APUs they're better following up with Zen3 big-iGPU MCM. They can integrate a big GPU with the IO hub for MCM APUs while reusing the current GPU-less IO hub for MCM Zen3 CPUs. I don't think the Zen3 chiplet is all that far away.

The point is that the Vega 11 is already ahead of Vega 8 at high clocks even with DDR4-3200, Renoir will be capable of DDR4-4000+ in fact it supports LPDDR4X-4266... thats a lot, and also coupled with faster CPU cores there is no way that 11CU not to provide a significant gain. 8CU may make sence for 15W U products, but 45W H the H are a joke. And its definately not acceptable for 65W+ desktop APUs.
 

amd6502

Senior member
Apr 21, 2017
971
360
136
The point is that the Vega 11 is already ahead of Vega 8 at high clocks even with DDR4-3200, Renoir will be capable of DDR4-4000+ in fact it supports LPDDR4X-4266... thats a lot, and also coupled with faster CPU cores there is no way that 11CU not to provide a significant gain. 8CU may make sence for 15W U products, but 45W H the H are a joke. And its definately not acceptable for 65W+ desktop APUs.

True on 45W parts, 11CU would have hit well there.

Here they are probably highly OEM oriented and many will go in bga Desktop.

The bulk volume for Renoir on AM4 Desktop (35W-65W) might be dies that didn't make the efficiency cut for high end mobile as well as dies that can manage high iGPU frequency. Should match 11CU 12nm APUs at stock iGPU freq is my guess. People using affordable spectrum of DDR4 memory for their builds won't see much of an iGPU disadvantage over flagship 2000/3000 iGPUs.

They should make the 65W APUs upward cTDP capable for those with their own cooling. This way no throttling ever (like was seen with many FM2 APUs).

They made a goal to focus on mobile 15W here. My thought was they would also focus on sub 15W and hence go quad-core. So it's a day for the moar-cores enthusiasts to celebrate. (Think about how many angry people on this board there would have been today if I had been Lisa Su instead.)

This is an utterly good 15w menu for the higher end (including gaming segment that come with dGPUs that are way more powerful than an 11-16CU iGPU). Quoting the chart from the wccf article I linked above:

AMD Ryzen 7 4800U8 / 161.8 GHz4.2 GHz8 / 51215W
AMD Ryzen 7 4700U8 / 82.0 GHz4.1 GHz7 / 44815W
AMD Ryzen 5 4600U6 / 122.1 GHz4.0 GHz6 / 38415W

I think the 4700U is going to be a very good match for gamers who are somewhat budget conscious. The 4500u and 4300u also complement the existing Picasso 3000 APUs very well. This is a very awesome addition IMHO, and I guess it's hard or impossible to have everything for everyone in one generation of products.


If you're really hoping for big iGPU APU then just wait a little for Navi (as AM4 MCM), as I think it will be featured shortly after Zen3 chiplet arrives later in this year. (Then you have the option of maxing out expensive high frequency RAM).
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Of course the 3400g will be ahead, but there are diminishing returns. This means the scaling of performance to CU count (or iGPU frequency) gets really really bad. It's hard to be 100% memory constrained, but 95% would be something like increase your CUs (or freq) by 50% and but you see only 5% to 10% performance increase (rather than a 50% performance increase that you would get if you're 0% bandwidth constrained).

In mobile its still going to be an upgrade, because The Vega 10 GPU on Picasso doesn't run anywhere near 1.3GHz, and closer to 900MHz.

If they can bring the gaming frequencies to say 1.4-1.5GHz, it'll end up performing much better, especially with the premium LPDDR4x setup.
 

Veradun

Senior member
Jul 29, 2016
564
780
136
That was only the case when yields were lower. A monolithic die with 8 cores and an IO die is by nature cheaper than chiplets. The packaging alone costs less, and there is potential silicon savings to be had. This is evidence by the fact they moved to a monolithic die for mobile. Also, there are now further indications from CES that the die will indeed be monolithic for consoles (or at least the Xbox).

If they had no 16c desktop part I would agree, but since they have one and it obviously makes no sense to cut a 16c monolithic die down to 6c SKUs I'm pretty sure not having to tapeout two designs for 8c and 16c is more than enough incentive to stay chiplet. Also they will have new nodes to move to and continous optimization of this layout will have positive effects also on that.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
If they had no 16c desktop part I would agree, but since they have one and it obviously makes no sense to cut a 16c monolithic die down to 6c SKUs I'm pretty sure not having to tapeout two designs for 8c and 16c is more than enough incentive to stay chiplet. Also they will have new nodes to move to and continous optimization of this layout will have positive effects also on that.

Not only that. If they had 2 monolithic designs for desktop, they would still need separate HEDT and Server dies in addition. Now they essentially get the CPU chiplet for free, because it's needed for servers anyway.