Info Ryzen 4000 Mobile Chips Unveiled at CES

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,524
136
I wonder if AMD is trying to use some kind of chiplet style approach to their desktop APUs. They likely don't want to cannibalize sales of their other, higher margin parts if at all possible, and they're strained enough on wafers as is without producing more monolithic chips. I don't think this generation sees that approach, but I don't think it will be too much longer before they go that route.

The piece that's missing is a graphics chiplet. I don't expect them to go that way for their mainstream consumer GPUs consider they said that such an approach resulted in performance scaling issues (no doubt similar to what we saw in the past with SLI/Crossfire), but they did indicate that a chiplet-based approach wasn't an issue for professional use cases, which makes me suspect that the may segment the professional and gaming cards this way in the future.

An APU doesn't need a lot of graphical power and can't make full use of all of its resources in a lot of cases, so that's where it would make sense to pair a CPU chiplet with a GPU chiplet. There's obviously more engineering work to it than simply wishing it so, but the chiplet-based approach already makes good economic sense and it just gives AMD another way to recycle chiplets that might otherwise be defective.
Desktop APU's have always been about unloading unsold laptop chips. AMD talks a big game about their intentions with their Desktop APU product, but even going back as far as Llano its been about making sure they had a second market for their Laptop dies. They aren't going to reinvent the wheel for a niche of a nice of a niche.
 

IntelUser2000

Elite Member
Oct 14, 2003
6,878
1,424
136
Is the battery not in the product name? I'm going on a long shot here.
SF313-52/G = intel system with graphics and ~52wh battery? (56wh rounded down)
SF314-42 = ryzen system with no graphics and "maybe ~42wh" battery? (the old 48wh battery rounded down)
Rounding doesn't work that way.

Product model numbers are arbitrary and has no real meaning.
 

Shivansps

Platinum Member
Sep 11, 2013
2,915
634
136
It doesn't seem like amd is worried about the desktop unfortunately. It looks like it's targeting 15w laptops and it sort of works in 45w units too. Hopefully the gpu will overclock to 2-2.2ghz on the desktop and get something better out of it.

Also you asked for an example of dropping compute units in the past; the gtx 780/ti/titans had more compute and more memory bandwidth than the following gtx980, but the clocks increased quite a bit and the memory was offset by the delta compression improvements. Although the 980 was certainly a smaller die and saved money there.
Yeah a lot of weird stuff happened on the dgpu market in the past, there are probably some examples there, but remember there was also a big jump from Kepler to Maxwell archs. On Renoir it is still Vega with tweaks.

The closer to what is happening with Renoir could be compare it to Intel removing HT on the I7s, or the small shadder regresion from A8-3870K to A10-5800K.

The best i can hope at this point is the desktop Renoirs to have more than 8CU on the top model so the segmentation dosent hit so hard on the entry-level. $100 to $150 is the more important area for the desktop APUs.
 

Nereus77

Member
Dec 30, 2016
63
63
61
I think Intel are in big trouble in 2020 if AMD mobile chips are starting to beat Intel Desktop chips. Imagine what the AMD 4000 series Desktop CPUs will do?

Intel needs to respond to remain competitive.
 

Gideon

Senior member
Nov 27, 2007
835
1,264
136
I think Intel are in big trouble in 2020 if AMD mobile chips are starting to beat Intel Desktop chips. Imagine what the AMD 4000 series Desktop CPUs will do?

Intel needs to respond to remain competitive.
Well to be fair, AMD is beating Intel chips because Intel disables HyperThreading in most SKUs. At least with upcoming Comet-Lake all chips will have HT enabled.

But yeah, as it stands now, they can even outperform a 9700K in some multi-threaded apps.
 

Shivansps

Platinum Member
Sep 11, 2013
2,915
634
136
Desktop APU's have always been about unloading unsold laptop chips. AMD talks a big game about their intentions with their Desktop APU product, but even going back as far as Llano its been about making sure they had a second market for their Laptop dies. They aren't going to reinvent the wheel for a niche of a nice of a niche.
Depends on the market, in my country AMD Desktop APUs are about 80% of AMD desktop chips sold.
In the company im working on it is 92% exactly. Not counting Carrizo FM2+ APUs that still sells a hell of a lot more than 200GE/3000G. Probably due to Win7 support.

Im not sure how well Desktop APUs are doing globally but i would not call them niches.
 

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,524
136
Depends on the market, in my country AMD Desktop APUs are about 80% of AMD desktop chips sold.
In the company im working on it is 92% exactly. Not counting Carrizo FM2+ APUs that still sells a hell of a lot more than 200GE/3000G. Probably due to Win7 support.

Im not sure how well Desktop APUs are doing globally but i would not call them niches.
Ah. That's not the niches I was reffering to but nice try. Also the APU's sell well because they are the cheapest. AMD's best right now is what like $150. That's the price of the cheapest Matisse Ryzen. Which circles back to the issue. These are decent volume products for AMD but they are A.) Still laptop dies B.) to sell like you are mentioning requires them to be low margin low ASP chips, yeah at this price they will sell better then a 1500x/1600AF/or 3500x for those markets they are available. Which is great if they continue to lack mobile penetration they can still sell these in a way that they will actually sell and keep even if its a knife edge, profitable. But you can see that AMD's intention with Renoir is to come in big in mobile and start hitting Intel in its biggest profit center.

As for Niche, niche is someone by a Desktop APU for anything other than an economy desktop, nicher still is someone purchasing an APU to game as the sole GFX, nicher still on that will be someone purchasing an APU as the sole GFX for gaming, that would spend enough in memory performance to close in on a situation where a 3400G might out perform a 4800G due to the 3 CU difference.

I say this as someone likely to look at building either a 4800G or 4600G slim system for my GF as soon as they are available.
 

lixlax

Member
Nov 6, 2014
138
78
101
I was a bit suprised about the regression in the CU count, but to think it makes sense since the Vega 10/11 were already quite memory bandwith starved. Coupled with the frequency increase and other possible tweaks these will still end up probably faster than the last gen.
The ~150mm2 die size is also a suprise because if I remember correctly all the other APUs have been around 250mm2. When the desktop versions launch it would be really interesting to see the relative iGPU peromance and the impact of the "tiny" L3 cache.
 

Shivansps

Platinum Member
Sep 11, 2013
2,915
634
136
Ah. That's not the niches I was reffering to but nice try. Also the APU's sell well because they are the cheapest. AMD's best right now is what like $150. That's the price of the cheapest Matisse Ryzen. Which circles back to the issue. These are decent volume products for AMD but they are A.) Still laptop dies B.) to sell like you are mentioning requires them to be low margin low ASP chips, yeah at this price they will sell better then a 1500x/1600AF/or 3500x for those markets they are available. Which is great if they continue to lack mobile penetration they can still sell these in a way that they will actually sell and keep even if its a knife edge, profitable. But you can see that AMD's intention with Renoir is to come in big in mobile and start hitting Intel in its biggest profit center.

As for Niche, niche is someone by a Desktop APU for anything other than an economy desktop, nicher still is someone purchasing an APU to game as the sole GFX, nicher still on that will be someone purchasing an APU as the sole GFX for gaming, that would spend enough in memory performance to close in on a situation where a 3400G might out perform a 4800G due to the 3 CU difference.

I say this as someone likely to look at building either a 4800G or 4600G slim system for my GF as soon as they are available.
Its not 100% like that, the APU market has been increasing greatly since they stopped to be BD trash with an igp. Is not only about price.
Raven was a huge turn around, a 2400G/3400G can do petty much everything that is not super high end related. Before Raven the TOP APU sales were from sub $100 apus, like the A8-9600, before that the A8-7600... After Raven? 2400G and now the 3400G, and sub $100 APU like the 200GE/3000G are selling very, very little.

And gaming? with GPU prices going nuts the APU market for gaming increased A LOT. Is not the "cheap choice" that it was before Raven.

And about margins, im petty sure AMD get bigger margins from desktop APUs than of those APU they sell for consoles. But less profit overall due to volume.
 

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,524
136
I was a bit suprised about the regression in the CU count, but to think it makes sense since the Vega 10/11 were already quite memory bandwith starved. Coupled with the frequency increase and other possible tweaks these will still end up probably faster than the last gen.
The ~150mm2 die size is also a suprise because if I remember correctly all the other APUs have been around 250mm2. When the desktop versions launch it would be really interesting to see the relative iGPU peromance and the impact of the "tiny" L3 cache.
Well RR was almost the exact same size of Zeppelin 192 vs. I think 205. This is ~ the size of 2 Zen 2 CDD's. Shivan is missing is that AMD had to design these well before they knew how well yields would be on 7nm to start and obviously set this and the CCD's size based on what they would need to keep the costs per chip manageable. It make sense for them to instead of breaking up the CCX design or limiting Intel style to 4 cores, to prune the GFX a little specially considering the dramatic increase in clock speed.
 
  • Like
Reactions: beginner99

joesiv

Member
Mar 21, 2019
74
24
41
Yeah, I think there will be monilithic APU's for desktop, similar to the 2000 and 3000 series, hitting the low end of the desktop market. They can be cheap because they are chips that don't meet the ultra low power mobile specs.

But it would be very interesting if they stretched the APU market upwards, and did a chiplet design too, like the 3700x with a GPU die in there too. The problem is, it would have to be priced more than a 3700x, but less than the 3700x + (the low end GPU that it's equivalent to). So... Say it's an RX5500 equivalent, or just under it... what would that be, like $450-500? That's quire pricey. And then you have the downside of no VRAM (no room for HBM on the CPU package).

I couldn't see them being able to charge more than $400 for it, and I'm not sure the market would want such a thing. Seems once you get in to the 3700x market, people would be willing to spend a couple hundred bucks for an actual GPU. But I could be wrong.
 

Shivansps

Platinum Member
Sep 11, 2013
2,915
634
136
Well RR was almost the exact same size of Zeppelin 192 vs. I think 205. This is ~ the size of 2 Zen 2 CDD's. Shivan is missing is that AMD had to design these well before they knew how well yields would be on 7nm to start and obviously set this and the CCD's size based on what they would need to keep the costs per chip manageable. It make sense for them to instead of breaking up the CCX design or limiting Intel style to 4 cores, to prune the GFX a little specially considering the dramatic increase in clock speed.
Im not missing that or that these CU are a little faster as well. My point is a very simple one, is not acceptable to reduce the size of a product of the same arch. If they provided 8 Vega cores at $100 before, it should be replaced by a product of 8 Vega cores or more at $100. Or any other core count of a newer arch.
This is how it always was, since when is ok to use a slower product, overclock it and sell it at a higher price?

The only acceptable argument is that in mobile this is a good idea to reduce idle power while gaining performance in use, but for desktop that is not going to count anymore.
 

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,524
136
Its not 100% like that, the APU market has been increasing greatly since they stopped to be BD trash with an igp. Is not only about price.
Raven was a huge turn around, a 2400G/3400G can do petty much everything that is not super high end related. Before Raven the TOP APU sales were from sub $100 apus, like the A8-9600, before that the A8-7600... After Raven? 2400G and now the 3400G, and sub $100 APU like the 200GE/3000G are selling very, very little.

And gaming? with GPU prices going nuts the APU market for gaming increased A LOT. Is not the "cheap choice" that it was before Raven.

And about margins, im petty sure AMD get bigger margins from desktop APUs than of those APU they sell for consoles. But less profit overall due to volume.
Still niche, yeah more capability means more sales. But really at this point you are really just displacing Intel iGPU sales where they are sold for their iGPU on top of CPU performance. Which again is really little.

You over value iGPU performance. Its never even in the best situation with esport games a really usable option. Probably reason number 5 AMD cut the CU's. If it was impossible on current platforms (DDR at 64bit dual channel vs GDDR at 256bit or higher or HBM) to really take advantage of the power of the iGPU and make it somewhat usuable outside outliers, then why continue to dedicate silicon space to it at the cost of cores.

The mining market died 2 years ago. GPU's are back to being available, the performance at economy prices isn't bad specially when paired with economy displays (1080p 60FPS). There is little reason outside spacing to go only igfx.

But none of this is why $100 and less APU's sold well. They sold well because they were cheap CPU's. Specially back when an Intel at that price was a single core Pentium.
 
  • Like
Reactions: rUmX

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,524
136
Im not missing that or that these CU are a little faster as well. My point is a very simple one, is not acceptable to reduce the size of a product of the same arch. If they provided 8 Vega cores at $100 before, it should be replaced by a product of 8 Vega cores or more at $100. Or any other core count of a newer arch.
This is how it always was, since when is ok to use a slower product, overclock it and sell it at a higher price?

The only acceptable argument is that in mobile this is a good idea to reduce idle power while gaining performance in use, but for desktop that is not going to count anymore.
Once again the now very shrunken niche within a niche within a niche within a niche (basically so insignificant as to be non-existant) is that of a person who purchases a $100 APU solely for its prowess in GFX. That market is basically Shivan and the people he sells to (with bottom dollar A320 boards). Because you think its enough doesn't make it enough and for the other 3.5 billion economy desktop purchasers looking at systems using sub 100 CPU's with no graphics it could be a 1CU to an 8CU unit and they would care. Specially since if AMD had what they wanted they would sell so many of these in Laptops that they don't even offer the APU's outside maybe to OEM's so they can have the full platform selection.
 

Shivansps

Platinum Member
Sep 11, 2013
2,915
634
136
Still niche, yeah more capability means more sales. But really at this point you are really just displacing Intel iGPU sales where they are sold for their iGPU on top of CPU performance. Which again is really little.

You over value iGPU performance. Its never even in the best situation with esport games a really usable option. Probably reason number 5 AMD cut the CU's. If it was impossible on current platforms (DDR at 64bit dual channel vs GDDR at 256bit or higher or HBM) to really take advantage of the power of the iGPU and make it somewhat usuable outside outliers, then why continue to dedicate silicon space to it at the cost of cores.

The mining market died 2 years ago. GPU's are back to being available, the performance at economy prices isn't bad specially when paired with economy displays (1080p 60FPS). There is little reason outside spacing to go only igfx.

But none of this is why $100 and less APU's sold well. They sold well because they were cheap CPU's. Specially back when an Intel at that price was a single core Pentium.
AMD had $100 Ryzens 1200 that sold very poorly, to the point they did not even bother anymore. iGPU-less cpu below $150 ARE the niche products.

And im sorry but i cant call my country top sell CPU, the 3400G as a niche.
 

Topweasel

Diamond Member
Oct 19, 2000
5,326
1,524
136
AMD had $100 Ryzens 1200 that sold very poorly, to the point they did not even bother anymore. iGPU-less cpu below $150 ARE the niche products.

And im sorry but i cant call my country top sell CPU, the 3400G as a niche.
Okay missing point.

A 2200G outselling a Ryzen 1200 makes sense. It's why that line died with the 1k series and was sompletely replaced by the 2200G. Because the 1200 requires you to purchase another part increasing the overall cost of the system.

You think I am saying APU is niche. It's not...well it is a little and no amount of talking about your 3rd world country only able to purchase sub $100 apu's changes the fact that generally retail APU purchases are a relatively niche thing (GPUless OEM systems another matter).

But when I am talking Niche. I am talking about the Shivan use case that is so indebted to the iGPU for desktop gaming that is panicking over a drop in the CU count. It's basically a use case of one (joke) and realistically such a small use case in sales that it shouldn't have any bearing on design choices.
 

Shivansps

Platinum Member
Sep 11, 2013
2,915
634
136
Okay missing point.

A 2200G outselling a Ryzen 1200 makes sense. It's why that line died with the 1k series and was sompletely replaced by the 2200G. Because the 1200 requires you to purchase another part increasing the overall cost of the system.

You think I am saying APU is niche. It's not...well it is a little and no amount of talking about your 3rd world country only able to purchase sub $100 apu's changes the fact that generally retail APU purchases are a relatively niche thing (GPUless OEM systems another matter).

But when I am talking Niche. I am talking about the Shivan use case that is so indebted to the iGPU for desktop gaming that is panicking over a drop in the CU count. It's basically a use case of one (joke) and realistically such a small use case in sales that it shouldn't have any bearing on design choices.
I understand what you are saying, but you still dont understand that the 2400G were and now the 3400G is the top sold product, there is a reason for that happening and i dont think it is just because the extra threads. Most people dont need anything more than a 2200G yet the APU market crown went from the A8-9600 to the 2400G and now the 3400G.

There are only two reasons for going 3400G instead of a 3200G:
1) Gaming
2) Video/Image editing (that uses the iGPU as well).

im specially worried about Vega segmentation in desktop, if the Renoir full die end up being a 8C/16T Vega 8 die. That means a 4200G could be as low as as Vega 5 product.

BTW, if GPU prices were back to normal, the RX5500XT 4GB would be a $99 product.
 

Kenmitch

Diamond Member
Oct 10, 1999
7,742
861
126
And about margins, im petty sure AMD get bigger margins from desktop APUs than of those APU they sell for consoles. But less profit overall due to volume.
I don't ever remember reading anything AMD has said about the profitability or lack of when it comes to the console market. Early on it was stated Nvidia passed on it as it wasn't profitable enough. I'm thinking it was more like they passed on Nvidia because lets face it a SOC couldn't cut it when it comes to the Xbox as PS line of consoles.

I'd imagine there's more profit than were led to believe when it comes to the console design wins. I would imagine there is built in R&D $'s provided by both Sony and Microsoft that are used for the console, but also further develop AMD's other products.
 

jpiniero

Diamond Member
Oct 1, 2010
7,929
1,209
126
I don't ever remember reading anything AMD has said about the profitability or lack of when it comes to the console market. Early on it was stated Nvidia passed on it as it wasn't profitable enough. I'm thinking it was more like they passed on Nvidia because lets face it a SOC couldn't cut it when it comes to the Xbox as PS line of consoles.

I'd imagine there's more profit than were led to believe when it comes to the console design wins. I would imagine there is built in R$
The margins do suck but there's not much risk either. Both paid for the R&D up front and presumably agreed to pay for the wafers. So it's not like AMD could get stuck with inventory.
 

IntelUser2000

Elite Member
Oct 14, 2003
6,878
1,424
136
About the battery life figures. I think they are the same for all CPUs in the chassis (that includes i5 and i7, not only dual-cores).
The Swift 3 Intel is 56WHr. AMD version is 48WHr.

But it's still strange when bearing in mint that Lenovo in comparison promises similar battery life between Intel and AMD models (and has usually gotten much closer to Intel counterparts even with Picasso).
The focus is different from the Swift 3. The Swift 3 is on the edge of weight before battery has to be sacrificed. Usually such super lightweight laptops have a 2.2lbs option with a 3xWHr and a 2.5-2.6lbs option with a 5xWHr. So there are definitely differences in platform between the two beyond the specs show.

The Lenovo is using more common components between the two and the weight is an easier to engineer 3lbs.
 
  • Like
Reactions: Gideon

ASK THE COMMUNITY