Info Ryzen 4000 Mobile Chips Unveiled at CES

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

guachi

Senior member
Nov 16, 2010
761
415
136
Acer Swift 3 with Ryzen 7 4700U
AMD Ryzen 4000 Mobile APUs

AMD debuted (finally!) their Ryzen 4000 mobile chips at CES that are confusingly Ryzen 2 CPUs that are analogous with their Ryzen 3000 desktop parts.

I like what I see, especially on the power front. I know many people said that 7nm Ryzen chips had the potential to be very power efficient and if AMD's slide deck is to be believed they have succeeded. Most of the power efficiency has come from the 7nm process. As well, the 7nm process looks to be allowing AMD to cram up to 8 cores onto a laptop chip.

Do you guys think AMD has a product that will be as competitive on the laptop as the 3000 series is on the desktop? I'm thinking that the 4600 will be the best buy like the 3600 is in the desktop space. The problem in the laptop space is AMD needs design wins. At least on the desktop I don't need some company to choose for me, I can just by the chip myself.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Yes it is huge problem for people who needs high fps wants spend money on premium memory, but don't want to see buy a discrete GPU to get way higher performance.

Who said anything about buying premium memory? I said several times that at the same freq and DDR4-3200 Vega 11 is always faster than Vega 8. More Bandwidth would have increased the gap but is not needed. Faster freqs only seem to increase the 3200G-3400G gap when i tested that. I really had no chance to get near a 3400G Vega 11 with a 3200G when Vega 11 got to 1700mhz.

Vega 8 needs about 150-200mhz more to match a Vega 11, always talking about DDR4-3200.

Very little backlash since gaming-oriented laptops have dGPU and office-laptop have no need for fast IGP.
And there are no indication of reduced performance, probably even a bit faster IGP (or?) than 3000-series anyway.

AMD should not waste die area (money) on all their APU’s just because of a niche market. This niche market was important when their CPU’s were inferior, but now they need a die that can compete with Intel in high volumes in all middle and high end laptops.

Sure, AMD invented APUs so OEMs and general people would pair them with expensive dgpus. AMD said so themselves, only the H cpus are really meant to be paired with a dgpus. U are not meant for that, neither are the desktop APUs as there are CPUs for that.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
The thing you're right on is that it's a very price sensitive segment. 3600mhz ram is still significantly more expensive than the common as muck 3200mhz and I do think the cpu prices will drop in response to intel's 10th gen. They'll need to hit 6cus at £/$100 or 8cu at £/$150 to get equal or better gpu performance, I'm putting those number down as the price for apus tends to be fully dictated by where the cpu cores would place them in performance and am not taking overclocking into consideration.
It's such a small market that amd just doesn't give a shit.

The only situation I can see that cost doesn't come into it is for the ultra small desktop pcs. I personally have one I built around the 2400g at a similar size to a mac mini and it doesn't look like it'll be worth upgrading and it's now two years down the line. NUCs are a similar one but onl that intel dpgu one has been performance orientated and cost a bomb.

Forget about faster rams, it makes no sence for an $100-$150 APU, it is something than an overcloker would take advantage as getting OC past 3200 on Picasso is really hard, but is not something that is worth discussing.

I used a 3200G for 6 months, overcloked to 1600mhz, i could barelly match an stock 3400G if that. Who says that you cant game on them has no idea of what they are talking about. In fact i finished my last Witcher 3 playtrought on 900p with a custom setting Low-Med-high settings with that thing, i used to have a RX480 before that, i never noticed the diference.

Vega 6 is not going to be the same... it problably needs around 1.8-1.9ghz to match the stock 3400G. Matching a stock 3200G is not going to be that difficult, 1500mhz is going to be enoght probably. But it is a clear step back in my book.
 

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
A larger die also means more wafers needed. Apple moving to 5 nm should free up some 7 nm space, but it's hard to know TSMC's capacity.
A couple days ago there was a Taiwanese report (via) that by not only Apple but also Qualcomm, HiSilicon etc. moving to 5nm and AMD doubling its order of wafers for H2 AMD is becoming TSMC's biggest customer for 7nm this year.
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Forget about faster rams, it makes no sence for an $100-$150 APU, it is something than an overcloker would take advantage as getting OC past 3200 on Picasso is really hard, but is not something that is worth discussing.

I used a 3200G for 6 months, overcloked to 1600mhz, i could barelly match an stock 3400G if that. Who says that you cant game on them has no idea of what they are talking about. In fact i finished my last Witcher 3 playtrought on 900p with a custom setting Low-Med-high settings with that thing, i used to have a RX480 before that, i never noticed the diference.

Vega 6 is not going to be the same... it problably needs around 1.8-1.9ghz to match the stock 3400G. Matching a stock 3200G is not going to be that difficult, 1500mhz is going to be enoght probably. But it is a clear step back in my book.
Personally, I'm waiting on benches before offering opinions.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Personally, I'm waiting on benches before offering opinions.

Regardless of bench results a 8CU Renoir should be faster that a 6CU Renoir, and they should be replacing Picassos with Renoirs of the same Vega core count because the IGP is the same. In fact the normal thing to do is to provide a CU core increase not a regresion in these cases. I cant remember any case of something like this happening in the past.

If AMD wanted to pull this move they should have used Navi cores, there at least we can have some discussion because it is a newer arch, with Vega is clear they did this to have bigger margins, that is not going to look better to me regardless of the bench results.

I do want to wait and see if this ends up to be true for desktop APUs as well, but if it happens as well no benchmark is going to change my opinion.
 
Last edited:

majord

Senior member
Jul 26, 2015
433
523
136
Regardless of bench results a 8CU Renoir should be faster that a 6CU Renoir, and they should be replacing Picassos with Renoirs of the same Vega core count because the IGP is the same. In fact the normal thing to do is to provide a CU core increase not a regresion in these cases. I cant remember any case of something like this happening in the past.

If AMD wanted to pull this move they should have used Navi cores, there at least we can have some discussion because it is a newer arch, with Vega is clear they did this to have bigger margins, that is not going to look better to me regardless of the bench results.

I do want to wait and see if this ends up to be true for desktop APUs as well, but if it happens as well no benchmark is going to change my opinion.

Consumers have turned their nose up at AMDs offering of superior performance IGPs , for lowrr $$ for generations. Is it that surprising they decided to offer class leading instead of class obliterating IGP perf now? Incidentally No one cares about CU counts .If it's faster it's faster.

Radeon 7 had less CU than Vega 64 btw
 

maddie

Diamond Member
Jul 18, 2010
4,723
4,628
136
Regardless of bench results a 8CU Renoir should be faster that a 6CU Renoir, and they should be replacing Picassos with Renoirs of the same Vega core count because the IGP is the same. In fact the normal thing to do is to provide a CU core increase not a regresion in these cases. I cant remember any case of something like this happening in the past.

If AMD wanted to pull this move they should have used Navi cores, there at least we can have some discussion because it is a newer arch, with Vega is clear they did this to have bigger margins, that is not going to look better to me regardless of the bench results.

I do want to wait and see if this ends up to be true for desktop APUs as well, but if it happens as well no benchmark is going to change my opinion.
If no benchmark will change your opinion, then so be it.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
For those of you that complain about the 8CU VEGA, AMD have said the new VEGA 8 has 59% faster performance than the previous gen.

That's per CU and its an up to figure. Their gaming performance claims support it. Besides, in mobile U the top SKU only has 10 of them not 11.

Shivan may be right about the H part being not being significantly faster. 10-20% maybe with DDR4-3200?

I used a 3200G for 6 months, overcloked to 1600mhz, i could barelly match an stock 3400G if that.

1600MHz is only 14% increase over 3400G's clock. The CU count difference is almost 38%. Granted CUs will scale worse than clocks but it won't make up for that big of a difference.
 
Last edited:

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
That's per CU. Their gaming performance claims support it. Upping clocks scale better than increasing CU count.



1600MHz is only 14% increase over 3400G's clock. The CU count difference is almost 38%. Granted CUs will scale worse than clocks but it won't make up for that big of a difference.

Yes, now we have 8 CUs with 59% more performance vs 11CUs of the previous gen.

So performance will be close to 20% higher on average than last gen.

So we may have the following,

iGPU perf

4800U 8CU = 20% faster than 3700U 10CU
4500U 6CU = 20% faster than 3500U 8CU
4300U 5CU = ~equal to 3500U 8CU

ps.
4500U should be the one to replace the old 3500U, it will actually have 50% more CPU cores and also have ~20% higher iGPU perf vs the last gen 3500U.
Thats not bad for $499 LAPTOPs.

edit: fixed the old gen 3 skus
 
Last edited:

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
@AtenRa Uh, I guess you mean 3200G and 3400G?

(I should have said 14% increase over 3750H, not 3400G)

Desktops have significant thermal/power headroom advantage so the gains will be less.

Based on specs it might be 20-30% better than 3700U.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
PCWorld artticle about Acer Swift 3 is interesting
(full specs are listed a the end of the article).

EDIT accidentally linked to the video. Article is here

Performance:
AMD version starts at $600 with a Ryzen 7 4700U!
In comparison, Intel version starts from $700 with a Core i3-1005G1 (that chip would look weak even against a R5)

Now to keep expectations in check:

Battery:
Intel: 56Wh (16 hours battery life)
AMD: Undisclosed (10 hours battery life)

Now, I'm quite sure that AMD version does not have the same battery (otherwise why it's the only thing that's undisclosed?). The difference is absurdly stark otherwise, more than the one between Ice-Lake vs Picasso on the MS Surface. It just doesn't make sense, especially if you consider that AMD version has a considerably lower resolution screen.

Still the writing on the wall states pretty clearly that AMD's battery life isn't at the level of Ice-Lake. The old AMD Swift had a 48 Wh battery, so I doubt it's much smaller and that wouldn't explain all of the difference.

But it's still strange when bearing in mint that Lenovo in comparison promises similar battery life between Intel and AMD models (and has usually gotten much closer to Intel counterparts even with Picasso).
 
Last edited:
  • Like
Reactions: lightmanek

insertcarehere

Senior member
Jan 17, 2013
639
607
136
PCWorld artticle about Acer Swift 3 is interesting
(full specs are listed a the end of the article).

Performance:
AMD version starts at $600 with a Ryzen 7 4700U!
In comparison, Intel version starts from $700 with a Core i3-1005G1 (that chip would look weak even against a R5)
Looks like AMD is being very aggressive with pricing to nab design wins if true. Good for us consumers!
 

beginner99

Diamond Member
Jun 2, 2009
5,208
1,580
136
Performance:
AMD version starts at $600 with a Ryzen 7 4700U!
In comparison, Intel version starts from $700 with a Core i3-1005G1 (that chip would look weak even against a R5)

Now to keep expectations in check:

Battery:
Intel: 56Wh (16 hours battery life)
AMD: Undisclosed (10 hours battery life)

Battery life for what? Your article actually links to a youtube video which I can't look at right now hence does it state what was tested for battery life? I mean is it really surprising a 8-core has less battery life than a dual core?

Still I agree that this is probably the big gotcha of these APUs. Still not matching intels battery life. And that might not even be due to the SOC itself but also the screen, wifi, optimizations etc. Exactly were AMD usually shots itself in the foot. Maybe the AMD one uses in general cheaper components especially a cheaper (read more power hungry) screen. If the batter test means "idling" with screen on then that could explain it.
 

Jimzz

Diamond Member
Oct 23, 2012
4,399
190
106
Something else to think about is how this will put pressure on a already fab strapped Intel to either cut prices on their new chips or add cores (more silicone) to current ones.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
For those of you that complain about the 8CU VEGA, AMD have said the new VEGA 8 has 59% faster performance than the previous gen.

As i said they are making a 8CU as fast or faster than a 11CU and as a result they are improving margins and protecting their dgpu market by making It smaller and scale backwards instead of doing a 1to 1 replacement.

I expect equal or better perf, but that Is not going to change what they did here.

Anyway there is no point in keep arguing this, this may be a good thing to do on mobile anyway, im worried about desktop APUs.
 
Last edited:

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
Battery life for what? Your article actually links to a youtube video which I can't look at right now hence does it state what was tested for battery life? I mean is it really surprising a 8-core has less battery life than a dual core?

Still I agree that this is probably the big gotcha of these APUs. Still not matching intels battery life. And that might not even be due to the SOC itself but also the screen, wifi, optimizations etc. Exactly were AMD usually shots itself in the foot. Maybe the AMD one uses in general cheaper components especially a cheaper (read more power hungry) screen. If the batter test means "idling" with screen on then that could explain it.

This is the article. I fixed the link in edit as well, my bad.

I agree with your points overall. Also bear in mind that the Intel version is a Project Athena laptop, which is closely codeveloped with Intel and needs to apply to a number of extra-strict requirements. Acer probably didn't optimze the AMD version nearly to the same degree. And as the processor is new there might also be firmware/driver problems etc that worsen the score.

About the battery life figures. I think they are the same for all CPUs in the chassis (that includes i5 and i7, not only dual-cores).

But whether it's the OEMs or AMD's job. It's clear that there is still a lot do be done on that front.

The full specs are here:

Acer Swift 3 (SF313-52/G, Intel Core) basic specs
  • Display: 13.5-inch IPS (2256x1504)
  • Processor: Core i3-1005G1, Core i5-1035G1, Core i5-1035G4 or Core i7-1065G7 (all 10th-gen, 10nm Intel "Ice Lake")
  • Graphics: Integrated UHD or Iris Plus graphics
  • Memory: Up to 16GB dual-channel LPDDR4X SDRAM
  • Storage: PCIe Gen3 8Gb/s NVMe (128GB, 256GB, 512GB, 1TB)
  • Ports: USB-C (Thunderbolt 3), 1 USB-A 3.1 Gen1 with power-off charging, I USB-A 2.0, HDMI, 3.5mm jack, DC jack
  • Camera: 1280x720, 720p video
  • Battery: 56Wh (16 hours battery life)
  • Wireless: Wi-Fi 6 (802.11ax), 2x2 MIMO; Bluetooth 5.0
  • Operating system: Windows 10 Home
  • Dimensions: 11.91 x 9.21 x 0.63 inches (15.9mm)
  • Weight: 2.62 pounds
  • Color/chassis: Aluminum, Magnesium Aluminum
  • Price and availability: $699 and up, available in March
Acer Swift 3 (SF314-42, AMD Ryzen) basic specs
  • Display: 14-inch (1920x1080)
  • Processor: Ryzen 7 4700U (7nm)
  • Graphics: Radeon Vega 7
  • Memory: up to 16GB dual-channel LPDDR4X SDRAM
  • Storage: PCIe SSD (128GB, 256GB, 512GB)
  • Ports: USB-C Gen 2, USB 3.0, USB 2.0, HDMI, 3.5mm audio jack
  • Camera: 1280x720, 720p video
  • Battery: Undisclosed (10 hours battery life)
  • Wireless: Wi-Fi 6 (802.11ax)
  • Operating system: Windows 10 Home
  • Dimensions: 12.73 x 8.62 x 0.65 inches (16.6mm)
  • Weight: 2.65 pounds
  • Color/chassis: Aluminum, Magnesium Aluminum
  • Price and availability: $599 and up, available in March
 
Last edited:
  • Like
Reactions: guachi

zinfamous

No Lifer
Jul 12, 2006
110,512
29,098
146
goddang it, all I wanted was a Zen2 Surface, and they release those stupid things 3 months before the Zen2 mobile CPUS are out! ....here's hoping for a mid-year refresh on the AMD-based Surface line! :\
 
  • Like
Reactions: Arkaign

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
Something else to think about is how this will put pressure on a already fab strapped Intel to either cut prices on their new chips or add cores (more silicone) to current ones.

I'm sure they're already working on a not so real world benchmarks suite that'll be more suited for the things one does on their phones. It will be interesting to see how it all plays out in the end.
 

tomatosummit

Member
Mar 21, 2019
184
177
116
About the battery life figures. I think they are the same for all CPUs in the chassis (that includes i5 and i7, not only dual-cores).

But whether it's the OEMs or AMD's job. It's clear that there is still a lot do be done on that front.
Is the battery not in the product name? I'm going on a long shot here.
SF313-52/G = intel system with graphics and ~52wh battery? (56wh rounded down)
SF314-42 = ryzen system with no graphics and "maybe ~42wh" battery? (the old 48wh battery rounded down)

The weight is fairly similar too, that seems odd with the extra graphics cooling that would be required for the intel system and a potentially larger battery.

Anyway there is no point in keep arguing this, this may be a good thing to do on mobile anyway, im worried about desktop APUs.
It doesn't seem like amd is worried about the desktop unfortunately. It looks like it's targeting 15w laptops and it sort of works in 45w units too. Hopefully the gpu will overclock to 2-2.2ghz on the desktop and get something better out of it.

Also you asked for an example of dropping compute units in the past; the gtx 780/ti/titans had more compute and more memory bandwidth than the following gtx980, but the clocks increased quite a bit and the memory was offset by the delta compression improvements. Although the 980 was certainly a smaller die and saved money there.
 
Last edited:
  • Like
Reactions: Gideon

Mopetar

Diamond Member
Jan 31, 2011
7,797
5,899
136
I wonder if AMD is trying to use some kind of chiplet style approach to their desktop APUs. They likely don't want to cannibalize sales of their other, higher margin parts if at all possible, and they're strained enough on wafers as is without producing more monolithic chips. I don't think this generation sees that approach, but I don't think it will be too much longer before they go that route.

The piece that's missing is a graphics chiplet. I don't expect them to go that way for their mainstream consumer GPUs consider they said that such an approach resulted in performance scaling issues (no doubt similar to what we saw in the past with SLI/Crossfire), but they did indicate that a chiplet-based approach wasn't an issue for professional use cases, which makes me suspect that the may segment the professional and gaming cards this way in the future.

An APU doesn't need a lot of graphical power and can't make full use of all of its resources in a lot of cases, so that's where it would make sense to pair a CPU chiplet with a GPU chiplet. There's obviously more engineering work to it than simply wishing it so, but the chiplet-based approach already makes good economic sense and it just gives AMD another way to recycle chiplets that might otherwise be defective.