• Guest, The rules for the P & N subforum have been updated to prohibit "ad hominem" or personal attacks against other posters. See the full details in the post "Politics and News Rules & Guidelines."

AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 10 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
Looks nice, other than the single DIMM. But if you're not going to use the GPU for anything 3D, I guess it doesn't matter.
Well the board only has two memory slots, so that is why I put in a single 8GB stick of RAM so it will be cheaper to upgrade to 16GB if needed.
 
  • Like
Reactions: Gideon

moinmoin

Platinum Member
Jun 1, 2017
2,771
3,669
136
I wonder how much 400 chipset borrows from Raven Ridge. After all- lower power consumption of the 400-series hints at AMD making Zen+ better suited for laptops.
Aside of being an APU built for mobile everything about the (still on 14nm) Raven Ridge cores so far points to it being an intermediate step to Pinnacle Ridge (12nm), stuff like Precision Boost 2 will be part of the latter as well. My guess is aside PCIe 3.0 the 400 chipset series will also enable finer gating of the uncore to reduce idle power use further.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,227
787
126
My guess is aside PCIe 3.0 the 400 chipset series will also enable finer gating of the uncore to reduce idle power use further.
Ryzen is a full-on SoC, it doesn't "need" a chipset for anything. The Ryzen "chipsets" are just I/O breakout boxes, and already use PCIe 3.0 for communication with the host SoC.
 

Shivansps

Diamond Member
Sep 11, 2013
3,365
978
136
Ok i just did my own tests just to make sure about the potential of the 2200G

On Witcher 3, in the starting area (all using the same savegame and the exact same area), all at the same 720p resolution.

A8-9600 with 2x4GB DDR4-2400 = 25-30 fps on LOW.

I5-7400/8GB DC+GT1030 2GB GDDR5 = 30-40 FPS on Medium
(totally the best case escenario for a 2200G, and i it seems imposible to me the 2200G will get anywhere near this with low speed DDR4s)

G4600/4GB+GT1050 2GB = 40-50 FPS on Ultra.
 
  • Like
Reactions: lightmanek

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
Ok i just did my own tests just to make sure about the potential of the 2200G

On Witcher 3, in the starting area (all using the same savegame and the exact same area), all at the same 720p resolution.

A8-9600 with 2x4GB DDR4-2400 = 25-30 fps on LOW.

I5-7400/8GB DC+GT1030 2GB GDDR5 = 30-40 FPS on Medium
(totally the best case escenario for a 2200G, and i it seems imposible to me the 2200G will get anywhere near this with low speed DDR4s)

G4600/4GB+GT1050 2GB = 40-50 FPS on Ultra.
Witcher 3 is THE game I really want to play that my old system completely can't handle. I was thinking that if Desktop RR could handle it, I might upgrade to it and avoid the crazy GPU pricing for a good long while, like until the Mining Bubble bursts.

But it looks like it probably won't be good enough.
 

Shivansps

Diamond Member
Sep 11, 2013
3,365
978
136
Witcher 3 is THE game I really want to play that my old system completely can't handle. I was thinking that if Desktop RR could handle it, I might upgrade to it and avoid the crazy GPU pricing for a good long while, like until the Mining Bubble bursts.

But it looks like it probably won't be good enough.
Actually its not that bad, the first time i finished W3 was with a 750TI on medium/1080P with my old 2500K and 4GB of ram because i had a dead ram at that time, the 1030 almost matches the performance of a 750TI.

The problem is, i see very difficult for the 2200G to get anywhere near a 1030/GDDR5, even with super fast rams. And even so it will probably have a lot worse minimum fps.
 

xblax

Member
Feb 20, 2017
54
70
61
Ok i just did my own tests just to make sure about the potential of the 2200G

On Witcher 3, in the starting area (all using the same savegame and the exact same area), all at the same 720p resolution.

A8-9600 with 2x4GB DDR4-2400 = 25-30 fps on LOW.

I5-7400/8GB DC+GT1030 2GB GDDR5 = 30-40 FPS on Medium
(totally the best case escenario for a 2200G, and i it seems imposible to me the 2200G will get anywhere near this with low speed DDR4s)

G4600/4GB+GT1050 2GB = 40-50 FPS on Ultra.
I still don't really get you. Sure, the 2200G has a 30$ higher price point but in both combinations with or without dGPU it is offers so much more power that cheaping out on the CPU to save 30$ would be dumb. With cheap and powerful quadcore CPUs like the 2200G or the i3-8100 recommending an A8-9600 or G4600 seems like bad advice to you customers.

The main selling point for you should be that the 2200G can later (1-4 years) be upgraded with a midrange GPU and it will not hold that GPU back, but the A8-9600 or G4600 would. Also 4 cores will become the minimum requirement for CPU hungry games in the future. Sure you can offer them an A8-9600 or G4600 build if they really can't afford that 30$ - but saving two more months would be the best advice you could give to them. If you have trouble selling AMD CPUs because customers prefer Intel - just tell them about Meltdown.
 

Shivansps

Diamond Member
Sep 11, 2013
3,365
978
136
I still don't really get you. Sure, the 2200G has a 30$ higher price point but in both combinations with or without dGPU it is offers so much more power that cheaping out on the CPU to save 30$ would be dumb. With cheap and powerful quadcore CPUs like the 2200G or the i3-8100 recommending an A8-9600 or G4600 seems like bad advice to you customers.

The main selling point for you should be that the 2200G can later (1-4 years) be upgraded with a midrange GPU and it will not hold that GPU back, but the A8-9600 or G4600 would. Also 4 cores will become the minimum requirement for CPU hungry games in the future. Sure you can offer them an A8-9600 or G4600 build if they really can't afford that 30$ - but saving two more months would be the best advice you could give to them. If you have trouble selling AMD CPUs because customers prefer Intel - just tell them about Meltdown.
I dont do sales, i just watch over the competition and build pc configurations, along with other internal stuff.

I need to think it like this: The 2200G build is probably going to cost the exact same money as a A12-9800 build right? the 9700/9800 sales are very very low, old A10-78xx where also not very high either.

Im giving the example of the G4600/4GB/1050 because its the cheaper one, but there is also R3 1200, R5 1400, I3-7100 with the 1050 and with 8 or 4GB of ram, and except for the R5 1400, any of those combinations sells more than the 9700/9800. And i just waiting for the H310 to replace the 7100s.

I just dont see how the 2200G can change that in any way, 1030 level IGP (if that, its going to be very difficult) is just not enoght. And im taking in consideration that people will know of the 2200G igp power (they will not).

I could be good to replace a GT1030 gaming configuration, but i dont have one, i use the 1050 as the more basic entry level GPU for games and i dont think anyone will argue with me that.

The 7300/9500 is the choice for a basic entry level, general use PC, with the Pentium following close by, and the A8-9600 is the choice for a pc to run basic entry level games and it sells quite well. And 9700/9800 are niches. Its really strange, people seems to think if they had to expend more money they rather jump to a 1050, and i dont blame them.
I could get rid of the 9600 and put a 2200G on it, the thing is, im not going to do this, because people is going to buy a 9600 in other place, its the reason of why i was forced intro building 4GB versions of EVERYTHING, i did not wanted to, but the competition did, and they sold well, so i had to do it, and it sells well. So i can do nothing.

BUT. the 2200G and 2400G could be VERY GOOD as R3 1200 / R5 1400 non-gaming build replacement, those "office" configurations have, right now a GT710 on them, i could easily replace the 1200 and 1400 for the APU version and get rid of the GT710. That will greatly help to keep the Ryzen configurations competitive when the H310 w/ 8100 / 8400 drops by.
 

neblogai

Member
Oct 29, 2017
144
49
71
Ok i just did my own tests just to make sure about the potential of the 2200G

On Witcher 3, in the starting area (all using the same savegame and the exact same area), all at the same 720p resolution.

A8-9600 with 2x4GB DDR4-2400 = 25-30 fps on LOW.
Not sure if comparing A8 9600 to 2200G is correct. Here is a A12 9800 (65W TDP) comparison to 2500U- which is quite similar to 2200G, only with extra CPU threads and also limited to ~25W. Both use the same speed RAM (DDR4 2400), and Raven Ridge here, on pre-release, 3 month old drivers, is already about 10% faster: https://www.youtube.com/watch?v=5-a7bbjwS5I . Surely APU like that can run much faster when given 65W to run on instead of just 25W. And even better when you give it better RAM, and overclock the iGPU some 30%+.
 
Last edited:

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
Actually its not that bad, the first time i finished W3 was with a 750TI on medium/1080P with my old 2500K and 4GB of ram because i had a dead ram at that time, the 1030 almost matches the performance of a 750TI.

The problem is, i see very difficult for the 2200G to get anywhere near a 1030/GDDR5, even with super fast rams. And even so it will probably have a lot worse minimum fps.
Well that is the thing. From what I have seen 750Ti > GTX 1030, and I expect GTX 1030 > than even 2400G.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
Not sure if comparing A8 9600 to 2200G is correct. Here is a A12 9800 (65W TDP) comparison to 2500U- which is quite similar to 2200G, only with extra CPU threads and also limited to ~25W. Both use the same speed RAM (DDR4 2400), and Raven Ridge here, on pre-release, 3 month old drivers, is already about 10% faster: https://www.youtube.com/watch?v=5-a7bbjwS5I . Surely APU like that can run much faster when given 65W to run on instead of just 25W. And even better when you give it better RAM, and overclock the iGPU some 30%+.
I would think the 2200G will be somewhat faster due to have four more powerful cores and using faster memory. Best to wait and see.
 

rainy

Senior member
Jul 17, 2013
463
308
136
I need to think it like this: The 2200G build is probably going to cost the exact same money as a A12-9800 build right? the 9700/9800 sales are very very low, old A10-78xx where also not very high either.
Quite obviously you don't know that Ryzen 3 2200G have Zen cores when A12-9800 using Excavator (previous AMD architecture) - IPC of Zen is 52 percent higher which translate to much better CPU performance.

Here is a review of A12-9800 - for comparison look at results of Ryzen 3 1200 because 2200G should be even faster.
https://www.techspot.com/review/1486-amd-a12-9800/

Btw, offering gaming configuration with 4GB of RAM is rather ridiculous IMO.
 
Last edited:

tamz_msc

Platinum Member
Jan 5, 2017
2,900
2,629
136
Yeah all this talk about how the 2200G's iGPU would fare in Witcher 3 using performance results from the A8-9600 is pretty pointless if people keep forgetting that those cores are Zen. Here is the 2500U in the the HP Envy running Witcher 3 at 720p with a 30fps cap using a mix of low-medium-high settings.

https://youtu.be/OVXn9iAEwSI?t=2m34s
 
  • Like
Reactions: whm1974

rainy

Senior member
Jul 17, 2013
463
308
136
Here is the 2500U in the the HP Envy running Witcher 3 at 720p with a 30fps cap using a mix of low-medium-high settings.
For sure 2200G should perform better not just because of much higher power budget (65W vs 15-25W) but also support of RAM with higher clocks - mobile version of Raven Ridge is limited to DDR4-2400.
 

SPBHM

Diamond Member
Sep 12, 2012
4,998
356
126
Yeah all this talk about how the 2200G's iGPU would fare in Witcher 3 using performance results from the A8-9600 is pretty pointless if people keep forgetting that those cores are Zen. Here is the 2500U in the the HP Envy running Witcher 3 at 720p with a 30fps cap using a mix of low-medium-high settings.

https://youtu.be/OVXn9iAEwSI?t=2m34s
the same video shows that 1080P low barely holds 20FPS on a light load area, I guess using 720P is a must because of the low memory bandwidth.
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
OK, I didn't know that, however you are expecting to see latops with them because I'm quite a bit sceptical?
At some point definitely.
It is highly unlikely that the higher TDP infras for Raven based mobile parts exist for no reason.
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
It highly unlikely they exist for memory overclocking.
?

The maximum allowed MEMCLK depends on the TDP on mobile parts.
The way it's been since Carrizo and the way it is with Raven.

Raven doesn't officially support anything higher than 2933MHz (1 DPC SR).
 

tamz_msc

Platinum Member
Jan 5, 2017
2,900
2,629
136
the same video shows that 1080P low barely holds 20FPS on a light load area, I guess using 720P is a must because of the low memory bandwidth.
I don't see how that is a problem when post #230 specifically mentions 720p performance. APUs like these are still a long way from providing reliable 1080p, 30fps console level performance. If you intend to game with an APU, then you should be prepared for sub-1080p resolutions when playing AAA games.

I'm interested to see Wolfenstein 2 on this APU. Since it is compute heavy an uses Vulkan - cards with more CUDA cores, even the old ones like the 750Ti, utterly destroy the GT 1030.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,573
126
I don't see how that is a problem when post #230 specifically mentions 720p performance. APUs like these are still a long way from providing reliable 1080p, 30fps console level performance. If you intend to game with an APU, then you should be prepared for sub-1080p resolutions when playing AAA games.

I'm interested to see Wolfenstein 2 on this APU. Since it is compute heavy an uses Vulkan - cards with more CUDA cores, even the old ones like the 750Ti, utterly destroy the GT 1030.
NV needs a GT1040 with 512 cores...
 

SPBHM

Diamond Member
Sep 12, 2012
4,998
356
126
I don't see how that is a problem when post #230 specifically mentions 720p performance. APUs like these are still a long way from providing reliable 1080p, 30fps console level performance. If you intend to game with an APU, then you should be prepared for sub-1080p resolutions when playing AAA games.

I'm interested to see Wolfenstein 2 on this APU. Since it is compute heavy an uses Vulkan - cards with more CUDA cores, even the old ones like the 750Ti, utterly destroy the GT 1030.
750 ti = 86GB/s memory, 1030 = 48GB/s memory

it's 384 vs 640 SPs but the 1030 runs games around 1600Mhz by default, the reference 750 ti around 1150 (according to reviews, the boost clock in the specs is lower for both)

are you sure this destruction is due to compute and not memory again?

the difference in bandwidth between the 750 ti and 1030 is bigger than "compute", and the APU is even more bandwidth bottlenecked (CPU sharing, less efficient usage of memory than Pascal)
 

ASK THE COMMUNITY