AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 77 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

french toast

Senior member
Feb 22, 2017
988
825
136
I would have to say a healthy scepticism is warrented, especially in small amateur you tube videos, but I wouldn't discredit them without knowing some more details, I would like them to show more information about the test procedure, software and settings used.
With various setting it is quite easy to skew the results one way or another (even innocent), more difficult with CPU tests but can be done, for instance covering up differences with artificial gpu bottlenecks.

As regards to the 1080ti Ryzen apu Vs intel benchmarks, I don't know how you would fake it? Or even how you could get all the results so wrong?.
I'm sceptical about the mainstream tech site reviewers just as much, they have incentive to tip the scales to which ever company offers the greatest benefits, whether that is kickbacks, free products or even just site traffic.
We should give more credence to the larger more established sites for obvious reasons, but I also wouldn't throw out these independent results either, seems unlikely to me they are intentionally bogus.
 
  • Like
Reactions: 3DVagabond

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,641
136
  • Like
Reactions: nathanddrews

Timorous

Golden Member
Oct 27, 2008
1,615
2,772
136
I have been playing with my system a bit and I am still getting the odd blue screen. Not sure if it is OC related or if it is drivers and platform immaturity as it seems okay with Prime95, memtest and Stellaris for hours at a time yet when doing something mundane I get a blue screen. Had one when I closed a game yesterday with a page-fault.

I am still running at 3.9Ghz CPU + 1.5Ghz GPU. I have backed the memory down again to 3200 14-14-14-32 but that does not seem to have made it any more stable so pretty sure it is not an issue with running @ 3400. It does not boot at all at 3600-16-16-16-36, might be able to get it to boot at looser timings but I will wait till everything is stable first before doing some more tweaking.

Thinking the best bet is a clean install of windows and just getting the latest chipset and APU driver from AMD and sticking with that. If that works then I will push for 4Ghz CPU and 1.6Ghz GPU with 3400 14-14-14-32 ram and call it day.
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
I have been playing with my system a bit and I am still getting the odd blue screen. Not sure if it is OC related or if it is drivers and platform immaturity as it seems okay with Prime95, memtest and Stellaris for hours at a time yet when doing something mundane I get a blue screen. Had one when I closed a game yesterday with a page-fault.

I am still running at 3.9Ghz CPU + 1.5Ghz GPU. I have backed the memory down again to 3200 14-14-14-32 but that does not seem to have made it any more stable so pretty sure it is not an issue with running @ 3400. It does not boot at all at 3600-16-16-16-36, might be able to get it to boot at looser timings but I will wait till everything is stable first before doing some more tweaking.

Thinking the best bet is a clean install of windows and just getting the latest chipset and APU driver from AMD and sticking with that. If that works then I will push for 4Ghz CPU and 1.6Ghz GPU with 3400 14-14-14-32 ram and call it day.

I'm having trouble running my R5 2400G's GPU past 1.3ghz unfortunately, even if I toss more voltage at it. Not sure what's causing it, but even setting the frequency in Ryzen Master to a known tested frequency/voltage that's within my tested "safe" range occasionally just locks the system with a THREAD_STUCK_IN_DEVICE_DRIVER or something like that BSOD.

I happened to have the same Gigabyte AB350N-Gaming Wifi that a lot of reviewers are using on the same BIOS revision, some of which are hitting 1.6ghz on the GPU.

I must have terrible luck with the silicon lottery... I've got a Wraith Max slapped on there currently, and the temps don't get much into the 50's when gaming, usually in the mid 40's.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
I'm having trouble running my R5 2400G's GPU past 1.3ghz unfortunately, even if I toss more voltage at it. Not sure what's causing it, but even setting the frequency in Ryzen Master to a known tested frequency/voltage that's within my tested "safe" range occasionally just locks the system with a THREAD_STUCK_IN_DEVICE_DRIVER or something like that BSOD.

I happened to have the same Gigabyte AB350N-Gaming Wifi that a lot of reviewers are using on the same BIOS revision, some of which are hitting 1.6ghz on the GPU.

I must have terrible luck with the silicon lottery... I've got a Wraith Max slapped on there currently, and the temps don't get much into the 50's when gaming, usually in the mid 40's.

That was driver problem, I have that bsod in my mining rug with custom driver and moded bios.

You just need reinstall the driver,
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
That was driver problem, I have that bsod in my mining rug with custom driver and moded bios.

You just need reinstall the driver,

Hmm I've tried with DDU a few times. Even uninstalling all chipset drivers and Ryzen master software software, then reinstalling.

I may just run at stock until the next round of driver updates and BIOS update on my board... then do a fresh Windows 10 install (already did, but I noticed my UEFI boot snagged onto a different SSD than my NVME so it'll clean things up).
 

Timorous

Golden Member
Oct 27, 2008
1,615
2,772
136
I'm having trouble running my R5 2400G's GPU past 1.3ghz unfortunately, even if I toss more voltage at it. Not sure what's causing it, but even setting the frequency in Ryzen Master to a known tested frequency/voltage that's within my tested "safe" range occasionally just locks the system with a THREAD_STUCK_IN_DEVICE_DRIVER or something like that BSOD.

I happened to have the same Gigabyte AB350N-Gaming Wifi that a lot of reviewers are using on the same BIOS revision, some of which are hitting 1.6ghz on the GPU.

I must have terrible luck with the silicon lottery... I've got a Wraith Max slapped on there currently, and the temps don't get much into the 50's when gaming, usually in the mid 40's.

That sucks, hopefully it is just a teething issue and you can squeeze more out of it as it matures. I did find Ryzen master OCing to be a bit more unstable than Bios OCing, but that may be something else.

No idea what my gaming temps are, the fans don't spin up past their idling speed so they must be staying below the first step in my fan profile settings as I don't hear them. It is quite nice having a really quiet system.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I have seen a lot of variance in the Vega 8/11 overclocks, anywhere between 1.3 - 1.6GHz, unfortunately it appears you lucked out and got one with limited headroom :/

In a practical sense though, I don't think it will make a big difference with a lower GPU overclock because these APUs are mostly bandwith limited, especially the 2400G. You'll probably see bigger gains overclocking your RAM rather than the GPU.
 

alexruiz

Platinum Member
Sep 21, 2001
2,836
556
126
I see the opposite. Actual detailed reviews, in depth analysis from the established reviewers...

So you are implying that even if the methodology is WRONG, you would prefer it as long as the review is detailed...
"Test setup:
AM4 parts we will be using the Kabini AM1 25W cooler. Windows 10 image created as clean install on Z170 chipset with all security updates including meltdown to level the field..."
I surely would not like a setup like that.

I know the youtubers also want hits, but they have the advantage that while the bigger sites have to post a nice review, youtubers don't, so the time that would be spent into polishing graphs could be spent into tweaking.

Do I like a nice detailed in depth review? Of course I do.
What I DO NOT like is when the time spent into tweaking and optimizing the setup is minimal, and that is what a lot of sites do. They can go 4 pages explaining infinity fabric, but on test setup all they write is "Asus crosshair VI hero motherboard"
BIOS version? RAM timings? Chipset driver version? Windows deployment method? All of that missing.

Anandtech and Tech Report are 2 sites that do it right, they go into nice detail, but also spend a lot of time tweaking, optimizing and configuring.

No amount of tuning is going to allow a 2200G @ 4GHz to be up to 50% faster than an i3 8100 at 1080P medium/high settings on a 1050 Ti, because that is a GPU bound resolution and there should be virtually no difference between two (on paper) comparable CPUs. This is just common sense. If you have a result that is far, far from the expected norm, then those results are going to be scrutinised.

Actually I wouldn't mind if you 'named and shamed' such sites or channels, what numbers of yours do you find to be consistently a lot higher than is reported by review sites and tech channels?

I agree, results that seem too good to be true have to be backed up nicely.

Regarding sites that have very low numbers, I don't remember specific case by case, and checking sites looking for very specific inconsistencies is a pain, but these are examples that I clearly recall:
  • PClab.pl is one of them, probably the worst offender. I swear those guys test AMD CPUs with the parody setup I described a few rows above, and they probably test AMD GPUs on a 1x PCI-e slot...
  • Digital trends and eurogamer are 2 others that get several AMD numbers lower, and I could bet that those guys deploy windows from a single image. Oh, and forget about getting BIOS version.
  • Techspot is inconsistent, sometimes it gets good numbers, sometimes gets low. Another one that likely deploys from a single image

On tests that I see lower number online, disk transfer and I/O meter is one of them, so I bet almost everyone tests AMD using the MS generic AHCI driver. Part of the problem is that when installing the promontory chipset drivers, SATA is not changed, so unless you manually force it, it stays with MS generic AHCI. Many reviewers haven't even noticed it! PCMark numbers would go up if they forced the AMD sata driver as it has deeper queue levels. As counterpoint, the AMD SATA driver has higher CPU usage, but if you can extract more performance, I would surely give up 2% of CPU cycles.

Battlefield minimum framerates is another one where I noticed higher numbers in my setups.
3DMark is another one where my numbers would also be usually higher.
In many cases the difference is not that big, but it has been consistent.

I will probably be editing and adding more as I remember them.
So there you have it.
 

nathanddrews

Graphics Cards, CPU Moderator
Aug 9, 2016
965
534
136
www.youtube.com
If you've got a 144Hz display and a 1080Ti, a 10-20% system performance improvement to get 238fps doesn't mean very much. In the context of the 2200G, we're talking budget/entry level gaming here, probably connected to a 60Hz TV or budget TN monitor. 10-20% gains can make the difference between a locked low settings at 1080p30 vs regular stuttering down to 28fps. Whether that's just faster RAM, OC CPU, OC GPU, or some combination, it's going to be significant to the playability of titles. I'm sure this is an area in which @VirtualLarry can testify.

When it comes to benchmarks on budget systems, using the straight presets (low, high, etc.) usually show poor results. That's why I like have a wide variety of gameplay videos and reviews that tweak settings to put forth the best possible experience.
 
  • Like
Reactions: Feimitsu

neblogai

Member
Oct 29, 2017
144
49
101
I have seen a lot of variance in the Vega 8/11 overclocks, anywhere between 1.3 - 1.6GHz, unfortunately it appears you lucked out and got one with limited headroom :/

In a practical sense though, I don't think it will make a big difference with a lower GPU overclock because these APUs are mostly bandwith limited, especially the 2400G. You'll probably see bigger gains overclocking your RAM rather than the GPU.

Here is a 2200G review by Digital Foundry- in which they show many titles on 2200G gaining more from 36% iGPU overclock (1100->1500MHz), than from 33% higher bandwidth (2400->3200). I'm not saying TFLOPS>GB/s or anything like that- just that overclocking an iGPU is very much worth it. 2400G will not gain as much- but I'm sure overclocking will help as well- at least in some games/settings.
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
Here is a 2200G review by Digital Foundry- in which they show many titles on 2200G gaining more from 36% iGPU overclock (1100->1500MHz), than from 33% higher bandwidth (2400->3200). I'm not saying TFLOPS>GB/s or anything like that- just that overclocking an iGPU is very much worth it. 2400G will not gain as much- but I'm sure overclocking will help as well- at least in some games/settings.

Yeah, during the time I was able to get it to run (not stable) at 1.5ghz it was definitely a significant frame rate increase. Running this on 3200CL16 RAM.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
So you are implying that even if the methodology is WRONG, you would prefer it as long as the review is detailed...
"Test setup:
AM4 parts we will be using the Kabini AM1 25W cooler. Windows 10 image created as clean install on Z170 chipset with all security updates including meltdown to level the field..."
I surely would not like a setup like that.

I know the youtubers also want hits, but they have the advantage that while the bigger sites have to post a nice review, youtubers don't, so the time that would be spent into polishing graphs could be spent into tweaking.

Do I like a nice detailed in depth review? Of course I do.
What I DO NOT like is when the time spent into tweaking and optimizing the setup is minimal, and that is what a lot of sites do. They can go 4 pages explaining infinity fabric, but on test setup all they write is "Asus crosshair VI hero motherboard"
BIOS version? RAM timings? Chipset driver version? Windows deployment method? All of that missing.

Anandtech and Tech Report are 2 sites that do it right, they go into nice detail, but also spend a lot of time tweaking, optimizing and configuring.



I agree, results that seem too good to be true have to be backed up nicely.

Regarding sites that have very low numbers, I don't remember specific case by case, and checking sites looking for very specific inconsistencies is a pain, but these are examples that I clearly recall:
  • PClab.pl is one of them, probably the worst offender. I swear those guys test AMD CPUs with the parody setup I described a few rows above, and they probably test AMD GPUs on a 1x PCI-e slot...
  • Digital trends and eurogamer are 2 others that get several AMD numbers lower, and I could bet that those guys deploy windows from a single image. Oh, and forget about getting BIOS version.
  • Techspot is inconsistent, sometimes it gets good numbers, sometimes gets low. Another one that likely deploys from a single image

On tests that I see lower number online, disk transfer and I/O meter is one of them, so I bet almost everyone tests AMD using the MS generic AHCI driver. Part of the problem is that when installing the promontory chipset drivers, SATA is not changed, so unless you manually force it, it stays with MS generic AHCI. Many reviewers haven't even noticed it! PCMark numbers would go up if they forced the AMD sata driver as it has deeper queue levels. As counterpoint, the AMD SATA driver has higher CPU usage, but if you can extract more performance, I would surely give up 2% of CPU cycles.

Battlefield minimum framerates is another one where I noticed higher numbers in my setups.
3DMark is another one where my numbers would also be usually higher.
In many cases the difference is not that big, but it has been consistent.

I will probably be editing and adding more as I remember them.
So there you have it.

This just seems like you are cherry picking some benchmarks from some random nobody on youtube(conveniently lacking any details), because they show the kinds of numbers you want to see.
 

PG

Diamond Member
Oct 25, 1999
3,426
44
91
I'm having trouble running my R5 2400G's GPU past 1.3ghz unfortunately, even if I toss more voltage at it. Not sure what's causing it, but even setting the frequency in Ryzen Master to a known tested frequency/voltage that's within my tested "safe" range occasionally just locks the system with a THREAD_STUCK_IN_DEVICE_DRIVER or something like that BSOD.

I happened to have the same Gigabyte AB350N-Gaming Wifi that a lot of reviewers are using on the same BIOS revision, some of which are hitting 1.6ghz on the GPU.

I must have terrible luck with the silicon lottery... I've got a Wraith Max slapped on there currently, and the temps don't get much into the 50's when gaming, usually in the mid 40's.

There are different voltages you can change in the bios. Maybe you just tried the wrong one. Some bioses have confusing terminology. I know Gigabyte has some quirks too like having certain settings in more than one area.
Anyway, it seems that the NB/SOC voltage is the one you need to change to help with IGP overclocking. There is a pic of an MSI bios and what to change here: https://wccftech.com/amd-ryzen-3-2200g-vega-8-overclocked-1600mhz-performance/
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Hmm I've tried with DDU a few times. Even uninstalling all chipset drivers and Ryzen master software software, then reinstalling.

I may just run at stock until the next round of driver updates and BIOS update on my board... then do a fresh Windows 10 install (already did, but I noticed my UEFI boot snagged onto a different SSD than my NVME so it'll clean things up).


do you tried using older version of windows 10 ?
 

Magic Hate Ball

Senior member
Feb 2, 2017
290
250
96
There are different voltages you can change in the bios. Maybe you just tried the wrong one. Some bioses have confusing terminology. I know Gigabyte has some quirks too like having certain settings in more than one area.
Anyway, it seems that the NB/SOC voltage is the one you need to change to help with IGP overclocking. There is a pic of an MSI bios and what to change here: https://wccftech.com/amd-ryzen-3-2200g-vega-8-overclocked-1600mhz-performance/
I've been using Ryzen Master, which has a dedicated "APU GFX Voltage" option. I did try earlier with a BIOS setting, but I didn't go as high as I did in Ryzen Master, I'll try that next to see if I can make it stable.


do you tried using older version of windows 10 ?

Everything I've heard on YouTube and on Reddit points towards wanting to use the latest 1709 Windows 10 installer.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Here is a 2200G review by Digital Foundry- in which they show many titles on 2200G gaining more from 36% iGPU overclock (1100->1500MHz), than from 33% higher bandwidth (2400->3200). I'm not saying TFLOPS>GB/s or anything like that- just that overclocking an iGPU is very much worth it. 2400G will not gain as much- but I'm sure overclocking will help as well- at least in some games/settings.

I did say especially the 2400G, from what I've seen the 2200G does gain more from core overclocking, in fact at max overclocks (assuming the same clocks) it appears the 2200G / Vega 8 isn't that far off 2400 G / Vega 11, because they are both limited by the same amount of bandwidth and this bottlenecks the benefits from the additional 3 CUs.
 

neblogai

Member
Oct 29, 2017
144
49
101
I did say especially the 2400G, from what I've seen the 2200G does gain more from core overclocking, in fact at max overclocks (assuming the same clocks) it appears the 2200G / Vega 8 isn't that far off 2400 G / Vega 11, because they are both limited by the same amount of bandwidth and this bottlenecks the benefits from the additional 3 CUs.

Well, the iGPU is not made out of CUs alone. Front end, and ROPs in the 2400G should benefit from 30% higher iGPU clocks just as well. Especially because Raven Ridge seems to have only 8 ROPs?
And considering there is a lot of interest and excitement about these chips- I find it strange that important comparisons- like iGPU benchmarks at 1600MHz: 2200G vs 2400G over a wide range of games are not done yet. Nor- is there a definite answer if total 8GB RAM can be sufficient to play every heavy AAA game without stuttering. If not- then H310+G5400+GT1030 might be a better choice soon.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Well, the iGPU is not made out of CUs alone. Front end, and ROPs in the 2400G should benefit from 30% higher iGPU clocks just as well. Especially because Raven Ridge seems to have only 8 ROPs?
And considering there is a lot of interest and excitement about these chips- I find it strange that important comparisons- like iGPU benchmarks at 1600MHz: 2200G vs 2400G over a wide range of games are not done yet. Nor- is there a definite answer if total 8GB RAM can be sufficient to play every heavy AAA game without stuttering. If not- then H310+G5400+GT1030 might be a better choice soon.

Yes it's not just the CUs, since the amount of ROPs and shading units are also 30% higher.

Anyhow, 2200G vs 2400G benchmarks, both overclocked with the iGPUs set at 1.35GHz: https://www.youtube.com/watch?v=a-WMpcTVykc

As I said previously, there isn't a lot of difference between the two, which is likely due to memory bandwith bottlenecks.

WRT 8 vs 16GB, it appears 8GB is enough: https://youtu.be/Y2KPzMeQnWE

Of course, you would have to be diligent with closing background apps that take up too much RAM on an 8GB system, I'm sure the reviewer did the benchmarks with minimal background tasks open.
 
Last edited:

neblogai

Member
Oct 29, 2017
144
49
101
Yes it's not just the CUs, since the amount of ROPs and shading units are also 30% higher.

No, amount of ROPs stays the same between 2200G and 2400G- however, overclocking the 2400G from 1250MHz to 1600MHz overclocks not just CUs, but front end and ROPs by ~28%- which could be what 2400G needs.

Anyhow, 2200G vs 2400G benchmarks, both overclocked with the iGPUs set at 1.35GHz: https://www.youtube.com/watch?v=a-WMpcTVykc

As I said previously, there isn't a lot of difference between the two, which is likely due to memory bandwith bottlenecks.

WRT 8 vs 16GB, it appears 8GB is enough: https://youtu.be/Y2KPzMeQnWE

Of course, you would have to be diligent with closing background apps that take up too much RAM on an 8GB system, I'm sure the reviewer did the benchmarks with minimal background tasks open.

I would like a proper review, with 20+ AAA games tested- because we need to know if all games can be made to run well, not if some games do. And especially hand-picking the titles known for high RAM and VRAM use, like ROTR, Doom and Wolfenstein2 for 8GB RAM testing.

I better not see any stuttering in any heavy AAA title with GT1030+8G of RAM then.

Having 2GB of extra RAM on the discreet card should help in at least some titles. Of course, 2c/4t CPU may introduce a different bottleneck- but first I'd like to know if/how often 8GB of RAM with 2200G is not enough.
 

coercitiv

Diamond Member
Jan 24, 2014
6,201
11,903
136
Having 2GB of extra RAM on the discreet card should help in at least some titles. Of course, 2c/4t CPU may introduce a different bottleneck- but first I'd like to know if/how often 8GB of RAM with 2200G is not enough.
To me this type of comparison is getting very tiresome: these extreme budget builds are by definition the pinnacle of compromise, yet we tend to have very high expectations of the APUs while ignoring the obvious weakness in alternative budget dGPU builds.

Choosing to build with 2c/4t and 8GB @ 2400Mhz is going to incur an extra cost sooner or later - either by forcing lower details in future games or by demanding a hardware upgrade (i3 upgrade, extra 8GB). I can understand this kind of sacrifice if one targets 1050/1050ti right now and gets considerably better performance to start with, but not on GT 1030 where performance uplift is low at best and the entire system is in need of upgrades just as soon as Vega 11 runs out of steam in AAA titles.
 

neblogai

Member
Oct 29, 2017
144
49
101
To me this type of comparison is getting very tiresome: these extreme budget builds are by definition the pinnacle of compromise, yet we tend to have very high expectations of the APUs while ignoring the obvious weakness in alternative budget dGPU builds.

Choosing to build with 2c/4t and 8GB @ 2400Mhz is going to incur an extra cost sooner or later - either by forcing lower details in future games or by demanding a hardware upgrade (i3 upgrade, extra 8GB). I can understand this kind of sacrifice if one targets 1050/1050ti right now and gets considerably better performance to start with, but not on GT 1030 where performance uplift is low at best and the entire system is in need of upgrades just as soon as Vega 11 runs out of steam in AAA titles.

Well- some games being unplayable at some point later is better, than the same sacrifice while the system is new. That is why I want it to be tested and compared. It's not just about the desktop market. At some point, I would want to buy a very portable laptop (12-13") with Raven Ridge. Such laptops are usually soldered/glued together- making RAM upgrade impossible. That is why I'm interested to see if 8GB of total RAM is enough to run every game.
 

coercitiv

Diamond Member
Jan 24, 2014
6,201
11,903
136
At some point, I would want to buy a very portable laptop (12-13") with Raven Ridge. Such laptops are usually soldered/glued together- making RAM upgrade impossible. That is why I'm interested to see if 8GB of total RAM is enough to run every game.
So not only does it have to run with 8GB, but do so within 15W TDP. Don't you think at some point you might have to turn details down anyway?
 

neblogai

Member
Oct 29, 2017
144
49
101
So not only does it have to run with 8GB, but do so within 15W TDP. Don't you think at some point you might have to turn details down anyway?

I never said anything about 15W- I'd actually prefer ~35W TDP APU model in it. And it is possible- seeing how Lenovo can fit 2c/4t + MX150/940MX (15W+25W) into a 13" Lenovo 710S Plus. I don't even mind if laptop is thicker than that- I just want it to be small and weigh less than 1.5kg.