AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 61 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
I am impressed with the performance of Ryzen 2400G in games like Hitman, Wolfenstein II , Battlefield 1 at low settings at 1080p/900p . Its impressive to see the Ryzen APU do so well in these demanding games. Hitting 40-50 fps in these games is no simple feat for a bandwidth limited IGP.

https://techreport.com/review/33235/amd-ryzen-3-2200g-and-ryzen-5-2400g-processors-reviewed/7
https://www.techspot.com/review/1574-amd-ryzen-5-2400g-and-ryzen-3-2200g/page4.html
http://www.tomshardware.co.uk/amd-ryzen-5-2400g-zen-vega-cpu-gpu,review-34205-6.html

BF1 multiplayer at 1080p low settings with high texture filtering on a 2400G OC with 3.9 Ghz CPU and 1.5 Ghz GPU. 50-100 fps. Absolutely mind blowing.

https://www.youtube.com/watch?v=K0cxIRp8Q0g
 
Last edited:
  • Like
Reactions: lightmanek

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Have you actually been reading this thread? The speculation all along has been that the performance would be in the range of the 1030. That is pretty much how it turned out. Now whether that is a reasonable performance for a gaming desktop is another issue altogether.

Whats your opinion on the 2400G now that you had seen the reviews? 2200G IGP performs close to the 2400G but i certanly expected to perform even closer, its a strange thing i cant find a reason for, i certanly expected a lot more of 2200G IGP compared to the 2400G one.

Maybe its the drivers that are not ready yet.
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Well that was a mighty quick edit, perhaps backtracking on your claims that it won't match a 1030? It's a good thing we have Google & your earlier posts #644
Come on, wake up, the old A10-7870K managed to beat an R7 250 ($90) DDR3 when paired with fast DDR3 rams, as well as GT740 ($100) DDR3. The A8-7600 managed to beat an R7 240 ($70) with fast ram.

Thats not one, but TWO tiers of entry level dGPUS.

Now comes an $170 2400G that is very likely to be slower than GT1030 ($75).
I know this is because GDDR5.

The good part here, is now that the APU have decent CPU cores, you can actually plant to build a 2400G gaming pc and add a decent dGPU later.

Everything else is the same, and even a bit worse if those new APUs cant beat any dGPU (GT710 dosent count).
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
I am impressed with the performance of Ryzen 2400G in games like Hitman, Wolfenstein II , Battlefield 1 at low settings at 1080p/900p . Its impressive to see the Ryzen APU do so well in these demanding games. Hitting 40-50 fps in these games is no simple feat for a bandwidth limited IGP.

https://techreport.com/review/33235/amd-ryzen-3-2200g-and-ryzen-5-2400g-processors-reviewed/7
https://www.techspot.com/review/1574-amd-ryzen-5-2400g-and-ryzen-3-2200g/page4.html
http://www.tomshardware.co.uk/amd-ryzen-5-2400g-zen-vega-cpu-gpu,review-34205-6.html

BF1 multiplayer at 1080p low settings with high texture filtering on a 2400G OC with 3.9 Ghz CPU and 1.5 Ghz GPU. 50-100 fps. Absolutely mind blowing.

https://www.youtube.com/watch?v=K0cxIRp8Q0g
50% resolution scale = half resolution

This is why the footage is a blur mess even on 1080p 60fps youtube.

EDIT: At 7:30 he test with 100% (true 1080p) and the performance is about 45-50 fps.
 
  • Like
Reactions: raghu78

tamz_msc

Diamond Member
Jan 5, 2017
3,821
3,642
136
As far as overclocking the GT 1030 is concerned, since most people would be targeting 30fps in AAA games, it really doesn't matter if it's 30 or 33fps. So the GT 1030 is a completely irrelevant product except for those who need the extra display ports, or if by any chance AMD doesn't get to enable 4K Netflix decode due to software issues.

Those who are getting the 2200G should overclock the iGPU though, as doing so gets it closer to the GT 1030. However one has to choose a motherboard which has decent VRMs for the V_SOC.
 
  • Like
Reactions: neblogai

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Well that was a mighty quick edit, perhaps backtracking on your claims that it won't match a 1030? It's a good thing we have Google & your earlier posts #644

Thanks for posting that. At this point I am just lmao. Anyway for people who want the answer for the legendary question - Can it run Crysis ? Yes the 2400G can match the legendary 8800GTX running Crysis at 16X10 resolution at Very high settings at 30 fps.(Go to video at 4:45)

https://www.youtube.com/watch?v=sCWOfwcYmHI
 
  • Like
Reactions: lightmanek

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
50%? Maybe if you buy a CPU to run one specific synthetic benchmark over and over. :rolleyes:.

Because Cinebench is synthetic..?.

Cinema 4D renderer is a synthetic software, a la Sandra, you think.??

You have other ray tracer and renderer below :

corona.png


blender.png


But if you are ]doing something that people actually do with their multi-core home computers like render video with Handbrake, you might get something like 18% better performance, for ~70% price increase, so I think the 2200G is much better value.
.

Of course one ap decide for the rest, preferably one that scale averagely for the time being, never mind the mid term, but even then :

handbrake.png
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
And watch the price of the high end DGPU when they don't have the volume low and mid level cards to support them. You'll be looking at Quadro and Firepro pricing for gaming cards.

Developers would have to adjust. They already largely target the console market anyway so it shouldn't be a problem.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Well that was a mighty quick edit, perhaps backtracking on your claims that it won't match a 1030? It's a good thing we have Google & your earlier posts #644

Yes i edited that, what it said was "i expected more of the 2200G IGP compared to the GT1030 and the 2400G IGP", removed the GT1030 of that so i could avoid another brain dead disscusion like this.

This is very interesting on how some people wants to twist reality now, the GT1030 talk was brought out by the people defending the APUs, i did not even know what the GT1030 could do until i started reading this thread.

This is my first post:
https://forums.anandtech.com/thread...ormance-unveiled.2533111/page-6#post-39252893
Pointed out it was strange to AMD not to compare to RX550 or the GT1030 (as it was mentioned on the slides), and only to Intel IGP.

And this is the first post were i mentioned a G4560+GTX1050
https://forums.anandtech.com/thread...ormance-unveiled.2533111/page-6#post-39252956
I can how change that to 2200G+GTX1050.

The rest were people saying it will perform more than GT1030, and me saying based on the AMD slides i dont belive it will, and it would be bad because older APU beaten the crap out of entry level dGPU of the tier the GT1030 it is now. And BTW, i was right, is not better, its trading blows with the GT1030 and its slower in some cases and faster in other ones.

And the CPU is not helping the GT1030 at all.

You also missed all the posts were i said i would like to perform more than GT1030, this is a nice sums up by page 30 im not going to look for specific posts.
https://forums.anandtech.com/thread...rmance-unveiled.2533111/page-34#post-39282119
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Yes i edited that, what it said was "i expected more of the 2200G IGP compared to the GT1030 and the 2400G IGP", removed the GT1030 of that so i could avoid another brain dead disscusion like this.

This is very interesting on how some people wants to twist reality now, the GT1030 talk was brought out by the people defending the APUs, i did not even know what the GT1030 could do until i started reading this thread.

This is my first post:
https://forums.anandtech.com/thread...ormance-unveiled.2533111/page-6#post-39252893
Pointed out it was strange to AMD not to compare to RX550 or the GT1030 (as it was mentioned on the slides), and only to Intel IGP.

And this is the first post were i mentioned a G4560+GTX1050
https://forums.anandtech.com/thread...ormance-unveiled.2533111/page-6#post-39252956
I can how change that to 2200G+GTX1050.

The rest were people saying it will perform more than GT1030, and me saying based on the AMD slides i dont belive it will, and it would be bad because older APU beaten the crap out of entry level dGPU of the tier the GT1030 it is now. And BTW, i was right, is not better, its trading blows with the GT1030 and its slower in some cases and faster in other ones.

And the CPU is not helping the GT1030 at all.

You also missed all the posts were i said i would like to perform more than GT1030, this is a nice sums up by page 30 im not going to look for specific posts.
https://forums.anandtech.com/thread...rmance-unveiled.2533111/page-34#post-39282119
Yeah except your reason for supporting the GT 1030 was odd ~ 64bit GDDR5 clocked at 6GBps will not have a distinct (bandwidth) advantage, just because it's GDDR5. If anything the 1030, aimed at laptops as MX150, is punching above its weight with a mere 30W TDP.

btw you also made up your mind, like others, that 2400 IGP will be bad/worse than some of the more optimistic projections in the thread, so there's that.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
Yeah except your reason for supporting the GT 1030 was odd ~ 64bit GDDR5 clocked at 6GBps will not have a distinct (bandwidth) advantage, just because it's GDDR5. If anything the 1030, aimed at laptops as MX150, is punching above its weight with a mere 30W TDP.

btw you also made up your mind, like others, that 2400 IGP will be bad/worse than some of the more optimistic projections in the thread, so there's that.

Of course, the GT1030 was the only card the APU had a chance to be beat, 64 bit of GDDR5, no way to beat a RX550 with those 112GB/s. What else i was going to say? Expecting those APU to beat a 128-bit GDDR5 gpu was 100% unrealistic, and BTW i never defended the GT1030.
Example A: https://forums.anandtech.com/thread...rmance-unveiled.2533111/page-24#post-39273176
Example B: https://forums.anandtech.com/thread...rmance-unveiled.2533111/page-27#post-39277342

Yes i said i expected the 2400G to be slower than GT1030, several times, based off AMD own slides, they avoided the gaming comparison like they always did (ive never hidden or attempted to hide this like you are suggesting) and the end result was not pretty, i can see now why AMD decided not to compare the 2400G to a GT1030.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Because Cinebench is synthetic..?.

Cinema 4D renderer is a synthetic software, a la Sandra, you think.??

You have other ray tracer and renderer below :

corona.png


blender.png




Of course one ap decide for the rest, preferably one that scale averagely for the time being, never mind the mid term, but even then :

handbrake.png

Cinebench is a benchmark, for ray tracing which almost no one does, which make it essentially a synthetic benchmark for most people.

I chose Video encoding in handbrake as more representative of what people actually do with CPUs, because video rendering is MUCH more common home activity, and Handbrake is actually one of the more common programs to do such rendering with.

But by all means if raytracing is a big part of your computing activities then you may find the 2400G worth the big jump in price over the 2200G, but for most use cases and most people it likely isn't.

The 2200G is great value part, the 2400G not quite as good value.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
Of course, the GT1030 was the only card the APU had a chance to be beat, 64 bit of GDDR5, no way to beat a RX550 with those 112GB/s. What else i was going to say? Expecting those APU to beat a 128-bit GDDR5 gpu was 100% unrealistic, and BTW i never defended the GT1030.
Example A: https://forums.anandtech.com/thread...rmance-unveiled.2533111/page-24#post-39273176
Example B: https://forums.anandtech.com/thread...rmance-unveiled.2533111/page-27#post-39277342

Yes i said i expected the 2400G to be slower than GT1030, several times, based off AMD own slides, they avoided the gaming comparison like they always did. Ive never hidden or attempted to hide this like you are suggesting. And the end result was not petty, i can see now why AMD decided not to compare the 2400G to a GT1030.

dude 2400G is pretty much on par with GT1030. Whats admirable is the 2400G fares remarkably well when VRAM requirements go beyond 2GB in titles like Hitman, Wolfenstein II where the GT1030 craters completely.

https://www.techspot.com/review/1574-amd-ryzen-5-2400g-and-ryzen-3-2200g/page4.html
https://techreport.com/review/33235/amd-ryzen-3-2200g-and-ryzen-5-2400g-processors-reviewed/7
 

R0H1T

Platinum Member
Jan 12, 2013
2,582
162
106
Yes i said i expected the 2400G to be slower than GT1030, several times, based off AMD own slides, they avoided the gaming comparison like they always did (ive never hidden or attempted to hide this like you are suggesting) and the end result was not pretty, i can see now why AMD decided not to compare the 2400G to a GT1030.
What do you mean? AFAIK nearly every other review site, youtuber, blogger has done a comparison with the GT 1030. The vast majority of them got official review kits from AMD.

Basically, with the advent of 2400G ~ the bargain basement dGPU is a dead end market, in the short term MX150 will still be hugely popular in the notebook segment but even there it's days are numbered. AMD has staked a claim in the HTPC, mini ITX & office desktop space with these. The only reason why Intel could be preferred is if you need the extra ST performance, there are no direct competitors nor a combination of them at this price point atm.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
dude 2400G is pretty much on par with GT1030. Whats admirable is the 2400G fares remarkably well when VRAM requirements go beyond 2GB in titles like Hitman, Wolfenstein II where the GT1030 craters completely.

https://www.techspot.com/review/1574-amd-ryzen-5-2400g-and-ryzen-3-2200g/page4.html
https://techreport.com/review/33235/amd-ryzen-3-2200g-and-ryzen-5-2400g-processors-reviewed/7

Hitman is way too much pro AMD thats your explanation right there, same for Wolfenstein, do you belive its OK for the A12-9800 to perform better than GT1030? That has nothing to do with VRAM.

Actually, You are sure RR can use more than 2GB of vram? i have no info on that, link that to me please.

Look at the rest of the games on that same site and you will see why AMD avoided a 1080p gaming comparison with the GT1030, and a more powerfull CPU, Same perf in CS:GO, worse in PUBG, worse in Fornite, same perf in Overwatch, worse in Rocket League, worse in Dota 2, all top of the line esports games, Witcher 3 is not there but its slightly worse on there as well, im yet to see AC:Origins, not sure about LOL if Dota is giving that result...
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
dude 2400G is pretty much on par with GT1030. Whats admirable is the 2400G fares remarkably well when VRAM requirements go beyond 2GB in titles like Hitman, Wolfenstein II where the GT1030 craters completely.

https://www.techspot.com/review/1574-amd-ryzen-5-2400g-and-ryzen-3-2200g/page4.html
https://techreport.com/review/33235/amd-ryzen-3-2200g-and-ryzen-5-2400g-processors-reviewed/7
Did those test systems have 16gb of system ram? I can see that one did. Curious as to the possible effect "when VRAM requirements go beyond 2GB in titles like Hitman, Wolfenstein II" when the system has only 8gb of ram. Several times it's been posted that 2gb of system ram would not be needed with the APUs, and you need to reserve only 1gb or less of system ram.
 

Shivansps

Diamond Member
Sep 11, 2013
3,855
1,518
136
What do you mean? AFAIK nearly every other review site, youtuber, blogger has done a comparison with the GT 1030. The vast majority of them got official review kits from AMD.

Basically, with the advent of 2400G ~ the bargain basement dGPU is a dead end market, in the short term MX150 will still be hugely popular in the notebook segment but even there it's days are numbered. AMD has staked a claim in the HTPC, mini ITX & office desktop space with these. The only reason why Intel could be preferred is if you need the extra ST performance, there are no direct competitors nor a combination of them at this price point atm.

On the slides AMD released (first page here), they compared 2400G and 2200G to intel IGP, and only to GT1030 in timespy, they always compared old APU to dGPUs before this, 7870 vs GT740, 6850 vs Gt630, etc. Thats what maked me to belive it was slower. Thats why i quoted the Unboxed review, the reason is simple, the 2400G losses and at the best matches on most of the esport games out there, and some of the top AAA like Witcher 3. There a few other i still need to check a few more games before i fully commit to this, but that is what im seeing, and it could very well explain it.
 

Abwx

Lifer
Apr 2, 2011
10,953
3,474
136
Cinebench is a benchmark, for ray tracing which almost no one does, which make it essentially a synthetic benchmark for most people.
.

What is used for the bench Cinema 4D renderer, that s not synthetic, is that difficult to understand or are you stuck in bad faith using poor argumentation, that is, because the term "bench" is used, but at this rate Handbrake is also only benches as well since the encoded files are not the same from a site to another..


In this respecr X264 encoding will benefit from MT vastly if the files are of higher definition, with 1080p files the encoder wont load the CPU much up to all his thread over 4-6, this is visible in some reviews, so it s not a clear cut, there are instances where the gain is more than the 20% displayed at TR.

30/27% in X264/265 from Ryzen 4C/4T to 4C/8T :

https://www.hardware.fr/articles/965-2/performances-applicatives.html
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
What is used for the bench Cinema 4D renderer, that s not synthetic, is that difficult to understand or are you stuck in bad faith using poor argumentation, that is, because the term "bench" is used, but at this rate Handbrake is also only benches as well since the encoded files are not the same from a site to another..


In this respecr X264 encoding will benefit from MT vastly if the files are of higher definition, with 1080p files the encoder wont load the CPU much up to all his thread over 4-6, this is visible in some reviews, so it s not a clear cut, there are instances where the gain is more than the 20% displayed at TR.

30/27% in X264/265 from Ryzen 4C/4T to 4C/8T :

https://www.hardware.fr/articles/965-2/performances-applicatives.html

I am not sure why you are aren't getting the main point, that ray tracing is NOT a common activity. I did say if ray-tracing was something you actually do outside of benchmarks then it might be worth a 70% increase in price. But I would bet less than 1% of low end CPU buyers do ray tracing.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Developers would have to adjust. They already largely target the console market anyway so it shouldn't be a problem.
I'm not talking about the devs being able to develop games. I agree that's not an issue. I'm more addressing the hi-end GPU market. We already have nVidia wanting to charge $3G for their top consumer GPU. What's going to happen when they need those to be their volume parts as well?
 

escrow4

Diamond Member
Feb 4, 2013
3,339
122
106
Over here in Australia the difference between a 2400G and an i5 8400 is $19 and the mobo costs are around $70 between a decent AB350 and Z370. Given all that I'd rather cop it and go Intel. An extra $100 over the course of a build is eh. Given an all rounder build where gaming isn't a priority.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
On the slides AMD released (first page here), they compared 2400G and 2200G to intel IGP, and only to GT1030 in timespy, they always compared old APU to dGPUs before this, 7870 vs GT740, 6850 vs Gt630, etc. Thats what maked me to belive it was slower. Thats why i quoted the Unboxed review, the reason is simple, the 2400G losses and at the best matches on most of the esport games out there, and some of the top AAA like Witcher 3. There a few other i still need to check a few more games before i fully commit to this, but that is what im seeing, and it could very well explain it.

You are basically trying to say that everyone else is wrong and it's getting old. For starters:

For the first part of that argument, about having more performance at the same price, AMD suggests the following competition for the Ryzen 5 2400G:
  • $169 Ryzen 5 2400G (4C/8T, 3.6 GHz, 704 SPs)
  • $182 Core i5-7400 (4C/4T, xxx, 24 EUs)
  • $187 Core i5-8400 (6C/6T, xxx, 24 EUs)
For the Ryzen 3 2200G, the competing products are less well defined:
  • $99 Ryzen 3 2200G (4C/4T, 3.5 GHz, 512 SPs)
  • $117 Core i3-8100 (4C/4T, xxx, 23 EUs)
  • $84 Pentium G4620 (2C/4T, xxx, 12 EUs)
So maybe they didn't make a slide? Who cares? Also, do you have that much hate (clearly not love) that you remember what they compared EVERY previous APU to at launch? Obsessed much?

Also you are off on the 2400G losing or at best matching most games. It wins some, it loses some. The two are very comparable overall. Except with the 2400G, you use less power, and have a significantly better CPU. Unless you spend more money on an i3. You are basically just sticking your fingers in your ears at this point. Or you just think more choices/competition is a bad thing.
 
Last edited:
  • Like
Reactions: Space Tyrant

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
I'm not talking about the devs being able to develop games. I agree that's not an issue. I'm more addressing the hi-end GPU market. We already have nVidia wanting to charge $3G for their top consumer GPU. What's going to happen when they need those to be their volume parts as well?

Well they make GPU's that bring the Intel CPU's performance on par with AMD (assuming Intel doesn't step up their game) or they're SOL. That's probably why they've entered different markets in years past. They know they can't depend on consumer graphics forever.