AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 43 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

USER8000

Golden Member
Jun 23, 2012
1,542
780
136
I did not see this before- but again- great find from TUM_APISAK:
1. 3DMark Firestrike 2400G test with 3200MHz RAM: link.
2. Even better- Firestrike results at different RAM speed: link.
3. Firestrike Extreme and Timespy scores.

I'd like to add- that hardware in first link is Asrock ITX board with 16GB of 3200MHz RAM- so it might be the hardware we will see in 2400G reviews- because I think Anandtech mentioned this is what AMD is sending out to reviewers.

weanz72.jpg


The is only a 10% increase in the score going from 2400MHZ to 3200MHZ DDR4.
 

neblogai

Member
Oct 29, 2017
144
49
101
What I find really curious, is this comparison: GT1030 and 8400 are faster at their respective Graphics and Physics scores- but when it comes to 'combined score' where both tests are done at the same time- 2400G suddenly wins. How is that possible? Is that from lower latencies when having CPU and iGPU together?
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Because 3dmark is not very bandwidth demanding... old APUs get around 20-25% increase by going single to DUAL CHANNEL. This is why 3dmark is the absolute best case escenario for an APU.

What I find really curious, is this comparison: GT1030 and 8400 are faster at their respective Graphics and Physics scores- but when it comes to 'combined score' where both tests are done at the same time- 2400G suddenly wins. How is that possible? Is that from lower latencies when having CPU and iGPU together?

Nah, Combined score uses the GPU compute capabilities, the 11CU of the 2400G should be way faster than GT1030 for compute.
 
  • Like
Reactions: Burpo

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Well, the driver was old, maybe with the new drivers the score gets up a bit because if it cant match a GT1030 in 3dmark there is just no chance in games and if the 2400G is slower, the 2200G may be too far behind.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
Well, the driver was old, maybe with the new drivers the score gets up a bit because if it cant match a GT1030 in 3dmark there is just no chance in games and if the 2400G is slower, the 2200G may be too far behind.

Maybe 3dfirestrike is not so bandwidth ...
 

Glo.

Diamond Member
Apr 25, 2015
5,705
4,548
136
Yeah there are leak of Feng Huang apu with rx 570D shader.
RX 570D shader count - yes. But it was not RX 570D, because it would be reported by the benchmark. Quite the contrary - it was referred as AMD 15FF graphics. Raven Ridge GPU was reported as 15DD graphics.
It may be a console APU.
None of console APUs were ever spotted on the desktop benchmarks.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Argghhh, such a difficult decision. The kid in me wants to get the 2400G but the adult in me wants to get the 2200G. :p

Then get the 2400G for the kid and get the 2200G for yourself. :p I have an RX470, but without doing any hardcore gaming i think the 2400G will do everything I need. Well maybe i'll just get him the 2400G. Bah, yep difficult decision is right!
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
Well, the driver was old, maybe with the new drivers the score gets up a bit because if it cant match a GT1030 in 3dmark there is just no chance in games and if the 2400G is slower, the 2200G may be too far behind.

Well, in the old APUs, the APU needed to archive a considerable higher score than the dGPU in order to archive the same or more performance, take this for example:

A10-7850K alone https://www.3dmark.com/fs/9693550

A10-7850K with GT740 DDR3 https://www.3dmark.com/fs/9815646
 

french toast

Senior member
Feb 22, 2017
988
825
136
2400g+3200 is not going to get near an overclocked gtx 1030 in gaming.

2700u +2400mhz ddr4 gets destroyed by mx150..even with adrenaline drivers.
https://youtu.be/mDVlQiBbOPg

Higher resolution would make it worse.
It's possible an overclocked 2400g would get close to gtx 1030 non overclocked imo.
Still makes raven ridge a brilliant product.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
2400g+3200 is not going to get near an overclocked gtx 1030 in gaming.

2700u +2400mhz ddr4 gets destroyed by mx150..even with adrenaline drivers.
https://youtu.be/mDVlQiBbOPg

Higher resolution would make it worse.
It's possible an overclocked 2400g would get close to gtx 1030 non overclocked imo.
Still makes raven ridge a brilliant product.

Burned!
Core clock < 950MHz ~ 25W TDP limit (+CPU) vs 25W TDP for MX150.
Memory clock < 2133/2400MHz

As you can see it rarely uses 1200MHz on IMC, usually ~ 1866MHz DDR4.

It is really interesting that both IMC and Core clock are jumping up and down. Anyway I hardly see 1067MHz or 1200MHz on IMC. Looks like I am them only one here, who noticed it.
 
Last edited:

IRobot23

Senior member
Jul 3, 2017
601
183
76
That's hardly convincing argument: you have mobile APU with maximum TDP at 25W vs desktop part at 65W plus 1/3 more bandwith (with DDR4-3200).
Btw, you would OC IGP of 2400G too.

You also, Check video and IMC speed!
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
That's hardly convincing argument: you have mobile APU with maximum TDP at 25W vs desktop part at 65W plus 1/3 more bandwith (with DDR4-3200).
Btw, you would OC IGP of 2400G too.

The only convincing argument is the 3dmark numbers that arent good enoght.

The rest is just pointless discussion because the 1030 is not a GPU i would recomend anyone to get with a new PC if they intent to do any gaming on it. Many people here are trying to past the GT1030 as a gaming GPU, and its just a weak entry level GPU intended mainly for old pcs and multimedia. IT also found a place along the igp-less Ryzen for non-gaming purposes, just like the GT 710.
 
Last edited:

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
It is really interesting that both IMC and Core clock are jumping up and down.

Perfectly normal behaviour for an AMD APU. My Athlon 845 has a northbridge base clock of 700MHz, jumping to 1100/1300MHz when doing something memory intensive.

Its properbly to save a bit of power.
 

french toast

Senior member
Feb 22, 2017
988
825
136
That's hardly convincing argument: you have mobile APU with maximum TDP at 25W vs desktop part at 65W plus 1/3 more bandwith (with DDR4-3200).
Btw, you would OC IGP of 2400G too.
2400g stands to make up more ground on pc Vs mx150/gtx 1030...BUT the 2700U is behind massively, I would suggest an overclocked 2400g with 3200mhz ddr4 would perform slightly slower than a stock 1030 in real gaming, overclocking 1030 would be unreachable imo.
Burned!
Core clock < 950MHz ~ 25W TDP limit (+CPU) vs 25W TDP for MX150.
Memory clock < 2133/2400MHz

As you can see it rarely uses 1200MHz on IMC, usually ~ 1866MHz DDR4.

It is really interesting that both IMC and Core clock are jumping up and down. Anyway I hardly see 1067MHz or 1200MHz on IMC. Looks like I am them only one here, who noticed it.
No I noticed your earlier comments and looked myself.

This image makes me so happy:

IMG0054927.jpg
:)
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
Perfectly normal behaviour for an AMD APU. My Athlon 845 has a northbridge base clock of 700MHz, jumping to 1100/1300MHz when doing something memory intensive.

Its properbly to save a bit of power.

Well NB is not hardlinked to IMC, here IF is hardlinked to IMC so... IMC its means that DDR4 speed is not at 2400MT/s.
 

neblogai

Member
Oct 29, 2017
144
49
101
2400g stands to make up more ground on pc Vs mx150/gtx 1030...BUT the 2700U is behind massively,
:)

2700U is not a competitor to Intel CPU + MX150. It is a premium class 15W APU -same like 8550U. Competitor to Intel CPU + dGPU is AMD's own RR + dGPU, while lowest end, budget gaming (MX110-130 and to some extent MX150) will be countered by a higher TDP mobile RR alone.
 

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
RX 570D shader count - yes. But it was not RX 570D, because it would be reported by the benchmark. Quite the contrary - it was referred as AMD 15FF graphics. Raven Ridge GPU was reported as 15DD graphics.

None of console APUs were ever spotted on the desktop benchmarks.


Yes I know, I'm just saying it's have Rx 570d shader configuration, but why AMD just using 2 gb HBM2 ? Are they banking on hbcc ?
 

neblogai

Member
Oct 29, 2017
144
49
101
Yes I know, I'm just saying it's have Rx 570d shader configuration, but why AMD just using 2 gb HBM2 ? Are they banking on hbcc ?

AFAIK, there were leaks of 2GB and 4GB versions of engineering samples, but only 4GB as pre-release, qualification sample.
 

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
2400g+3200 is not going to get near an overclocked gtx 1030 in gaming.

2700u +2400mhz ddr4 gets destroyed by mx150..even with adrenaline drivers.
https://youtu.be/mDVlQiBbOPg

Higher resolution would make it worse.
It's possible an overclocked 2400g would get close to gtx 1030 non overclocked imo.
Still makes raven ridge a brilliant product.

MX150 has a TDP of 25w while the 2700u with CPU + GPU has a combined cTDP of 25w. Its obvious that if we give more TDP to the 2700u and bump it to 35w or 45w it will quickly close the gap. I am betting a 2400G with DDR4 3200 will match a stock GT 1030 in games. If you throw in DDR4 3600 which btw is $10 costlier than DDR4 3200 with a bit of core overclocking and you are going to be able to keep up with overclocked GT 1030.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
I dont trust AMD TDP numbers anyway, there is a chance that AMD "25W" needs the same cooling than a 25W MX150+15W intel cpu. Everyone who used a "65W" AMD apu in a bad cooling pc case knows what i mean. That petty much reminds me of prescott cores.
But i cant be sure unless i get my hands on one.

about the 2200G and 2400G, feb 12 is the date right? We are close to see the truth.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
I dont trust AMD TDP numbers anyway, there is a chance that AMD "25W" needs the same cooling than a 25W MX150+15W intel cpu. Everyone who used a "65W" AMD apu in a bad cooling pc case knows what i mean. That petty much reminds me of prescott cores.
But i cant be sure unless i get my hands on one.

about the 2200G and 2400G, feb 12 is the date right? We are close to see the truth.

WHAT?