AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 62 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

neblogai

Member
Oct 29, 2017
144
49
101
Have you actually been reading this thread? The speculation all along has been that the performance would be in the range of the 1030. That is pretty much how it turned out. Now whether that is a reasonable performance for a gaming desktop is another issue altogether.
That is not true. People were looking at 2500U vs MX150 results, where MX150 was ~60% faster- and saying there is no way 2400G would catch GT1030 due to iGPUs using shared DDR4 bandwidth- no matter the iGPU clocks. Only after showing that desktop Raven Ridge parts will have +50% higher CPU, iGPU clocks and memory bandwidth, general opinion started moving towards 'close to GT1030'- with the naysayers then looking for other excuses against these new parts.
It seems I was right, when I projected overclocked 2200G to catch up to MX150 speed, and 2400G OC possibly being faster. Now the next question is: would it be possible to overclock the the iGPU on a $36 A320 motherboard? If that is possible- then $99 APU+ $36 mobo +$94 for 2x4GB of DDR4 3000 RAM +~$100 for the rest of the system= $330, which is crazy value. If not possible- then $55 B350 mobo and total price of $350 is still very good.
 

TempAcc99

Member
Aug 30, 2017
60
13
51
Given the absurd dGPU prices these APUs for the first time actually seem a reasonable choice, as long as dGPU prices remain high. But that is the only reason. A 2200g paired with a $150 dGPU with reasonable dGPU prices would cost $70 more than a 2400g but offer double the performance.
 

R0H1T

Platinum Member
Jan 12, 2013
2,570
147
106
Given the absurd dGPU prices these APUs for the first time actually seem a reasonable choice, as long as dGPU prices remain high. But that is the only reason. A 2200g paired with a $150 dGPU with reasonable dGPU prices would cost $70 more than a 2400g but offer double the performance.
That's debatable, the 2400g also has 4 extra threads & unlike Intel the SMT gains on Ryzen can be huge. Though the 2400g will likely run hotter & need an aftermarket cooler, if one is pushing it to the limits.
 
  • Like
Reactions: coercitiv

AtenRa

Lifer
Feb 2, 2009
13,832
3,068
136
Have you actually been reading this thread? The speculation all along has been that the performance would be in the range of the 1030. That is pretty much how it turned out. Now whether that is a reasonable performance for a gaming desktop is another issue altogether.
I remember people here telling us that 2400G will be slower than GT1030 and RX550 because of the memory bandwidth. Especially for the RX550 because it has double the memory bandwidth than 2400G + 3200 Ram.

Also, nobody has really looked how much of an upgrade those Ryzen APUs are over last year BristolRidge. 2200G iGPU performance is 70/80% and up to more than 2x faster than A12-9800 at the same price. People with Trinity/Kaveri could upgrade to 2200G/2400G and see a healthy performance uplift both in CPU and iGPU performance. Not to mention all those millions of Intel iGPU users with Pentium/Core i3 (3/4/5/6th Gen).
 

SirDinadan

Member
Jul 11, 2016
108
64
71
boostclock.com
Also, nobody has really looked how much of an upgrade those Ryzen APUs are over last year BristolRidge.
Bristol Ridge is one of the most hilarious release of AMD. More than a year late, absolutely no support, etc. I reported to AMD that there's huge problems with the A12-9800 frame time wise compared to the older A10-7860K, but they didn't really care. It's barely faster than Kaveri / Godavari and stutters a lot more. I will make some comparisons videos, just for fun. The real deal is the performance against the GT1030.
 

AtenRa

Lifer
Feb 2, 2009
13,832
3,068
136
Bristol Ridge is one of the most hilarious release of AMD. More than a year late, absolutely no support, etc. I reported to AMD that there's huge problems with the A12-9800 frame time wise compared to the older A10-7860K, but they didn't really care. It's barely faster than Kaveri / Godavari and stutters a lot more. I will make some comparisons videos, just for fun. The real deal is the performance against the GT1030.
Well i havent played with BristolRidge so I cannot comment on that.
 

Peter Watts

Member
Jan 11, 2018
60
15
41
well, it wasn't really my argument, I just commented that the CPU perf is not really a problem for this kind of gaming PC and that a 1050 ti is in another league.

but looking at it, it's 7 years old platform, not components, the CPUs are not going to fail, the question is mainly the motherboard and ram.
I personally have lots of 10+ years old PCs running daily (lga 775 all cheap boards with 945gc, G31 and so on), apart from PSUs I didn't have any recent failures, my main PC runs on a 2012 board, in times of expensive DDR4 (one of the main reasons I'm stuck with this) the old PCs with a new graphics card make even more sense.

BUT I can see it's not the solution for most, it's just the superior solution in terms of perf/$

the 2200G and to a lesser extent the 2400G are great options (but the DDR4 prices are spoiling the party quite a bit)
To be honest the DDR4 prices are pushing me towards a used market combo (old/new) parts also... Throwing in a 1050ti and boom it destroys these in price/performance. (Sadly not in power consumption)

What combo would you recommend?

And i do think it's relevant to this discussion, because the these are aimed at budget the market after all.
 

ksec

Senior member
Mar 5, 2010
420
117
116
Given the absurd dGPU prices these APUs for the first time actually seem a reasonable choice, as long as dGPU prices remain high. But that is the only reason. A 2200g paired with a $150 dGPU with reasonable dGPU prices would cost $70 more than a 2400g but offer double the performance.
Unless we could solve the GPU Crypto Problem in the near future ( Which I doubt ), doesn't making a APU with higher GPU performance makes much more sense? Someone called it like a ThreadVega or something.

Or they should go and make GPU the old way that isn't great at Crypto and only good at gaming.
 

Feimitsu

Junior Member
Feb 8, 2015
2
0
36
No.
The chips will drop their clocks when both of the domains are pushed hard simultaneously (especially the 2400G), however that's due to the TDP limit.
Activate "OC-Mode" and the limiters are gone.
Thanks.
Just to understand, GeAPM was an artificial limit with no correlation to power consumption, and APM on Kaveri has the same functionality on Raven Ridge?
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
Thanks.
Just to understand, GeAPM was an artificial limit with no correlation to power consumption, and APM on Kaveri has the same functionality on Raven Ridge?
GeAPM was one of the many features which made no sense on those designs.
Tons of stuff was misplaced, incomplete and broken.

Unlike the 15h APUs, Raven won't drop the clocks unless it hit's some of the physical limits (power, current or thermal).
15h APUs throttled down the CPU once the iGPU utilization hit 95%, regardless if there was any need to do so due to an actual limit.
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
Vega uArch could be much better in highly bandwidth limited scenarios, which would be where APU stand to gain the most. Halving memory clock on both and then comparing would have been interesting.
Based on my tests with Polaris 11 in various configurations, there isn't much difference in the bandwidth requirement compared to Raven.

I estimate that the bandwidth available at DDR4-3200 is sufficient to keep the < 5.4 CUs of the Vega 11 GPU (and the CPU) fully fed at all times.
The performance scaling when going from 3CUs to 6CUs is already 71.4% at 3200MHz memory frequency (58.9% at 2400MHz), and when going from 6CU to 9CU the scaling is only 38.7% at 3200MHz (28.4% at 2400MHz).

Based on that ideally we would need around 102GB/s of bandwidth.
Relative to the characteristics of the GPU (CU / RB / Clocks) that's almost exactly the same as on older GCN generations.
 
  • Like
Reactions: moinmoin and Gideon

wahdangun

Golden Member
Feb 3, 2011
1,007
148
106
Based on my tests with Polaris 11 in various configurations, there isn't much difference in the bandwidth requirement compared to Raven.

I estimate that the bandwidth available at DDR4-3200 is sufficient to keep the < 5.4 CUs of the Vega 11 GPU (and the CPU) fully fed at all times.
The performance scaling when going from 3CUs to 6CUs is already 71.4% at 3200MHz memory frequency (58.9% at 2400MHz), and when going from 6CU to 9CU the scaling is only 38.7% at 3200MHz (28.4% at 2400MHz).

Based on that ideally we would need around 102GB/s of bandwidth.
Relative to the characteristics of the GPU (CU / RB / Clocks) that's almost exactly the same as on older GCN generations.

so what is the limit MC ? can it do DDR4-4000 ?
 

The Stilt

Golden Member
Dec 5, 2015
1,709
3,057
106
so what is the limit MC ? can it do DDR4-4000 ?
Based on my experience the practical limit is almost exactly the same as on Zeppelin.
The highest fully stable frequency I've achieved on my samples is 3400MHz (on high quality B-die sticks and at tight timings).
Unlike Zeppelin, the memory controller can train the memory even at 3666MHz but still anything higher than 3400MHz isn't error free.
On my R7 1700 I can boot to Windows at 3600MHz, but 3533MHz is the last stable frequency.
 

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
Based on my tests with Polaris 11 in various configurations, there isn't much difference in the bandwidth requirement compared to Raven.

I estimate that the bandwidth available at DDR4-3200 is sufficient to keep the < 5.4 CUs of the Vega 11 GPU (and the CPU) fully fed at all times.
The performance scaling when going from 3CUs to 6CUs is already 71.4% at 3200MHz memory frequency (58.9% at 2400MHz), and when going from 6CU to 9CU the scaling is only 38.7% at 3200MHz (28.4% at 2400MHz).

Based on that ideally we would need around 102GB/s of bandwidth.
Relative to the characteristics of the GPU (CU / RB / Clocks) that's almost exactly the same as on older GCN generations.
Do you have Wolfenstein 2? There's something I'd be happy if you could test when you can!

https://forum.beyond3d.com/posts/2021626/

If the theory is correct, then the GPU culling off setting should abnormaly affect Vega 11 compared to any other GPU of its class.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,574
126
While the forum creates imaginary "optimized" builds the people are buying, and it looks like wallets love both 2200G and 2400G. Who would have known?!

I must commend @Shivansps for being 100% spot on about these chips: they are selling like hotcakes.
The 2400G disappeared from stock almost immediately at Amazon and Newegg. Must have sold a lot to still be a best seller.
 

psolord

Golden Member
Sep 16, 2009
1,549
849
136
Hello everybody. I have an honest question, so please don't, bite my head off.

Since these are the first trully gaming worthy apus, even at entry level, what does this mean for the ram wear?

I mean is system ram suitable for doing graphics rendering? Does this reduce the life span of memory chips?

Thanks.
 
  • Like
Reactions: Peter Watts

Peter Watts

Member
Jan 11, 2018
60
15
41
I have seen how 8Gb 3200 dual channel performs (in games) with the 2400G, and it completely sucks donkey balls!

These are not worth it for gaming with the current prices of ram!!! Fast 16Gb is needed to really make these shine in gaming, plus the gaming isn't even that great.

Instead of buying the 2400 for 160 euro's/169 dollars, i think i will save up and get either a 2200G or near future alternative 100€/$ cpu paired with a 1050ti.

The cheapest 4GB 1050ti only costs a 100/109 €/$ more, when combined with the 60/69 you save from disregarding the 2400G. Not only that but 16GB of ram costs roughly 150/159 €/$, and 8GB 109/118 €/$!!! <---Even cheaper if you go lower frequency ram

So you save 101/110 €/$ by avoiding 16GB ram and the 2400G! Which can be invested in for instance a 1050ti!!!

So save up 60/69 €/$ for huge performance boost sounds better to me!

That's worth it!

Regular htpc/desktop usage is another story, and does not need 16GB of ram.
 

Glo.

Diamond Member
Apr 25, 2015
5,083
3,691
136
I have seen how 8Gb 3200 dual channel performs (in games) with the 2400G, and it completely sucks donkey balls!

These are not worth it for gaming with the current prices of ram!!! Fast 16Gb is needed to really make these shine in gaming, plus the gaming isn't even that great.

Instead of buying the 2400 for 160 euro's/169 dollars, i think i will save up and get either a 2200G or near future alternative 100€/$ cpu paired with a 1050ti.

The cheapest 4GB 1050ti only costs a 100/109 €/$ more, when combined with the 60/69 you save from disregarding the 2400G. Not only that but 16GB of ram costs roughly 150/159 €/$, and 8GB 109/118 €/$!!! <---Even cheaper if you go lower frequency ram

So you save 101/110 €/$ by avoiding 16GB ram and the 2400G! Which can be invested in for instance a 1050ti!!!

So save up 60/69 €/$ for huge performance boost sounds better to me!

That's worth it!

Regular htpc/desktop usage is another story, and does not need 16GB of ram.
Find GTX 1050 Ti for reasonable prices. Cheapest one in my EU country costs 899 PLN - 190$. It is cost that you ADD over cost of CPU, RAM, Mobo and SSD.
 

kallisX

Member
Sep 29, 2016
34
28
91
why did they choose the stealth cooler on a $99 cpu, where you annoyingly have to unplug everything in your computer and screw the backplate. x4 950 consumes twice as much power and it gets the simplest one.
 

ASK THE COMMUNITY