AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 56 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DAPUNISHER

Super Moderator CPU Forum Mod and Elite Member
Super Moderator
Aug 22, 2001
28,286
19,943
146
1050 ti is massively faster than RX 550 which is a little faster than the 2400G with fast ram, so no contest really, old i5 is enough for most games anyway.
I am astonished there are people suggesting a 7 year old platform for gaming, to people on a tight budget right now. Gaming is a hard workload, how long until one of those old components fail? What warranty recourse can you offer those that can ill afford to replace the components any time soon? It makes no sense to me, to even bring such a dubious solution into a thread like this. I will give you all the benefit of a doubt that it is sincere but (to me anyways) misguided advice, and not something agenda driven.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
It's really funny how AMD actually delivers above expectations, yet people...

That confirms how good IMC on Ryzen. Only that DRAM latency caused by IF (NB).

Game CHANGER here is this:
AMD created only 2 dies. APU and CPU!
 
  • Like
Reactions: USER8000

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
What's with the performance regression with a dGPU (ie, 2400G @ 3.6GHz is 10-15% slower than R5 1500X @ 3.5GHz both with same dGPU)? Steve commented on it here in the HWUNboxed vid plus TechPowerUp charts show the same thing in several games:-

BF1 - R5 2400G + GTX 1080 = 126.7fps
BF1 - R5 1500X + GTX 1080 = 138.4fps

DOOM - R5 2400G + GTX 1080 = 135.7fps
DOOM - R5 1500X + GTX 1080 = 146.3fps

WatchDogs2 - R5 2400G + GTX 1080 = 73.3fps
WatchDogs2 - R5 1500X + GTX 1080 = 81.8fps

Witcher 3 - R5 2400G + GTX 1080 = 91.4fps
Witcher 3 - R5 1500X + GTX 1080 = 100.4fps

https://www.techpowerup.com/reviews/AMD/Ryzen_5_2400G_Vega_11/14.html

Is that the effect of 8x vs 16x PCIe lanes? I thought that was only supposed to affect high end cards but there's a 10-15% drop in fps (both avg & min) even on the RX550.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
No!
You do not need.

Why would you need that much memory for medium & low settings? Specially with Vega.

Wasn't that a cool feature ? I mean...
You need dual channel with the APU, so you need two sticks of ram, 4x2 or 8x2.
The thinking is that if you have 2gb for the GPU, 8gb leaves you just 6gb of ram for the system.
Single channel ram kills the gpu in the APU.

With a DGPU you don't need the dual channel ram.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
You need dual channel with the APU, so you need two sticks of ram, 4x2 or 8x2.
The thinking is that if you have 2gb for the GPU, 8gb leaves you just 6gb of ram for the system.
Single channel ram kills the gpu in the APU.

With a DGPU you don't need the dual channel ram.

Of course you need. Basically with lower latency and higher bandwidth you are gaining on ST/MT thread performance.


CPU needs bandwidth = low frame times, smother experience.
Why don't you guys compare AIDA64 copy bandwidth - GPU (iGPU vs RX 550/GT 1030).
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
That shouldn't be that big problem. TPU didn't test correctly.
Why isn't the test "correct"? HardwareUnboxed got exactly the same result so it's not just one site showing it "wrong". Most 2D benchmarks don't show that much of a difference given the cache size disparity (eg, x264 almost the same, LAME only 4% difference, etc), and so the only other difference that's gaming-specific is the obvious 8x vs 16x lane difference for dGPU's.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Of course you need. Basically with lower latency and higher bandwidth you are gaining on ST/MT thread performance.


CPU needs bandwidth = low frame times, smother experience.
Why don't you guys compare AIDA64 copy bandwidth - GPU (iGPU vs RX 550/GT 1030).
Not really interested in synthetics anymore, actually.
The single channel ram is far less of a real world performance problem for the CPU / DGPU combo than it is for the APU graphics.
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
Why isn't the test "correct"? HardwareUnboxed got exactly the same result so it's not just one site showing it "wrong". Most 2D benchmarks don't show that much of a difference given the cache size disparity (eg, x264 almost the same, LAME only 4% difference, etc), and so the only other difference that's gaming-specific is the obvious 8x vs 16x lane difference for dGPU's.

Not in games. You still need to know that its still have 6MB of L2/3.

If you compare older Athlons which had only L2 (4MB). With carrizo there was bigger difference with only 2MB of L2/3.
http://www.guru3d.com/articles_pages/amd_athlon_x4_845_fm2_review,16.html
 

IRobot23

Senior member
Jul 3, 2017
601
183
76
Not really interested in synthetics anymore, actually.
The single channel ram is far less of a real world performance problem for the CPU / DGPU combo than it is for the APU graphics.

How can I convince you?

Lets say that friend buy i5 8400 with single 8MB stick (2400MHz) and you buy i3 8100 with dual channel DDR4 2666MHz+ (2x4GB).

1. Your ST performance will be slower with 8400.
2. If game is coded well and threads will help, those threads will love bandwidth.

If you will want to know that your i5 8400 can be even more than 50% faster with faster dual channel ram in some games then maybe you will consider buying dual channel with 2666MHz for 20$ more.

This guy did 3000MT/s 2x4Gb vs 1x8GB
https://www.youtube.com/watch?v=qBmElSVy4U8

Well some test are still useless since on 2x4Gb you get 98% of GPu usage.
 
Last edited:

cbn

Lifer
Mar 27, 2009
12,968
221
106
I think it is rather interesting that R3 2200G + GT1030 (at MSRP) is the same price as the R5 2400G.

What happens when the Athlon x4 version of Raven Ridge gets released?

You're basically reading my mind. Well done.

Well yeah, when Athlon x4 comes R5 2400G will be an even harder argument.

So what does AMD do at that time? (for the Zen APU dies with 11 Vega Units)

Make more 15W Ryzen 7 2700U APUs for mobile (instead of using the dies for R5 2400G)? Or does going 35W (mobile) make even more sense for dies with 11 Vega Units? (I think 35W with Vega 11 iGPU would be so much more interesting and useful....and this is what I hope they do)

P.S. If going 35W I do hope AMD is better able to label the specs on paper so it matches up with the reality of the chip actually being much better. Example: I felt with Bristol Ridge the 35W spec on paper made the chip look too weak of an improvement compared to the 15W version (because it only listed CPU clock speed, not the combined CPU and GPU clock when CPU and GPU were both being used at the same time).
 
Last edited:

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
Note that I am looking at this strictly from a gaming POV...

The 2200G is the star of the show here, at $99 and when overclocked is within 10% of the 2400G. Of course, you can overclock the 2400G too but it seems to gain a lot less than the 2200G does, possibly due to memory bandwith limitations, even with DDR4-3200.

The only downside to these chips are the price of DDR4 memory currently, especially the higher speed modules. But then again, Ryzen has always needed fast DDR4 to extract maximum gaming performance so this isn't anything new.
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
How can I convince you?

Lets say that friend buy i5 8400 with single 8MB stick (2400MHz) and you buy i3 8100 with dual channel DDR4 2666MHz+ (2x4GB).

1. Your ST performance will be slower with 8400.
2. If game is coded well and threads will help, those threads will love bandwidth.

If you will want to know that your i5 8400 can be even more than 50% faster with faster dual channel ram in some games then maybe you will consider buying dual channel with 2666MHz for 20$ more.

This guy did 3000MT/s 2x4Gb vs 1x8GB
https://www.youtube.com/watch?v=qBmElSVy4U8

Well some test are still useless since on 2x4Gb you get 98% of GPu usage.
Why would you want to convince me? And my "system" doesn't have an Intel chip in it.
You didn't address the graphics performance of the APUs with single channel ram, anyway.
:D
 

LTC8K6

Lifer
Mar 10, 2004
28,520
1,575
126
Note that I am looking at this strictly from a gaming POV...

The 2200G is the star of the show here, at $99 and when overclocked is within 10% of the 2400G. Of course, you can overclock the 2400G too but it seems to gain a lot less than the 2200G does, possibly due to memory bandwith limitations, even with DDR4-3200.

The only downside to these chips are the price of DDR4 memory currently, especially the higher speed modules. But then again, Ryzen has always needed fast DDR4 to extract maximum gaming performance so this isn't anything new.
I agree with that choice. Of the two, the 2200G makes the most sense.
 

tamz_msc

Diamond Member
Jan 5, 2017
3,727
3,560
136
The cost.
Indium itself is rather expensive, but that's not the whole story.
Using sTIM instead of conventional TIM requires a significant amount of additional manufacturing phases.
Both the die itself and the heatspreader must be plated (various metal layers for the silicon and gold for the heatspreader).
Thanks for the reply. It seems Robert Hallock himself is confirming this:
AMD_Robert said:
Before this turns into panic: the decisions you make for mainstream products are not always the same decisions for enthusiast products. :) Have faith.
 

epsilon84

Golden Member
Aug 29, 2010
1,142
927
136
I am astonished there are people suggesting a 7 year old platform for gaming, to people on a tight budget right now. Gaming is a hard workload, how long until one of those old components fail? What warranty recourse can you offer those that can ill afford to replace the components any time soon? It makes no sense to me, to even bring such a dubious solution into a thread like this. I will give you all the benefit of a doubt that it is sincere but (to me anyways) misguided advice, and not something agenda driven.

Nothing wrong with the used market if price/performance is what you are after, especially with current DDR4 prices.

The cost of entry for a 2200G system, including 8GB DDR4-3200 and entry level B350 motherboard, is approximately $270. Add a decent 500W PSU + case and 120GB SSD and you're looking at an approximate $400 total system cost.

For about the same money you can generally get a i7 2600 or 3770 based Dell/HP SFF PCs (the market seems to be flooded with these, at least in Australia) plus a current gen GTX 1050 or RX 560, for slightly better CPU performance but far superior graphical performance.

Of course, new vs used is an apples to oranges comparison, but I personally am not against buying used hardware as I've never had a CPU or motherboard die on me, though I am not dismissing the possibility of that happening.
 
Last edited:
  • Like
Reactions: Peter Watts

raghu78

Diamond Member
Aug 23, 2012
4,093
1,475
136
So, much like I was expecting, 2400G + 3200 Memory = ~GT1030.

2200G OC + 3200/3400 Memory = ~GT1030

For $99 the 2200G is in a league of its own.

I will definitely build a USFF with the 2400G in June.

anandtech summarizes it aptly.

"With the Ryzen 5 2400G, AMD has completely shut down the sub-$100 graphics card market. As a choice for gamers on a budget, those building systems in the region of $500, it becomes the processor to pick."

The people who were arguing for days together that R5 2400G cannot match GT1030 have been proven wrong. Moreover I think 2200G is hurt quite a bit in MT workloads due to lack of SMT. For stock performance and a good combination of CPU and GPU horsepower I have to give the nod to the 2400G as a no compromises all in one SoC for USD 500 PC.
 

Glo.

Diamond Member
Apr 25, 2015
5,662
4,421
136
Thanks for the reply. It seems Robert Hallock himself is confirming this:
In essence - Think about what Soldering would do to exposed Interposer, and HBM2 stack, in an APU and how much better it is to use in this case scenario TIM, instead of soldering. Its not done for today, but for future.
 

richaron

Golden Member
Mar 27, 2012
1,357
329
136
How can I convince you?...
What makes you think you can change people's opinions? There are members here with what appears to be an obvious bias against AMD who post constantly.

Even if this parent site conducts a review and shows the 2400G is in the same league as both more expensive intels CPU and dedicated entry level nVidia GPUs (which combined cost more), these members whom appear to have a bias against AMD will most likely keep posting arguments which paint AMD products in the worst light possible.

Normal people will read reviews from (hopefully) independent experts and make their own conclusions. But forum discussions (and searches leading to forum discussions) will always have some impact, so I hope those members who appear to have an agenda against AMD will always have others to post a more balanced counter-argument.