AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 30 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
The technical reason is that you need DDR4-3200 what is already out of spec to have more bandwidth than the GT1030.
And with that i might add that the new core probably need more bandwidth.

Other problem is TDP, Vega 8 on 2200G is petty much an integrated RX550 (that an GT1030 trades blows with at half the bandwidth) w/ "Vega improvements" and RX550 is already a 50W TDP part at 14nm.

Aside from that (TDP limits and bandwidth) everything else looks good at the technical part. And thats incluiding both the 2200G and 2400G.

Actually is the non-technical reasons that has me worried about this, and thats comes courtesy of AMD slides and im not going to repeat the same. I see no reason for AMD wanting to hide the APU performance vs a GT1030/RX550 if the APU is competitive.

As i have explained before, memory bandwidth with DDR-4 3200 will not be that much of a problem for the 2200G/2400G.
The performance increase from Bristol Ridge (A12-9800) to Raven Ridge (2200G/2400G) will be in the same area as GT730 64bit GDDR-5 to GT1030 64bit GDDR-5.
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Memory Bandwidth on the 2400G + DDR-4 3200 will not be that much of a problem.
For comparison GT1030 has 48,06GB/s of memory bandwidth and 2400G with DDR-4 3200 will have 51,2GB/s

NVidia is a lot more efficient using bandwidth than AMD. The AMD RX 550 is the the competitor at this level and it has 112 GB/s of bandwidth.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
Especially if the team in question has a long history of doing just that. All else being equal, precedence is a legitimate factor.

It must not be comforting to have to move the goal posts all over the field to make your point possible.





And you inadvertently made my point in your last statement. If the vast majority of buyers don't care much about how strong an igpu is, then it stands to reason that they care more about compute performance and is what was holding back sales of Bristol, Carrizo, Kaveri and Trinity. Ryzen APUs eliminates that deficit, and gives buyers the ability to have discrete class graphics paired with strong CPU cores without needing the discrete graphics and the complexity of multiple drivers/software packages from different vendors. Buyers were attaching discrete GPUs to intel APUs because they had no choice if they wanted anything but a failed gaming experience.

You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?

Seriously? We expect the average buyer to do that?

Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...

The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.

AMD APU performance won't even be relevant once Volta hits.
 
  • Like
Reactions: frozentundra123456

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
136
Just a reminder , RX550 is not VEGA ;)

Vega never really moved the needle on anything over Polaris.

This focus on the market for the gamer who can't afford a dGPU is a waste of time, and this isn't where AMD will move the needle. That Buyer was already Team AMD.

The main market for RR breakthrough is non-gamers, and it is the CPU that will make the difference.

I know that I have recommended people away from AMD systems when they asked me for advice for years and I bet I wasn't the only one doing this. One of my friends just bought his second Intel system last year on my advice (previous was 8 years before that).

But I don't have to do that anymore. AMD is finally competitive again. There will now be AMD options to recommend.
 
  • Like
Reactions: whm1974 and tential

tential

Diamond Member
May 13, 2008
7,355
642
121
Especially if the team in question has a long history of doing just that. All else being equal, precedence is a legitimate factor.

It must not be comforting to have to move the goal posts all over the field to make your point possible.





And you inadvertently made my point in your last statement. If the vast majority of buyers don't care much about how strong an igpu is, then it stands to reason that they care more about compute performance and is what was holding back sales of Bristol, Carrizo, Kaveri and Trinity. Ryzen APUs eliminates that deficit, and gives buyers the ability to have discrete class graphics paired with strong CPU cores without needing the discrete graphics and the complexity of multiple drivers/software packages from different vendors. Buyers were attaching discrete GPUs to intel APUs because they had no choice if they wanted anything but a failed gaming experience.

You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?

Seriously? We expect the average buyer to do that?

Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...

The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.

AMD APU performance won't even be relevant once Volta hits.

Also you had mentioned meltdown blowback. How deep are we digging here to find things you don't like about the other side? I mean, do you seriouly expect people who are spending at the lowest end of the spectrum to care about a cloud computing bug that doesn't affect them in anyway?
I don't see how these extremely high expectations out of AMD help from yourself other than to leave many many users disgruntled when they don't come to fruition...
Vega never really moved the needle on anything over Polaris.

This focus on the market for the gamer who can't afford a dGPU is a waste of time, and this isn't where AMD will move the needle. That Buyer was already Team AMD.

The main market for RR breakthrough is non-gamers, and it is the CPU that will make the difference.

I know that I have recommended people away from AMD systems when they asked me for advice for years and I bet I wasn't the only one doing this. One of my friends just bought his second Intel system last year on my advice (previous was 8 years before that).

But I don't have to do that anymore. AMD is finally competitive again. There will now be AMD options to recommend.
If Vega was so good, we would have seen midrange Vega on desktop. When I brought up just how weak the performance was on Vega to the point that Polaris would be similar anyway, I was yelled at as a troll, yet when you look at Vega 56, it's not far enough ahead of Polaris 10 to make a large gap for Vega midrange. Vega just isn't powerful period, and it's going up against Volta this year....
It's a far larger uphill climb than anyone in this thread who is hoping for the best for AMD realizes.
 

french toast

Senior member
Feb 22, 2017
988
825
136
You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?

Seriously? We expect the average buyer to do that?

Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...

The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.

AMD APU performance won't even be relevant once Volta hits.
Oh right, nobody will do that will they? Because you looked into your crystal ball and say so?
As long as that Volta 2030 GPU is not bolted onto a low end intel CPU it does not matter does it?.. different products and different markets you see, only AMD has the product that combines high end CPU with a decent GPU onto a single affordable die.
Again what stops someone buying a 2200g and putting a 2030 on it?..nothing...how is raven ridge a flawed product because of this revelation?
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
Memory Bandwidth on the 2400G + DDR-4 3200 will not be that much of a problem.
For comparison GT1030 has 48,06GB/s of memory bandwidth and 2400G with DDR-4 3200 will have 51,2GB/s

There's one more. The GT1030 system has 65W for the CPU, and 30W for the GPU.

Again what stops someone buying a 2200g and putting a 2030 on it?..nothing...how is raven ridge a flawed product because of this revelation?

It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic. I think it will have its own niche uses, but beyond that is very doubtful.
 

french toast

Senior member
Feb 22, 2017
988
825
136
There's one more. The GT1030 system has 65W for the CPU, and 30W for the GPU.



It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic. I think it will have its own niche uses, but beyond that is very doubtful.
It's not flawed logic at all, memory prices are hideously expensive top to bottom, 2667mhz ddr4 is not that much more expensive than 3200mhz..(2x4gb)..2400g with 3200mhz ram will likely give ballpark 1030 performance...giving the amd system the option to play budget gaming from the get go with little hassle or add a dgpu in future or upgrade to a 7nm cpu without building a new system.
Or buy a dirt cheap 2200g and do similar with lower settings..both overclock-able with a decent cooler out of the box and dirt cheap mobos that are on the market.
Sure you can pick a CPU+GPU combo that would best it in some scenario..but you would be compromising something or Paying more.
Picking a budget gaming niche scenario to denegrate raven ridge is silly, it is a well balanced all in one product that can play AAA games on low/medium settings and offer solid productivity.
Why are the same people not shouting i3 8100 and 8400 down for being worse in this scenario?..it is stupid.
Gaming a little bit better or worse than an intel CPU + Nvidia GPU is not going to make or break raven ridge, it will be close enough to not matter to average Joe, most of which just like the convenience of bunging in one processor with less fuss and no obvious downsides.
Gtx 2030 comes when next week? More like 6 months.

Besides most people in this market will be buying OEM systems, which will get a good deal to buy cheap as chips raven ridge in bulk and knock out these budget pc that can play games and be upgradable...it will sell like hot cakes.
OEMs can have one mobo with 2-4 configurations..2200g/2400g/+ choice of 2 dGPUs..no hassle.
 
Last edited:

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?

Seriously? We expect the average buyer to do that?

Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...

The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.

AMD APU performance won't even be relevant once Volta hits.
There won't be a GT 2030 this year.
GT 630 - 2012
GT 730 - 2014
GT 1030 - 2017
So based on this, the earliest launch would be next year.
 

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...

Really i have no idea what you are talking about here. There is no compatibility problem with any AMD APU that im aware of, from Llano to BR.

The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.

This seems like trolling, there are hundred of thousands of users with AMD APUs gaming with iGPUs and others that upgraded to a dGPU down the road.

Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.

When GT2030 is launching ?? Because Ryzen 2200G/2400G are launching in 2 weeks

If Vega was so good, we would have seen midrange Vega on desktop.

There is no need for AMD to design and produce low/Midrange VEGA since Polaris is still competitive against NV low/mindrange offerings.
 
Last edited:

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
If RR does get close to 1030 performance maybe it'd inspire nVidia to do something, but I think it would be more likely something based upon GP107 than Ampere.
 

rainy

Senior member
Jul 17, 2013
505
424
136
It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic.

Are you trying to say, that Intel IGPs are average?
I know many people using them for light gaming and suddenly much better performance on AMD side is a problem, which is hillarious, especially when CPU performance was fixed by Zen cores.

Btw, judging by level of bashing, you could think that RR is a worst AMD APU since Llano release in June 2011, when in fact is exactly opposite.
 
Last edited:

CatMerc

Golden Member
Jul 16, 2016
1,114
1,149
136
When GT2030 is launching ?? Because Ryzen 2200G/2400G are launching in 2 weeks
Considering classic NVIDIA cadence that they followed like clockwork? Volta is somewhere in the next couple of months. Probably in Q2.

There is no need for AMD to design and produce low/Midrange VEGA since Polaris is still competitive against NV low/mindrange offerings.
It won't be in a few months.
 

TheELF

Diamond Member
Dec 22, 2012
3,967
720
126
Really i have no idea what you are talking about here. There is no compatibility problem with any AMD APU that im aware of, from Llano to BR.
Linux/hackintosh,browser acceleration, x264/x265 both playing and encoding.
Intel igpu is almost always supported while for amd you have to search far and long to find anything.
The only thing where AMD APUs are really well supported is in gaming.
 

neblogai

Member
Oct 29, 2017
144
49
101
If RR does get close to 1030 performance maybe it'd inspire nVidia to do something, but I think it would be more likely something based upon GP107 than Ampere.

We talk about Raven Ridge and G108/GT1030/MX150 as if they were meant for competing on desktop in gaming. They are not- GP108 is made for laptops and is successful there; and RR is AMD's way of entering that same lowest end gaming market with and iGPU that is almost free to produce/implement. On desktop- we only get reused products of that battle; AMD will also reuse RR for business segment as a competitive CPU that finally has free GPU.
 

Abwx

Lifer
Apr 2, 2011
10,854
3,298
136
Linux/hackintosh,browser acceleration, x264/x265 both playing and encoding.
.

Lol if that s not trolling then i wonder what it is..

The video engine is now extremely capable, supporting hardware-accelerated decoding of CODECs such as VP9 10-bpc and HEVC 10-bpc at frame-rates of up to 240 for 1080p, and 60 for 4K UHD. It can also encode H.265 8-bpc at frame-rates of up to 120 at 1080p, and 30 at 4K UHD. You finally get to use the display connectors on your socket AM4 motherboards, as the iGPU supports DisplayPort 1.4 and HDMI 2.0b, with resolutions of up to 3840 x 2160 @ 60 Hz with HDR, 1440p @ 144 Hz, and 1080p @ 240 Hz.

What is missing in the list..?..


If ever you are interested by actual info rather than made up hollow statements :

https://www.techpowerup.com/238175/amd-raven-ridge-silicon-detailed
 

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
There won't be a GT 2030 this year.
GT 630 - 2012
GT 730 - 2014
GT 1030 - 2017
So based on this, the earliest launch would be next year.
Considering classic NVIDIA cadence that they followed like clockwork? Volta is somewhere in the next couple of months. Probably in Q2.


It won't be in a few months.
GT 2030 is not going to launch this year. See my post above on this page.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
You talk about moving goal posts, then say buyers will not care about the igpu perf, but the compute performance?

Seriously? We expect the average buyer to do that?

Also, AMD gpus with igpus do need drivers/have driver complexity, what are you talking about? That's complete misinformation to say otherwise. It's intel igpus that are almost always supported/plug and play since they're the most widely used. If you're talking compatibility, AMD loses there by a longshot...

The dream of people gaming on AMD's extremely weak igpus, in hopes they'll upgrade to a discrete AMD card later is just that, a sad sad dream that won't ever happen.
Volta will seal the deal as the comparison to a GTX 1030 will not even make sense when a GTX 2030 blows it away. It's never been a competitive strategy to go for the lowest of lowest end GPU performance and expect it to be relevant.

AMD APU performance won't even be relevant once Volta hits.

Also you had mentioned meltdown blowback. How deep are we digging here to find things you don't like about the other side? I mean, do you seriouly expect people who are spending at the lowest end of the spectrum to care about a cloud computing bug that doesn't affect them in anyway?
I don't see how these extremely high expectations out of AMD help from yourself other than to leave many many users disgruntled when they don't come to fruition...
.

This is almost the exact opposite of what i said, unless you meant to quote the post I responded to (i suppose it's a bit confusing since the poster said the exact opposite in 2 successive posts)?

So let me clarify then. Buyers that bought a low end entry level GPU like a 1030 and paired it with a low end intel APU clearly do want a GPU with better performance than what the weak GPU in intel's APUs can offer. The next step up is to attach an entry level card, and, they were attaching those to intel's APUs because those had the best performing CPU.

When Ryzen APUs are available to purchase in the near future, those same buyers would no longer be forced to buy 2 separate pieces of hardware to accomplish the same goals with higher combined TDP and most likely higher cost. They also won't have to download and install 2 separate drivers and 2 separate software packages, compared to 1 unified driver and software package with Radeon Settings.

So to clarify your misrepresentation of what i said about people gaming on AMD's Ryzen APUs and attaching an AMD card later, that isn't remotely close to what i said. And it's especially telling that you had to invoke some dream of a future 2030 card which has no relevance to the timeline involving Ryzen APUs. So it basically sound like your are saying out of one side of your mouth that AMD's APUs are going to be relevant, since out of the other side you are saying they won't be relevant once Volta is released (do we have any roadmap at all that provides a semblance of proof to your claim?), thus implying they will until then. You've got quite the game of Twister going on there! :cool:

As for intel's Meldown problem, it has everything to do with the stigma and perception of the the issue. To what degree it affects end users is really not the important issue. As we've heard over and over nearly every where on forums, mindshare and perception are the important things. People often use "intel inside", or the jingle associate with it, or marketing in general as examples of this. I've already had someone ask me for advice on a new laptop, and stating "will it be safe if i get an intel?". Of course it would be remiss if i never told them the truth, so i simply said (paraphrasing), that there would be little impact from the software that they are working on to correct the problem, BUT the most secure processors on the market are the Ryzen chips from AMD.

There are no high expectations, just stating the obvious. With a Ryzen APU, buyers will be able to have an experience similar to a discrete class card, in this case a 1030, in a single package without the relative complexity.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
As i have explained before, memory bandwidth with DDR-4 3200 will not be that much of a problem for the 2200G/2400G.
The performance increase from Bristol Ridge (A12-9800) to Raven Ridge (2200G/2400G) will be in the same area as GT730 64bit GDDR-5 to GT1030 64bit GDDR-5.

Well see. In my experience an A10-7850K needed DDR3-2133 to be faster (and not en every game) than a R7 250 DDR3 (equiped with a equivalent to DDR3-1600).
They need to gain over 80%, almost 100% over an A12-9800 to get anywhere near an GT1030, ill say that should have been slide worthy.

There won't be a GT 2030 this year.
GT 630 - 2012
GT 730 - 2014
GT 1030 - 2017
So based on this, the earliest launch would be next year.

If i had to bet, ill say they probably rebrand GT1030 to GT2010 and GTX1050 to GT2030.
 

tential

Diamond Member
May 13, 2008
7,355
642
121
There's one more. The GT1030 system has 65W for the CPU, and 30W for the GPU.
It's not, but the point tential is making is solely relying on a more-powerful-than-an-average iGPU is a flawed logic. I think it will have its own niche uses, but beyond that is very doubtful.
It's not just that.
It's that we run through the same TIRED arguments from those who are very bullish on AMD products time and time again.
It's always been that AMD will levarge the graphics IP to make people want the CPU....

What actually made people buy AMD cpus? AMD improving their CPU!!!
Instead, you see a large contingent of people still focused heavily on the "APU".

It may just be a country/currency gap though. I notice a lot of those who are very interested in the APUs are also those on a heavy budget not from the US. At least anecdotally. If your budget forces you into an APU only solution and you live in the USA, you have bigger problems going on in your life to address. My personal opinion.
 
  • Like
Reactions: frozentundra123456