AMD Ryzen 5 2400G and Ryzen 3 2200G APUs performance unveiled

Page 32 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
That isn't bad for an APU at all. Both RR APUs will be flying off the shelves if this will be the case.
 

Shivansps

Diamond Member
Sep 11, 2013
3,641
1,329
136
I'm not sure why AMD APUs have not been that popular. They have been superior to Intel's IGP for a long time, and with Kaveri in January of 2014 they were well out in front.
I built a Kaveri system for a friend that was a great performer for the money..

My guess is that overall, if people want good graphics, they prefer to buy a DGPU that can be upgraded.

If they don't need good graphics, Intel's IGP has been good enough, while also having a better CPU.

Ryzen APUs would theoretically bring along what was missing, a good CPU.

I still suspect that overall, people will not be enthusiastic about integrated graphics for decent levels of gaming, and will prefer a DGPU that can be upgraded.
Finally someone gets it.
 
  • Like
Reactions: tential

Shivansps

Diamond Member
Sep 11, 2013
3,641
1,329
136
It stills remains to be seens how well they perform, if A12-9800 vs A12-9800E is any indication i would expect about 30% over their mobile counterparts. But im sure it can run every one on that list, considering the A12-9800 can. PUBG is the more demanding game on that list, and even a GT1030 can handle that easily.

The fact is... i never ever trought about using GT1030 for gaming before entering this thread. And i live in a country were a GTX1050 2GB is right now, 50% of the minimum salary. Maybe it was not clear, and i pointed out this and other people did as well, AMD APU were destroying Nvidia (and AMD) low end cards, as well as Intel IGP ever since Llano was launched. And some of them, like Richland and Kaveri had better performance vs entry level discrete dgpus of the time (and this taking in consideration an 2400G can match a GT1030 as AMD slide suggest). The really big news is the CPU power, but 4/4 and 4/8 is not really a big news, according to AMD fans (last year, now it is awesome it seems). You have no idea of how crazy this sound, for the last months i kept hearing that 4/4 and 4/8 were dead for any future gaming, but these APU suddently are a good option to have something while you pay or wait for dGPU prices to drop?

So really i dont know what to tell you, the kind of people that was buying APU will keep buying APUs because nothing has changed, the people buying dGPU+CPU will keep doing so because nothing has changed. At least talking about gaming this is a huge meh, NON-gaming use is were those new APU will shine, you know, you dont longer need to buy a GT710 along with a R3 1200 for a pc that it may never run any game outside facebook or any game at all, this is were those APUs are great, specially the 2200G i still think the 2400G is overpriced, but this is AMD, they probably realise that no one wants them a month later and you will see a 2450G replacing it a $140-150.
 

french toast

Senior member
Feb 22, 2017
988
824
136
I'm not sure why AMD APUs have not been that popular. They have been superior to Intel's IGP for a long time, and with Kaveri in January of 2014 they were well out in front.
I built a Kaveri system for a friend that was a great performer for the money..

My guess is that overall, if people want good graphics, they prefer to buy a DGPU that can be upgraded.

If they don't need good graphics, Intel's IGP has been good enough, while also having a better CPU.

Ryzen APUs would theoretically bring along what was missing, a good CPU.

I still suspect that overall, people will not be enthusiastic about integrated graphics for decent levels of gaming, and will prefer a DGPU that can be upgraded.
Finally someone gets it.
It stills remains to be seens how well they perform, if A12-9800 vs A12-9800E is any indication i would expect about 30% over their mobile counterparts. But im sure it can run every one on that list, considering the A12-9800 can. PUBG is the more demanding game on that list, and even a GT1030 can handle that easily.

The fact is... i never ever trought about using GT1030 for gaming before entering this thread. And i live in a country were a GTX1050 2GB is right now, 50% of the minimum salary. Maybe it was not clear, and i pointed out this and other people did as well, AMD APU were destroying Nvidia (and AMD) low end cards, as well as Intel IGP ever since Llano was launched. And some of them, like Richland and Kaveri had better performance vs entry level discrete dgpus of the time (and this taking in consideration an 2400G can match a GT1030 as AMD slide suggest). The really big news is the CPU power, but 4/4 and 4/8 is not really a big news, according to AMD fans (last year, now it is awesome it seems). You have no idea of how crazy this sound, for the last months i kept hearing that 4/4 and 4/8 were dead for any future gaming, but these APU suddently are a good option to have something while you pay or wait for dGPU prices to drop?

So really i dont know what to tell you, the kind of people that was buying APU will keep buying APUs because nothing has changed, the people buying dGPU+CPU will keep doing so because nothing has changed. At least talking about gaming this is a huge meh, NON-gaming use is were those new APU will shine, you know, you dont longer need to buy a GT710 along with a R3 1200 for a pc that it may never run any game outside facebook or any game at all, this is were those APUs are great, specially the 2200G i still think the 2400G is overpriced, but this is AMD, they probably realise that no one wants them a month later and you will see a 2450G replacing it a $140-150.
It is clear to all that you don't get it.period.
Past APUs were not BALANCED....they had compromises on power consumption,CPU performance, upgrade value, ..(ignoring AMDs brand value which doesn't help-especially around that time).
Raven Ridge is the first balanced APU ever made...no obvious downsides or compromises for its price and market segment...it outperforms it's intel rival processor's.

If you want better than 1030 performance why are you even considering either? What's the point of the discussion?..buy an i5 8400 and a inflated 350$ gtx 1060 and be happy.
A big chunk of the market DO buy their PC with this budget price range in mind...a cheap jack of all trades that does all the basics well, upgradeable/future proof ..and keeps the missus happy that not too much money is being spent on toys... family with small children this is a common scenario.

If you can't see that an intergrated APU that can perform similar to the best discrete rival products for around the same price is not a great product..that if it does not smash a discrete GPU that has dedicated memory it has little value... your beyond help and I have to wonder about trolling.
 

rainy

Senior member
Jul 17, 2013
486
374
136
I find this a little interesting
here is a review from mid 2011 from the first "APU"


the APU was $150, the GT 430 was $79 both had similar memory bandwidth (DDR3 128bit 1800 for the gt 430)

looks like Llano had a more competitive IGP than this new APU compared to a low end Nvidia card of this price point.
the main difference is the CPU being a lot more competitive now.
It's not surprising at all: with A8-3850 it was possible to use DDR3-1866, even with DDR4-3600 you don't have twice more bandwith in the case of 2200G/2400G.
GT 1030 is a modern equivalent of GT 430 - both have 64-bit bus, however first using GDDR5 at 6000 MHz, when latter DDR3 at 1600 MHz.
This is close to 4x time more bandwith plus really efficient memory compression techmology.

That's the main problem: in terms of bandwith, gap between APU and low-end graphics cards increased very siginificantly because of use of GDDR5 memory.

https://en.wikipedia.org/wiki/List_of_AMD_accelerated_processing_unit_microprocessors#Lynx:_"Llano"_(2011)
https://www.techpowerup.com/gpudb/2954/geforce-gt-1030
https://www.techpowerup.com/gpudb/603/geforce-gt-430
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,641
1,329
136
It is clear to all that you don't get it.period.
Past APUs were not BALANCED....they had compromises on power consumption,CPU performance, upgrade value, ..(ignoring AMDs brand value which doesn't help-especially around that time).
Raven Ridge is the first balanced APU ever made...no obvious downsides or compromises for its price and market segment...it outperforms it's intel rival processor's.

If you want better than 1030 performance why are you even considering either? What's the point of the discussion?..buy an i5 8400 and a inflated 350$ gtx 1060 and be happy.
A big chunk of the market DO buy their PC with this budget price range in mind...a cheap jack of all trades that does all the basics well, upgradeable/future proof ..and keeps the missus happy that not too much money is being spent on toys... family with small children this is a common scenario.

If you can't see that an intergrated APU that can perform similar to the best discrete rival products for around the same price is not a great product..that if it does not smash a discrete GPU that has dedicated memory it has little value... your beyond help and I have to wonder about trolling.
The APU to dGPU gap is wider than ever before, and i taking in consideration that 2400G = GT1030 here, just like the slide suggest. This is easy to see and understand, why no one wants to see this?

Not that AMD could do much about this because the problem is GDDR5 and DDR4 bandwidth, Still the GT1030 gives the APU a fighting chance with that 64bit bus. So yes IM EXPECTING the APU to beat a similar bandwidth dGPU like they did in the past, if they dont i not going to make up excuses for AMD.

And those APU have everything to do so, the 2200G has petty much a integrated Vega version of RX550, and higher bandwidth than a 1030 with DDR4-3200. if a 2200G cant match a GT1030 this could be only because of 3 problems:
1) TDP
2) Nvidia memory compresion is just better.
3) the new cores are using too much memory bandwidth.

So no excuses, they should be better than that thing.
 
Last edited:

maddie

Diamond Member
Jul 18, 2010
4,331
3,849
136
Any naysayers considering the idea that the lowest end cards are a moving target? Why would any graphics company develop a low powered card thats surpassed by an APU. The lowest cards, almost by definition, will have to try and be above the APUs performance wise. What else justifies there existence?

When AMD CPUs were completely surpassed by Intel, the majority of APUs had very low graphics power, so a lower performance card could have a role. In those days, the minority player, with superior graphics might have overcome the lowest discrete card. Now that that buyers can get an equivalent CPU to Intel and a much better graphic unit included, the benefits of very low performance graphics cards are losing their role.

Also, this is not a perfect mathematical ratio of discrete being X% faster than APU at all times. It will vary per generation.
 
Aug 11, 2008
10,451
642
126
It is clear to all that you don't get it.period.
Past APUs were not BALANCED....they had compromises on power consumption,CPU performance, upgrade value, ..(ignoring AMDs brand value which doesn't help-especially around that time).
Raven Ridge is the first balanced APU ever made...no obvious downsides or compromises for its price and market segment...it outperforms it's intel rival processor's.

If you want better than 1030 performance why are you even considering either? What's the point of the discussion?..buy an i5 8400 and a inflated 350$ gtx 1060 and be happy.
A big chunk of the market DO buy their PC with this budget price range in mind...a cheap jack of all trades that does all the basics well, upgradeable/future proof ..and keeps the missus happy that not too much money is being spent on toys... family with small children this is a common scenario.

If you can't see that an intergrated APU that can perform similar to the best discrete rival products for around the same price is not a great product..that if it does not smash a discrete GPU that has dedicated memory it has little value... your beyond help and I have to wonder about trolling.
No, he does get it, and I agree with him. RR is more balanced, but if you plan to game on the igpu, nothing has changed. Apus werent cpu limited in most gaming scenarios before, they were bandwidth and TDP limited, and still are. If someone had suggested before building a "gaming" PC with a quad core cpu and a 1030 he would have been laughed out of the forums, especially by the "moar cores" lobby that were claiming quad cores were dead. Now it is supposed to be this great "future proof" solution that a huge chunk of the market will buy?? If all one wants to play is esports and older games, just look at Newegg. For 800 dollars, you can buy a laptop with a gtx 1050 and and a hyperthreaded quad core i7 that will almost certainly outperform either of the RR desktop apus. For slightly over 700 dollars, you can get a laptop with the same gpu and a quad mobile i5. You lose the ability to upgrade, but you get better performance out the door, and the advantage of mobility. And as for "keeping the missus happy", I hardly think she will be happy with the upgrade path either: "Wait, honey, didnt we just buy a computer? Why are we having to spend more money now?"
 
Aug 11, 2008
10,451
642
126
Any naysayers considering the idea that the lowest end cards are a moving target? Why would any graphics company develop a low powered card thats surpassed by an APU. The lowest cards, almost by definition, will have to try and be above the APUs performance wise. What else justifies there existence?

When AMD CPUs were completely surpassed by Intel, the majority of APUs had very low graphics power, so a lower performance card could have a role. In those days, the minority player, with superior graphics might have overcome the lowest discrete card. Now that that buyers can get an equivalent CPU to Intel and a much better graphic unit included, the benefits of very low performance graphics cards are losing their role.

Also, this is not a perfect mathematical ratio of discrete being X% faster than APU at all times. It will vary per generation.
Actually, you just made a very strong case *against* an APU. Yes, low end cards are a moving target, but that is one big disadvantage to an APU. Once they start to catch up, discrete cards take another step up in performance.
 

rainy

Senior member
Jul 17, 2013
486
374
136
Still the GT1030 gives the APU a fighting chance with that 64bit bus. So yes IM EXPECTING the APU to beat a similar bandwidth dGPU like they did in the past, if they dont i not going to make up excuses for AMD.
Your trolling in this thread become really annoying: since when 2200G IGP would have more bandwith?
On paper, 128-bit bus and DDR4-3200 it's slightly more than 64-bit bus and GDDR5 6000 MHz, however bandwith must be shared between CPU and IGP.
 

BSim500

Golden Member
Jun 5, 2013
1,480
216
106
This is why I hate pre-release speculation threads. They all end the same way with dozens of pages of some prematurely "over-dumping" on the product whilst others are constantly over-selling it for heavyweight AAA's at 720p/Very Low settings they'd never use themselves after the immediate post-APU purchase novelty had worn off. The Ryzen APU's look very good vs Intel's same priced new stuff (i3-8100 or G5400-G5600 Pentium's, etc) IF your needs are genuinely light and you do not need a dGPU. Personally, I'm going to keep a close eye on the 2200G with the intention of throwing one into a hybrid HTPC / light gaming rig for use on 1990-2013-ish era games (basically Day of the Tentacle to Dishonored 1, ScummVM to Skyrim, etc) as well as a number of newer lighter weight Indie's (Don't Starve, QUBE2, This War of Mine, Thimbleweed Park, maybe Talos Principle 2 at a push).

Having said that, I've hardly seen anyone here actually talk about playing older games or actually name new Indie's they want to play. It's been "Witcher 3, Witcher 3, Witcher 3", even talk of 2018-2019 AAA titles like CyberPunk 2077. APU's are at their worst on new AAA heavyweight's and at their best on older / lighter titles, yet after 32 pages am I really the only one who's actually named predominantly non AAA's to actually play myself?

Throwing theoretical bandwidth figures around for heavyweight games is meaningless for APU vs dGPU comparisons due to a variety of constraints (1. Bandwidth being shared with the CPU, 2. Dynamic TDP sharing (ie, throttling to stay within 65w especially in "thin" SFF builds that will also constrain OCing), 3. 2GB less usable RAM, etc), and since different games behave in different ways, there really is no "one size fits all" prediction formula to declare as a single "absolute fact". Even the slides in the first post show this (45 > 49 (+9%) for Rocket League, 87 -> 96 for Skyrim (+10%) and BF1 flatlined at 52fps on both) given a +37.5% difference in APU shaders (ie, despite 704 vs 512 shaders, the 2400G's performance seems "capped" somewhat around the 576th shader assuming settings are the same). Whether that performance wall is down to DDR4 bandwidth, dynamic TDP sharing throttling or something else remains unknown until people actually get their hands on one and start testing all variables. The difference in how Skyrim scales vs BF1 though shows why OCing something by X% in one game doesn't necessarily lead to same x% gain in all games, and why the only sane thing to do when giving purchase 'advice' is simply "wait for the benchmarks".

Some games may get very close to a GT 1030, others won't no matter what tweaking you do. Likewise, newer games with heavier engines tend to be more VRAM / RAM thirsty at lower settings relative to same visuals on older engines, eg, turning a UE4 game down to 1080p/low doesn't necessarily keep VRAM usage under 2GB the same way it used to with UE1-3 games (even on High/Ultra). One of the biggest problems for 1-2GB "VRAM" APU / GPU's over the past 3-4 years has been chronic VRAM bloat on post +2014 era games, and many of last year's titles remained above 2GB even on "very low" settings - examples (Wolfenstein 2, Dishonored 2, etc). So again, those wanting 2GB "VRAM" APU's for 2018-2020 AAA games and are relying on "low" presets solving engine-bloat are being "overly-optimistic" to put it politely.

Whilst new engines do look better high-end (4K textures, etc), they've also conversely become a lot less efficient in providing sub 2GB VRAM 1-2k textures on "Low/Med" presets vs how well 2007-2014 on Med-Ultra written on an older engine run. Likewise for other games dropping resolution from 4K > 1440P > 1080p > 720p doesn't actually lower (System) RAM usage that much any more which is going to be problematic for heavier titles on a 8GB - 2GB iGPU VRAM = 6GB rig, especially if you've got a web browser with a walkthrough / wiki guide left open in the background. Example ME: Andromeda (10.1GB 4K / 8.2GB 1440P / 7.3GB 1080p = 6.0-6.5GB for 720p?) vs how older games typically remained under 2GB process / 4-5GB system usage due to being predominantly 32-bit.

I've seen UE4 titles like Obduction or Everybody's Gone To The Rapture "out of RAM" crashing on a 8GB RAM + dGPU with a 1GB browser in the background even with settings lowered yet remain stable closing the browser (ie, 7 vs 8GB = playable vs not playable), so unless you actively avoid these titles, having only 4-5GB (vs 6-7GB) to play with (8GB - 2GB "APU shared VRAM" - 1-2GB OS & background apps) is going to hit those limits even earlier or grind everything to a halt via constant pagefile swapping. Some modern game engines (UE4 & Unity in particular) are just plain RAM hogs and there's little a 6GB RAM user can do as it's down to the "weight" of the engine. Personally, I have 16GB in both rigs (bought at less than half current prices) but if you're buying new today and can only afford 8GB, certainly the more expensive 2400G becomes a much harder sell vs a $70-$100 CPU + GT1030 which even given the same speed, effectively comes with a "free" $25 extra 2GB stick of RAM (8 vs 6 of 8GB usable) that potentially saves another $100 on 16GB (14GB) vs 8GB and it's mostly the cheaper 2200G + lighter weight games that makes the most sense for a genuinely "budget" build.

IF the cheaper 2200G can do 1080p/med/60 (or near enough with a little tweaking) on the titles mentioned in the first paragraph, then it'll be a solid buy from me vs an i3-8100 + H310 / B360 board. But I think those wanting APU's for bleeding edge heavyweight AAA titles need to "keep it real", and be honest with themselves right from the start that what they need is 16GB RAM and / or a 4GB dGPU (1050Ti min) for 1080p low/med, even if they don't want to budget for it given the current unfortunately skewed pricing climate.

Ultimately, there's only 15 days left until launch, and I'm pretty sure no-one's had a coroner write "a geek who couldn't wait 2 weeks for benchmarks" as cause of death... :p
 

KompuKare

Senior member
Jul 28, 2009
853
553
136
Funny how when AMD utterly destroys Intel in integrated performance its not important and suddenly there is going to be a GT2030 released very soon(just to try and scare people off getting a Ryzen 3 2200,etc if they can),but if Intel is even 10% faster with a £100 CPU in a game at 720p with a Titan Xp its huge or is 5% faster in SuperPi its massive!!
Don't really know where this GT2030 (or GT1140) is meant to come from anyhow.

If it's Volta (Ampere is pretty much an unknown for now), there is the extra performance meant to come from? Because while GV100 has some nice AI specific features, it really doesn't seem to bring any extra gaming performance especially for it's die size. And TSMC's 12nm doesn't really bring much compared with their 16nm process.
Architecturally, Nvidia have already used up most of the low hanging fruit like tile-based rendering and stripping out compute from their gaming cards etc., so it's hard to see what they can do for Volta.
So about the one way I see a Volta design outperforming Pascal significantly is if Nvidia are willing to sacrifice die space (and margins) to enable it to do so. With their record margins it is always possible that they will have to do just that, but I think anyone expecting miracles from Volta/Ampere will be disappointed. More likely is that Pascal to Volta/Ampere will be more like Kepler to Maxwell v1 (like GP107 to GM107). Although for specific SKUs they may be more willing to use a bigger die (so I guess it's possible that if they think GT1030 is under lots of competitive pressure they may be willing to raise GP107's 70mm² to 100mm² for GV107/GA107).

As for Ryzen 5 2400G and Ryzen 3 2200G , I hope AMD eventually release some in-between models as while 2200G is very well prices, 2400G seems is less so. (Wonder if someone will be able to unlock the shaders on 2200G though as has often been the case for previous AMD products?)
 
  • Like
Reactions: USER8000

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
This is why I hate pre-release speculation threads. They all end the same way with dozens of pages of some prematurely "over-dumping" on the product whilst others are constantly over-selling it for heavyweight AAA's at 720p/Very Low settings they'd never use themselves after the immediate post-APU purchase novelty had worn off. The Ryzen APU's look very good vs Intel's same priced new stuff (i3-8100 or G5400-G5600 Pentium's, etc) IF your needs are genuinely light and you do not need a dGPU. Personally, I'm going to keep a close eye on the 2200G with the intention of throwing one into a hybrid HTPC / light gaming rig for use on 1990-2013-ish era games (basically Day of the Tentacle to Dishonored 1, ScummVM to Skyrim, etc) as well as a number of newer lighter weight Indie's (Don't Starve, QUBE2, This War of Mine, Thimbleweed Park, maybe Talos Principle 2 at a push).

Having said that, I've hardly seen anyone here actually talk about playing older games or actually name new Indie's they want to play. It's been "Witcher 3, Witcher 3, Witcher 3", even talk of 2018-2019 AAA titles like CyberPunk 2077. APU's are at their worst on new AAA heavyweight's and at their best on older / lighter titles, yet after 32 pages am I really the only one who's actually named predominantly non AAA's to actually play myself?

Throwing theoretical bandwidth figures around for heavyweight games is meaningless for APU vs dGPU comparisons due to a variety of constraints (1. Bandwidth being shared with the CPU, 2. Dynamic TDP sharing (ie, throttling to stay within 65w especially in "thin" SFF builds that will also constrain OCing), 3. 2GB less usable RAM, etc), and since different games behave in different ways, there really is no "one size fits all" prediction formula to declare as a single "absolute fact". Even the slides in the first post show this (45 > 49 (+9%) for Rocket League, 87 -> 96 for Skyrim (+10%) and BF1 flatlined at 52fps on both) given a +37.5% difference in APU shaders (ie, despite 704 vs 512 shaders, the 2400G's performance seems "capped" somewhat around the 576th shader assuming settings are the same). Whether that performance wall is down to DDR4 bandwidth, dynamic TDP sharing throttling or something else remains unknown until people actually get their hands on one and start testing all variables. The difference in how Skyrim scales vs BF1 though shows why OCing something by X% in one game doesn't necessarily lead to same x% gain in all games, and why the only sane thing to do when giving purchase 'advice' is simply "wait for the benchmarks".

Some games may get very close to a GT 1030, others won't no matter what tweaking you do. Likewise, newer games with heavier engines tend to be more VRAM / RAM thirsty at lower settings relative to same visuals on older engines, eg, turning a UE4 game down to 1080p/low doesn't necessarily keep VRAM usage under 2GB the same way it used to with UE1-3 games (even on High/Ultra). One of the biggest problems for 1-2GB "VRAM" APU / GPU's over the past 3-4 years has been chronic VRAM bloat on post +2014 era games, and many of last year's titles remained above 2GB even on "very low" settings - examples (Wolfenstein 2, Dishonored 2, etc). So again, those wanting 2GB "VRAM" APU's for 2018-2020 AAA games and are relying on "low" presets solving engine-bloat are being "overly-optimistic" to put it politely.

Whilst new engines do look better high-end (4K textures, etc), they've also conversely become a lot less efficient in providing sub 2GB VRAM 1-2k textures on "Low/Med" presets vs how well 2007-2014 on Med-Ultra written on an older engine run. Likewise for other games dropping resolution from 4K > 1440P > 1080p > 720p doesn't actually lower (System) RAM usage that much any more which is going to be problematic for heavier titles on a 8GB - 2GB iGPU VRAM = 6GB rig, especially if you've got a web browser with a walkthrough / wiki guide left open in the background. Example ME: Andromeda (10.1GB 4K / 8.2GB 1440P / 7.3GB 1080p = 6.0-6.5GB for 720p?) vs how older games typically remained under 2GB process / 4-5GB system usage due to being predominantly 32-bit.

I've seen UE4 titles like Obduction or Everybody's Gone To The Rapture "out of RAM" crashing on a 8GB RAM + dGPU with a 1GB browser in the background even with settings lowered yet remain stable closing the browser (ie, 7 vs 8GB = playable vs not playable), so unless you actively avoid these titles, having only 4-5GB (vs 6-7GB) to play with (8GB - 2GB "APU shared VRAM" - 1-2GB OS & background apps) is going to hit those limits even earlier or grind everything to a halt via constant pagefile swapping. Some modern game engines (UE4 & Unity in particular) are just plain RAM hogs and there's little a 6GB RAM user can do as it's down to the "weight" of the engine. Personally, I have 16GB in both rigs (bought at less than half current prices) but if you're buying new today and can only afford 8GB, certainly the more expensive 2400G becomes a much harder sell vs a $70-$100 CPU + GT1030 which even given the same speed, effectively comes with a "free" $25 extra 2GB stick of RAM (8 vs 6 of 8GB usable) that potentially saves another $100 on 16GB (14GB) vs 8GB and it's mostly the cheaper 2200G + lighter weight games that makes the most sense for a genuinely "budget" build.

IF the cheaper 2200G can do 1080p/med/60 (or near enough with a little tweaking) on the titles mentioned in the first paragraph, then it'll be a solid buy from me vs an i3-8100 + H310 / B360 board. But I think those wanting APU's for bleeding edge heavyweight AAA titles need to "keep it real", and be honest with themselves right from the start that what they need is 16GB RAM and / or a 4GB dGPU (1050Ti min) for 1080p low/med, even if they don't want to budget for it given the current unfortunately skewed pricing climate.

Ultimately, there's only 15 days left until launch, and I'm pretty sure no-one's had a coroner write "a geek who couldn't wait 2 weeks for benchmarks" as cause of death... :p
So you decided to weigh in with your epic length, over selling post. :D

What makes your choice of games more valid than mine?

Witcher 3, is a game I mentioned because it is the ONLY game I can't play on my old computer, that I actually want to play, and is thus, a driver for my new computer upgrade decisions.

Most of us probably already have computers that can handle older, and lighter weight games, and if we are buying something new in 2018+ it will be to play newer, more intensive titles.
 

Insert_Nickname

Diamond Member
May 6, 2012
4,819
1,491
136
Any naysayers considering the idea that the lowest end cards are a moving target? Why would any graphics company develop a low powered card thats surpassed by an APU. The lowest cards, almost by definition, will have to try and be above the APUs performance wise. What else justifies there existence?
Display connectivity/multimonitor. Adding modern video decoding to older systems with outdated IGP.

Not everything is gaming.
 
  • Like
Reactions: tential

neblogai

Member
Oct 29, 2017
144
49
101
Having said that, I've hardly seen anyone here actually talk about playing older games or actually name new Indie's they want to play. It's been "Witcher 3, Witcher 3, Witcher 3", even talk of 2018-2019 AAA titles like CyberPunk 2077. APU's are at their worst on new AAA heavyweight's and at their best on older / lighter titles, yet after 32 pages am I really the only one who's actually named predominantly non AAA's to actually play myself?
I play a lot of really old titles, and in general my preferred genre (tactical/strategy) does not require high frame rate. So I'm actually considering exchanging my R5 1600 to 2400G to play around with it. My current dGPU is very old, and not that much faster than expected Vega11 speed anyway.

Whilst new engines do look better high-end (4K textures, etc), they've also conversely become a lot less efficient in providing sub 2GB VRAM 1-2k textures on "Low/Med" presets..
While there is a movement toward 4K, more RAM and VRAM, more cores and GPU power on PC- there are positive developments for ultra-budget as well. It probably comes from console hardware being limited (weak CPU + total 8GB of (V)RAM for most of them) -which helps ports to run on relatively weak PCs. The most important trend, I think, is coming from consoles constantly scaling (lowering) resolution to achieve acceptable frame rate. On PC- this turns out into things like 50% resolution scale on 720p or 1080p- making it possible for well programed games like Destiny 2, Overwatch or Wolfenstein 2 run on almost anything. So- I'd say there is also a trend of game software market getting friendlier to ultra-low-end PC hardware.
 
Last edited:

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
No, he does get it, and I agree with him. RR is more balanced, but if you plan to game on the igpu, nothing has changed. Apus werent cpu limited in most gaming scenarios before, they were bandwidth and TDP limited, and still are. If someone had suggested before building a "gaming" PC with a quad core cpu and a 1030 he would have been laughed out of the forums, especially by the "moar cores" lobby that were claiming quad cores were dead. Now it is supposed to be this great "future proof" solution that a huge chunk of the market will buy?? If all one wants to play is esports and older games, just look at Newegg. For 800 dollars, you can buy a laptop with a gtx 1050 and and a hyperthreaded quad core i7 that will almost certainly outperform either of the RR desktop apus. For slightly over 700 dollars, you can get a laptop with the same gpu and a quad mobile i5. You lose the ability to upgrade, but you get better performance out the door, and the advantage of mobility. And as for "keeping the missus happy", I hardly think she will be happy with the upgrade path either: "Wait, honey, didnt we just buy a computer? Why are we having to spend more money now?"
It sounds like you believe there are a tiny amount of low end entry class discrete GPUs like the 1030 being sold. I would guess that this class of card is the highest volume GPU on the market, so obviously somebody must be buying them (whether in an OEM solution or DiY). Mainly because intel's weak GPU's aren't capable of entry level graphics performance for an acceptable experience. It has nothing to do with what hardcore gamers would consider buying for an ultimate gaming machine for the latest AAA games at max quality. There are a heck of a lot more casual gamers than hardcore gamers in the world. Ryzen G fills that demand because of it's strength in both graphics and general compute. Previous AMD APUs were capable of playing games at the entry level, but buyers had to compromise on lower comparable general compute performance. Casual gamers aren't going to buy a computer strictly for gaming, it is not the primary purpose. There is no compromise with the Ryzen G series if it has comparable CPU performance and class leading GPU performance that is comparable to an attached discrete GPU.
 
  • Like
Reactions: DarthKyrie

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
It sounds like you believe there are a tiny amount of low end entry class discrete GPUs like the 1030 being sold. I would guess that this class of card is the highest volume GPU on the market, so obviously somebody must be buying them (whether in an OEM solution or DiY).
Someone must be buying them because you guess they are the highest Volume? What kind of logic is that?

If someone is buying a GPU for gaming it is extremely unlikely it will be the 30 series.

In the top 10 GPUs on Steam, 100% of them are 50 series or higher.
http://store.steampowered.com/hwsurvey/videocard/

50 and 60 series are the Overwhelming choice for gamers.

Most 30 series chips probably end up in laptops, not as Desktop GPUs.

As a desktop GPU it's almost pointless. If you game, you want more than that, if you don't game, you don't need a dGPU.
 
  • Like
Reactions: tential

whm1974

Diamond Member
Jul 24, 2016
9,460
1,566
96
Someone must be buying them because you guess they are the highest Volume? What kind of logic is that?

If someone is buying a GPU for gaming it is extremely unlikely it will be the 30 series.

In the top 10 GPUs on Steam, 100% of them are 50 series or higher.
http://store.steampowered.com/hwsurvey/videocard/

50 and 60 series are the Overwhelming choice for gamers.

Most 30 series chips probably end up in laptops, not as Desktop GPUs.

As a desktop GPU it's almost pointless. If you game, you want more than that, if you don't game, you don't need a dGPU.
That depends, dGPUs like the 1030 are common upgrades to HTPCs and older systems to provide updated ports like HDMI 2.0 as well as modern video decoding.
 

piesquared

Golden Member
Oct 16, 2006
1,651
473
136
Someone must be buying them because you guess they are the highest Volume? What kind of logic is that?

If someone is buying a GPU for gaming it is extremely unlikely it will be the 30 series.

In the top 10 GPUs on Steam, 100% of them are 50 series or higher.
http://store.steampowered.com/hwsurvey/videocard/

50 and 60 series are the Overwhelming choice for gamers.

Most 30 series chips probably end up in laptops, not as Desktop GPUs.

As a desktop GPU it's almost pointless. If you game, you want more than that, if you don't game, you don't need a dGPU.

You realize the 1030 is not listed anywhere on that steam list? What do you suppose the chances are that not a single 1030 has been sold? Or is it that it is listed in one of the other categories, perhaps in the 11.8% other category, or not counted at all when paired with an intel APU?

http://store.steampowered.com/hwsurvey/videocard/?sort=name
 

PeterScott

Platinum Member
Jul 7, 2017
2,605
1,540
106
That depends, dGPUs like the 1030 are common upgrades to HTPCs and older systems to provide updated ports like HDMI 2.0 as well as modern video decoding.
I somewhat agree. But HTPC are a small niche to start with, and they come in a huge variety of configs, from IGPs to fairly powerful GPUs for living room gaming as well.

Also this isn't really a market that an APU would cut into (dropping in a dGPU to upgrade an old system).

So, I really don't think the GT 1030 presents any kind of significant target for making APU gains, outside of laptops.
 

scannall

Golden Member
Jan 1, 2012
1,902
1,529
136
It seems the mining bubble has even affected the GT1030. The cheapest of those on Newegg is $75.
 

maddie

Diamond Member
Jul 18, 2010
4,331
3,849
136
Actually, you just made a very strong case *against* an APU. Yes, low end cards are a moving target, but that is one big disadvantage to an APU. Once they start to catch up, discrete cards take another step up in performance.
Sounds like an argument to never buy anything. Tomorrow is always better. I'm sure you didn't mean that, but your zealousness to marginalize RR leads to some really strange arguments.
 

USER8000

Golden Member
Jun 23, 2012
1,527
761
136
It seems the mining bubble has even affected the GT1030. The cheapest of those on Newegg is $75.
I checked the price of the cheapest GT1030 and a G4560 on Scan and it came to £125. Looking at the RRP for the 2200G,it should be under £90 in the UK,so the GT1030 combo is 40% more.
 

ASK THE COMMUNITY