Zen 2 APUs/"Renoir" discussion thread

Page 14 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Again for the n-th time, it is not replacing but complementing the 3200g & 3400g

It ends up offering more diverse options to enthusiasts and allows the old flagship to be significantly discounted. (That is a complete win-win for consumers.) So either can get a much better deal on top of the line igpu gaming with a 3400g or you can at the same price get stronger CPU cores with an iGPU exceeding that of a 3200g but with almost twice the CPU capability as the 3200g.

The fact that 7nm Vega 7 outperformed the old 11 Vega flagship in the Rainbow6 iGPU-test (and probably traded blows in other iGPU tests) is impressive to me.

If you are so much stressing the (pretty insignificant) ~15% GPU disadvantage of 7nm Vega 6 over the flagship Picasso iGPU and talking about the great importance of the iGPU then also kindly point out that the i3-10100 that you are promoting is an absolute joke that shouldn't even be an option in your context.

You must misunderstand my point, which is that any smart enthusiast knows that he can (usually) also go the dGPU route. Hence, that person might get himself a 3600 + RX 570 at a not dramatically higher % build cost while getting well over 2x any top end iGPU performance, or a 2600+570 for a good bit below the cost of a 4750g

Im sorry but ill belive this "complement" theory when i see it, nothing that AMD did so far leads me to belive such a thing, everything that AMD did so far goes against that, cutting motherboard compatibility, launching the Athlon 3150G, what is likely be way too near the 3200G pricing, they killed 3400G supply earlier this year for months and it only recovered like a month or so ago.
I also find hard to belive that AMD wants a cheapper Picasso to steal sales from the more expensive Renoir and B550 motherboards due to having velue IGP perf value. Im not going to say that it is impossible, but it is a wishfull thinking at this point.

BTW, im not suprised by Vega 7 @1.9ghz perf... i already knew that Vega 8 on 3200G needs about 1.6Ghz to more or less match Vega 11 at stock. So that Vega 7 performance is what i was expecting actually. Dont forget that i used a 2200G then a 3200G for a year for gaming whiout a dGPU, i know very well what Vega 8 can do when overclocked, and yes it is a lot. This is part of the reason why i said the 4600G should be the one closer to 3400G pricing.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
If those leaked numbers are real than that is a bit disappointing to be honest. I am skeptical because the 4750G basically matched the 3700X in Cinebench and x265. Same for the 4670G/3500X. Maybe in gaming it is just a complete turd. I also don't remember the 3200G/3400G being so much worse than Pinnacle Ridge in gaming. I am having a tough time finding a valid comparison though as nobody seems to have the 3400G and 2500X in the same benchmarks.

Based on that leak you may have a point. I still want to see confirmed gaming results first though. I still don't see how AMD can sell these cheaper than the same core Matisse variants seeing as you get an iGPU.

So I dug a bit more and this is the best I found so far. The 1300X vs the 2200G. Very similar clocks, 2200G has half the L3. In all but one game (and one outlier result in World of Tanks) they basically match each other in games. I'm not saying that leak is wrong. It just would seem odd that if Raven Ridge matched Summit Ridge in gaming and suddenly Renoir is noticeably worse than Matisse.


I agree, we need to wait to see full reviews, it is just that those numbers are not looking good, but they may be wrong. And im very suprised by the bad CPU perf in gaming i trought Renoir would do very good there.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
BTW, im not suprised by Vega 7 @1.9ghz perf... i already knew that Vega 8 on 3200G needs about 1.6Ghz to more or less match Vega 11 at stock. So that Vega 7 performance is what i was expecting actually. Dont forget that i used a 2200G then a 3200G for a year for gaming whiout a dGPU, i know very well what Vega 8 can do when overclocked, and yes it is a lot. This is part of the reason why i said the 4600G should be the one closer to 3400G pricing.

Out of curiosity, could you give us a few examples of prices of video cards in your country? I think at one point you said they are stupidly expensive compared to the US? Say an RX 580 or 1660. How about the used market? Something like an RX 480 or 1060.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
I would be a little surprised if AMD kept supplying significant volumes of Picasso except to fulfill long term contracts (where applicable).
 

amd6502

Senior member
Apr 21, 2017
971
360
136
I would be a little surprised if AMD kept supplying significant volumes of Picasso except to fulfill long term contracts (where applicable).

Renoir laptops are upper-mid and higher end mainstream and high end laptops. Picasso laptops are still a bulk (maybe majority) of AMD mainstream laptops, although the Raven2 budget Athlon laptops will at some point outpace Picasso. I would think these are still very much in production (or they have accumulated a large stockpile). There doesn't seem to be much reason why Picasso or Pinnacles would be out of production. They seem to offer good perf/$ and are probably relatively cheap to make vs 7nm Renoir.

What Renoir will do is greatly drop the demand for the high bin mobile Picasso dies (namely 3700u). 3500u will still be a very mainstream product and compete well with Renoir because it's lower-mid mainstrem in pricing. I think this will free up a lot Picasso dies for AM4 with 8-11CU. Imho, a new ~3300g (3380g?) bin with 10 CU and 8t at 3.5-4ghz at $120 would instantly become a darling of budget iGPU gamers.

3300u (6CU) will also stay relevant although it's getting some competition from Raven2 dual cores (3CU Athlons). Bottom line, I really don't think Picasso is going to vanish. No way, not until there is a Zen 2 or 3 quadcore APU substitute already launched.
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Picasso very nicely does two things: it fills volume at the bottom end of the market and it continues to fill WSA terms with GloFo. As the market moves on from old Bristol Ridge and prior based APUs at the very bottom, it still gives AMD something to produce cheaply at volume, especially in the mobile market. Remember, the rest of the world doesn't revolve around the US market, and a lot of it needs to shop in the absolute bottom end of the market. AMD needs product in that space that is competitive.
 

Asterox

Golden Member
May 15, 2012
1,026
1,775
136
Picasso very nicely does two things: it fills volume at the bottom end of the market and it continues to fill WSA terms with GloFo. As the market moves on from old Bristol Ridge and prior based APUs at the very bottom, it still gives AMD something to produce cheaply at volume, especially in the mobile market. Remember, the rest of the world doesn't revolve around the US market, and a lot of it needs to shop in the absolute bottom end of the market. AMD needs product in that space that is competitive.

Price+performanse ratio, this is the most important thing.It is completely irrelevant where someone lives, US or Klingon or for example EU countries Austria or Albania.

Can we complain, or is it Athlon Gold 3150GE(Zen+) maybe to expensive for 60$.All new Athlon models use Vega 3 CU at 1100mhz.

2020-07-24_131719.jpg

A lot of people buys CPU(especially for gaming)or APU they dont really need.

I can buy 6/12 Renoir APU, but for my HTPC needs that would be a big waste of money.4/8 Renoir APU is more then enough for my needs.

My Athlon 3000G(Zen), for now it doing his job with no problems.I paid that 2/4 APU 55$, so lets get back to the green 4/4 Athlon for roughly 60$.
 
  • Like
Reactions: Tlh97 and amd6502

mohit9206

Golden Member
Jul 2, 2013
1,381
511
136
One thing is i am very disappointed in is that back around 2013-2015 when APUs were very weak and AMD had laid out a long roadmap and great things were to come when it came to iGPUs in APUs becoming powerful, looking now at 2020 its very disappointing that even the latest APUs like 3400G cannot play games on anything other than low/720p in AAA titles. I don't feel much progress has been done since 2400G in 2018. They gimped on the gpu in 4000 series when it should have had same or more CUs. APUs have made sub 80 dollar GPUs redundant but 4000 series still can't beat 100 dollar cards like 1050 which is where real gaming performance begins.
 

chrisjames61

Senior member
Dec 31, 2013
721
446
136
One thing is i am very disappointed in is that back around 2013-2015 when APUs were very weak and AMD had laid out a long roadmap and great things were to come when it came to iGPUs in APUs becoming powerful, looking now at 2020 its very disappointing that even the latest APUs like 3400G cannot play games on anything other than low/720p in AAA titles. I don't feel much progress has been done since 2400G in 2018. They gimped on the gpu in 4000 series when it should have had same or more CUs. APUs have made sub 80 dollar GPUs redundant but 4000 series still can't beat 100 dollar cards like 1050 which is where real gaming performance begins.


Doesn't it all stem from using slow system ram for the igpu? That more CUs would not make a difference?
 

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Picasso very nicely does two things: it fills volume at the bottom end of the market and it continues to fill WSA terms with GloFo. As the market moves on from old Bristol Ridge and prior based APUs at the very bottom, it still gives AMD something to produce cheaply at volume, especially in the mobile market. Remember, the rest of the world doesn't revolve around the US market, and a lot of it needs to shop in the absolute bottom end of the market. AMD needs product in that space that is competitive.

This^^

Can we complain, or is it Athlon Gold 3150GE(Zen+) maybe to expensive for 60$.All new Athlon models use Vega 3 CU at 1100mhz.

(Edited)

My Athlon 3000G(Zen), for now it doing his job with no problems.I paid that 2/4 APU 55$, so lets get back to the green 4/4 Athlon for roughly 60$.

I really like my own 200GE. Cheap, quiet, low power, does what's asked of it with no fuss. Can even run older games acceptably.

It's great even basic Athlons are moving to quad core. All that's perhaps missing is a 2C/4T option with a Vega6 (even 8) IGP for emerging markets. Original Athlons where cut down Raven, so it should be possible. But such a chip might be a little too popular.

Doesn't it all stem from using slow system ram for the igpu? That more CUs would not make a difference?

Faster RAM helps. But if you're on a budget, the value proposition might be a little iffy.
 

amd6502

Senior member
Apr 21, 2017
971
360
136
It's great even basic Athlons are moving to quad core. All that's perhaps missing is a 2C/4T option with a Vega6 (even 8) IGP for emerging markets. Original Athlons where cut down Raven, so it should be possible. But such a chip might be a little too popular.

Now that 4c/4t (Raven salvage) Athlons are out the only distinguishing consistent features of Athlons (vs Ryzen) are the Vega 3 graphics and 4 threads.

But now with Athon being split into gold and silver groups maybe 6CU is an interesting possibility. How about 6CU + 3c/6t at ~ 3.3-3.6ghz ?
 

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Looks like AMD really dropped the ball quite hard when it comes to memory latency on Renoir. I did not expect a regression compared to Matisse.
It's not entirely surprising. Matisse has a 1:1 FCLK:DRAM divisor, and a 1:2. Renoir, supposedly had some sort of variable ratio, which to implement, in simple terms, requires some async / buffering going on, which necessarily means worse latency. At least, that seemed obvious to me.
 

Gideon

Golden Member
Nov 27, 2007
1,608
3,572
136
It's not entirely surprising. Matisse has a 1:1 FCLK:DRAM divisor, and a 1:2. Renoir, supposedly had some sort of variable ratio, which to implement, in simple terms, requires some async / buffering going on, which necessarily means worse latency. At least, that seemed obvious to me.
Not everybody reports the regression. Hardwareluxx got their 4650G's FCLK running at 2200MHz and the latency down to 50.7 ns (custom timings):


Even their 3600 MHz bandwidth/latency results are very good. Certainly not a regression at least.
 

Attachments

  • Screenshot_20200727-004827.jpg
    Screenshot_20200727-004827.jpg
    102.2 KB · Views: 23

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
Doesn't it all stem from using slow system ram for the igpu? That more CUs would not make a difference?
Indeed, at the top end all AMD APUs are seriously bottlenecked by the RAM so there is just no room to grow significantly further until RAM spec is upgraded. Which will happen with DDR5 in 2022 or later.

Looks like AMD really dropped the ball quite hard when it comes to memory latency on Renoir. I did not expect a regression compared to Matisse.
Latency on Renoir is a dynamic variable that depend on the type of load, whether it asks for more bandwidth or lower latency.
 
  • Like
Reactions: Tlh97

tamz_msc

Diamond Member
Jan 5, 2017
3,710
3,554
136
Latency on Renoir is a dynamic variable that depend on the type of load, whether it asks for more bandwidth or lower latency.
That makes some sense in case of the -U parts, but there's no excuse for DRAM:FCLK to not be 1:1 by default on the desktop parts, if it turns out that it isn't set like that by default.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
Picasso very nicely does two things: it fills volume at the bottom end of the market and it continues to fill WSA terms with GloFo. As the market moves on from old Bristol Ridge and prior based APUs at the very bottom, it still gives AMD something to produce cheaply at volume, especially in the mobile market. Remember, the rest of the world doesn't revolve around the US market, and a lot of it needs to shop in the absolute bottom end of the market. AMD needs product in that space that is competitive.

Thing is, most of us in the "first world" have no idea what is being sold in down markets or for what prices. If Picasso is what AMD wants to use in those markets then so be it, but if I recall correctly, the latest WSA is negotiated in such a way that AMD can avoid taking wafers if their product lineup requires something more-sophisticated than what GF can provide. In other words, if they shift all markets to Renoir, then their obligations for GF wafers will drop accordingly. I'm still not convinced Picasso is the best/cheapest way for them to go when 7nm products open up lots of possibilities for down markets such as longer battery life and cheaper cooling apparatuses, which OEMs will like (they can select poor cooling and smaller batteries).

All that aside, we are discussing desktop Renoir here, and availability of 3400G chips has been spotty. I would not be surprised if they just vanished sometime soon. Most of the world simply won't care. Anyone choosing between a 3400G and 4300G (or whatever) may care a great deal about that.

That makes some sense in case of the -U parts, but there's no excuse for DRAM:FCLK to not be 1:1 by default on the desktop parts, if it turns out that it isn't set like that by default.

Definitely agree with you there. Unless AMD is hoping the 4750G won't undercut sales of the 3700x while they're still peddling it.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Eventually, AMD will have to walk away from the old 24nm nodes from GloFo for their bottom tier products. While GF14/12 are far from perfect, they are very mature nodes. Their costs are continuing to diminish, and the products that they produce continue to fall down market. Intel will be dumping 14nm atoms on the market for years to come, and AMD will need something better than Bristol ridge to compete.

Do I expect Picasso to exist forever? Of course not, but it’s going to be here a while. AMD May eventually release a small die N7 APU, which we all think is Dali. Even heavily cut down, it’s still going to be more expensive than Raven2 and Picasso per die. For a long time to come. It doesn’t make sense to abandon Picasso too quickly.
 
  • Like
Reactions: Tlh97 and amd6502

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136

Here's Death Stranding on the 4700G. Seems to run okay with 1080p CAS.

I was coming to coment just that, TechEpiphany got his hands on a 4750G and is making gaming benchsmarks. Hes not making direct comparisons, but for example the GTA V on 1080p with DDR4-4133 it is like 10-15fps on avg faster than the 3400G with DDR4-3466, it is a sku that cost twice as much, running expensive rams in a more expensive motherboard... at least it does performs better. But Vega 7 and 6 arent going to fare that well.

Im unsure about something, Vega on Renoir does not longer have the 2GB VRAM limit that Picasso had since day 1... this VRAM limit change its definately going to help Renoir in certain games, but last time i checked this limit was still present on Picasso, so Vega on Renor has a 16GB memory limit and Vega on PIcasso 2GB memory limit?

EDIT: Death Stranding is definately a significant win for the 4750G, im unsure by how much, but a 3400G/3200mhz rams cant do that at a constant +30 fps.
 
Last edited:
  • Like
Reactions: Elfear

VirtualLarry

No Lifer
Aug 25, 2001
56,226
9,990
126
Im unsure about something, Vega on Renoir does not longer have the 2GB VRAM limit that Picasso had since day 1... this VRAM limit change its definately going to help Renoir in certain games, but last time i checked this limit was still present on Picasso, so Vega on Renor has a 16GB memory limit and Vega on PIcasso 2GB memory limit?
Does this effectively mean, that due to modern game VRAM requirements, that a "budget APU gamer" needs 32GB of (fast) DDR4 now? Whereas, a "normal" dGPU build could get away with 16GB of 3200? Sure makes some of the budget/pricing consideration between the two kind of topsy-turvy.

Anyways, Gigabyte GTX 1650 SUPER for $154.99 @ Newegg right now, consider that in the pricing stack too.
 
  • Like
Reactions: Tlh97 and moinmoin

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Does this effectively mean, that due to modern game VRAM requirements, that a "budget APU gamer" needs 32GB of (fast) DDR4 now? Whereas, a "normal" dGPU build could get away with 16GB of 3200? Sure makes some of the budget/pricing consideration between the two kind of topsy-turvy.

Anyways, Gigabyte GTX 1650 SUPER for $154.99 @ Newegg right now, consider that in the pricing stack too.

I dont think OEMs will ever enable the 16GB option in the "auto" setting im petty sure they are going to keep using 2GB in auto selections. At any rate, with this kind of performance there is zero point in enabling more than 3GB of VRAM, maybe 4GB depending on game... im worried of what the driver will do regardless, remember that in PIcasso regardless of the bios settings, the driver was always able to use up to 2GB.
Anyway i think a minimum of 16GB will still be enoght.

At any rate is not that important, because with these results and prices NO ONE should be using Renoir on desktop for any kind of gaming, 4750G for IGP gaming makes no sence, EVEN with the idea of add a GPU later due to the low CPU+GPU results. And it looks like the 3400G is still king of the price/perf for that use case at <$200. Unless you want to OC.
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Desktop APUs have always only made any sense in precious few use cases: extremely restricted budget (but rarely beats just buying used) or extremely restricted form factor. If you absolutely have to live in an ITX case that has absolutely no space for a GPU, then you are restricted to APU gaming. If that's the case, and you want the absolute best that you can get to fit in there, then the 4700/4750g is your only choice currently. With good ram and a modest overclock, it blows past the RX550 in most cases and dabbles in 1050 territory sometimes.

If you can't spend quite that much, and you still are that restricted for space, the 4600/4650 isn't a bad choice. Again, good ram and a modest overclock will put it past the 3400g almost every time, and when it comes down to CPU bound games that need more threads, it's a significant win.

I'll give you this much, if all you want to do is game, and nothing else is important to you, the 3400g looks to be a better purchase than the 4300/4350g. There seems to be precious few instances where it's going to outperform the 3400g, especially if that 3400g has a solid overclock applied. However, if AVX2 performance is important in any way, or if the games that you play are heavily CPU bound, the 4300/4350 makes more sense.

In the mobile space, it's no contest. Renoir has it all over Picasso. My older son's 4700u system demolishes my younger son's 3500u system across the board, and the 3500u has a better cooling system by far. The systems were within $100 of each other new, granted the 3500u is a year old now.
 
  • Like
Reactions: Tlh97

Insert_Nickname

Diamond Member
May 6, 2012
4,971
1,691
136
Now that 4c/4t (Raven salvage) Athlons are out the only distinguishing consistent features of Athlons (vs Ryzen) are the Vega 3 graphics and 4 threads.

But now with Athon being split into gold and silver groups maybe 6CU is an interesting possibility. How about 6CU + 3c/6t at ~ 3.3-3.6ghz ?

3C/6T w/ 6CU? I'm in. Don't think AMD is however. Could be a fun little chip to play with. If you'll pardon the pun...

Does this effectively mean, that due to modern game VRAM requirements, that a "budget APU gamer" needs 32GB of (fast) DDR4 now? Whereas, a "normal" dGPU build could get away with 16GB of 3200? Sure makes some of the budget/pricing consideration between the two kind of topsy-turvy.

You should always overspec the RAM when using an IGP. Otherwise you may end up with only 6GB useable system memory on an 8GB system. I don't need to mention what'd happen on a 4GB system.

Memory allocation is dynamic however. The IGP gets access to what it needs, regardless of UEFI setting. That just sets the minimum reserved, not what's actually used.