Zen 2 APUs/"Renoir" discussion thread

Page 20 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
Maybe from a design perspective, but not from an availability perspective. It was much easier to get Kaveri on desktop than mobile.

I think that has more to do with demand than anything else. Pre-Ryzen the cpu perf was just not there, and Raven and Picasso would have done a lot better if they allowed for DDR4-3200 like Renoir.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
OK, let's just forget the fact that you get four more CPU cores that are better than the ones in the 3400G. In MT that Renoir was a full 225% faster than the 3400G. I know you are focused on iGPU performance but there are other things to consider. If they had more fab capacity maybe we'd see a cheaper 4C, 8CU chip with high iGPU clocks. I have a feeling that with the next gen consoles around the corner they don't have much room for a SKU like that.
I think no one is arguing that the 4750G is better than the 3400G in all aspects. As it was no Ryzen 7 on Picasso there is really nothing to compare it with. That does not means it makes sence to use it at the $309 price point, outside some niche use cases.

Lower performance than Zen 2, despite being Zen 2. I get it, you meant gaming performance. I don't expect a 4300G to be similar iGPU-wise to a 2400G. Zen 3 CPU's are near but Zen 3 APU's? If anything Cezanne looks to be a loser with 8 Vega (still!?) CU's. Other faster 4/8 CPU's, but will a crap iGPU.
The cache has to affect more than gaming performance, they dont put cache in there just for games. Benchmarks is a bad way to test this. Calling Zen 2 cache as "gaming cache" was a bad marketing move.

4300G has SMT whereas the 3200G doesn't. It's also Zen2 vs Zen+. That is a generational upgrade. Minimal upgrade to the iGPU vs the 3200G? Well, I haven't seen any trustworthy reviews, but I'll concede that.
Maybe 2-3 years ago the 4300G could had been considered something else, in late 2020 with faster 4/8 cpus at 100-130 mark, the 3200G has to be replaced with a 4/8 APU with faster IGP perf, the 4350G is just that. Unless you want to keep having 4/4 stuff at $100 in 2021 there is no way to argue this.

That "new product" rule that you say isn't exactly true either. Sticking with AMD, they raised prices considerably with K7, K8, and dual core K8. They also raised the price a bit with the FX-8150 vs Thuban. That didn't go over well. And more recently they raised prices by far with Zen vs FX. This is AMD's best laptop chip ever and has gained them considerable market share. They are going to charge a premium for it.
Yeah the market always tend to fix that, remember the +$500 FX-9370?, how long did that lasted? a week? as long as there are other options that is a non issue, the problem with APUs is that there are no other options.
Maybe it would be better not to launch Renoir at desktop at all and use all supply for mobile. If they are doing it is because they are producing more than they can sell at mobile, they are not doing it to lose money, you can be sure of that. And to be fair, as Renoir is OEM only, so this is, at least for now, true.

And BTW, they raised the prices, but looking instances where the new product was slower in some way than the older one at the same price is far more difficult, the only instance that i do remember was the first FX gen that was in some cases slower than Thuban. And AMD paid dearly for that.
 
Last edited:

Thunder 57

Platinum Member
Aug 19, 2007
2,647
3,706
136
I think no one is arguing that the 4750G is better than the 3400G in all aspects. As it was no Ryzen 7 on Picasso there is really nothing to compare it with. That does not means it makes sence to use it at the $309 price point, outside some niche use cases.

I'd have to disagree. It makes for a great business part. Eight strong cores without the need for a dGPU. Perhaps that is why they are OEM only for now.

The cache has to affect more than gaming performance, they dont put cache in there just for games. Benchmarks is a bad way to test this. Calling Zen 2 cache as "gaming cache" was a bad marketing move.

Outside of gaming and things like 7-zip, I haven't seen much difference. I was against the "GameCache" marketing from the beginning. However, Zen's L3 is still a victim cache so it doesn't benefit some tasks.

Maybe 2-3 years ago the 4300G could had been considered something else, in late 2020 with faster 4/8 cpus at 100-130 mark, the 3200G has to be replaced with a 4/8 APU with faster IGP perf, the 4350G is just that. Unless you want to keep having 4/4 stuff at $100 in 2021 there is no way to argue this.

It doesn't "have" to be replaced by anything at the same price. Whether it should or not is what we are talking about. I think AMD is doing well in this segment considering the competition.

Yeah the market always tend to fix that, remember the +$500 FX-9370?, how long did that lasted? a week? as long as there are other options that is a non issue, the problem with APUs is that there are no other options.
Maybe it would be better not to launch Renoir at desktop at all and use all supply for mobile. If they are doing it is because they are producing more than they can sell at mobile, they are not doing it to lose money, you can be sure of that. And to be fair, as Renoir is OEM only, so this is, at least for now, true.

And BTW, they raised the prices, but looking instances where the new product was slower in some way than the older one at the same price is far more difficult, the only instance that i do remember was the first FX gen that was in some cases slower than Thuban. And AMD paid dearly for that.

Oh yea, the FX-9000 series was flaming garbage. It was still pricey for at least a couple months though.

As for the APU's and there being no other options, well that is why AMD can price them and see what the market will accept. If there was competition you bet prices would be lower. Just look at Intel and their quad cores from 2011-2017.
 

software_engineer

Junior Member
Jul 26, 2020
8
11
41
Outside of gaming and things like 7-zip, I haven't seen much difference. I was against the "GameCache" marketing from the beginning. However, Zen's L3 is still a victim cache so it doesn't benefit some tasks.

Compilation workloads benefit greatly from the greater L3 cache available on Matisse vs Renoir. Tom's Hardware tested the time taken to compile LLVM, a large C++ codebase in their review of the 4750G. The larger 32MB L3 cache of the 3700X allowed it to complete the compilation in 80% of the time taken by the 4750G with its 8MB cache, (477s vs 623s).

hQc3D63rRHtKnN5bi8TnXN-970-80.png.webp
 
  • Like
Reactions: Tlh97

AtenRa

Lifer
Feb 2, 2009
14,000
3,357
136
This is AMD's best laptop chip ever and has gained them considerable market share. They are going to charge a premium for it.

Renoir replaced Picasso in Laptops without a price increase, check prices of laptops with 4700U at release against Laptops with 3700U at release and you will see they have almost the same price.
Thats because there is a competition in Laptops with Intel 10th Gen Core Family at 10nm.

4750G at 300 USD is a nich product only suited to OEMs and not DIYs users.
As i have said before, AMD better not release Renoir to the Deskop DIY market and keep it to OEMs/ODMs.
 

cortexa99

Senior member
Jul 2, 2018
318
505
136

'A520 platform being segmented from the B550 with lack of PCI-Express gen 4.0 support'
That said A520 is able to overclock both CPU&GPU? Is it possible?
 
  • Like
Reactions: amd6502

Gideon

Golden Member
Nov 27, 2007
1,608
3,573
136
This 1Usmus review of Renoir 4650G on guru3d is quite interesting, particularily the dGPU gaming results.

Essentially a highly tuned Renoir (DDR4 @ 4533CL16 1:1) can almost match a highly tuned 3600x (DDR4 @ 3800CL15 1:1) and often beat it significantly in 1% lows and minimums (see Witcher 3, which also loves CPU bandwidth but also Battlefield).

Average FPS

index.php


index.php


index.php


index.php


And 1% lows/ minimums:


index.php


index.php


index.php


Latency:

index.php




While this might be "meh" as far as Renoir itself goes (nobody is going to use such memory kits on it), this IMO bodes very well to Vermeer (Ryzen 4xxx).

Due to L3 unification Vermeer essentially doubles the L3 size for games again, and if it can run similar FCLKs of 2166+ Mhz (DDR4 4333 Mhz) and latencies of ~55ns the gaming performance should make a significant jump.

One has to bear in mind though that chiplet design will add a bit more latency (should hopefully be <=5ns) and possibily limit FCLK as well. Hopefully not too much.

 

geokilla

Platinum Member
Oct 14, 2006
2,012
3
81
When will Zen 2 APUs be released to retail? Building an APU based rig immediately and right now I'm forced to buy a B450 motherboard to go with Ryzen 3 3200G, with plans to upgrade to Ryzen 5 4600G in the future.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
When will Zen 2 APUs be released to retail? Building an APU based rig immediately and right now I'm forced to buy a B450 motherboard to go with Ryzen 3 3200G, with plans to upgrade to Ryzen 5 4600G in the future.

Doesn't seem like they will any time soon, or at least not in the US.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
What it shocks me is the little diference for IGP perf when you compare DDR4-3200 to DDR4-4533 in that guru3d review, with the only exception being Battlefield V. There is really no much point in going over 3200mhz.
 

thigobr

Senior member
Sep 4, 2016
231
165
116
Yes, probably there's some other bottlenecks on the GPU itself because it's a 40% increase in bandwidth... Only Battlefield comes close to that number others are not bandwidth starved.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I'm not sure what you're seeing, but reading the graphs is telling me that, for a ~40% increase in ram bandwidth, most of the games are showing around a 20%+ increase in fps. The review also leaves the iGPU at stock speeds as well. With overclocking, the performance increases would have been even more pronounced.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
I'm not sure what you're seeing, but reading the graphs is telling me that, for a ~40% increase in ram bandwidth, most of the games are showing around a 20%+ increase in fps. The review also leaves the iGPU at stock speeds as well. With overclocking, the performance increases would have been even more pronounced.

im going to repeat myseft here... anything over DDR4-3200 becomes very expensive and harder to get. A DDR4-4000 kit could be as much as x2 the price of a 3200 kit, do you really think it is worth it to expend that much, i dont even want to know much a 4533/4600 kit cost... You you think that is worth it to gain 10 fps on avg? You can get a RX570 for as low as $130.

The IGP is too small to take full advantage of the extra bandwidth, you can see that going from 2133 to 3200 is a bigger jump in performance for a smaller jump in avalible bandwidth. And to be honest, even if it was a RDNA2 igpu with 1050ti perf with a 4533mhz mem kit it would still be too expensive to consider as long we can still get 570s.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,248
136
im going to repeat myseft here... anything over DDR4-3200 becomes very expensive and harder to get. A DDR4-4000 kit could be as much as x2 the price of a 3200 kit, do you really think it is worth it to expend that much, i dont even want to know much a 4533/4600 kit cost... You you think that is worth it to gain 10 fps on avg?

There's always the b-die lotto. My kit's the currently $100 G.Skill Flare-X that neweggs is selling. I'd never pony up for faster ram, but sometimes the lotto is good for you. I run my kit at 3600 CL14, but it's capable of much higher clocks. 3700x memory divider kicking into the 1:2 mode kills the fun!

4266MHz.JPG

I can see your argument about buying high priced faster ram.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
If you're spending over $100 extra on high speed ram when you have the OPTION of getting a PCIe video card, you're doing it wrong. Full stop.

If you are buying an APU because you don't need graphics performance, then it doesn't matter.

If you're buying an APU because you are living in a mini case that has no PCIe slot, then the option exists for you to buy better RAM to achieve the best available performance in that form factor. AMD simply offers the best combination of gaming performance in that form factor. You may not like their decisions, but that's what they have decided to offer.

Personally, I DO wish they would have offered a six core 4600/4650 version with an 8CU, full frequency, iGPU, as I can see a market for that product. But, they didn't. They know their yield splits and they know what they want/need to charge for it.
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
This 1Usmus review of Renoir 4650G on guru3d is quite interesting, particularily the dGPU gaming results.

Essentially a highly tuned Renoir (DDR4 @ 4533CL16 1:1) can almost match a highly tuned 3600x (DDR4 @ 3800CL15 1:1) and often beat it significantly in 1% lows and minimums (see Witcher 3, which also loves CPU bandwidth but also Battlefield).

Average FPS

index.php


index.php


index.php


index.php


And 1% lows/ minimums:

index.php


index.php


index.php


Latency:

index.php




While this might be "meh" as far as Renoir itself goes (nobody is going to use such memory kits on it), this IMO bodes very well to Vermeer (Ryzen 4xxx).

Due to L3 unification Vermeer essentially doubles the L3 size for games again, and if it can run similar FCLKs of 2166+ Mhz (DDR4 4333 Mhz) and latencies of ~55ns the gaming performance should make a significant jump.

One has to bear in mind though that chiplet design will add a bit more latency (should hopefully be <=5ns) and possibily limit FCLK as well. Hopefully not too much.
this a good news you can tune renoir much better than other ryzens
gaming mins in critical situations is all about latency
while you can't match intel offerings here, this a big win
the 4k lineup (ryzen 3)looks a good improvement
 

NTMBK

Lifer
Nov 14, 2011
10,208
4,940
136
From HotChips- I think this right here is why they stuck with Vega instead of RDNA, because they wanted a tiny die area:

202008172013111.jpg


And from the Q&A at the end:

03:29PM EDT - Q: 8 cores in this size, trade offs? A: We wanted to provide the 8 core perf, and as we analyzed with 7nm that if we took care we could enable 8 cores. Lots of content creation takes advantage of 8 cores. These come in handy with high perf applications. On the GPU, we were also careful in perf/watt and perf/mm2 that we can drive. We figured out in 7nm we could get higher frequency, so we balanced resources and that's why we went for 8 CUs. Vega with mem bandwidth gives better UX in 15W
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
It makes sense. They had to keep Renoir as compact as possible given the cost and yield nature of N7 during development. Going to 8 cores took up a LOT of transistors and die area. As we have seen, in mobile, Vega 8 is a nice improvement with respect to performance at each price tier, and Renoir is mobile first. I don't think that RDNA brings enough to the table at N7 to make it worth it to evict Vega. At N5, things are different. Two main issues arrive there: They would have to port VEGA to N5, which will not be a trivial or cheap task, when they are already doing development for RDNA2 on that node,and, the larger transistor costs of RDNA2 CU equivalents will not be as big of an issue at N5, as there will be more budget for transistors per square mm.

Remember, AMD doesn't have unlimited development funds, they still have to choose carefully and make trade-offs for each node/platform/generation.
 

leoneazzurro

Senior member
Jul 26, 2016
905
1,430
136
Probably there is also another reason: Vega was already a proven architecture for mobile use (they already managed to cut down power substantially in Picasso), while RDNA1 needed more optimization at the time Renoir/Cezanne were laid on the design table. It makes a lot of sense, cutting costs when a certain performance would have been downplayed anyway by the lack of bandwidth due to DDR4 systems. While this has allowed AMD to optimize the RDNA2 architecture better so it will be used in future APU iterations.
 

eek2121

Platinum Member
Aug 2, 2005
2,904
3,906
136
When will Zen 2 APUs be released to retail? Building an APU based rig immediately and right now I'm forced to buy a B450 motherboard to go with Ryzen 3 3200G, with plans to upgrade to Ryzen 5 4600G in the future.

I am a firm believer that we will never see a 4000G DIY release. Instead AMD will release 5000G APUs with Zen 3.

Probably there is also another reason: Vega was already a proven architecture for mobile use (they already managed to cut down power substantially in Picasso), while RDNA1 needed more optimization at the time Renoir/Cezanne were laid on the design table. It makes a lot of sense, cutting costs when a certain performance would have been downplayed anyway by the lack of bandwidth due to DDR4 systems. While this has allowed AMD to optimize the RDNA2 architecture better so it will be used in future APU iterations.

Currently, Vega is significantly more power efficient of the two. RDNA2 should improve the situation, but some additional optimization needs to be done. That being said, I personally believe RDNA in an APU is a mistake. The APU should be compute oriented.
 

jpiniero

Lifer
Oct 1, 2010
14,510
5,159
136
I am a firm believer that we will never see a 4000G DIY release. Instead AMD will release 5000G APUs with Zen 3.

Laptop demand is nutty right now. It's comparable to the bitcoin mining era with GPUs. They could still release Renoir to DIY but it won't be until things calm back down.
 
  • Like
Reactions: spursindonesia

leoneazzurro

Senior member
Jul 26, 2016
905
1,430
136
Exactly, RDNA2 will also need further optimization for use in APUs, of course the console experience will be quite precious in this regard. About APU needing to be more compute oriented, I don't know. Especially with the mobile market being the driving force behind them: video decoding/encoding and casual gaming are still the primary scope of these parts. Maybe ML applications - for these I see more and more space in the future.