Zen 2 APUs/"Renoir" discussion thread

Page 12 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
Quite frankly the difference in performance between a PCI 3.0 8x setup and 16x setup was tested on Techpowerup some time ago and resulted in a 2-3% difference with a 2080 Ti. And people buying a 2080 Ti will quite probably not settle for an AMD APU for their gaming rig.

Games really aren't that cache size heavy. I suspect that despite what it says, it's not running at the frequency they say it is.
 

leoneazzurro

Senior member
Jul 26, 2016
920
1,450
136
I am confused. How does cache size relate to PCI-E speed and number of lanes?
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
The only case where I've seen it demonstrated that the difference between PCIe 3.0 and 4.0 had any significant impact was in the 5500 parts. Particularly the 4GB parts at higher texture qualities and resolutions. The 5500 is capable of operating at PCIe 4.0, but only has an x8 bus, so, its a fairly large difference in bandwidth for a card that's already "tight" on VRAM.
 

jpiniero

Lifer
Oct 1, 2010
14,584
5,206
136
The only case where I've seen it demonstrated that the difference between PCIe 3.0 and 4.0 had any significant impact was in the 5500 parts. Particularly the 4GB parts at higher texture qualities and resolutions. The 5500 is capable of operating at PCIe 4.0, but only has an x8 bus, so, its a fairly large difference in bandwidth for a card that's already "tight" on VRAM.

We're talking about 3.0x8 versus 3.0x16. Turing only supports 3.0.
 

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Fair enough, but it's still not that big of a deal in most situations. For gpu instructions, x8 PCIe 2.0 was more than enough for most cards out there, it's the texture transfers across the PCIe bus that take up the bulk of the traffic. So, if you're trying to run the absolute highest texture quality (which means larger sizes for each texture) and you're VRAM limited at whatever resolution you're running, then that effective bus speed/capacity is going to have a big impact. So, a 2080ti, with its massive VRAM, isn't going to have much of an issue with the smaller bus. It's the smaller VRAM cards that seem most sensitive. It's also going to have a bigger impact if the game that you're playing has to swap textures in and out a whole lot. To give you an example of the issue, large open world games that have a very long distance field of view need to keep massive amounts of textures handy. The XbX and the PS5 are going to stream massive amounts of data from their SSDs because, even with a combined pool of 16GB of ram, that's still not enough to keep all the textures that they need ready to go. If developers port those kinds of games to the PC, and try to keep those same styles of scenes, then they're going to be saturating the PCIe bus as well.

So, PCIE 3.0 or 4.0, or x8 vs. x16, the thing that matters most is the level of texture transfer that the game requires to be sent from the storage or RAM to the video card's local VRAM and how able that bus is to get it there. It's going to be situational between different games or applications.
 
  • Like
Reactions: Tlh97 and amd6502

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
We'll see.

If you go back to the results, there are iGPU comparisons.

Except, it could also be a good CPU comparison. On Overwatch its getting over 60 fps, and in League of Legends, over 100. Maybe the faster CPU is also playing a part in why the Vega 6 in Ryzen 4000 is competitive with Vega 11 in Ryzen 3000, and using slower memory.

Alternate explanation is that Cerny's claims about clock speeds playing a bigger part than CUs being right. It does make sense. I mean clock speeds benefit everything while CUs only a subset.

CU affects compute performance and some game shaders use that. What you are going to see, it is actually very clear in these results, that the more simplier games, like Overwatch and LOL are going to take advantage of the extra clockspeed, while more shader complex games are going to need more than Vega 6 at 1.7Ghz to match Vega 11 at 1.4Ghz. These results are seems to reflect that, and the results that we already know of a overclocked 3200G vs stock 3400G support that as well. CS:GO is another game that i expect it will run better on the 4300G.

At the end of the day, with these results, the 4300G is a good 3200G replacement, but a $150? it is actually replacing the 3400G, and that is no good at all. I hope the press reflect that because it is unaceptable.
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
You say that as if the 4300g doesn't bring anything to the table that is better than the 3200g, which, when you consider the considerably faster CPU, is clearly wrong. The market has been resegmented. If you just need the CPU performance of the 3200g, but don't need the GPU performance, they have the Athlon Gold 3150. If you need the GPU performance of the 3200g, you have to step up a bit to get the 4300g, but pay more. There is benefit to the 4300g over the 3200 and the 3400, just not in the GPU in SOME cases. And, in other cases, the 4300g outperforms the 3400g.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
You say that as if the 4300g doesn't bring anything to the table that is better than the 3200g, which, when you consider the considerably faster CPU, is clearly wrong. The market has been resegmented. If you just need the CPU performance of the 3200g, but don't need the GPU performance, they have the Athlon Gold 3150. If you need the GPU performance of the 3200g, you have to step up a bit to get the 4300g, but pay more. There is benefit to the 4300g over the 3200 and the 3400, just not in the GPU in SOME cases. And, in other cases, the 4300g outperforms the 3400g.

At $150 it has to be better than the 3400G and with the results that we have, is not, at this point im not even sure if it has a worthwhile cpu uplift compared to the 3400G. And seriusly, dont even suggest replacing the 3200G with the Athlon 3150G at $100.

"resegmented" are you an Argentinian politician? because you certanly talk like one.
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
I must doubt your sanity when you are attempting to tell me that AMD should be charging less than they want for the 3300X ($120) for the 4300g, which is offering very similar overall system performance AND a competent iGPU with the same or better performance as the 3200g. You are really saying that you want AMD to give you what is essentially the capabilities of a 1030 for $20? That's horse excrement on a grand scale.

AMD has NOT discontinued the 3200/3400g products. They still exist in retail, and at their normal pricing. If YOU value the blistering GPU performance of VEGA 11, its still there for you to purchase, and at $149. The 4300g isn't a direct replacement for it, nor has it been marketed as such. It fills a spot in their lineup, just like the Athlon Gold and silver series, just like the XT products. They make a processor, they set a price for it. Personally, I think that improved CPU performance for a mild regression in GPU performance in certain situations is a fair trade for the same price. I certainly think that their prices for the 4600 and 4700 are just fine where they are.

What you wanted was for AMD to essentially give you more CPU and more GPU for free in a bottom of the stack product as compared to what they used to sell in a top of the stack product in consecutive years. That just doesn't happen in the industry, and AMD has very nearly done just that here. If you don't like it, you are free to go over to their competition and buy an equivalent i3 from Intel... Oh wait, you can't. The Vega 6 has it all over the GT630 in the i3 10th gen lineup, and, at the same price, Intel doesn't have any sort of significant lead in CPU performance either. AMD is STILL offering you greater value for the dollar in this segment of the market. Intel offers you DDR-2666 locked, AMD offers you specs for DDR4-3200, and unlocked to go as high as you can manage if you want to. AMD offers you processor direct NVME performance, Intel doesn't. Both offer PCIe 3.0 here, and Intel has the same lead that they had against the 3400g here with actual x16 instead of AMD's x8 for the dGPU slot.

All in all, the value at that price is there. It may not be there for YOU, but its there for the broader market.
 

Geranium

Member
Apr 22, 2020
83
101
61
A fair point (sort of), but if you compare AMD's net margin today to, I don't know, four years ago . . . look, they lost a lot of money back then. They are much-improved and are finally healthy, and they did it with prices that are a lot higher than what they used to charge in 2015-2016 on their new uarch products (Kaveri, Godavari, Carrizo). The zombie pricing on their 2014-and-earlier designs was all over the map. A new 8370 or 8370e was actually kind of expensive (okay only $200 but still), and 9590s still commanded prices over $300 until Ryzen came out so let's be clear, AMD wasn't selling $200-and-below across their entire lineup.

All their new stuff into which they plowed at least some development money (admittedly, not a lot since they were risking so much on Zen) was in that basic price range. The 7890k launched at $170, and that was the fastest FM2+ APU they ever launched. Today you can pay $749 for an AM4 CPU, and in October that price may be moving up.

AMD has significantly moved their price stack upwards. It's perfectly okay for consumers to say "enough". It's not entitlement or anything of the sort, and why people continue to be offended that anyone would suggest that AMD just stick to the pricing from Zen1 or Zen2 is beyond me.
Kaveri, Godavati was not AMD's flagship chip Vishera was, and top Vishera chip FX-9590 costed 1000$(later resuced to $230). New Ryzen 9 3950X is less than that and it is much faster than that.

Yeah lemme just go buy a TX2-based PC. That'll work real well oh wai-
Don't you know ARM is the future!!

That's what happens when you make generalizations. You get things very wrong. I don't know whom you think makes such statements, but I certainly haven't.
I am not accusing you, but reading comment on every AMD's product review gives me that vibe.

We have at least one user here who uses and (apparently) sells/services APUs to a lot of people in a "down" market where APUs are popular. The way AMD is changing their product stack is apparently unfavorable to him and his customers, especially when you consider the influence of importers that control pricing and availability to a greater extent than would be seen in North America or the EU. His perspectives are his own, and while I often disagree with him, at least he's consistent.
Maybe he should start to offer Intel/Nvidia combination. Also why he is selling AMD system the first place as AMD system don't sell like Intel/Nvidia system.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
I must doubt your sanity when you are attempting to tell me that AMD should be charging less than they want for the 3300X ($120) for the 4300g, which is offering very similar overall system performance AND a competent iGPU with the same or better performance as the 3200g. You are really saying that you want AMD to give you what is essentially the capabilities of a 1030 for $20? That's horse excrement on a grand scale.

AMD has NOT discontinued the 3200/3400g products. They still exist in retail, and at their normal pricing. If YOU value the blistering GPU performance of VEGA 11, its still there for you to purchase, and at $149. The 4300g isn't a direct replacement for it, nor has it been marketed as such. It fills a spot in their lineup, just like the Athlon Gold and silver series, just like the XT products. They make a processor, they set a price for it. Personally, I think that improved CPU performance for a mild regression in GPU performance in certain situations is a fair trade for the same price. I certainly think that their prices for the 4600 and 4700 are just fine where they are.

What you wanted was for AMD to essentially give you more CPU and more GPU for free in a bottom of the stack product as compared to what they used to sell in a top of the stack product in consecutive years. That just doesn't happen in the industry, and AMD has very nearly done just that here. If you don't like it, you are free to go over to their competition and buy an equivalent i3 from Intel... Oh wait, you can't. The Vega 6 has it all over the GT630 in the i3 10th gen lineup, and, at the same price, Intel doesn't have any sort of significant lead in CPU performance either. AMD is STILL offering you greater value for the dollar in this segment of the market. Intel offers you DDR-2666 locked, AMD offers you specs for DDR4-3200, and unlocked to go as high as you can manage if you want to. AMD offers you processor direct NVME performance, Intel doesn't. Both offer PCIe 3.0 here, and Intel has the same lead that they had against the 3400g here with actual x16 instead of AMD's x8 for the dGPU slot.

All in all, the value at that price is there. It may not be there for YOU, but its there for the broader market.

Im sorry, what? wait? what? hell no.

So wanting to have more perf at the same price in a new generation of chips is crazy now? Woow, just wow, you would make a great politician.

Lets say tomorrow Intel comes forward and say "You know what?, Pentiums are now $100, the starting I3 $150, and the starting I5 $200" and they are inferior in some way to the skus at 100, 150 and 200 they are replacing, that sounds great to you?

BTW, the "AMD has not discontinued the 3200/3400g products" is a huge lie when they cant be used on the newer motherboards. They will die off after Renoir launch sooner than later.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
I also question your sanity Shivansps. To imply that Renoir is an inferior product than Picasso it's preposterous

If AMD tries to put the 4300G to replace the 3400G, according to the results we saw, it is. Renoir is not inferior to Picasso, its the pricing what makes it that way.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
Im sorry, what? wait? what? hell no.

So tomorrow Intel comes forward and say "You know what?, Pentiums are now $100, the starting I3 $150, and the starting I5 $200" and they are inferior in some way to the skus at 100, 150 and 200 they are replacing, that sounds great to you?

Can someone please tell me when it started to be OK to have a replacement SKUs at the same price with lower performance in ANY WAY? Can someone please tell me why Picasso refresh launch looks great compared to this?

And can someone please explain to me how wanting a new GEN APU launch to give more perf for the same money is WRONG IN ANY WAY?!?!?!

Ill recomend you to go into politics, you certanly have the traits.


BTW, the "AMD has not discontinued the 3200/3400g products" is a huge lie when they cant be used on the newer motherboards. They will die off after Renoir launch sooner than later.

So you are suggesting Renoir APU's have lower performance than Picasso? :rolleyes:

Maybe, by a few percent, in a few edge cases involving GPU only but I doubt that. You are getting more and faster CPU cores. I guess that also lowers performance. Are you really so pissed that AMD cut a few CU's that you have to keep on posting how "unacceptable" that is? Have you seen the iGPU clocks? Up to 2.1GHz? That should more than make up for the CU's. I don't understand why you are so upset over the loss of 2-3 CU's.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
If AMD tries to put the 4300G to replace the 3400G, according to the results we saw, it is. Renoir is not inferior to Picasso, its the pricing what makes it that way.

There is no pricing. They are currently OEM only. Otherwise it's rumor/speculation. Sure I wish they were in retail, and it sounds like they will be at a later date. You are making up any reason you can to complain about. Don't like it? Buy something from the competition.
 

IntelUser2000

Elite Member
Oct 14, 2003
8,686
3,785
136
You are making up any reason you can to complain about. Don't like it? Buy something from the competition.

That's a lazy way of shutting down a discussion.

@Shivansps may be too early at judging the product, but considering majority of leaks surrounding products are true, he has a legitimate concern.
 

Shivansps

Diamond Member
Sep 11, 2013
3,851
1,518
136
So you are suggesting Renoir APU's have lower performance than Picasso? :rolleyes:

Maybe, by a few percent, in a few edge cases involving GPU only but I doubt that. You are getting more and faster CPU cores. I guess that also lowers performance. Are you really so pissed that AMD cut a few CU's that you have to keep on posting how "unacceptable" that is? Have you seen the iGPU clocks? Up to 2.1GHz? That should more than make up for the CU's. I don't understand why you are so upset over the loss of 2-3 CU's.

The results THAT WE KNOW, says the 4300G is inferior to the 3400G most of the time, for a new product thats not acceptable, it never was in any case, EVER. Even the CPU results are worse than expected, so the cpu gains may not be that great either.

So unless new results come forward that contradicts the older ones i dont see how any of this would change. So far it looks like a bad replacement of the 3400G, but is not because the product is bad, the pricing is. At $110 it would be a great 3200G replacement, and thats already a 10% price increase, and the $120 3300X will most most likely be faster in CPU.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,674
3,796
136
That's a lazy way of shutting down a discussion.

@Shivansps may be too early at judging the product, but considering majority of leaks surrounding products are true, he has a legitimate concern.

I didn't shut anything down. There is still a discussion. I just don't get the argument. If AMD had limited Picasso to 8 CU's, everything would be OK right now? We wouldn't even be having this discussion. Maybe AMD will learn from their "mistake".

The results THAT WE KNOW, says the 4300G is inferior to the 3400G most of the time, for a new product thats not acceptable, it never was in any case, EVER. Even the CPU results are worse than expected, so the cpu gains may not be that great either.

So unless new results come forward that contradicts the older ones i dont see how any of this would change. So far it looks like a bad replacement of the 3400G, but is not because the product is bad, the pricing is. At $110 it would be a great 3200G replacement, and thats already a 10% price increase, and the $120 3300X will most most likely be faster in CPU.

For starters you're getting 4C/8T of a higher performing CPU. Yea, it was crappy that AMD disabled SMT for segmentation, but Intel has been doing that forever. CPU results worse than expected? Let's see the real benchmarks before we get too upset. Shouldn't they be in line with Zen+ to Zen 2 gains? Clearly that was acceptable.

Now you have an iGPU that should be on par or faster than the previous model. Overall that makes the product better. They officially increased the memory speeds allowed as well.

At $110 it would kill the 3300X. When $10 buys you a barely slower CPU but also a very usable iGPU. Guess what people are going to choose? Intel sold us 4C CPU's year after year with minor improvements. Almost always far less than Picasso to Renoir. Was that also unacceptable? Sure. Like I said earlier, you don't have to like it. But that's the way things are. In the end, the market will determine what is acceptable or not. Vote with your wallet.
 

amd6502

Senior member
Apr 21, 2017
971
360
136
So unless new results come forward that contradicts the older ones i dont see how any of this would change. So far it looks like a bad replacement of the 3400G, but is not because the product is bad, the pricing is.

I would calm down because I'm almost sure this will work out well for you. The 3400g will NOT be discontinued for a while (should be available by the end 2021 if not longer) and it will be discounted significantly. This means you can get your overclockable 11CU at a lower price if what you really want is max iGPU.

First, 4300g is ABSOLUTELY NOT a replacement for 3400g.

3400g is a higher main bin for a 5 Billion transistor 12nm die.

4300g is a lower salvage bin for a 10 Billion transistor 7nm die that is a little over twice as big as a Matisse chiplet (156mm²). Quad-core salvage from dual CCX or 8c dies are a somewhat less common item, as we've consistently seen all the way back to summit ridge, (or even earlier Vishera and Orochi). 6c and 8c Renoir desktop parts are going to be produced at ~10x the quantity of 4c parts. Therefore the pricing cannot be as aggressive on the 4c parts. (The sweet spot for value will therefore be for the 6c parts.)

3400g can however be cheaply and abundantly produced, because it is not some rare salvage die, and because it's only half the transistors as Renoir. I would go there if you want quadcore value, and be happy for its extra 3 CU.

Second point: the -300g numbering already implies it is not superior by necessity to a -400g product; it just happens to be superior in the CPU department because of doubled up FPU units and greater cache, and a third AGU.

I would think a 3300g Picasso bin would also do quite well for budget APU gamers. Perhaps something like 3.5-4.0 4c/8t with 8CU.
 
Last edited:

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
Woah, who said that ASMEDIA is discontinuing production of 450/470 boards anytime soon?!?! You can go out, right now, and purchase any number of boards that support the 3200/3400g. Heck, the x570 was released with 3200/3400 support and is still in volume production. Don't hand me that steaming pile of solid animal waste argument that just because you can't OFFICIALLY drop a 3400g on a B550 board it's dead. Baloney?

The 3400g is STILL a product, available for sale, at the previous price point. AMD has taken NOTHING away from you.

The 4300g is a new product. It has an all around better CPU section than the 3400g. It'sVEGA6 iGPU lags the VEGA 11 iGPU in most GPU limited games, especially when both are kept at STOCK clocks and in games that are compute heavy, but not in all games. That's called a wash.

The benchmarks that you've seen are interesting, and concerning, but don't really match up to what we're seeing with the same cores in mobile platforms for some items. It is known that some of the tech in the PRO SKUs increases memory latency, as well as how the chip itself senses it's load situation and manages it's multiple internal clock domains. AMD's own rep has been on Reddit discussing how the IF and memory controller clocks adaptively change their clocks and multiplier ratios dynamically due to sensed load and how that can affect benchmarks.

As I've said before, we're going to have to wait and see what they look like when they finally get into the hands of enthusiasts that can thoroughly put the consumer versions of these chips through their paces.

And, to the comment that the 4300g should embarrass the 3300x in consumer systems, I highly doubt it. I expect them to be roughly similar in single thread benchmarks and the 4300g to fall a bit behind in multi. There is no reason to expect otherwise.
 

NostaSeronx

Diamond Member
Sep 18, 2011
3,686
1,221
136

LightningZ71

Golden Member
Mar 10, 2017
1,627
1,898
136
imho, this is the filler till AMD releases small Zen2 6nm APU

For this:

Now, I just need to find out if they will turn off SMT for the 6nm version or have it on.
Isn't that just a more deeply disabled 3200 die?
 
  • Like
Reactions: Antey