Speculation: Ryzen 3000 series

Page 5 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What will Ryzen 3000 for AM4 look like?


  • Total voters
    230

Zapetu

Member
Nov 6, 2018
94
165
66
A local AMD overclocker, insider, etc. said this "leak" was an utter nonsense and re-confirmed that via his sources at AMD... This is the same guy who had reported a 7nm 8c ES boosting to 4.5GHz and benchmarks ran on a OCed ES@5GHz with AIO.

I would also like to know what parts of the leak he thought would be utter nonsense? The chiplet approach seems very plausible to me and as Vattila pointed out, even Kyle Bennet from HardOCP thinks that AdoredTV video is mostly true. I don't really care what people think about cutting IO dies into smaller pieces or other wild speculation like that as long as the basic concept is true. So far I give 99.9% more credibility to AdoredTV and Kyle Bennet than some totally anonymous source. At least AdoredTV a lot of guts to come out with that story. Sure, their (Jim/Kyle) sources might also not be credible but still it all makes so much sense. Still this leak could turn out not to be true and AMD could have went with a more traditional monolithic design. For now, I find the chiplet route much more attractive.
 
Last edited:
  • Like
Reactions: spursindonesia

exquisitechar

Senior member
Apr 18, 2017
655
862
136
Wasn't it @Mockingbird who said that there was a 4.5GHz ES, anyway? This person said the same thing? 5GHz with an AIO sounds quite interesting, but I'd like to see the source. If it's out there, I'm surprised this isn't widespread already with all the enthusiasts searching around for any new piece of information...
 
  • Like
Reactions: happy medium

DownTheSky

Senior member
Apr 7, 2013
787
156
106
6c/12t @ 99$, I'm not buying it. It would cannibalize all the upper segments, since the average user doesn't need much more than that. Also 16c ryzen 9 doesn't leave much room for upgrading for next 2 generations.

But this is AMD we're talking about. Recent CPU history/roadmap taught us they like going balls in. What would make this leak more plausible is if PS5 is a 16c monster.
 

exquisitechar

Senior member
Apr 18, 2017
655
862
136
A "premium" mainstream desktop 16c CPU at around 500$ is something that would have hardly surprised me even before the leak (especially with the leaked 10c Comet Lake), but that 99$ 6c/12t would indeed really bring down ASPs. Even that's overkill for most people...
 
Mar 11, 2004
23,031
5,495
146
I would see that exactly the other way around. The 6-core parts with 2 defective ones would go into consoles. I'm still of the opinion that a 4c8t Zen2@3ghz is more than enough for consoles. But I see your point. A top 12c ryzen could use 2 chiplets that have 1 or 2 defective cores but are high leakge eg. clock high so not best option for server or consoles. But if that much binning is needed, 7nm as tsmc must yield very poorly.

EDIT: In fact the next consoles most likely will be 14nm still. I could see the gpu being 7nm but even that would be an issue price wise.

Unless they're a mix of chips, the new consoles won't be 14nm. They for sure won't on the GPU as Navi was engineered for 7nm (and I believe Lisa or someone has outright said Navi will be 7nm only). Suppose they might would use an older Ryzen for the CPU, but I'm somewhat doubtful, as I think they'll be rocking a lower core count and lower clocked version of Ryzen 2. A big reason is because of the power constraints of a console, plus it should help them with lower cooling and power supply and overall size.

I think Sony is going to be fine losing money on the PS5 initially (the reports about them saying to expect lower income from gaming through 2021 I think is because they'll take low margins or even a loss on the hardware early on to speed up adoption rate). I think they want more power from the outset, and possibly might ditch the Pro model for a unified development.

My guess is 8c/16t monolithic CPU on a revamped 1x 8c CCX layout on 7nm and a direct shrink to 12nm for the APU.

The chiplet approach, while incredibly smart and full of upsides, has a few downsides that can be very relevant in a desktop scenario.

Threadripper will just be derived from EPYC, binning the chiplets for speed and pairing four of them with a salvaged motherchip (you just need 1/4 of a full motherchip capabilities for TR) to help improving yelds on a big part done on a stable process tech.

The APU will be on 12nm while they work on the next design on 7nm, also waiting for more capacity on the foundry side.

I don't see them going with a different core design at this point. The benefits of 7nm and other aspects (memory controller) will more than makeup for the CCX deficiencies. In the future, I could see things split (with AMD keeping the consumer chips on 7nm while the EPYC and Threadripper goes to 5nm or 7nm EUV).

If it sounds too good to be true, it probably isn't.

If we got a ~10% IPC improvement and a ~10% clock rate improvement, I'd think that'd be the height of reasonable expectations.

So ~4.5 GHz boost clock on something that is very similar (IPC wise) to the Skylake architecture on most workloads with various outliers favouring one or the other.

I'd say 15-20% clock speed improvement would be reasonable (so 4.8-5.0GHz I think is possible - it very well might not be optimal from an efficiency standpoint, but I think it'll be possible). I think that for a couple of reasons. Not just the process, even though I think that'd probably be worth 10-15% on its own. But I think Ryzen 1 was limited in clock due to other issues, ones that they could resolve in Ryzen 2, bringing more clock speed improvement.

Now if this were say Intel where they already had clocks pushed, yeah, I think 10% would be more reasonable.

A Ryzen 3000 that pushes all 16 cores at > 5 GHz won't exist.

The socket and motherboard simply won't be designed with that kind of power consumption in mind.

A Ryzen 3000 that can turbo a core or two to that kinda of speed may* exist - but its power requirements would be within socket limits (say 125W "official").

*Although I'm skeptical.



edit: For reference, a 2950 *will just about* get to ~4.0 GHz on all 16 cores at 180W. So, going to 7nm... that might become, say, 125W (guess at around 30% power improvement at iso-clock). But you'd at least double that power going from 4.0 GHz to 5.0GHz (based on power increase being approx cubic to clock) so north of 250W I'd guess.

Much guessing, little data - take with a truck load of salt.

That's not a fair comparison as the Threadripper is using multiple dice, and a lot of the power consumption is coming not from the core.

https://twitter.com/AdoredTV/status/1070345089300475904

Adored on the leak...




Well, he acknowledges that the clocks could be made up.

I'm not sure if that's what he's talking about or if its the odd "multiplier" bits.
 
  • Like
Reactions: exquisitechar
Mar 11, 2004
23,031
5,495
146
6c/12t @ 99$, I'm not buying it. It would cannibalize all the upper segments, since the average user doesn't need much more than that. Also 16c ryzen 9 doesn't leave much room for upgrading for next 2 generations.

But this is AMD we're talking about. Recent CPU history/roadmap taught us they like going balls in. What would make this leak more plausible is if PS5 is a 16c monster.

Except AMD might be going for market share, and big increases there (and big sales figures) will look good for investors, which they'll need to help them fund future development. I think AMD sees Intel's issues and realizes it'd be good to be aggressive. I'm not sure that will be a problem, as I think we'll be on 7nm for consumer stuff for a while. And Zen 3 will bring other things (DDR5, PCIe 5) that will make it a worthwhile upgrade. They can also improve other parts if they're using a controller chip, where they can drop it to 12/10/7nm later. And there's other things they can do (integrate co-processors and things like that).

I think a big reason why AMD is going to chiplets is because they can space out development. One year its new cores, then another year its the I/O stuff, another they do platform and process on the I/O, then process on the cores, then core tweaks, etc. This way they keep bringing iterative improvements, and with the slowed development of process tech, it helps them keep fresh products even when they can't easily get improvements via process updates. If necessary they can do bigger ones every so often.
 
Last edited:

Zapetu

Member
Nov 6, 2018
94
165
66
6c/12t @ 99$, I'm not buying it. It would cannibalize all the upper segments, since the average user doesn't need much more than that. Also 16c ryzen 9 doesn't leave much room for upgrading for next 2 generations.

I had to admit that It does sound too good to be true, I'll give you that. 6C/12T would have one salvaged 6C/8C chiplet (72mm² 7nm), one dummy chiplet and one IO die (maybe about 100mm² 12/14nm). Do we even know what manufacturing costs for that would be? I think we tend to overestimate many (negative) things here myself included.

Even a bigger mystery are the Navi chiplet models which would have a lot of working 7nm silicon. One 8C chiplet, one Navi 20CU chiplet and a fully working IO die. Their pricing is pretty low as far as what their manufacturing costs may be.

But this is AMD we're talking about. Recent CPU history/roadmap taught us they like going balls in. What would make this leak more plausible is if PS5 is a 16c monster.

Maybe it's all about market share for AMD now. Sure they need more R&D money but their priorities are nowhere near the same as Intel's.

But we'll find out soon enough (about in month) whatever there's any truth to this.

Addition:
Except AMD might be going for market share, and big increases there (and big sales figures) will look good for investors, which they'll need to help them fund future development. I think AMD sees Intel's issues and realizes it'd be good to be aggressive. I'm not sure that will be a problem, as I think we'll be on 7nm for consumer stuff for a while.

Sure 7nm should be a long lived node. Still this is the right time for AMD to strike while Intel has trouble with it's 10nm node. After that the competion will be much more harsh and all market share gains for AMD now would likely be a huge benefit in the future.

Edit: Navi 12 changed to Navi 20CU chiplet.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
I had to admit that It does sound too good to be true, I'll give you that. 6C/12T would have one salvaged 6C chiplet (72mm² 7nm), one dummy chiplet and one IO die (maybe about 100mm² 12/14nm). Do we even know what manufacturing costs for that would be? I think we tend to overestimate many (negative) things here myself included.



Maybe it's all about market share for AMD now. Sure they need more R&D money but their priorities are nowhere near the same as Intel's.
That HAS to be wrong, ill bet for a 4C/8T+Dummy and THEN a 6C/12T+Navi at $140. But 6/12 at 99? no way. Anyway is not a good idea to loose the 99 APU... there is a big market for office pcs right there... so im hoping for 4/4 rebranded RR Athlon w/Vega at 80.
 
  • Like
Reactions: guachi

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,072
136
If AMD drives 8 cores into the consoles then for any gamer a 6 core $99 chip isnt going to be enough.

So you need to stop thinking 4 or 6 or 8 core will be enough for PC master race because AMD will work as hard as it can to drive it that way to maximize its value proposition against intel.

Out side of gaming in the mainstream market they will largely take what they are given when they are given it.

Edit: Do people realize that if amd get 700 usable dies a wafer even taking a really high cost like 12k a wafer it's $17 a 7nm chiplet. 14nm was estimated to be somewhere around 7-8k near release. a 100mm IO die with WSA on GF 14nm will be dirt cheap now so a $99 6 core chip doesn't seem far fetched at all.

edit2: The other point is AMD will need to drive core counts on the console side to stop Sony/MS even considering a high core count A76 or a like setup.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
If AMD drives 8 cores into the consoles then for any gamer a 6 core $99 chip isnt going to be enough.

So you need to stop thinking 4 or 6 or 8 core will be enough for PC master race because AMD will work as hard as it can to drive it that way to maximize its value proposition against intel.

Out side of gaming in the mainstream market they will largely take what they are given when they are given it.

IF Sony/MS is willing to pay for it. Ill bet that even the Athlon 200GE can kick 8C Jaguar ass. And every modern ARM CPU as well.

BTW do you realise that cost dosent make the price right? Going from 4/4 to 6/12 at $99 is just crazy +95% of the people will be just fine with that, and office PC building will go to 4C athlons for sure since that 99 chip dosent have a IGP. THAT is how you lose money.
 
Last edited:

Maxima1

Diamond Member
Jan 15, 2013
3,515
756
146
6c/12t @ 99$, I'm not buying it. It would cannibalize all the upper segments, since the average user doesn't need much more than that. Also 16c ryzen 9 doesn't leave much room for upgrading for next 2 generations.

But this is AMD we're talking about. Recent CPU history/roadmap taught us they like going balls in.

Would it? I can think of a number of goods people buy that have little utility over much cheaper alternatives. It's also important to realize the large disparity in income/wealth. Those with deep pockets will generally have a preference for more expensive things.
 

itsmydamnation

Platinum Member
Feb 6, 2011
2,743
3,072
136
IF Sony/MS is willing to pay for it. Ill bet that even the Athlon 200GE can kick 8C Jaguar ass.

BTW do you realise that cost dosent make the price right? Going from 4/4 to 6/12 at $99 is just crazy +95% of the people will be just fine with that, and office PC building will go to 4C athlons for sure since that 99 chip dosent have a IGP. THAT is how you lose money.
A 4 core Zen1 @ 3ghz is probably about equal to a 8 core jaguar 2.4ghz. But what does that have to do with anything as we are taking about a new console.

You realise phones have more cores right, why are you getting so hung up on it. the 8 core chiplet is smaller then the smallest intel chips sold at the lowest prices........
 
  • Like
Reactions: DarthKyrie

Zapetu

Member
Nov 6, 2018
94
165
66
Edit: Do people realize that if amd get 700 usable dies a wafer even taking a really high cost like 12k a wafer it's $17 a 7nm chiplet. 14nm was estimated to be somewhere around 7-8k near release. a 100mm IO die with WSA on GF 14nm will be dirt cheap now so a $99 6 core chip doesn't seem far fetched at all.

So even if 7nm is about 50-70% (Edit: fixed from 60% to 70%) more expensive to manufacture (per absolute die size), the cost per transistor, or whatever is the right measurement called, would still be much lower. As far as I know, we have talked about about 2x density improvements. So likely PS5 would mostly be 7nm and there might be some 14/12nm IO die in the mix.

And those console chiplets can have relative low base clocks and all core boost clocks but game developers could use higher boost clocks for 1-2 cores if they choose to. And Sony/MS will likely reserve 1-2 of those cores for their own purposes as they have done previously, if I recall correctly. Edit: I did recall correctly, PS4 had at first 6 cores enabled for game developers and later they enabled 7 cores. And I do know that 8 Jaguar cores consume a very litlle power and take a very little die space.
 
Last edited:

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
A 4 core Zen1 @ 3ghz is probably about equal to a 8 core jaguar 2.4ghz. But what does that have to do with anything as we are taking about a new console.

You realise phones have more cores right, why are you getting so hung up on it. the 8 core chiplet is smaller then the smallest intel chips sold at the lowest prices........

Im saying that cost dosent make the price and regardless of what AMD can do, it depends of Sony and MS and willing to pay for it, that is all. Just because a 6C/12T cpu is cheap to produce dosent mean that you should launch it a $99 the result of that is very simple = you lose money, check what happened with the G4560. It is the same reason of why AMD waited this long to release the 200GE.

And last time i checked an Athlon 5350 (4C @ 2GHz) scored around 160 in Cinebench R15 and the 200GE around 360. ill say the 200GE wins in raw power. The console APU are just a worthless piece of silicon that proves Sony can sell consoles and games regardless of hardware.
 
Last edited:
  • Like
Reactions: CHADBOGA

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
At 7nm? AMD could do very with an APU with 6c/12t and Navi with 16CUs along with 8MB of L3 cache... priced at ~$200. It will be great for SFF builds.

For the low cost market, the 2200G shrunk to 7nm. Maybe updated to Navi?
 
  • Like
Reactions: happy medium

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
At 7nm? AMD could do very with an APU with 6c/12t and Navi with 16CUs along with 8MB of L3 cache... priced at ~$200. It will be great for SFF builds.

For the low cost market, the 2200G shrunk to 7nm. Maybe updated to Navi?

probably by Q1 2020, if the adoredTV leak is true the 3000 APU are already coming by the end of 2019, an 2200G "shrunk" has to come later than that. But im fully expecting they keep selling Raven Ridge as sub $100 Athlons, this is 50% of why a $99 CPU with no IGP is a really BAD idea for AMD. the other %50 is that it will kill a lot of over $100 CPU sales.
 

Zapetu

Member
Nov 6, 2018
94
165
66
That HAS to be wrong, ill bet for a 4C/8T+Dummy and THEN a 6C/12T+Navi at $140. But 6/12 at 99? no way. Anyway is not a good idea to loose the 99 APU... there is a big market for office pcs right there... so im hoping for 4/4 rebranded RR Athlon w/Vega at 80.

Navi 20CU chiplet seems like an overkill for office PCs for sure. But it will likely possibly be memory started anyway but still two ~72nm² 7nm chiplets seems like a lot. But as itsmydamnation pointed out, manufacturing costs per chiplet aren't really that high (about $17 per chiplet). It's like they are giving the Navi 20CU chiplet almost for no profit at all just to manufacture more of them. If AMD is really going to do this, I would really admire their market share gain tactics. It seems almost like all they care about is hurting Intel. But sure they would still make a profit. That is, if this rumour turns out to be somewhat true.

Edit: Navi 12 changed to Navi 20CU chiplet.
 
Last edited:

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
probably by Q1 2020, if the adoredTV leak is true the 3000 APU are already coming by the end of 2019, an 2200G "shrunk" has to come later than that. But im fully expecting they keep selling Raven Ridge as sub $100 Athlons, this is 50% of why a $99 CPU with no IGP is a really BAD idea for AMD. the other %50 is that it will kill a lot of over $100 CPU sales.
I wait wait until DDR5 is out before I introduce higher core counts for desktops.
 

Shivansps

Diamond Member
Sep 11, 2013
3,835
1,514
136
DDR5 means new socket anyway.

Navi 12 chiplet seems like an overkill for office PCs for sure. But it will likely be memory started anyway but still two ~72nm² 7nm chiplets seems like a lot. But as itsmydamnation pointed out, manufacturing costs per chiplet aren't really that high (about $17 per chiplet). It's like they are giving the Navi 12 chiplet almost for no profit at all just to manufacture more of them. If AMD is really going to do this, I would really admire their market share gain tactics. It seems almost like all they care about is hurting Intel. But sure they would still make a profit. That is, if this rumour turns out to be somewhat true.

It needs an IGP no matter what, this is the reason of why the Ryzen 1200 did not work out and they are still trying to get rid of then for next to nothing, and the 2200G did work out OK. I hoped AMD would have learned this lesson by now. 2200G->2400G-2600 Works. 1200->1300X-1400-1500X-1600 Did not, as most sales fell on 1400 and 1600. And A8 APUs.

There should be NO CPU whiout IGP below $150.
 
  • Like
Reactions: Thunder 57

Zapetu

Member
Nov 6, 2018
94
165
66
It needs an IGP no matter what, this is the reason of why the Ryzen 1200 did not work out and they are still trying to get rid of then for next to nothing, and the 2200G did work out OK. I hoped AMD would have learned this lesson by now.

I hope they have. I agree that any kind of iGPU would be a benefit for Ryzen Pro or even any kind of non-gaming focused desktop platform. For many non "tech-savvy" users it might be too much of an hassle to buy a discrete GPU. Well, most of them would buy so called market PCs anyway.

I wait wait until DDR5 is out before I introduce higher core counts for desktops.

But Rome has 8C per one DDR4 memory channel as many have pointed out...
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
I hope they have. I agree that any kind of iGPU would be a benefit for Ryzen Pro or even any kind of non-gaming focused desktop platform. For many non "tech-savvy" users it might be too much of an hassle to buy a discrete GPU. Well, most of them would buy so called market PCs anyway.



But Rome has 8C per memory one DDR4 memory channel as many have pointed out...
For mainstream desktops? A higher clocked 8c/16t CPU at 65w TDP will meet all my needs perfectly for a good while.
 
  • Like
Reactions: Zapetu

Zapetu

Member
Nov 6, 2018
94
165
66
For mainstream desktops? A higher clocked 8c/16t CPU at 65w TDP will meet all my needs perfectly for a good while.

Sorry that I misunderstood you. Sure, 8C or even 6C would be enough for mainstream desktop for a while. Still 12C for AM4 would be quite attractive if you do any kind of rendering /video editing or similar kind of work and don't want to spend all that money that HEDT platforms (like ThreadRipper) require. So yes, 8C/16T (or 6C/12T) seems good for most of us.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Sorry that I misunderstood you. Sure, 8C or even 6C would be enough for mainstream desktop for a while. Still 12C for AM4 would be quite attractive if you do any kind of rendering /video editing or similar kind of work and don't want to spend all that money that HEDT platforms (like ThreadRipper) require. So yes, 8C/16T (or 6C/12T) seems good for most of us.
I have a strong bias towards CPU with high TDPs. So if AMD sold a R 7 3700@4000Mhz with the TDP at 65W and reasonable priced, I will pick that for my next build.
 
  • Like
Reactions: Justinbaileyman

Zapetu

Member
Nov 6, 2018
94
165
66
I have a strong bias towards CPU with high TDPs. So if AMD sold a R 7 3700@4000Mhz with the TDP at 65W and reasonable priced, I will pick that for my next build.

Don't you mean with low TDPs? You can always undervolt your CPU. I am a bit of ecomaniac (mayte that's a little bit of an overstatement) an eco frriendly person myself and I do care about the enviroment, climat change and such things and I think that Lisa Su mentioned somewhere that one of the goals of AMD's 7nm roadmap is to reduce general power consumption for servers (and desktop) markets. So even if there's not a SKU that directly pleases you, please do customize the current ones for your needs. I thing that we should strictly focus on technical stuff in here, though.

Edit: Maybe I misunderstood something here, but it doesn't matter anyway. It's all good and well. :grinning:
 
Last edited:

Justinbaileyman

Golden Member
Aug 17, 2013
1,980
249
106
I have a strong bias towards CPU with high TDPs. So if AMD sold a R 7 3700@4000Mhz with the TDP at 65W and reasonable priced, I will pick that for my next build.
Me too please!! I will probably get the 65w 8c/16t variant and the 16c/32t if its really real and I can afford it..
 
  • Like
Reactions: whm1974