Speculation: Ryzen 3000 series

Page 202 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What will Ryzen 3000 for AM4 look like?


  • Total voters
    230

coercitiv

Diamond Member
Jan 24, 2014
6,151
11,681
136
A 4+2 phase? Remember, that CPU has up to 50% higher current demand than the 1700 within the same power envelope, not counting AVX2. It also has a higher TDP and likely a lower "reference" voltage (not that modern AMD CPUs really have a reference voltage). If you are lucky, it'll be stuck at 3.8 GHz and won't try to boost anywhere.

I would worry about stock operation.
How about a 3+2 phase with doublers? :)

Hardware Unboxed ran 3900X and 3700X on an ASRock AB350M Pro 4. Stock works.

 
  • Like
Reactions: DarthKyrie

amd6502

Senior member
Apr 21, 2017
971
360
136
We will see AMD start to use die stacking on the I/O die fairly soon, they would be able to include both HBM and DDR on the APU but the ones that use both are at least 5 years out.

Edit: As to your question, I am not sure wouldn't that info be in Ryzen Master. Or does Ryzen Master work differently than Watt-Man?

With HBM they could go for larger iGPU's, but I don't think it's worth it. GPU intensive like 1080p gaming is best and most economically addressed by discrete graphics. (Unless you're a multibilion multinational like MS or Sony, in which case you might have the economies of scale to make a 7nm semi-custom APU with HBM worth it versus the dGPU route.)

For PC's, the APU's are great at value in the mainstream performance range. They now handle 720p (and somewhat above) really well. For anything at or over the performance of RX-560 one cannot beat discrete though.

Now I hope 7nm APU's come in two forms. MCM and a mobile oriented monolithic, with the MCM version being as capable (or slightly more) as the 3400g in the graphics department, while doubling the ability in the CPU department.

I could see the HBM stacking for the consoles. They are going with chiplet too most likely to improve yield and get the costs economical. It's possible the CPU chiplet would be identical/reused between Sony and MS. The memory controller may sit on the CPU chiplet (maybe?), and the GPU might be on one or two chiplets (2 more likely?).
 
Last edited:

DarthKyrie

Golden Member
Jul 11, 2016
1,531
1,279
146
With HBM they could go for larger iGPU's, but I don't think it's worth it. GPU intensive like 1080p gaming is best and most economically addressed by discrete graphics. (Unless you're a multibilion multinational like MS or Sony, in which case you might have the economies of scale to make a 7nm semi-custom APU with HBM worth it versus the dGPU route.)

For PC's, the APU's are great at value in the mainstream performance range. They now handle 720p (and somewhat above) really well. For anything at or over the performance of RX-560 one cannot beat discrete though.

Now I hope 7nm APU's come in two forms. MCM and a mobile oriented monolithic, with the MCM version being as capable (or slightly more) as the 3400g in the graphics department, while doubling the ability in the CPU department.

I could see the HBM stacking for the consoles. They are going with chiplet too most likely to improve yield and get the costs economical. It's possible the CPU chiplet would be identical/reused between Sony and MS. The memory controller may sit on the CPU chiplet (maybe?), and the GPU might be on one or two chiplets (2 more likely?).
AFAIK the CPU cores and GPU cores are custom variants due to having some secret sauce built-in for backward compatibility so not much interchangeability between the 2, they should be able to use the same I/O die thou.
 

DrMrLordX

Lifer
Apr 27, 2000
21,582
10,785
136
How about a 3+2 phase with doublers? :)
]

I would take that over a 4+2 board. You are (mostly) getting the current capacity of a 6-phase primary. Pity that it wouldn't run Blender.

Eventually, once we see PB + PBO get sorted out properly on all compatible boards, we'll probably see more than 3% performance differences between B350 and x570 while running a 3900x. Frankly I'm a bit surprised we saw any performance differences with the 3700x. But then Matisse has brought many surprises.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
I ran the CCD through Photoshop to try to make the features more visible.

Edit: New picture, composited together both source images that are out in the wild, since each one captures certain details better than the other. Looks like the north area between the two CCXes is quite a bit different than the equivalent south area.

Also, that bright area just above the centre of the die only exists on the one side. It's not an artifact of the light hitting the die a certain way. You can tell because in the picture of the package, that same area lacks bumps.

Zen 2 CCD Composite.jpg
 

Attachments

  • ZEN 2 CCD.png
    ZEN 2 CCD.png
    924.3 KB · Views: 29
Last edited:

H T C

Senior member
Nov 7, 2018
549
395
136
die shot of both O/I and chiplet!

It's interesting that on the I/O die i can see very similar patterns of cache layout as on zeppelin. Also interesting that one core equals about the same die size as L2+L3 slice.

Any chance to have this but for 3700X and 3800X?

I'm trying to find out if the 3800X has only one 8-core chiplet or two 2-core-disabled chiplets minus half cache on each chiplet.

AWESOME die shot, btw.
 

DiogoDX

Senior member
Oct 11, 2012
746
277
136
Any chance to have this but for 3700X and 3800X?

I'm trying to find out if the 3800X has only one 8-core chiplet or two 2-core-disabled chiplets minus half cache on each chiplet.

AWESOME die shot, btw.
I think this could be the 3700X or 3800X since the 3900X has 2 chiplets plus i/o (3 dies total).
 

H T C

Senior member
Nov 7, 2018
549
395
136
I think this could be the 3700X or 3800X since the 3900X has 2 chiplets plus i/o (3 dies total).

Since it's not labeled, it could be the 3600, for all i know. - IT IS labeled: it's an image of a 3600 CPU, as per the description.

Obviously, more than 8 cores will force the CPU to have two chiplets + IO die but, and that is precisely the question: do any of the 8c / 16t CPUs have two chiplets + IO die or do they all have only one chiplet + IO die?
 
Last edited:

extide

Senior member
Nov 18, 2009
261
64
101
www.teraknor.net
Since it's not labeled, it could be the 3600, for all i know. - IT IS labeled: it's an image of a 3600 CPU, as per the description.

Obviously, more than 8 cores will force the CPU to have two chiplets + IO die but, and that is precisely the question: do any of the 8c / 16t CPUs have two chiplets + IO die or do they all have only one chiplet + IO die?

8C and below are all a single CCD.


EDIT: Here is my take on an annotation:
9N3AFL3.png
 
Last edited:

NostaSeronx

Diamond Member
Sep 18, 2011
3,683
1,218
136
This is for a potential Sempron 300. There has been an update to 22FDX standard Vts and BEOL metal layers. In which, 22FDX now supports 17 metal layers. 22FDX now has six logic Vts of three in the LVT w/ FBB segment; eLVT, sLVT, LVT; and three in the HVT w/ RBB option segment; RVT, HVT, uHVT. That is compared to the original 22FDX, which if I remember maxed out at 12-13 metal layers and only had 4 logic Vts.

Stoney Ridge is considered a Low/Mid-end APU via Globalfoundries definition. Which is any mobility product that is at any of 40nm, 28nm, 22FDX nodes.
Raven2/Dali is a high-end SoC, with Picasso/Renoir being a premium SoC.

High/Premium SoCs aren't protected by GlobalFoundries, however Low/Mid SoCs are protected by GlobalFoundries.
 

Snarf Snarf

Senior member
Feb 19, 2015
399
327
136
Any chance to have this but for 3700X and 3800X?

I'm trying to find out if the 3800X has only one 8-core chiplet or two 2-core-disabled chiplets minus half cache on each chiplet.

AWESOME die shot, btw.

Robert Hallock has confirmed that only the 3900x and 3950x will be dual chiplet designs over on reddit.
 
  • Like
Reactions: lightmanek

H T C

Senior member
Nov 7, 2018
549
395
136
Robert Hallock has confirmed that only the 3900x and 3950x will be dual chiplet designs over on reddit.

Thanks for the info.

Do you happen to have the link for that Reddit post? I was arguing over this over @ another forum and i want to admit defeat with the proper evidence.
 
  • Like
Reactions: extide
Mar 11, 2004
23,031
5,495
146
With HBM they could go for larger iGPU's, but I don't think it's worth it. GPU intensive like 1080p gaming is best and most economically addressed by discrete graphics. (Unless you're a multibilion multinational like MS or Sony, in which case you might have the economies of scale to make a 7nm semi-custom APU with HBM worth it versus the dGPU route.)

For PC's, the APU's are great at value in the mainstream performance range. They now handle 720p (and somewhat above) really well. For anything at or over the performance of RX-560 one cannot beat discrete though.

Now I hope 7nm APU's come in two forms. MCM and a mobile oriented monolithic, with the MCM version being as capable (or slightly more) as the 3400g in the graphics department, while doubling the ability in the CPU department.

I could see the HBM stacking for the consoles. They are going with chiplet too most likely to improve yield and get the costs economical. It's possible the CPU chiplet would be identical/reused between Sony and MS. The memory controller may sit on the CPU chiplet (maybe?), and the GPU might be on one or two chiplets (2 more likely?).

But on AM4 they'd have problems packaging that. They might be able to fit a small GPU with a single stack of HBM in the spot of one of the CPU chiplets (but that GPU would be so small that I don't think it'd be worth it, and the costs for doing that wouldn't be), and I'm not sure they can move the I/O module's placement to put HBM by it. There's just not much room without changing the whole packaging and I'm not sure they're ready for that yet. So it might not be feasible until they can stack onto the HBM.

I think they both already said they're using GDDR6 on the next consoles so that's not gonna happen. The memory controller would either be in the I/O or on the GPU chiplets, as they'd have to have a special version of the CPU chiplet made since it has no memory controller at all (and definitely not a GDDR6 one).

Sounds like the rumors are suggesting the MCM APUs might have been cancelled this round (I think Anandtech mentioned that in the interview with Lisa Su, and she said they hadn't said what that product would be but that its not cancelled - which I have a hunch just means there'll be monolithic Zen 2 APUs), and I'd guess that's due to the packaging issues of AM4.

AFAIK the CPU cores and GPU cores are custom variants due to having some secret sauce built-in for backward compatibility so not much interchangeability between the 2, they should be able to use the same I/O die thou.

They say that its custom but I personally am skeptical that it will be drastically so. I'm very doubtful the CPUs will be, but the GPUs might.

Oh and I really don't think it would be for backwards compatibility since that shouldn't be that difficult since its x86 CPU and the GPU still adheres to general API. It'll need tweaking to run on the new hardware for sure though (Microsoft even said they have frozen adding games to backwards compatibility so they can focus on having what they've currently got working on the next system).

Also with regards to that general talk, the rumors I saw said pretty much the opposite, that the dGPU Navi we're getting this year is a stopgap between GCN and the full architecture changes that is the basis for Navi, and that the next gen consoles will have the "real Navi" (and I'm guessing Arcteryx will bring most of that to the PC GPUs).
 
  • Like
Reactions: DarthKyrie

H T C

Senior member
Nov 7, 2018
549
395
136
common sense is sufficient evidence.

Not ... common enough ...

A reply from someone high enough with AMD puts to rest any and all doubt. I don't use Reddit and i'm having trouble locating the post in question so a little assistance in that regard would be much appreciated.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,670
1,250
136
I had a bit more fun playing around with the CCD in PS. Aspect corrected, with a bit more color.

Zen 2 CCD composite.png
 
Last edited:
  • Like
Reactions: amd6502

moinmoin

Diamond Member
Jun 1, 2017
4,933
7,619
136
Here is my take on an annotation:
9N3AFL3.png
Looks good. Though this makes me even more curious what all that space on the IOC is about. It's supposedly 12nm compared to the 14nm Epyc IOC but still more than a quarter of that one's size. The CCD looks boring in comparison.
 
  • Like
Reactions: amd6502

arandomguy

Senior member
Sep 3, 2013
556
183
116
What is your criteria for a good cpu or gpu (relative to the competition in what way?)?

Intel released Core 2 in 2006. AMD acquired ATI in 2006. If your line of thinking is that AMD has never had a "good" CPU since Intel released Core 2 until Zen 2 then I guess? Although I'd argue that lines such as the Phenom X6 were compelling choices due to pricing for certain use cases. It's really post Bulldozer vs Sandybridge in which the huge gap emerged.

My opinion is that HD 4xxx through HD 7xxx and Rx 2xx were all better situations for AMD GPU wise. I'm not seeing how AMD's current stack with RX 5xxx is better than previously with Rx 3xx through Rx 5xx.
 
Last edited:

arandomguy

Senior member
Sep 3, 2013
556
183
116
Computerbase did an update of their article about PCIe4 compatibility of Asus 450/470 MBs, in short it is not guaranted and could be obsoleted (!) by AMD themselves in a new AGESA, for the time only the 570 has officialy such a support....

https://www.computerbase.de/2019-07/asus-mainboard-x470-b450-pcie-4.0/

So interestingly the more expensive ROG line boards have worse support and that B450 boards have better support than X470.

The PCIe 4.0 situation is something I don't like about this launch and something holding me back from buying. I want PCIe 4.0 on the x16 and 1 m.2 but I don't want to pay the X570 price premium, power premium, or deal with the fan. Unfortunately this means waiting for B550 which means depending on other development may mean not buying in at all.