Speculation: Ryzen 4000 series/Zen 3

Page 206 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

itsmydamnation

Platinum Member
Feb 6, 2011
2,744
3,080
136
One clear answer is ARM. The X1 totally smashes, even the zen 3 in IPC, that's for the <10W X1. Imagine what a 100 W part would be able to produce. The new arm firestorm cores developed by Apple will probably be able to emulate x86 applications faster than what x86 processors are being able to produce natively. The first actor to release an arm CPU for desktop, developed with support for SATA, pciex, DDR 5 memory and so forth, they will be able to grab a huge market share over night. Ok might be a bit dramatic, but with firestorm, we have what, a 60% higher absolute performance than Intel's best desktop chips, but at 5 watts. Now we're talkin again! Only thing left is for Windows to do the same as Apple, get great AArch64 support and get it done fast. I want my next gaming rig to be 300% faster than the former, not 26%..
When can the general market buy anything of that? When Zen5 is out?

Im starting to notice a pattern here , really we seem to be using a ~3 year gap in comparing x86 to ARM. We compare yet to exist cores in yet to exist products to products that go through massive validation cycles to hit full general market availability and then go DERP DERP ARM.
 

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,620
136
Are you guaranteed to have a single chiplet with a 5800X though?
Both 5600X and 5800X have only the cache of one chiplet, not two.
YbUq117.png
 
Feb 4, 2009
34,506
15,737
136
When can the general market buy anything of that? When Zen5 is out?

Im starting to notice a pattern here , really we seem to be using a ~3 year gap in comparing x86 to ARM. We compare yet to exist cores in yet to exist products to products that go through massive validation cycles to hit full general market availability and then go DERP DERP ARM.

Let’s not forget enterprise customers.
While I do agree ARM has some interesting things I don’t see a big vender mixing some ARM based pc’s in their regular mix.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
5600X should be able to Trash a 6950X a 10 Core from Intel, but the 10900K? In Single Thread Yes, in Multi Threaded 5600X should be a few percentage(single digit) lower
In theory, at least. I've never seen or used a 6950X. Wasn't exactly in my budget all that time ago...
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
Let’s not forget enterprise customers.
While I do agree ARM has some interesting things I don’t see a big vender mixing some ARM based pc’s in their regular mix.
No, but they are starting to take place in several of the world's fastest super computers mind you. Apple is already there. All I'm saying, it's doable, more than ever. If Microsoft works out their software that is....
 
Feb 4, 2009
34,506
15,737
136
No, but they are starting to take place in several of the world's fastest super computers mind you. Apple is already there. All I'm saying, it's doable, more than ever. If Microsoft works out their software that is....

I agree with this.
 

soresu

Platinum Member
Dec 19, 2014
2,617
1,812
136
The new arm firestorm cores developed by Apple will probably be able to emulate x86 applications faster than what x86 processors are being able to produce natively.
Perhaps, but certainly no avid Windows user is going to switch to Apple just for that - and at the rate that ARM Ltd cores are improving YOY by the time any significant 3rd party apps are fully native ARM they will likely be available on WARM too to take advantage of those newer ARM Ltd core platforms.
developed with support for SATA
SATA is a dead (ie no longer improving) standard kept alive because no one has yet made a more compact variant of the U2/U3 connector standard to popularize NVMe drives off the motherboard in the consumer space.

Having said that, PCIe5 will have such high bandwidth that they could design a cable/connector standard for only 2 lanes and still have as much bandwidth as a PCIe4 M2 drive.

The odd thing is that the SATA server equivalent SAS is still improving with 22.5 Gbit/s ratified in 2017 and 45 Gbit/s in development, yet SATA is still stuck on measly 6 Gbit/s - go figure priorities there, they seem to prefer us cooking our M2 drives from the nearby CPU/GPU airpaths so that they can profit from us buying new ones more often.

Edit: perhaps internal USB4 drives could be a thing?
Those connectors are plenty compact, go up to 40 Gbit/s and deliver power too into the bargain.
 
  • Like
Reactions: Tlh97

kurosaki

Senior member
Feb 7, 2019
258
250
86
Perhaps, but certainly no avid Windows user is going to switch to Apple just for that - and at the rate that ARM Ltd cores are improving YOY by the time any significant 3rd party apps are fully native ARM they will likely be available on WARM too to take advantage of those newer ARM Ltd core platforms.
Was never talking about us to surrender to the dark side. Rather hoping someone else would see the huuge underlying potential in changing arch.

perhaps internal USB4 drives could be a thing?
Those connectors are plenty compact, go up to 40 Gbit/s and deliver power too into the bargain.
Why not! But it would take alot more for each and every step outside the taken path. Getting 2 billion people change CPUs over night wont be easy, to have them discard all their old hard drives as well? ;)
 

soresu

Platinum Member
Dec 19, 2014
2,617
1,812
136
Getting 2 billion people change CPUs over night wont be easy, to have them discard all their old hard drives as well?
USB to SATA converters may not be a dime a dozen but they are available and moderately cheap, I even used one to back up my data to a new HDD before a system reformat last year.

I would be surprised if Apple keep SATA given their predilection for abandoning aging tech.

By their past standards SATA should already be gone from their lineup at this point.

Even so I fear that the consumer sector is doomed to stay M2 and PCIe slot only for NVMe drives - which wouldn't be so bad if the GPU makers would start actually making energy efficient SKU's that don't roast the surrounding area.
 

Vattila

Senior member
Oct 22, 2004
799
1,351
136
And don't say it's GPUs since they still can't even run standard C++ ...

Have a look at SYCL; the Khronos standard for single-source C++ for writing programs that execute across heterogeneous platforms. This seems to be the direction the industry is taking, as an open alternative to CUDA.

Also, have a look at the C++ Standard Committee's work on parallelism and support for heterogeneous systems (study groups SG1 and SG14, in particular).
 
Last edited:
  • Like
Reactions: MrJim

DrMrLordX

Lifer
Apr 27, 2000
21,583
10,785
136
No need to jump the gun IMO.
If Warhol comes, let's see when, how it performs and at what price. Then we may or not rage against it.

Fair enough. I just think Warhol is likely to push out Zen4 desktop (Raphael?) if AMD is determined to release it.

At another point rebranded products see price-hikes and everyone is throwing in their wallets and their first-born to get them.

I think we're fortunate if enough people didn't buy into the XT chips (outside of the 3600XT) to convince AMD to not make another release like that.
 

kurosaki

Senior member
Feb 7, 2019
258
250
86
I come to the tread about Zen3 and I have to read how ARM is the best. I though It had It's own thread, but It looks like It infected this one too.:mad:
Well, it's kind of the big pink elephant in the room. A small hope of cutting the stagnation in perf per dollar increases.
But with that said, I'm still thinking of buying an 8 core zen3, of course, when the price gets right..

That ARM stuff is at least 5 years away in this pace anyway. Apple is always so fast footed, the win community, including myself, is a bit more conservative.
 
Last edited:

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Can someone please link me to the post where the confidential slides were posted?
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Robert Hallock, Director of Technical Marketing at AMD. The messages are from Discord, a friend (Casmoden) asked him on the AMD Vanguard Discord.

He's also on the r/AMD Discord as well, so questions like this he'll answer there too.

You've seen him already, he was on stage. He's this guy:
ecb867c53e068bec3ffde706c6d116d1.jpg
Him! Yes, I know hairless Gepetto. I've only seen him once before the event in a video podcast video AMD put out.
 
  • Like
Reactions: Tlh97 and uzzi38

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
Uh, wasn't he also on stage for a portion of the Vermeer launch?
The presentation got ruined by constant interruptions and I only saw the first four minutes and the end credits. I went with the slides. Caught less than 20 minutes of the Apple event last month and maybe 10 minutes of the NVidia event. I have bad luck!
 

jamescox

Senior member
Nov 11, 2009
637
1,103
136
Come on, do you really think AMD was losing money on Zen, Zen+ and Zen2 chips - including R&D? Absolutely not. Yes it's normal for new chips to cost more than the ASP of 1.5 year old chips. That's standard. What's not standard is the new chips commanding a significantly higher price than those old chips made their debut at. For the lower end chip, it's nearly a 20% increase in original MSRP - even more so when you factor in that none of them come with HSF anymore except the 6c chip.
That is almost a straw man. I very clearly stated that AMD didn’t make much profits last year, which means that they did not sell at a loss overall. You can’t just go by the production cost since that has no relation to the price. The development cost has to be taken into account. Earlier Zen iterations were way underpriced by market standards of the time. They had to do that to gain market share though. How much R&D budget do you think Intel and Nvidia have compared to AMD? I am not going to look it up because I know the answer is a tiny fraction. AMD is competing with both now and with ARM in the future.

It is amazing that they are competing at all and here they are taking the lead. A large part of that is Intel’s complete process tech fumble, but Intel obviously wasn’t prepared for this level of competition regardless of the process tech issues. Even if 10 nm wasn’t near complete failure, they would have needed to accelerate their roadmaps significantly and large companies usually don’t do well when that happens. I don’t really expect Intel to be that competitive again until 7 nm stacked chips, and even then, it is in doubt. We don’t know what AMD and TSMC have in the pipeline for Zen 4.

AMD has been able to get by with lower R&D cost due to their modular architecture. Zen 1 was literally one chip for all. They taped out a single chip that had to cover their entire market. It was quite good, but it had a lot of wasted silicon. The single chip desktop parts had a large infinity fabric switch that was not needed and increased latency. Zen 2 fixed that. It is still one cpu die which maximizes wafer allocation and a cheap IO die at GF. No real wasted silicon except ThreadRipper, which only uses half the IO die. It is a lower volume part though. To continue to compete, AMD needs to make a bunch of different monolithic parts for mobile which cost R&D. Just taping out a chip cost millions for the mask set. It looks like we are going to get perhaps an ultra low power mobile part (perhaps based on Zen 2-ish console cpus) and a few Zen 3 variants. They probably aren’t going to make much profit again with how much they must be funneling into R&D. How may GPUs and APUs did they tape out or are coming soon? They are going to need a lot more R&D money going forward with more specialized products and market segments.

Anyway, AMD is not screwing over enthusiast at all. It is the opposite actually. The 50$ increase is across the board. Enthusiast are more likely to buy the expensive parts where the 50$ is a small percentage of the cost. The mainstream will just buy an OEM system or the less than 300$ parts mostly. People make a big deal out of the high end stuff when it actually isn’t that high of volume. Most of my friends are gaming on less than 200$ processors.
 

DrMrLordX

Lifer
Apr 27, 2000
21,583
10,785
136
@uzzi38

Not using N7+, N7, or N7P? That's . . . odd. Might explain why AMD didn't get any all-core clock boosts with Vermeer over Matisse (in fact, they have regression). Totally unexpected. Hopefully the wider core design will help mitigate that. It's also an interesting opportunity for overclockers, especially if AMD has done anything to fight hotspotting.
 

moinmoin

Diamond Member
Jun 1, 2017
4,934
7,620
136
@uzzi38

Not using N7+, N7, or N7P? That's . . . odd. Might explain why AMD didn't get any all-core clock boosts with Vermeer over Matisse (in fact, they have regression). Totally unexpected. Hopefully the wider core design will help mitigate that. It's also an interesting opportunity for overclockers, especially if AMD has done anything to fight hotspotting.
Yeah, that to me is probably the most surprising info from the event: After all these years teasing 7nm+ for Zen 3 it turned out to be some node optimizations by AMD itself, not a separate node by TSMC, and the Zen 2 XT chips are actually using those as well.