Speculation: Ryzen 3000 series

Page 168 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

What will Ryzen 3000 for AM4 look like?


  • Total voters
    230

Abwx

Lifer
Apr 2, 2011
10,948
3,458
136
Can someone explain in a simple way why so many frames are lost with intel or why so many frames are preserved with AMD?
I totally don’t understand why someone streaming video would lose so many frames broadcasting other than poor internet connection.

The gamers that hang by there have perhaps an explanation, what is sure is that there s a core count advantage for AMD as well as a huge L3, that should be the culprits.

Besides it s at low quality 1080p, so this will inflate the FPS and limit the CPUs ability in what is actually a multitasking test, even if the two apps are related.
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Simple they were using a software encoder a Really high detail encoder at a high bit rate. That means it needs MT CPU power. At 50% more cores at nearly the same performance per core between the two systems, the AMD one was able to do the encoding/streaming on the same CPU. When done like this on the 9900K, the system came to a crawl, the game ate so much of the CPU that the encoder/streaming didn't have enough umph to do anything.

Its a stacked example. Lowering the bit rate or using a faster pre-set might not have affected PQ. But it shows what Ryzen 1k did when compared to a 7700k and what I have been doing since I got my 4400+ in 2005. Which is using your system for more than just straight up gameplay, more cores helps. So if you are streamer who wants a super PQ high rate stream. You either have to go HDET or 2 systems with Intel, or do it on one system with a high end consumer Ryzen. What AMD demo'd just before that though was that the 3900x was neck and neck with the 9900k (which wasn't always the case for the 1800x). So it's game at pretty much the same level and do more. The 3900x and 3950x are really appearing to be no compromise CPU's. Negligible performance difference with tons more compute capability to spare. Top it off. I wouldn't be surprised considering Intel if the 3950x and the 9900KS are nearly the same price.

Exactly right. Even at lower bit rates the 9400F for instance fails to encode well, if at all. Simply put, if you want Intel for streaming, you need 12+ threads, minimum - even the 4C/8T 7700K didn't fare well in encoding+gaming (link) as you mention, when put up against the Ryzen 1800X it was a bloodbath on combined performance. The 8700K made some headway though.
 

DrMrLordX

Lifer
Apr 27, 2000
21,632
10,845
136
sorry mods, this is too much

You can probably just pull the fan and put on a passive heatsink, unless you are going to be running NVMe RAID.

There will be aftermarket solutions for this problem, which honestly isn't all that big of a problem. Plus MSI is advertising that they have their "FROZR" fan tech which will basically spin down the fan when it isn't needed. That should be on nearly every one of their X570 boards. Why you would focus on this one little thing?

But, Intel still has lower memory latency.

Intel has excellent memory controllers and a monolithic core design. Their mesh also works pretty well for keeping down memory latency. If they had 10nm working, they'd be pretty scary right now.

Is the fan even going to run for most people? It's not like the average user has a bunch of stuff hooked up via pcie. It seems more like a way to deal with edge cases where someone loads up all the slots with drives or GPUs.

Word on the street is that most motherboards will have at least minimal fan operation regardless of NVMe activity. As I stated above, MSI has a solution for this problem . . . allegedly. We'll have to see how well it works. I am confident that aftermarket heatsinks for the Southbridge will be a popular item.

I’ll see what I can do. But it’ll probably be a few days. My kitchen is getting remodeled, between that and work not a lot of free time.

No problem. Take care of the real-world stuff first, naturally.
 

MBrown

Diamond Member
Jul 5, 2001
5,724
35
91
I am hoping that with either the 12 core or 16 core, that I will be able to game and stream on one PC without any lag on the gaming side. When gaming and streaming on my current set up, I would get this input lag while gaming. I am hoping either of these high core CPUs would fix that.
 
  • Like
Reactions: lightmanek

ozzy702

Golden Member
Nov 1, 2011
1,151
530
136
I am hoping that with either the 12 core or 16 core, that I will be able to game and stream on one PC without any lag on the gaming side. When gaming and streaming on my current set up, I would get this input lag while gaming. I am hoping either of these high core CPUs would fix that.

Is this on the 4790k in your profile? If so, yes, it should make a huge difference since even without streaming a 4790k is a sizable bottleneck in many games. Even an 8 core would make a massive difference. It will be interesting to see how the 12 and 16 core game vs the 8 core. I'm extremely interested to see if AMD fixed their latency issue, and it appears that memory support is fantastic. If I didn't already have a 9900k I'd be picking up a new system in July.
 

MBrown

Diamond Member
Jul 5, 2001
5,724
35
91
Is this on the 4790k in your profile? If so, yes, it should make a huge difference since even without streaming a 4790k is a sizable bottleneck in many games. Even an 8 core would make a massive difference. It will be interesting to see how the 12 and 16 core game vs the 8 core. I'm extremely interested to see if AMD fixed their latency issue, and it appears that memory support is fantastic. If I didn't already have a 9900k I'd be picking up a new system in July.

Yes the 4790k in my sig. It does alright in most games but I can get hitches sometimes during games like battlefield especially. I got my 2080 expecting to upgrade the rest of my system later on when these Ryzens come out.
 
  • Like
Reactions: ozzy702

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
You can use your GPU to stream instead which doesn't hurt quality too much.

I don't know why they chose the slow preset. It creates a far bigger load with a barely noticeable improvement.

It's pretty easy to understand why they picked it when you look at the results it gives. The Intel product ends up looking like hot garbage and the AMD product ends up looking almost impeccable.

Sure it's pointless for the vast majority of people, but the whole point of these presentations is marketing your product.
 

Mopetar

Diamond Member
Jan 31, 2011
7,837
5,992
136
Why would AMD bother to make the desktop IOC both at 14nm and 12nm?

Could be a case of WSA terms and them needing to buy a certain number of wafers. That seems the most likely to me, but there could be other reasons as well.
 

HurleyBird

Platinum Member
Apr 22, 2003
2,684
1,268
136
Could be millions of reasons. Perhaps the IOC was supposed to be 14nm originally, but ended up being a limiting factor, or had some other issue, causing AMD to re-spin it on 12nm. But the old 14nm version was deemed good enough for X570.

Given that Epyc IOC is still 14nm, this explanation makes a certain amount of sense.

The fun thing here will be whether we eventually see a 12nm "X575" or some such. Maybe TR will make use of the 12nm version?
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
I’m thinking the chipset fan thing will likely get worked out later on too.
that is what I am counting for....

it gets somehow solved, b550 info is nowhere out
Looking back, I can't count the number of chipset fans I had on motherboards, and none made any noise ! (I could hear back then) One of the greatest motherboards of all time, the SR-2 had a fan. AMD is not the first
that great you have this experience, I have other...every MB fan i had failed or became loud after some time

we are in 2019 and as r3000 looks excellent so far, that chipset is a rushed product that should be available later with better tech
maybe that is the reason pci-e 4.0 is here so long after 3.0- tech needed to run it
 

PotatoWithEarsOnSide

Senior member
Feb 23, 2017
664
701
106
I wonder if any popular tech reviewers will do a deep dive on the effects of the latest windows scheduling improvements.
Second gen Ryzen is looking like a great value buy.
 
  • Like
Reactions: lightmanek

coercitiv

Diamond Member
Jan 24, 2014
6,201
11,903
136
we are in 2019 and as r3000 looks excellent so far, that chipset is a rushed product that should be available later with better tech
maybe that is the reason pci-e 4.0 is here so long after 3.0- tech needed to run it
That chipset is actually a repurposed Matisse IO die: so when it's inside the package it looks excellent to you, but when put outside to service external IO duties it suddenly becomes rushed out and in need of better tech.

I simply cannot understand why it's so hard hard to wait for actual review information before going all out and criticizing an entire platform just because of a bloody fan. How many NVME drives running in RAID do you have on your Z170 board anyway?!
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
That chipset is actually a repurposed Matisse IO die: so when it's inside the package it looks excellent to you, but when put outside to service external IO duties it suddenly becomes rushed out and in need of better tech.

I simply cannot understand why it's so hard hard to wait for actual review information before going all out and criticizing an entire platform just because of a bloody fan. How many NVME drives running in RAID do you have on your Z170 board anyway?!
yes exactly, it looks excellent because it has different cooling designer for it, you wont have a passive cooled cpu that powerful as r3000 and adding pci-4 to cpu with some minor power makes sense
imagine your new ipad gets a fan for not that much needed features....you wont like it
about that nvme raid, sorry I dont need one, nvme current drives are imo performing well on the desktop without raid
I dont know about your opinion, but those x570 boards look "over" cooled, overengineered and expensive
but yes I am waiting for the reviews, actually I am very interested in what is needed to get that r3000 above 4,5GHz - boards, cooling, power etc...
I am pretty sure about 4-4,2 GHz it will have massive efficiency, lets see in the reviews
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
It's pretty easy to understand why they picked it when you look at the results it gives. The Intel product ends up looking like hot garbage and the AMD product ends up looking almost impeccable.

Sure it's pointless for the vast majority of people, but the whole point of these presentations is marketing your product.

It does however show consumers that AMD may be capable of allowing end-users to do more than 1st or 2nd generations of Ryzen. I emphasize the words "may be" because I'd rather wait until benchmarks come out before attempting to make a statement that may seem factual.
 

A///

Diamond Member
Feb 24, 2017
4,352
3,154
136
that is what I am counting for....

it gets somehow solved, b550 info is nowhere out

that great you have this experience, I have other...every MB fan i had failed or became loud after some time

we are in 2019 and as r3000 looks excellent so far, that chipset is a rushed product that should be available later with better tech
maybe that is the reason pci-e 4.0 is here so long after 3.0- tech needed to run it

It was supposed to be available years ago but the governing body kept on altering it and extended it, which caused delays. By all accounts it should have been available in 2016-2017, if not even earlier.
 

Thunder 57

Platinum Member
Aug 19, 2007
2,675
3,801
136
https://images.anandtech.com/doci/1...Gaming-CPU_Architecture_06092019-page-008.jpg

3rd AGU, unified AGU scheduler, etc. Definitely, more SMT friendly than Zen/Zen+.

A 3rd AGU? Nice, I thought they were saving that for Zen 3. I was surprised how well Zen did with the 4 ALU/2 AGU setup. Another AGU will certainly help.

...I can not believe what I am reading on enthusiast forum and with lots of applause... :( sometimes I understand why people like theStilt left...

I always wondered why theStilt left. I figured it had to do with backlash with his reported Zen+ power numbers, but I guess it was something else.