AMD Ryzen 5000 Builders Thread

Page 17 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

B-Riz

Golden Member
Feb 15, 2011
1,482
612
136
Last edited:

DisEnchantment

Golden Member
Mar 3, 2017
1,601
5,780
136
Thanks, went 32Gb ultimately (2 sticks) because for the work I do, I easily fill 20Gb instantly with most of my routines, so 32Gb is already getting filled up, and 40Gb is not uncommon either, so that's over the limit, so I will likely end up getting a 2nd pair of 16's to get to 64Gb (same exact memory only though).

Avoiding overclocking in general on this workstation, stability is important or I can lose hours of work. My processing times are blocks of minutes, so losing a single block, reset, start again, etc is just bad. I just need it to work and be stable.

Got some 3200Mhz CL16 speed sticks at 32Gb (16x2).

I've also dropped the 5600x. I'm just going to wait for 5900x. I need the cores/threads for my work. It's a 50% increase in cost, but it's a 100% increase in cores/threads. I'll live on a lowly Athlon 3000G ($50 lol) until I can get a 5900x.

I picked up an Asus Tuf Gaming Wifi x570 board, 32Gb of DDR4 3200mhz CL16 memory, a Crucial NVMEM2 SSD and the Athlon 3000G for $350 shipped (some of it used/refurb). I have the rest already and can re-use. Will just survive on the 3000G until I can buy a 5900X at at least retail (though I'd buy used or something if its available by then).

Very best,
5900X will be a great dev/workstation CPU.
Stability is foremost for me too.
I had a 3900X which I migrated from 1700X. After realizing how much productivity I have with a 12 Core CPU, I bought the 5950X because I know it will make my life easier.
I run lots of VMs on Linux, compile code and stuffs, Multiple eclipse instances. On Windows I run Orcad PSpice, MATLAB Simulink, IBM Rhapsody, Visual studio, VMs, Android Studio, Eclipse and so on, and doing ALL at once is such a breeze. I use around 28-30 GB of RAM during normal days on Windows. On Linux I use lesser.

Just a note, For RAM, it may not be stable to run them at their rated speed later when you add another 32GB. I had the same problem like that.
I struggled for quite a bit thinking my PC is unstable, then one day I took out 2x kits from my 4x kit I have in the board for testing another build. It turns out my 3900X was very stable with 2x kits, it became unstable after I added two more kits. For 5950X I have 4x kits from the start rated for 4 kits. Super stable.
Just take note of that.
 

MalVeauX

Senior member
Dec 19, 2008
653
176
116
5900X will be a great dev/workstation CPU.
Stability is foremost for me too.
I had a 3900X which I migrated from 1700X. After realizing how much productivity I have with a 12 Core CPU, I bought the 5950X because I know it will make my life easier.
I run lots of VMs on Linux, compile code and stuffs, Multiple eclipse instances. On Windows I run Orcad PSpice, MATLAB Simulink, IBM Rhapsody, Visual studio, VMs, Android Studio, Eclipse and so on, and doing ALL at once is such a breeze. I use around 28-30 GB of RAM during normal days on Windows. On Linux I use lesser.

Just a note, For RAM, it may not be stable to run them at their rated speed later when you add another 32GB. I had the same problem like that.
I struggled for quite a bit thinking my PC is unstable, then one day I took out 2x kits from my 4x kit I have in the board for testing another build. It turns out my 3900X was very stable with 2x kits, it became unstable after I added two more kits. For 5950X I have 4x kits from the start rated for 4 kits. Super stable.
Just take note of that.

Interesting thanks; by 4x kits rated for 4 kits, what does that mean really? I would think this would be motherboard dependent more-so than CPU dependent on the handling of the memory since some toplogy is t-top and some daisy chain.

Very best,
 

DisEnchantment

Golden Member
Mar 3, 2017
1,601
5,780
136
Interesting thanks; by 4x kits rated for 4 kits, what does that mean really? I would think this would be motherboard dependent more-so than CPU dependent on the handling of the memory since some toplogy is t-top and some daisy chain.

Very best,
What it means is that, if you buy 4x16 GB kits rated for 3600CL16 for 64GB, they will work fine.
If however, you buy 2x16GB kits rated for 3600CL16 for 32GB, and then later add another 2x16GB rated for 3600CL16 for 32GB, it most likely will be unstable if you try to now run 4x16GB at 3600CL16 for 64 GB. If you are lucky it might work, but it can be unstable and you might need to lower clocks if you discover unstable behavior.
 
  • Like
Reactions: Tlh97 and prtskg

MalVeauX

Senior member
Dec 19, 2008
653
176
116
What it means is that, if you buy 4x16 GB kits rated for 3600CL16 for 64GB, they will work fine.
If however, you buy 2x16GB kits rated for 3600CL16 for 32GB, and then later add another 2x16GB rated for 3600CL16 for 32GB, it most likely will be unstable if you try to now run 4x16GB at 3600CL16 for 64 GB. If you are lucky it might work, but it can be unstable and you might need to lower clocks if you discover unstable behavior.

Ok, makes sense, thanks. I am still curious, I would think this is more the memory handling on the motherboard rather than memory dependent or CPU dependent. I definitely get the idea of a pair of sticks of RAM being rated on their own with daisy chain topology memory handling, but if it's t-topology it should be rather happy with multiple sticks as long as the sticks are rated to be the same with timings, etc and same die, etc. But I will definitely keep an eye on this. Stability is important to me as it's lost time and we don't have a renewable source of time yet.

Very best,
 

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
On a dual-CCD CPU such as a 5900x, does the Windows 10 CPU scheduler access both CCD simultaneously or does it try to do as much processing as possible on the 1st CCD before accessing the 2nd CCD? For example, if a certain game uses 6 cores, does the scheduler divide the workload evenly between the 2 CCD's of the 5900x or does it use all 6 cores of the 1st CCD and leave the other CCD on idle?
 

moinmoin

Diamond Member
Jun 1, 2017
4,944
7,656
136
On a dual-CCD CPU such as a 5900x, does the Windows 10 CPU scheduler access both CCD simultaneously or does it try to do as much processing as possible on the 1st CCD before accessing the 2nd CCD? For example, if a certain game uses 6 cores, does the scheduler divide the workload evenly between the 2 CCD's of the 5900x or does it use all 6 cores of the 1st CCD and leave the other CCD on idle?
Starting with Windows 10 v1903 the Windows Scheduler tries to keep all threads of a given process within a chosen CCX which reduces latency penalties of inter-CCX communication. Multiple processes are still spread among all available cores as usual.
 
  • Like
Reactions: prtskg

Dave3000

Golden Member
Jan 10, 2011
1,351
91
91
Starting with Windows 10 v1903 the Windows Scheduler tries to keep all threads of a given process within a chosen CCX which reduces latency penalties of inter-CCX communication. Multiple processes are still spread among all available cores as usual.

Does this mean that if a game is using 6 or less cores on the 5900x, it will only use half the memory write bandwidth of the 5900x, and to really take advantage of the full memory write bandwidth of the 5900x a game would need to be utilizing more than 6 cores on the 5900x and more than 8 cores on the 5950x because of the way the Windows 10 scheduler works? Full memory write bandwidth can't be taken advantage of if 6 or less cores are used on the 5900x and 8 or less cores are used on the 5950x? One of the reasons I'm considering a 5900x over the 5800x is because of the full bandwidth memory controller but I mainly game and I don't really need more than 8 cores but then again the 5900x is only $100 more than the 5800x for 4 more cores. From what I understand, to get full write memory bandwidth the Ryzen needs a dual CCD configuration which the 5900x and 5950x have but the 5600x and 5800x are in a single CCD configuration but with the way the Windows 10 scheduler works, the full memory write bandwidth won't kick in on the 5900x until more than 6 cores are utilized and more than 8 cores are utilized on the 5950x. Am I correct here or am I misunderstanding something here?
 
Last edited:
Dec 6, 2008
149
24
81
i have a modest asus prime b450, + the 5600x and it runs like a dream , ( coming from a 3600) last bios enables PBO2, lots of features etc its almost goes to full 5ghz all cores with great temps, this cb20 scores are w just amild 100+ extra pbo + curve optimizer -15 with 16gb 32000 cl 16 single ch ram




pbo +200


index.jpg
stock temps after decompresing large files (add 2c+ with pbo)

edit tried to find a 32gb 3600 cl 16 "dual rank" kit.... ( for that extra 5 fps) it was all overpriced rgb garbage. end up getting a cheap 32 gb 3600 CL18 .... single rank kit from oloy (lol) great deal. i bet i will work better at 3200 cl 14 ( if its able to go there, i doubt)
 
Last edited:

phillyman36

Golden Member
Jun 28, 2004
1,762
160
106
Managed to score a Evga RTX 3080 FTW. Just need the 5900x and the Asus Dark hero mob(i have 3 on pre order) and im ready to build.

Part list Corsair 680x, Trident Z neo 32Gb(3600), Seasonic Px 1000watt psu, Samsung 980 Pro 1Tb, Evga FTW3 3080, Noctua u14
 

GoodRevrnd

Diamond Member
Dec 27, 2001
6,803
581
126
Got my SF750 psu finally. Now need my 3080 and figure out what mobo is going to work best. Got scared off from the Strix X570-I from a few reported stability issues on 5000s and it apparently needs a flash out of the box but I don't have another cpu. Gig Pro AX B550 seems like next best minor fitment issues reported in my case.

Ryzen 5000 natively addresses gpu and one nvme at pcie4, so limited downside to B550?
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Existing Parts:
ASRock B450M Pro4
Radeon 5700 reference
Corsair RM650X
G.skill Trident Z RGB 2x8GB 3200CL16
Ryzen 3600
CoolerMaster Hyper 212 LED tower cooler

To be swapped in:
Zen 3 chip

Use case is multiple electronic medical records (primarily single-threaded dedicated apps in Citrix), other electronic medical records which are browser-based, photo editing (PS, GIMP), moderate duty Excel workloads, statistical analysis in JMP and R that is fairly parallel and multi-threaded, though not time-consuming enough to on its own justify 5900X or 5950X, some gaming (some by me, some by my son -- Kerbel, Civ 6, MSFS2020, Minecraft, etc. - 1440p75 primary monitor).

Reason for upgrade is the substantial uplift in single-threaded performance on the Zen 3 chips, which will likely benefit me in most of my workloads. Not that my current 3600 is slow, but it does make its presence known from time to time, and with the statistical analysis stuff, it would be really nice to run through more operations more quickly.

Most of this stuff is not too taxing on the GPU, and the one thing that is - FS2020 - I'm going to take a wait-and-see approach since my son and I aren't likely to be doing it as a major hobby, more of a fun diversion at this time.

I get the sense bumping up the RAM to 3600 or 3800 MT/s with same latency might be a nice upgrade -- though I'm not sure it's a "necessary" one, the prices right now are quite good and I'd hate to miss a good window.

Questions:
- $ per core for the 5900X is so much better than the 5800X that, if I really wanted to pick something heavier-duty beyond the 5600X, I'm strongly considering skipping up to a 5900X -- are there any likely VRM issues, cooling issues, or other considerations in my current setup if I wanted to do so?
- Is the RAM upgrade something that'll create a noticeable improvement in lightly-threaded activities?
- Any other considerations?
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,617
10,826
136
@amrnuke

I had once feared that the 3900X would offer up serious VRM problems for cheaper boards, but that threat never really materialized. You had to hit pretty low on the AM4 motherboard list to find VRMs that would really overheat with a chip like that in the socket. It was possible, but rare.

The 5900X isn't going to be any worse.

edit: Looks like the B450m Pro4 has a weaker VRM than the B450 Pro4. As long as its cooled you should be fine.
 
Last edited:

MalVeauX

Senior member
Dec 19, 2008
653
176
116
Well, I got some parts in yesterday, it all came fast. Except a CPU. :D sigh... and some parts I already have waiting:

ASUS TUF X570 Gaming-Wifi motherboard
TG Dark Z a DDR4 32Gb (16x2) 3200Mhz CL16-18-18-38 RAM
Crucial 500Gb NVMe M2 SSD
Crucial 250Gb SATA SSD
CM Hyper 212 Black Edition Cooler
Corsair TX V2 750 PSU
nVidia GTX 750Ti (just for display x 3, no gaming)
NZXT Whisper Case

Waiting on a lowly Athlon 3000G to arrive in two weeks to hold me over, while waiting for a 5900X to become available for purchase somewhere. So... like February? March? Sigh!

I'm definitely skipping all the 3000 series and all the 5000 series to the 5900X point on this workstation; I've not had an upgrade in 8 years now.

Very best,
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,436
146
Well last night I just started playing with PBO2 undervolting in BIOS. So far I am using -18 allcore offset on my 5800X, which does seem to improve temps and performance a bit. I will keep testing and tweaking, plus I have a new cooler to install later. I am going to use a scythe ninja 5, while opening up the top of my case and adding another exhaust fan for better airflow.

I hope that the new cooler is better and quieter than this old corsair unit.
 

Kenmitch

Diamond Member
Oct 10, 1999
8,505
2,249
136
@amrnuke

I had once feared that the 3900X would offer up serious VRM problems for cheaper boards, but that threat never really materialized. You had to hit pretty low on the AM4 motherboard list to find VRMs that would really overheat with a chip like that in the socket. It was possible, but rare.

The 5900X isn't going to be any worse.

It really depends on ones work flow, case air flow and ambient temps of room.

I guess a person could reference some of the Hardware Unboxed videos were he does somewhat push the higher core offerings. I'd imagine the 5xxx series would have similar results.

A couple of snips of B550's and X570's for quick reference. The higher end boards usually have the best temps.

B550 VRM temps.JPG

X570 VRM temps.JPG

I view the above examples as most likely not even the worse case scenarios.
 
Last edited:

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Thanks guys. I have Corsair LL120 (43.25 cfm max) - 2 intake in the front, 1 top exhaust (just above and slightly rear of the center of the CPU), and 1 rear exhaust. (Mostly because my 7 year old son loves RGB, otherwise I'd have gone black/brown Noctua-style for better performance / noise.)

I keep the house around 72-75F. I also leave the bottom edge of the plexiglass cover slightly cracked open, and dust the PC and vent sites out every couple of weeks. The back of the case behind the motherboard gets cleaned out once a month.

I won't be max-loading for more than 15-20 minutes at a time for some of the statistical stuff, so hopefully I'm in the clear. Thanks for digging up some info @Kenmitch and @DrMrLordX !
 

Det0x

Golden Member
Sep 11, 2014
1,028
2,953
136
Asus have released a new bios version 3003 for the rog crosshair viii hero (wi-fi) @ https://www.asus.com/supportonly/ROG CROSSHAIR VIII HERO (WI-FI)/HelpDesk_BIOS/
But be warned, seems something strange is going on with the performance compared to bios 2702

L3 bandwidth look to have doubled in aida64, but people are reporting lower scores in benchmarks. (atleast in cinebench)

BIOS 3003:

CB20 ST - > From 650 pts (BIOS 2702) to 617.

hmm.png


2kkk.png

Read more in this thread: https://www.overclock.net/threads/a...erclocking-discussion-thread.1728796/page-195
 
Last edited:

Tup3x

Senior member
Dec 31, 2016
959
942
136
Just put my build together.... but... No matter what it doesn't POST. It seems like latest BIOS flashed properly. There isn't even error leds so it seems to be almost POSTing. Not sure what to do tomorrow.
 
  • Wow
Reactions: DisEnchantment