Question Performance hit going from 8700k to 3900x with same GPU

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

viivo

Diamond Member
May 4, 2002
3,344
32
91
This relates strictly to gaming performance, not productivity and other tasks.

Most of my gaming time is spent playing an MMO - FFXIV - which isn't exactly a demanding game, so I’m quite surprised that my framerate has become unplayable. With an 8700k and the same 5700 XT I was able to maintain 100FPS nearly 100% of the time, but since switching to the 3900X my FPS maxes out in the mid-70s to low 80s with frequent drops and spikes.

Windows 10 x64 clean install, all latest drivers and BIOS. CPU and GPU are properly cooled with third-party solutions and neither is entering throttling. FreeSync is not an option due to my monitor’s nauseating amount of flashing and flickering.

I doubt this is normal. Anyone know a potential cause?

Specs:
3900x
ASYS TUF X570
32GB C16 3200 @ 3200 / fclk 1600
5700 XT, 2560x1440 @ 100hz
OS and games on separate NVMe drives
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Is it normal for HWINFO to report 95% total CPU usage ("Max CPU/Thread usage" is 100%) from just loading Steam after booting into Windows? I know it's not a real 95% workload but it's the highest I've seen with this CPU and wanted to ask.
edit: Screenshot -View attachment 22530




I've had the average column disabled, but max has been 4600 for core 0, 4500 for 1-5, and 4300 for 6-11.




Here's an example of why I started the thread. This is a lightly crowded area in which I would have been getting a solid 100FPS before. 77FPS feels like a stuttery slideshow at 100hz and no VRR.
View attachment 22529
I'm not sure that's normal for Steam but I'd have to try at home and see.

In your shoes I'd really fiddle around a lot. This game seems like a great hobby for you, and I'd make sure it works as best as able.

Some things I'd mess around with:
- use as much RAM at the highest MT/s and then lowest CL timings as possible, ideally 4 sticks if you have it - Ryzen DRAM calculator
- consider disabling SMT, disabling one chiplet, and then disabling both SMT and one chiplet (yes... taking your 3900X to 6c/6t!) to see how it affects your FPS
- - (finding out which is your "good" chiplet and "bad" chiplet largely depends on the peak frequency at default settings, AMD auto-selects these and highlights the top core in the factory, so it should be easy to find)
- if you find things better with SMT and/or one chiplet disabled, with the added heat headroom, consider a mild overclock, if you're comfortable with it

IMO there should be some combination of the above that allows the game to gain substantial, hopefully double digit percentage, frame rate increases. However, my worry would be stuttering - this is something that might - depending on the game - get worse with fewer cores. However, since the 8700K is 6c/12t, you may be able to get away with either disabling a chiplet OR disabling SMT to maintain 12 threads and keep stuttering in line.

Sounds really interesting to me, I'm by no means an expert in these matters, but hopefully others can chime in with any ideas as well.
 

ondma

Platinum Member
Mar 18, 2018
2,726
1,291
136
Thanks so much for the responses and helpful data. It does look like it’s the engine, at least in the case of FFXIV.



Are you sure you aren’t looking at just the Minimum and/or Current columns? The Maximum column shows boosts of 4600, 4500, and 4300 which I assume is normal.
Yes, I was looking at the "current" column, because that is what the computer is actually running. The "maximum" column only tells what the max frequency was since the counter was last reset. If could have been only a brief boost when starting up or running some other task.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
Cpu usage looks quite low (I think, hard to see all the numbers), but you are only boosting to 3.7 ghz. This seems like a reasonable clock speed for a heavy load or stress test, but shouldn't you be getting higher boost than that (at least on a few cores) in a game which is not using the cpu heavily?
You're right, because that's actually even lower than the base clock.
 

Hitman928

Diamond Member
Apr 15, 2012
5,374
8,219
136
You're right, because that's actually even lower than the base clock.

If you look at the effective clock rows as well as the CPU temperature, it appears that when he took the screenshot he was idling at desktop and the CPU was in powersave mode.
 

lobz

Platinum Member
Feb 10, 2017
2,057
2,856
136
If you look at the effective clock rows as well as the CPU temperature, it appears that when he took the screenshot he was idling at desktop and the CPU was in powersave mode.
Sorry, it's a bad habit of mine -> I react to posts as I read them, when I read through multiple pages, it's not easy to always go till the very end and then go back again. Even though I should do exactly that :)
 

Chicken76

Senior member
Jun 10, 2013
259
41
91
@viivo
  • are you running "game mode" in Ryzen Master? If so, disable it.
  • which power profile are you using in windows? If you installed the AMD latest drivers you should use the AMD optimized one
  • have you messed around in your bios? If so, I suggest resetting it to defaults and only modifying the absolutely necessary things, like which drive to boot from
  • do you have an audio adapter in your system with an up-to-date driver? You say you play without sound, but some game engines need some audio device to sync video to audio. You don't need to actually plug in anything, just have it in the system.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,389
10,072
126
do you have an audio adapter in your system with an up-to-date driver? You say you play without sound, but some game engines need some audio device to sync video to audio. You don't need to actually plug in anything, just have it in the system.
This. They are used as a timing source by the engine.
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
I don't play that game in particular, but funny enough I had a very similar experience overall. I had an 8086k OC build and a 2700x build. When Ryzen 3000 launched, I sidelined the 8086k to be replaced with a 3900X as my 'everything' rig instead of having two. Well, I couldn't find one at first, so built a 3700X. This was with just a 1080ti. Gaming performance dropped substantially all over the place. Used 3733 ram, but set at 3600 CL15, some titles were fairly close while others were dramatically worse when set at my preferred settings (Max Texture+Details, Low AA/Shadows to pursue high 144hz range where possible).

Ended up upgrading to 3900X finally, which helped 0% in gaming but is an otherwise incredible CPU. And finally moved to 5+GHz 9900KS build with 2080ti, which sounds great, but it's honestly only for gaming, I still prefer the 3700X for normal use as it probably uses a third the power, though I've stripped it finally to just a RX550 lol.
 

Excelsior

Lifer
May 30, 2002
19,048
18
81
I don't play that game in particular, but funny enough I had a very similar experience overall. I had an 8086k OC build and a 2700x build. When Ryzen 3000 launched, I sidelined the 8086k to be replaced with a 3900X as my 'everything' rig instead of having two. Well, I couldn't find one at first, so built a 3700X. This was with just a 1080ti. Gaming performance dropped substantially all over the place. Used 3733 ram, but set at 3600 CL15, some titles were fairly close while others were dramatically worse when set at my preferred settings (Max Texture+Details, Low AA/Shadows to pursue high 144hz range where possible).

Ended up upgrading to 3900X finally, which helped 0% in gaming but is an otherwise incredible CPU. And finally moved to 5+GHz 9900KS build with 2080ti, which sounds great, but it's honestly only for gaming, I still prefer the 3700X for normal use as it probably uses a third the power, though I've stripped it finally to just a RX550 lol.

What games and resolution do you run out of curiosity?
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
What games and resolution do you run out of curiosity?

Hi! I have two Gysnc displays, one 144hz 2560x1440 (competitive FPS, more *twitch* style gaming), and one 3440x1440 120hz for single player, more cinematic style games that support 21:9.

Personally I find framerate more important than some visual effects that are less noticably different in motion, if there is a trade-off to be made. Shadows are a big one of these for me, as well as excessive AA. Eye of the beholder and all that, but lower AA and shadows at say 130-140fps looks and feels better to me compared to say 60-90fps at max AA and ultra shadows.

And I find the differences can indeed often be that large. If you look at reviews of GPUs and you see lots of 1080p Ultra, 1440p Ultra, 4k Ultra framerates, what I observe is that often you can achieve the '1080p' level framerates or better at 1440p with mixed settings. Keeping Textures, draw distance, and things like AF and occlusion set to max, while finding some others to ease off on.

The implications for me personally are easing back from a GPU bottleneck back to a CPU one in many instances, even with a heavily OC 9900KS, I still cannot reach 144fps in a lot of titles even with completely stripped minimal settings. And of course settings that low are hideous to look at and only useful for seeing where the CPU limits are for a particular title.

One such example that I noticed most severely were the back to back AC entries : Origins and Odyssey. They are not alone however, there are a number of titles where it's difficult to lock 144 such as Hitman 2, Far Cry 5/New Dawn, Metro Exodus, Control, etc, and I can imagine Cyberpunk 2077, AC Valhalla, the PC release of Horizon Zero Dawn, etc will only continue this trend.

It's a very esoteric problem, as it's mostly a philosophical and subjective consideration. Many people will be totally fine with 60fps, hell many are ok with 30 and dips on consoles. I prefer to push further when I can, and that's led me to dumping a series of very good CPUs along the recent past, either by replacing them or using them for non-gaming roles, including 8086k, 2700x, 3700x, and 3900x, as they all fell short of my goals. 9900KS follows this trend, although at 5.2/5.3Ghz I don't think I can currently invest in any observably superior option until Zen3 and/or a future Intel release moves this bottleneck further up. I do only have 4133Mhz ram currently, well 4000@4133, so there is some potential there to perhaps shoot for 4400.

3000 series Nvidia will just make this more obvious, as raising the GPU bar will probably make the typical 1440p Ultra comparisons of tomorrow look like the 1080p Ultra comparisons of today, and for mixed settings framerate chasers such as myself, only slam into CPU walls all the sooner.

All of this is mostly academic if someone is running 4k, or any form of 60hz, or any GPU where they are more weighted towards GPU bottleneck earlier. I'd even say that's probably 99% of PC gamers period, if not 99.9% going by deeper analysis. Numbers of hardware configs to be practically chasing 144hz in AAA gaming are vanishingly small, and I have to imagine a significant portion of those will be connected to some form of 60hz display, or running 4k, which even with a 2080ti or Titan RTX is a hard ask. It's why I looked at the 4k/144hz monitors and although tempting to a degree, I don't think I would find enjoyment in having to cut settings far enough for my 2080ti to even hit 100fps in many titles. I'd have to imagine for eSports it would be amazing though, as a lot of those types of players play scalable titles with deliberately stripped settings which WOULD in fact make 4k/144 possible. Another niche of a niche of a niche.
 

viivo

Diamond Member
May 4, 2002
3,344
32
91
Thanks again for the many helpful responses.

In the interest of SCIENCE I acquired another 3900x and I am so glad I did that I don't mind I'll lose money selling the other one. With the new one, temps and voltages are down, all-core boost clocks are up, and there is a very noticeable difference in performance. I'm not trying to claim I got a chip from the best part of Wafer Town, but that the first one I had must be one of the worst examples of a 3900x.

Now if only UPS could get here before 8:30PM for once so I can see how big a difference some good b-die makes. In addition to the GSkill TridentZ C15 3600, I also ordered some Patriot Viper C19 4400. Unless the reports of Patriot using low binned b-die is true for all modules, it should be the better of the two.
 

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
C15 3600 has first word latency of 8.33ns, C19 4400 8.63. For ZEN2 sweet spot is 3800CL14 or CL14 + GDM with custom secondaries and tertiaries.
 

viivo

Diamond Member
May 4, 2002
3,344
32
91
C15 3600 has first word latency of 8.33ns, C19 4400 8.63. For ZEN2 sweet spot is 3800CL14 or CL14 + GDM with custom secondaries and tertiaries.

That’s what I was referring to. I would think that between C19 4400 and C15 3600 the former has a better chance of doing C14 3800.
 

DrMrLordX

Lifer
Apr 27, 2000
21,700
10,976
136
@viivo

Your Patriot DIMMs seem to have similar XMP settings to my Corsair DDR4-4400. Mine never did better than DDR4-3733 14-16-14-28 1T. Hopefully your kit works out a bit better.
 

TheGiant

Senior member
Jun 12, 2017
748
353
106
Hi! I have two Gysnc displays, one 144hz 2560x1440 (competitive FPS, more *twitch* style gaming), and one 3440x1440 120hz for single player, more cinematic style games that support 21:9.

Personally I find framerate more important than some visual effects that are less noticably different in motion, if there is a trade-off to be made. Shadows are a big one of these for me, as well as excessive AA. Eye of the beholder and all that, but lower AA and shadows at say 130-140fps looks and feels better to me compared to say 60-90fps at max AA and ultra shadows.

And I find the differences can indeed often be that large. If you look at reviews of GPUs and you see lots of 1080p Ultra, 1440p Ultra, 4k Ultra framerates, what I observe is that often you can achieve the '1080p' level framerates or better at 1440p with mixed settings. Keeping Textures, draw distance, and things like AF and occlusion set to max, while finding some others to ease off on.

The implications for me personally are easing back from a GPU bottleneck back to a CPU one in many instances, even with a heavily OC 9900KS, I still cannot reach 144fps in a lot of titles even with completely stripped minimal settings. And of course settings that low are hideous to look at and only useful for seeing where the CPU limits are for a particular title.

One such example that I noticed most severely were the back to back AC entries : Origins and Odyssey. They are not alone however, there are a number of titles where it's difficult to lock 144 such as Hitman 2, Far Cry 5/New Dawn, Metro Exodus, Control, etc, and I can imagine Cyberpunk 2077, AC Valhalla, the PC release of Horizon Zero Dawn, etc will only continue this trend.

It's a very esoteric problem, as it's mostly a philosophical and subjective consideration. Many people will be totally fine with 60fps, hell many are ok with 30 and dips on consoles. I prefer to push further when I can, and that's led me to dumping a series of very good CPUs along the recent past, either by replacing them or using them for non-gaming roles, including 8086k, 2700x, 3700x, and 3900x, as they all fell short of my goals. 9900KS follows this trend, although at 5.2/5.3Ghz I don't think I can currently invest in any observably superior option until Zen3 and/or a future Intel release moves this bottleneck further up. I do only have 4133Mhz ram currently, well 4000@4133, so there is some potential there to perhaps shoot for 4400.

3000 series Nvidia will just make this more obvious, as raising the GPU bar will probably make the typical 1440p Ultra comparisons of tomorrow look like the 1080p Ultra comparisons of today, and for mixed settings framerate chasers such as myself, only slam into CPU walls all the sooner.

All of this is mostly academic if someone is running 4k, or any form of 60hz, or any GPU where they are more weighted towards GPU bottleneck earlier. I'd even say that's probably 99% of PC gamers period, if not 99.9% going by deeper analysis. Numbers of hardware configs to be practically chasing 144hz in AAA gaming are vanishingly small, and I have to imagine a significant portion of those will be connected to some form of 60hz display, or running 4k, which even with a 2080ti or Titan RTX is a hard ask. It's why I looked at the 4k/144hz monitors and although tempting to a degree, I don't think I would find enjoyment in having to cut settings far enough for my 2080ti to even hit 100fps in many titles. I'd have to imagine for eSports it would be amazing though, as a lot of those types of players play scalable titles with deliberately stripped settings which WOULD in fact make 4k/144 possible. Another niche of a niche of a niche.
QFT
I have the same experience as the OP with fps going down in gaming critical situations with my 3900X, but vs 9900K i had for a while last year
the reviews currently are misleading, they test more of a general gameplay than critical gameplay situations and with pre built test
even the 1% lows are not representative
I have the same approach, I like smoothness more than a few details
However, the 5700XT I have is not fast enough for 1440p
Oh and the consoles, your statement can't be more true- 30fps with you said it politically correct- dips it is unplayable
I am glad that the console world is finally going to the 60 fps target, now possible with the zen2 CPUs
It is a strange situation now
the 10900K is out, but 3080TI is not
I don't like waiting :)
 
  • Love
Reactions: Arkaign

JoeRambo

Golden Member
Jun 13, 2013
1,814
2,105
136
That’s what I was referring to. I would think that between C19 4400 and C15 3600 the former has a better chance of doing C14 3800.

C14 3800 is a first word latency of 7.36ns, a tough call for both of them. WIth Ryzen you don't need high mem clock binning, just tightest possible latency binning. I had very good success with 3200CL14 ( first word latency of 8,75ns ). I think ~1.4V they all can clock 3800, just the timings are the question.
 

viivo

Diamond Member
May 4, 2002
3,344
32
91
In case there are others in a similar situation, I’ll post my entirely subjective and anecdotal results that may or may not be of any help.

F72CEC47-9995-4878-B6BD-6449740A093E.jpeg

Early results with the Patriot kit aren’t the most promising, but that’s probably because I haven’t spent the time optimizing all timings. At least TB confirms the modules are b-die. Ryzen Calculator‘s suggestions are very optimistic so I’ll have to just use it as a guide while engaging in trial and error. Quick and dirty C14 3600 at 1.5v ran fine but I’d like to try getting stability at a lower voltage.

If I’m unable to get the Patriot to do C14 and at least 3600 with decent sub timings , I’ll crack open the GSkill.

Any timings suggestions are also greatly appreciated. I don’t have the time or enthusiasm I once did to spend hours tweaking settings.

edit: So much for the GSkill. No POST despite multiple CMOS clears and reseating the DIMMs. Motherboard DRAM debug LED stays lit.

edit 2: At last back to a relatively consistent 99FPS in FFXIV. Disabled SMT and currently RAM only at C15 3600 but will tweak further when I get time. Otherwise things look great.
 
Last edited:
  • Like
Reactions: lightmanek

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,627
14,618
136
In case there are others in a similar situation, I’ll post my entirely subjective and anecdotal results that may or may not be of any help.

View attachment 23013

Early results with the Patriot kit aren’t the most promising, but that’s probably because I haven’t spent the time optimizing all timings. At least TB confirms the modules are b-die. Ryzen Calculator‘s suggestions are very optimistic so I’ll have to just use it as a guide while engaging in trial and error. Quick and dirty C14 3600 at 1.5v ran fine but I’d like to try getting stability at a lower voltage.

If I’m unable to get the Patriot to do C14 and at least 3600 with decent sub timings , I’ll crack open the GSkill.

Any timings suggestions are also greatly appreciated. I don’t have the time or enthusiasm I once did to spend hours tweaking settings.

edit: So much for the GSkill. No POST despite multiple CMOS clears and reseating the DIMMs. Motherboard DRAM debug LED stays lit.
I have gskil in every one of my Ryzens, including 3 3900x's
 

Arkaign

Lifer
Oct 27, 2006
20,736
1,377
126
Zen2 overall seems way friendlier with various DDR4 kits vs Zen+ and especially OG Zen. Also I don't know if it's mfg variance or not, but I literally had ram listed on the QVL that didn't work in a few cases, and far more cases especially with Zen2 where Ram NOT on the QVL worked like a boss.

Strange that your Gskill isn't firing up there. Maybe you can see if Mark will be willing to share some model numbers with you and you could see about matching his recommendation based on his experience.
 

DrMrLordX

Lifer
Apr 27, 2000
21,700
10,976
136
@viivo

memoc.png

Those are my settings on Corsair DDR4-4400 when running DDR4-3666 14-16-14 28 1T. I got those timings from setting Ryzen Memory Calculator to a higher clock than my target, then lowered clocks to make it stable. These settings will run @ DDR4-3733 but not with 100% stability - sometimes it just bombs out.

Be sure to pay attention to every detail of what Ryzen Memory Calculator is trying to tell you - including VDDP and VDDG!
 

Rigg

Senior member
May 6, 2020
472
979
136
Any timings suggestions are also greatly appreciated. I don’t have the time or enthusiasm I once did to spend hours tweaking settings.


I've been playing around with B-die kits on my C8H and Prime-P x570 boards. I have 2 of the trident z 3600 CL15 kits and 2 patriot viper steel 4000 CL19 kits. Using 3900x and 3600 CPU's I didn't really see much difference between the memory kits in both 2 and 4 DIMM configs. The best I could achieve daily stable on any of these configs (even while pushing RAM and SOC voltages) was 3800 CL16. I'm not sure if 3800 CL14 is realistic without a really good IMC and/or an ITX board. At least it hasn't been possible with my mobo/CPU/RAM combos.

I have the below settings daily stable with the Rec. voltages. This is the generic B-die A0/B0 fast preset. I exported my Trident Z's XMP via TB, and tried using the "import XMP" (tried fast and safe settings), but couldn't get the kit to post at the suggested 3800 cl14 settings even with Gear Down mode enabled.

The below passes %1000 per thread in the DRAM calc memtest and also passes memtest86 (4 pass default test) on all of my combinations of RAM kits (2&4 DIMM), motherboards, and CPU's. I did use slightly higher RAM voltage of 1.375 and used medium LLC on the SOC.

Ryzen RAM.png
 

Staples

Diamond Member
Oct 28, 2001
4,952
119
106
Nearly all MMOs are optimized to run best on Intel CPUs and nVidia GPUs. As a matter of fact, most games of any type are.
Yeah, I've always found it strange that console ports to PC prefer Intel/NVIDIA when the base game was developed for AMD/AMD hardware. Maybe the different ports of game engines per system are much different than I have always thought.
 
  • Like
Reactions: viivo