Question Performance hit going from 8700k to 3900x with same GPU

viivo

Diamond Member
May 4, 2002
3,344
32
91
This relates strictly to gaming performance, not productivity and other tasks.

Most of my gaming time is spent playing an MMO - FFXIV - which isn't exactly a demanding game, so I’m quite surprised that my framerate has become unplayable. With an 8700k and the same 5700 XT I was able to maintain 100FPS nearly 100% of the time, but since switching to the 3900X my FPS maxes out in the mid-70s to low 80s with frequent drops and spikes.

Windows 10 x64 clean install, all latest drivers and BIOS. CPU and GPU are properly cooled with third-party solutions and neither is entering throttling. FreeSync is not an option due to my monitor’s nauseating amount of flashing and flickering.

I doubt this is normal. Anyone know a potential cause?

Specs:
3900x
ASYS TUF X570
32GB C16 3200 @ 3200 / fclk 1600
5700 XT, 2560x1440 @ 100hz
OS and games on separate NVMe drives
 

Markfw

Moderator Emeritus, Elite Member
May 16, 2002
25,542
14,496
136
The cl16 ram@3200 could hurt you a little.3600 cl15 or 16 would be better for gaming. Or 3200 cl14.

Just a few FPS.

Edit: Just struck me. By default that ram will run at 2133. Do you have it set to run at rated speed in the bios ?
 

Hitman928

Diamond Member
Apr 15, 2012
5,243
7,792
136
Can you run the benchmark and report results? Also maybe run a couple other game and CPU benchmarks to see if issues pop up elsewhere?
 
  • Like
Reactions: viivo

viivo

Diamond Member
May 4, 2002
3,344
32
91
Nearly all MMOs are optimized to run best on Intel CPUs and nVidia GPUs. As a matter of fact, most games of any type are.

The console versions of FFXIV run exceptionally well on those AMD Jaguar cores. I don't think it's an optimization issue.

I do think it may be a RAM and possibly GPU issue. I just tried NieR and Frostpunk, games that should run well on most hardware, with equally disappointing results. Frostpunk was a stuttering mess. NieR seemed ok but with noticeable frame drops. I just ordered some 3600 C15 memory so I'll see what impact that has. Unfortunately the 5700 XT isn't going anywhere until GPU stock and prices come back down to earth.

edit: Took out two DIMMs for the hell of it and game performance actually apears to have improved.
 
Last edited:

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
This relates strictly to gaming performance, not productivity and other tasks.

Most of my gaming time is spent playing an MMO - FFXIV - which isn't exactly a demanding game, so I’m quite surprised that my framerate has become unplayable. With an 8700k and the same 5700 XT I was able to maintain 100FPS nearly 100% of the time, but since switching to the 3900X my FPS maxes out in the mid-70s to low 80s with frequent drops and spikes.

Windows 10 x64 clean install, all latest drivers and BIOS. CPU and GPU are properly cooled with third-party solutions and neither is entering throttling. FreeSync is not an option due to my monitor’s nauseating amount of flashing and flickering.

I doubt this is normal. Anyone know a potential cause?

Specs:
3900x
ASYS TUF X570
32GB C16 3200 @ 3200 / fclk 1600
5700 XT, 2560x1440 @ 100hz
OS and games on separate NVMe drives
Some out-of-the-box thoughts:

- With the change to a new motherboard, how is your internet going? Can you stream movies at 4K or high resolution without stuttering? What is your internet speed like with the new motherboard? Are those drivers updated?
- What are your numbers like in other games? Can you run some game-centric benchmarks to see what the issue may be? This may help delineate whether it's game-specific or a broader gaming issue.
- What are your 3900X cores doing during this? Can you post some HWInfo data regarding cores / temps / peak clocks / etc while playing?
 
  • Like
Reactions: viivo

viivo

Diamond Member
May 4, 2002
3,344
32
91
Thanks for all the replies and help.

Some out-of-the-box thoughts:

- With the change to a new motherboard, how is your internet going? Can you stream movies at 4K or high resolution without stuttering? What is your internet speed like with the new motherboard? Are those drivers updated?
- What are your numbers like in other games? Can you run some game-centric benchmarks to see what the issue may be? This may help delineate whether it's game-specific or a broader gaming issue.
- What are your 3900X cores doing during this? Can you post some HWInfo data regarding cores / temps / peak clocks / etc while playing?

Internet speeds and connection quality are fine. I'd prefer to use my network card, but the only compatible slot on this motherboard is blocked by my GPU cooler. Otherwise the onboard seems just as good. Since I often play with no sound, onboard and all other audio sources are disabled.
Performance in NieR and Frostpunk was disappointing. I'm installing some other more hardware-intensive games like Outer Worlds and No Man's Sky to test.

HWINFO64 after 2 hours of FFXIV:
hwinf1.png hwinf2.png
 
Last edited:

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
edit: Took out two DIMMs for the hell of it and game performance actually apears to have improved.

I'm not surprised. The memory latency with Ryzen is horrible, even when only using two 8GB sticks, at least compared to Intel CPUs that use the ring bus, which is all of them that aren't based on Xeons. You ordering DDR4 3600 @ CL15 should greatly improve your processor's RAM performance, since the RAM's latency goes down a lot between 3200@ CL16 and 3600 @ CL15.
 
  • Like
Reactions: krumme

Hitman928

Diamond Member
Apr 15, 2012
5,243
7,792
136
The console versions of FFXIV run exceptionally well on those AMD Jaguar cores. I don't think it's an optimization issue.

I do think it may be a RAM and possibly GPU issue. I just tried NieR and Frostpunk, games that should run well on most hardware, with equally disappointing results. Frostpunk was a stuttering mess. NieR seemed ok but with noticeable frame drops. I just ordered some 3600 C15 memory so I'll see what impact that has. Unfortunately the 5700 XT isn't going anywhere until GPU stock and prices come back down to earth.

edit: Took out two DIMMs for the hell of it and game performance actually apears to have improved.

Can you give some benchmark numbers with the 2 vs 4 sticks? You shouldn't see that drastic of an improvement by pulling two sticks out unless running the 4 sticks was forcing you to run at lower RAM speeds.
 

myocardia

Diamond Member
Jun 21, 2003
9,291
30
91
Can you give some benchmark numbers with the 2 vs 4 sticks? You shouldn't see that drastic of an improvement by pulling two sticks out unless running the 4 sticks was forcing you to run at lower RAM speeds.
I would guess that just going from being forced to run 2T with four sticks to 1T with two sticks might very well make a difference that is noticeable, if you didn't need a huge improvement.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
Hard to believe the memory latency made that much difference, but I suppose it is possible. Did you run memtest to make sure things were stable?
 

amrnuke

Golden Member
Apr 24, 2019
1,181
1,772
136
Thanks for all the replies and help.



Internet speeds and connection quality are fine. I'd prefer to use my network card, but the only compatible slot on this motherboard is blocked by my GPU cooler. Otherwise the onboard seems just as good. Since I often play with no sound, onboard and all other audio sources are disabled.
Performance in NieR and Frostpunk was disappointing. I'm installing some other more hardware-intensive games like Outer Worlds and No Man's Sky to test.

HWINFO64 after 2 hours of FFXIV:
View attachment 22493 View attachment 22494
What were peak and average clocks for the CPU?

It may be the case that disabling one chiplet might help things out a lot due to inter-chiplet latency and the latency-dependency of some games.
 
  • Like
Reactions: viivo

viivo

Diamond Member
May 4, 2002
3,344
32
91
Is it normal for HWINFO to report 95% total CPU usage ("Max CPU/Thread usage" is 100%) from just loading Steam after booting into Windows? I know it's not a real 95% workload but it's the highest I've seen with this CPU and wanted to ask.
edit: Screenshot -hwinfo.png


What were peak and average clocks for the CPU?

It may be the case that disabling one chiplet might help things out a lot due to inter-chiplet latency and the latency-dependency of some games.

I've had the average column disabled, but max has been 4600 for core 0, 4500 for 1-5, and 4300 for 6-11.




Here's an example of why I started the thread. This is a lightly crowded area in which I would have been getting a solid 100FPS before. 77FPS feels like a stuttery slideshow at 100hz and no VRR.
ffxiv_06072020_222435_552.png
 
Last edited:

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
The console versions of FFXIV run exceptionally well on those AMD Jaguar cores. I don't think it's an optimization issue.


FFXIV likes Intel. Interestingly enough, the 9700k actually does better here than the 9900k . . . probably due to SMT.

edit: that's FFXV! Ha ha! Ha.

Have you tried disabling SMT on your 3900x?
 
Last edited:
  • Like
Reactions: Tlh97 and viivo

JasonLD

Senior member
Aug 22, 2017
485
445
136

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Only relevant data I can find is the Shadowbringers benchmark. Here's what a 9900k + 2080Ti can score (one sample):


Running bone stock, my 3900x + Radeon VII gets 19740. Let's see where I can get it with my usual gaming settings . . . hold on.

edit: Setting my 3900x to a static clockspeed of 4.4 GHz increased score to 20086 without changing GPU settings. I might try setting affinity next to help it avoid logical cores created by SMT.

edit edit: Watching GPU-z, the game seems to load the GPU only on two scenes during the entire benchmark (first fight scene and final fight scene). In every other case, GPU usage drops below 100% sporadically. Seems like the CPU is having problems feeding the GPU in 1080p.

Overclocking my Radeon VII to 2000 MHz only increased score by ~400 points. I'm sitting at around 20400.
 
Last edited:

ondma

Platinum Member
Mar 18, 2018
2,720
1,280
136
Thanks for all the replies and help.



Internet speeds and connection quality are fine. I'd prefer to use my network card, but the only compatible slot on this motherboard is blocked by my GPU cooler. Otherwise the onboard seems just as good. Since I often play with no sound, onboard and all other audio sources are disabled.
Performance in NieR and Frostpunk was disappointing. I'm installing some other more hardware-intensive games like Outer Worlds and No Man's Sky to test.

HWINFO64 after 2 hours of FFXIV:
View attachment 22493 View attachment 22494
Cpu usage looks quite low (I think, hard to see all the numbers), but you are only boosting to 3.7 ghz. This seems like a reasonable clock speed for a heavy load or stress test, but shouldn't you be getting higher boost than that (at least on a few cores) in a game which is not using the cpu heavily?
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Cpu usage looks quite low (I think, hard to see all the numbers), but you are only boosting to 3.7 ghz. This seems like a reasonable clock speed for a heavy load or stress test, but shouldn't you be getting higher boost than that (at least on a few cores) in a game which is not using the cpu heavily?

Good catch. I put in a static OC on my 3900x to 4.4 GHz and got a score increase to 20086 (from 19740) without even touching GPU settings.

edit: looks like GPU usage drops in a lot of the benchmark scenes, which would indicate that the CPU isn't doing a great job of feeding the GPU. May be engine optimization issues in play, I don't know. In any case, GPU overclock to 2000 MHz only raised score by ~400 points.

edit edit:


Check that out. There are a ton of results there showing interesting data. CPU seems to make a big difference. Note the many 8700k results there . . . including one guy with a 9900k + 2080Ti that clocks in lower than I do, even when I'm running defaults! There's a guy with a 9600k that beats me by 500 points though. I think the 9900k guy was reporting results for 4k or 1440p though.
 
Last edited:
  • Like
Reactions: Tlh97 and viivo

Gideon

Golden Member
Nov 27, 2007
1,622
3,645
136

viivo

Diamond Member
May 4, 2002
3,344
32
91
Thanks so much for the responses and helpful data. It does look like it’s the engine, at least in the case of FFXIV.

Cpu usage looks quite low (I think, hard to see all the numbers), but you are only boosting to 3.7 ghz. This seems like a reasonable clock speed for a heavy load or stress test, but shouldn't you be getting higher boost than that (at least on a few cores) in a game which is not using the cpu heavily?

Are you sure you aren’t looking at just the Minimum and/or Current columns? The Maximum column shows boosts of 4600, 4500, and 4300 which I assume is normal.