• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Review Ryzen 3950X / i9-9900K "Tuned" Memory Scaling Performance @ TechSpot

UsandThem

Elite Member
https://www.techspot.com/review/1955-ryzen-3950x-vs-core-i9-9900ks-gaming/

Pretty interesting article to argue about discuss and some decent performance increases for those who are willing to manually set their memory (roughly a 4% - 6% gain) compared to default RAM settings (XMP). It's not the biggest performance gains ever, but it's really just some extra free performance for those willing to do it.

6.jpg

7.jpg
They also linked to a past article in this review that I hadn't seen before. In it, they tested the same XMP vs. manual timings using a Ryzen 3900X: https://www.techspot.com/review/1891-ryzen-memory-performance-scaling/
 
Thank you for posting this. Too bad Steve Walton did not give the specific parameters for the memory tuning. By clicking on the previous article concerning the 3900x it appears he used the Dram calculator tool for the Ryzen.
 
Nice article. The latency tuning benefits the ryzens more ofc
I would like to see clock tuning, like 4200 or so
 
I didn't realize the 3950x and 9900KS were so close gaming wise. Even at 1080p with a 2080Ti (not a very common matchup I imagine) Techspot showed the Intel chip to be 7% faster out of the box and 4% faster with tuned memory over their 18 game test suite.
 
I didn't realize the 3950x and 9900KS were so close gaming wise. Even at 1080p with a 2080Ti (not a very common matchup I imagine) Techspot showed the Intel chip to be 7% faster out of the box and 4% faster with tuned memory over their 18 game test suite.
There’s still a GPU bottleneck, they should’ve done 720p low settings benches. The next gen 3080ti Super Duper Ultra Edition will be bottlenecked by the 3950x, so the 9900KS is the only option for real gamers!
 
I didn't realize the 3950x and 9900KS were so close gaming wise. Even at 1080p with a 2080Ti (not a very common matchup I imagine) Techspot showed the Intel chip to be 7% faster out of the box and 4% faster with tuned memory over their 18 game test suite.

The settings were honestly pretty terrible. 1080p Ultra with Max MSAA is incredibly wasteful for those with high refresh displays. I see better numbers in 1440p because I run generally : max textures, low/no AA, medium shadows, high lighting/AO/AF.

For reference, I have 3700X w/1080ti, and 9900KS w/2080ti, and never run anything close to all Ultra unless it's ancient. Framerate is far too important.

Tests like these should ALWAYS show at minimum :

1080 Minimal/Low (to show engine/CPU bottleneck, useful for thinking about future GPU paths)

1440 High (to show general performance for common high refresh gaming today)

4k High/Ultra (to show if 60fps is a go)

A lot of games such as RDR2 absolutely tank with some pretty bad optimization with certain settings.
 
The settings were honestly pretty terrible. 1080p Ultra with Max MSAA is incredibly wasteful for those with high refresh displays. I see better numbers in 1440p because I run generally : max textures, low/no AA, medium shadows, high lighting/AO/AF.

For reference, I have 3700X w/1080ti, and 9900KS w/2080ti, and never run anything close to all Ultra unless it's ancient. Framerate is far too important.

Tests like these should ALWAYS show at minimum :

1080 Minimal/Low (to show engine/CPU bottleneck, useful for thinking about future GPU paths)

1440 High (to show general performance for common high refresh gaming today)

4k High/Ultra (to show if 60fps is a go)

A lot of games such as RDR2 absolutely tank with some pretty bad optimization with certain settings.

I agree for the most part. Those three quality settings would give a broader view of a CPU's capabilities and cover almost all of the gamers out there.

Although, I still contend that those gamers running a 2080Ti with low settings on a 1080p monitor are a niche within a niche. Much more common for 1440p/4k monitors when running a $1,200 GPU if I had to guess. Even if you use the argument that future GPUs will relieve some of the GPU bottleneck at 1440p and thus show higher performance with a faster CPU, I haven't seen any good analysis to prove it one way or the other. Sounds plausible but I'd be interested to see some research.
 
I agree for the most part. Those three quality settings would give a broader view of a CPU's capabilities and cover almost all of the gamers out there.

Although, I still contend that those gamers running a 2080Ti with low settings on a 1080p monitor are a niche within a niche. Much more common for 1440p/4k monitors when running a $1,200 GPU if I had to guess. Even if you use the argument that future GPUs will relieve some of the GPU bottleneck at 1440p and thus show higher performance with a faster CPU, I haven't seen any good analysis to prove it one way or the other. Sounds plausible but I'd be interested to see some research.

It does make it difficult to theorize about the future. It can be done looking back a bit though, if someone felt like testing 2017 and 2018 rigs with older flagship GPUs vs new ones. I no longer have a 980ti, but it would be interesting to see how various older platforms dealt with newer titles and 2080ti. I'm definitely looking forward to the 7nm Ampere, it should give a dramatic leap forward compared to the painful RTX gen. $ for $, it was the worst one in memory, at least the Supers weren't as bad as feared. Noticably improved per tier without a big price hike.
 
Back
Top