[Part 3] Measuring CPU Draw Call Performance in Fallout 4

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
Press Shift+Enter to bring up the UI. This spawns a second mouse for the overlay. On the bottom panel, expand the "Profiler" category to view the number of draw calls.
Hmm okay, thanks. I'll mess with it next time I get the chance.
 

Head1985

Golden Member
Jul 8, 2014
1,858
678
136
Anyone knows where all files are to test it again?I tested it back few years ago on 6700k and i still have best results:D.Now i have ryzen 3700x and i want to test it again, but i dont have save and enb and settings used anymore.
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
Took another stab at measuring Fallout 4 performance, this time with ENB installed.

Video #1: Using ENB with same settings as last video (1080p medium, max draw distances):

Code:
https://youtu.be/gTqV6v1xeHs
Seems like installing ENB lowered my framerates? I was able to find a slow-er spot a few pixels away from the one in my last vid.

Video #2: Using ENB + Head1985's benchmark settings:

Code:
https://www.youtube.com/watch?v=vXH6zynVSxg&feature=youtu.be
Unsure if using godrays on an AMD card was a good idea. Anyway, it managed to be a bit slower than 1080p Medium + max draw distances. I did get maximums above his 6700k though.

Overall, I was disappointed that I had to install something that lowers framerates just to test for draw calls. Maybe I'll uninstall ENB, go back to the same spot on the top of the Corvega factory and see if I can still get framerates below 60 fps.

edit: after uninstalling ENB and comparing my in-game position to that of the Head1985 settings video (still using his settings in 1080p), I was able to get fps down to 57 on top of Corvega examining the same spot. That compared to 51.8 or so using ENB. Maximums without using ENB (VSynv off) were up in the 190s. Is there a way to measure drawcalls that doesn't involve using ENB?
 
Last edited:

MajinCry

Platinum Member
Jul 28, 2015
2,488
561
136
Took another stab at measuring Fallout 4 performance, this time with ENB installed.

Video #1: Using ENB with same settings as last video (1080p medium, max draw distances):

Code:
https://youtu.be/gTqV6v1xeHs
Seems like installing ENB lowered my framerates? I was able to find a slow-er spot a few pixels away from the one in my last vid.

Video #2: Using ENB + Head1985's benchmark settings:

Code:
https://www.youtube.com/watch?v=vXH6zynVSxg&feature=youtu.be
Unsure if using godrays on an AMD card was a good idea. Anyway, it managed to be a bit slower than 1080p Medium + max draw distances. I did get maximums above his 6700k though.

Overall, I was disappointed that I had to install something that lowers framerates just to test for draw calls. Maybe I'll uninstall ENB, go back to the same spot on the top of the Corvega factory and see if I can still get framerates below 60 fps.

edit: after uninstalling ENB and comparing my in-game position to that of the Head1985 settings video (still using his settings in 1080p), I was able to get fps down to 57 on top of Corvega examining the same spot. That compared to 51.8 or so using ENB. Maximums without using ENB (VSynv off) were up in the 190s. Is there a way to measure drawcalls that doesn't involve using ENB?
You made sure to have all the settings in ENB disabled, right? Using godrays is very dumb, for two reasons. The first, is that it's entirely dependent on the GPU and has no CPU impact. The second, is that Bethesda used a nonsensical way of implementing them; they use tessellation, where any sane developer will just use a compute shader.

Maximum FPS is irrelevant. Anyone can get super high framerates by just staring at the ground and letting the frustum culling take care of everything. Minimum fps is what is important, as that shows the performance of the hardware in complex scenes.

If there's any other utility that can measure draw calls, I've not been able to find one.
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
You made sure to have all the settings in ENB disabled, right?
I just went with the default configuration. If there's something I need to disable, please let me know so I can try it again.

Using godrays is very dumb, for two reasons. The first, is that it's entirely dependent on the GPU and has no CPU impact. The second, is that Bethesda used a nonsensical way of implementing them; they use tessellation, where any sane developer will just use a compute shader.
That was my thinking. Head1985 had it in his testing suite though, so when I benched against his settings, I used it. His settings weren't that great when I was running ENB, but without it, my framerates were still quite good.

Maximum FPS is irrelevant.
It is, but it isn't. When my max fps goes from 150-ish to in the 190s then something has changed, when viewing the same spot. It also affects average fps for the run (for better or worse).
 

MajinCry

Platinum Member
Jul 28, 2015
2,488
561
136
I just went with the default configuration. If there's something I need to disable, please let me know so I can try it again.

That was my thinking. Head1985 had it in his testing suite though, so when I benched against his settings, I used it. His settings weren't that great when I was running ENB, but without it, my framerates were still quite good.

It is, but it isn't. When my max fps goes from 150-ish to in the 190s then something has changed, when viewing the same spot. It also affects average fps for the run (for better or worse).
Here are the ENB ini files used in the benchmark I had the users run: https://mega.nz/#!3kUHTY7K!GnxkhGoqX1WjnqxUsVHMbPhfmO5zFmB5Ys8dcMva048

It's got everything disabled, so it should have no impact. When I ran ENB on my Phenom II x4 965, which was extremely limited in performance, I had no difference in fps. Even gained two frames during Fallout 4's earlier releases.

Edit: Actually, if you want to benchmark, I collected the config files, instructions and all, on this thread which I used to collect the results first, before making this thread: https://forums.anandtech.com/threads/draw-call-performance-in-fallout-4.2501467/
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
Okay I got your config working.

OS: Win10 18963.rs_prerelease
CPU: R9 3900x @ 4.4 GHz
RAM: DDR4-3600 14-16-14-28 1T
GPU: AMD Radeon VII @ 2050 MHz/1200 MHz
GPU Driver: 19.8.1

First Save (Corvega)
Draw Calls: ~11690
FPS: ~57.8

Second Save (Diamond City)
Draw Calls: ~8010
FPS: ~66.0
 

MajinCry

Platinum Member
Jul 28, 2015
2,488
561
136
Okay I got your config working.

OS: Win10 18963.rs_prerelease
CPU: R9 3900x @ 4.4 GHz
RAM: DDR4-3600 14-16-14-28 1T
GPU: AMD Radeon VII @ 2050 MHz/1200 MHz
GPU Driver: 19.8.1

First Save (Corvega)
Draw Calls: ~11690
FPS: ~57.8

Second Save (Diamond City)
Draw Calls: ~8010
FPS: ~66.0
That's a good ~12fps over Zen+. It's still got a wee bit of a distance from xLake, but it's nowhere near the 50% difference as with earlier Ryzen CPUs. That's pretty good.
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
That's a good ~12fps over Zen+. It's still got a wee bit of a distance from xLake, but it's nowhere near the 50% difference as with earlier Ryzen CPUs. That's pretty good.
The thing that's striking to me is that with "normal" settings, playing the vanilla game (no ENB effects), I find it difficult to bring framerates quite that low except in a few weird places. Seems like Bethsoft should have been able to rejigger Gamebryo/Creation Engine to address the issue. But of course Fallout 4 is an old game by now so, why would they? Not gonna waste my time with Fallout76 to see if they've done any better, either.
 

MajinCry

Platinum Member
Jul 28, 2015
2,488
561
136
The thing that's striking to me is that with "normal" settings, playing the vanilla game (no ENB effects), I find it difficult to bring framerates quite that low except in a few weird places. Seems like Bethsoft should have been able to rejigger Gamebryo/Creation Engine to address the issue. But of course Fallout 4 is an old game by now so, why would they? Not gonna waste my time with Fallout76 to see if they've done any better, either.
If you like the settlement system, it's real easy to get way higher draw calls than that. When I was working on Sanctuary Hills, the draw calls climbed to 17,000 on my i7 6700k and I was dipping down from 30fps into the high twenties. Which is a stupidly high number of draw calls, and I wasn't even halfway done with the settlement.

The solution to the draw call performance is to make use of Direct3D 12 or Vulkan, as this is precisely what they are designed to do; issue large amounts of draw calls very quickly, and scale almost linearly in performance with additional cores dedicated to the driver. Unfortunately, that will never be backported. New Vegas, the best game they've published, and one of the greatest games ever, is going to forever be stuck in Direct3D 9.
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
If you like the settlement system, it's real easy to get way higher draw calls than that. When I was working on Sanctuary Hills, the draw calls climbed to 17,000 on my i7 6700k and I was dipping down from 30fps into the high twenties. Which is a stupidly high number of draw calls, and I wasn't even halfway done with the settlement.

The solution to the draw call performance is to make use of Direct3D 12 or Vulkan, as this is precisely what they are designed to do; issue large amounts of draw calls very quickly, and scale almost linearly in performance with additional cores dedicated to the driver. Unfortunately, that will never be backported. New Vegas, the best game they've published, and one of the greatest games ever, is going to forever be stuck in Direct3D 9.
Yeah, the settlements. Ugh I didn't much care for that to be honest. But I see your point there.

New Vegas probably will be stuck since getting Obsidian to do a remaster would be . . . difficult.
 

Head1985

Golden Member
Jul 8, 2014
1,858
678
136
Here are the ENB ini files used in the benchmark I had the users run: https://mega.nz/#!3kUHTY7K!GnxkhGoqX1WjnqxUsVHMbPhfmO5zFmB5Ys8dcMva048

It's got everything disabled, so it should have no impact. When I ran ENB on my Phenom II x4 965, which was extremely limited in performance, I had no difference in fps. Even gained two frames during Fallout 4's earlier releases.

Edit: Actually, if you want to benchmark, I collected the config files, instructions and all, on this thread which I used to collect the results first, before making this thread: https://forums.anandtech.com/threads/draw-call-performance-in-fallout-4.2501467/
Good i will test it with my 3700x and 1080TI.I still have best results so far with my 6700k.Nobody was able to beat me:)
 

Head1985

Golden Member
Jul 8, 2014
1,858
678
136
Ok here are my results with 3700x
system:
3700x
2x16GB dual rank ram 3600mhz cl 16-17-19-36
1080TI
2019-08-20 (3).png2019-08-20 (4).png

With 6700k at 4,5Ghz i have this:
First Save (Corvega)
Draw Calls: 11703
FPS: 71.8fps

Second Save (Diamond City)
Draw Calls: 8004
FPS: 85.5fps

with 3700x i have:
First Save (Corvega)
Draw Calls: 11776
FPS: 66.1fps

Second Save (Diamond City)
Draw Calls: 8029
FPS: 78.8fps

So 6700k is only 8.5% faster.Not bad.It also runs at 4500mhz vs cca 4250mhz 3700x
https://forums.anandtech.com/threads/draw-call-performance-in-fallout-4.2501467/#post-38813107
 
Last edited:
  • Like
Reactions: MajinCry

MajinCry

Platinum Member
Jul 28, 2015
2,488
561
136
Ok here are my results with 3700x
system:
3700x
2x16GB dual rank ram 3600mhz cl 16-17-19-36
1080TI


With 6700k at 4,5Ghz i have this:
First Save (Corvega)
Draw Calls: 11703
FPS: 71.8fps

Second Save (Diamond City)
Draw Calls: 8004
FPS: 85.5fps

with 3700x i have:
First Save (Corvega)
Draw Calls: 11776
FPS: 66.1fps

Second Save (Diamond City)
Draw Calls: 8029
FPS: 78.8fps

So 6700k is only 8.5% faster.Not bad.It also runs at 4500mhz vs cca 4250mhz 3700x
https://forums.anandtech.com/threads/draw-call-performance-in-fallout-4.2501467/#post-38813107
That's really good. The 6700k is only 8.5% faster, and those results are from before fixes for the various exploits found since then were released. The i7 8600k has 5% higher clocks, and is only 8% faster. Safe to say that, at this point, they're almost at parity.

Zen 2 fixed much of the performance disparity. I'll make a new thread that shows the difference in draw calls between xLake, Zen, Zen+, and Zen 2 when I can be bothered.
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
Huh, just running an NV dGPU got you a lot of extra performance there. Goofy as heck.
 

MajinCry

Platinum Member
Jul 28, 2015
2,488
561
136
Huh, just running an NV dGPU got you a lot of extra performance there. Goofy as heck.
It's not a consistent gain either, some configurations had little to no difference when paired with an NVidia GPU over an AMD GPU, despite being in a draw call limited scenario. I wonder what's at play there.
 

besset

Junior Member
Oct 26, 2019
2
0
6
Anyone got a 9900k or 9700k that can test it against the ryzen 3000 series? Planning on getting a ryzen 3600 but the fo4 benchmarks got me a bit worried
 

DrMrLordX

Lifer
Apr 27, 2000
19,163
7,920
136
@mikk
Fallout4 isn't exactly an MT beast. Having four more cores doesn't necessarily help the 3700x at all.
 

gamervivek

Senior member
Jan 17, 2011
485
29
91
Had this on my mind before moving to 9900KF, too bad I didn't find this thread and test it with 3600.

OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: Gtx 1080Ti aorus waterforce
GPU Driver:441.20

First Save (Corvega)
Draw Calls:11700
FPS:74

Second Save (Diamond City)
Draw Calls: 8000
FPS: 91

Will check with Vega56 tomorrow.

Anyone got a 9900k or 9700k that can test it against the ryzen 3000 series? Planning on getting a ryzen 3600 but the fo4 benchmarks got me a bit worried
Moving from 3600 to 9900KF was a huge difference in this game in CPU intensive places, like Swan's Pond or Faneuil Hall. My fps went from 35-40 to 60 solid. And 3600 was a big improvement over 1600 before.
One caveat, I didn't reinstall windows when going from 1600 to 3600, I decided to do so with 9900KF, a week after installation and it seemed to have improved the fps in this game.
 
  • Like
Reactions: lightmanek

USER8000

Golden Member
Jun 23, 2012
1,522
753
136
Had this on my mind before moving to 9900KF, too bad I didn't find this thread and test it with 3600.

OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: Gtx 1080Ti aorus waterforce
GPU Driver:441.20

First Save (Corvega)
Draw Calls:11700
FPS:74

Second Save (Diamond City)
Draw Calls: 8000
FPS: 91

Will check with Vega56 tomorrow.



Moving from 3600 to 9900KF was a huge difference in this game in CPU intensive places, like Swan's Pond or Faneuil Hall. My fps went from 35-40 to 60 solid. And 3600 was a big improvement over 1600 before.
One caveat, I didn't reinstall windows when going from 1600 to 3600, I decided to do so with 9900KF, a week after installation and it seemed to have improved the fps in this game.
The difference shouldn't be that high,as people have tested with the Ryzen 7 3700X with a GTX1080TI,which should not be much faster than the Ryzen 5 3600,as Fallout 4 won't use more than 4 to 6 threads properly. Also look at your Core i9 9900KF scores against a 4.5GHZ Core i7 6700K,most of the improvement is down to your CPU running closer to 5GHZ.

With 6700k at 4,5Ghz i have this:
First Save (Corvega)
Draw Calls: 11703
FPS: 71.8fps

Second Save (Diamond City)
Draw Calls: 8004
FPS: 85.5fps

with 3700x i have:
First Save (Corvega)
Draw Calls: 11776
FPS: 66.1fps

Second Save (Diamond City)
Draw Calls: 8029
FPS: 78.8fps
That is with a Ryzen 7 3700X at roughly 4.25GHZ and a Ryzen 5 3600 should be running at between 4.0~4.2GHZ or around that. Someone with a Ryzen 9 3900X was getting between 60~74FPS with a GTX1080TI in the Corvega plant depending on the memory configuration they used here.

So realistically there shouldn't be more than a 10% to 20% difference at most looking at the earlier results with Ryzen 3000. But what you are saying is more like a 50% to 60% increase.

The thing is the FPS difference seems rather large compared to tests which have been done:


With an RTX2080,a stock Core i5 9600K against a stock Ryzen 5 2600X is around 20% difference running around the region around Diamond city.

Here is a Ryzen 7 2700X against a Core i9 9900K in the same scene and the same GPU:


Around 20% to 25% difference. Here is a test around Swan's Pond done with a GTX1080:
https://www.youtube.com/watch?v=Mr2B0RJd7Nc

The Ryzen 7 2700X is at 4.2GHZ and the Core i7 8700K at 4.4GHZ,and the Core i7 8700K is at best 10% higher.

I found the area around Diamond City to be a bit more of a performance hog,than say immediately around Swan's Pond. Even with my Ryzen 5 2600,with a GTX1080 I certainly didn't see such low FPS around Swan's Pond at 1440p,so something does not seem right TBH. Built up settlements,yes I did see under more like that kind of FPS.
 
Last edited:
  • Like
Reactions: Elfear

Markfw

CPU Moderator, VC&G Moderator, Elite Member
Super Moderator
May 16, 2002
22,801
11,213
136
The difference shouldn't be that high,as people have tested with the Ryzen 7 3700X with a GTX1080TI,which should not be much faster than the Ryzen 5 3600,as Fallout 4 won't use more than 4 to 6 threads properly. Also look at your Core i9 9900KF scores against a 4.5GHZ Core i7 6700K,most of the improvement is down to your CPU running closer to 5GHZ.



That is with a Ryzen 7 3700X at roughly 4.25GHZ and a Ryzen 5 3600 should be running at between 4.0~4.2GHZ or around that. Someone with a Ryzen 9 3900X was getting between 60~74FPS with a GTX1080TI in the Corvega plant depending on the memory configuration they used here.

So realistically there shouldn't be more than a 10% to 20% difference at most looking at the earlier results with Ryzen 3000. But what you are saying is more like a 50% to 60% increase.

The thing is the FPS difference seems rather large compared to tests which have been done:


With an RTX2080,a stock Core i5 9600K against a stock Ryzen 5 2600X is around 20% difference running around the region around Diamond city.

Here is a Ryzen 7 2700X against a Core i9 9900K in the same scene and the same GPU:


Around 20% to 25% difference. Here is a test around Swan's Pond done with a GTX1080:
https://www.youtube.com/watch?v=Mr2B0RJd7Nc

The Ryzen 7 2700X is at 4.2GHZ and the Core i7 8700K at 4.4GHZ,and the Core i7 8700K is at best 10% higher.

I found the area around Diamond City to be a bit more of a performance hog,than say immediately around Swan's Pond. Even with my Ryzen 5 2600,with a GTX1080 I certainly didn't see such low FPS around Swan's Pond at 1440p,so something does not seem right TBH. Built up settlements,yes I did see under more like that kind of FPS.
Why are you comparing against a year old chip ? Why not the 3700x or 3800x ? The 2700x is old news and not near as good as the 3700x.
 

USER8000

Golden Member
Jun 23, 2012
1,522
753
136
Why are you comparing against a year old chip ? Why not the 3700x or 3800x ? The 2700x is old news and not near as good as the 3700x.
He said the performance delta in those areas between a Ryzen 5 3600 and a Core i9 9900KF was between 50% to 60% - the problem is even with a Ryzen 5 2600X and Ryzen 7 2700X with a Core i5 9600K,Core i7 8700K or a Core i9 9900K the delta is nowhere as big. Since Fallout 4 uses 4 threads intensively,and slightly uses another two threads,and all the Intel CPUs use the same core design,with any Intel CPU of recent years,the performance difference is largely down to clockspeed and RAM speed.

Also due to the age of the game,no one really tests Fallout 4 which is kind of annoying for me as I have 2000+ hours in the game,and why I was asking for Ryzen 3000 owners to please test stuff.

Plus his results don't seem to tally with the Ryzen 7 3700X and Ryzen 9 3900X results and his own test results. The Ryzen 5 3600 boosts upto 4.2GHZ,so at worst with the same RAM will be 10% slower than say a Ryzen 7 3800X or Ryzen 9 3900X. Looking at the results posted here:
1.)A tweaked Ryzen 9 3900X and a Ryzen 7 3700X at 4.25GHZ produce 66~68FPS in the Corvega test. Their Core i9 9900KF at stock is 74FPS. So between 9% to 12% better and its mostly down to the clockspeed IMHO,which another poster stated when they compared the Ryzen 7 3700X to a 4.5GHZ Core i7 6700K.
2.)Ryzen 7 3700X at 4.25GHZ produces 79FPS and the Core i9 9900KF 91FPS which is 15% or around that.

A Ryzen 5 3600 runs at between 4GHZ~4.2GHZ even under a Blender stress test. That is at worst a 5% clock difference between a Ryzen 5 3600 and the Ryzen 7 3700X tested here or nearer to 10% if they boost perfectly. So lets say a 20% delta for the Ryzen 5 3600 - so how is the delta 50% to 60% between a Ryzen 5 3600 and a Core i9 9900KF??

The test here and their own results don't tally.

Or to frame it another way,how are the tests here indicating at worst a 15% delta between their stock Core i9 9900KF and a Ryzen 7 3700X and a Ryzen 9 3900X,and suddenly that is 50% to 60% with a Ryzen 5 3600 and the same Intel CPU,in a game which won't give two hoots about more than 6 threads??

The reason why I included the Ryzen 2000 results is because there is a comparisons with the Core i7 8700K and Core i9 9900K,and the difference is not 50% to 60% in Boston with a GTX1080 or RTX2080. So how does it work with a Ryzen 5 3600 which should be faster than any of the earlier Ryzen CPUs?? Ryzen 3000 being able to run faster RAM alone gives it a huge improvement alone over Ryzen 2000 - Fallout 4 is one of the games which loves fast RAM.

Edit to post.

I did find one video testing a Ryzen 5 3600 and a GTX1080TI with Fallout 4 :


RAM is running under specification at 3000MHZ,which definitely will reduce Fallout 4 performance(max officially rated speed is 3200MHZ).

So in one or two instances it does drop down to roughly 40FPS briefly in the city,but the problem is that its modded. One of the mods used is WOTC,which is a mod which massively increases the amount of spawns in the world and is one of the most performance hogging Fallout 4 mods out there(just read the comments at how much FPS is reduced and even the mod author states that it is intensive),as extra spawns means more NPCs fighting each other in the world too. True Storms, further increases draw calls as it generates more intense weather,pile of corpses which keeps death bodies much longer in the map,which further causes more CPU load and a few other graphical updates which do not help with increased draw calls.

So at this point I don't know how an unmodded playthrough(which this test requires) will get worse performance,especially with WOTC not installed.
 
Last edited:

gamervivek

Senior member
Jan 17, 2011
485
29
91
OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: Gtx 1080Ti aorus waterforce
GPU Driver:441.20

First Save (Corvega)
Draw Calls:11700
FPS:74

Second Save (Diamond City)
Draw Calls: 8000
FPS: 91

Will check with Vega56 tomorrow.
OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: RX Vega56
GPU Driver:19.11.1

First Save (Corvega)
Draw Calls:11700
FPS:66

Second Save (Diamond City)
Draw Calls: 8000
FPS: 74

The difference shouldn't be that high,as people have tested with the Ryzen 7 3700X with a GTX1080TI,which should not be much faster than the Ryzen 5 3600,as Fallout 4 won't use more than 4 to 6 threads properly. Also look at your Core i9 9900KF scores against a 4.5GHZ Core i7 6700K,most of the improvement is down to your CPU running closer to 5GHZ.



That is with a Ryzen 7 3700X at roughly 4.25GHZ and a Ryzen 5 3600 should be running at between 4.0~4.2GHZ or around that. Someone with a Ryzen 9 3900X was getting between 60~74FPS with a GTX1080TI in the Corvega plant depending on the memory configuration they used here.

So realistically there shouldn't be more than a 10% to 20% difference at most looking at the earlier results with Ryzen 3000. But what you are saying is more like a 50% to 60% increase.

The thing is the FPS difference seems rather large compared to tests which have been done:


With an RTX2080,a stock Core i5 9600K against a stock Ryzen 5 2600X is around 20% difference running around the region around Diamond city.

Here is a Ryzen 7 2700X against a Core i9 9900K in the same scene and the same GPU:


Around 20% to 25% difference. Here is a test around Swan's Pond done with a GTX1080:
https://www.youtube.com/watch?v=Mr2B0RJd7Nc

The Ryzen 7 2700X is at 4.2GHZ and the Core i7 8700K at 4.4GHZ,and the Core i7 8700K is at best 10% higher.

I found the area around Diamond City to be a bit more of a performance hog,than say immediately around Swan's Pond. Even with my Ryzen 5 2600,with a GTX1080 I certainly didn't see such low FPS around Swan's Pond at 1440p,so something does not seem right TBH. Built up settlements,yes I did see under more like that kind of FPS.
It isn't that big of a difference across the board, in some specific places I mentioned. In GTA V, I saw more modest gains, 10-20%, with performance even slightly worse in one. And that with a mutli-monitor setup which increases the CPU load further. Fallout 4 I just tested with 1080p.

As for the swan's pond video which you linked, his settings are even lower than the default high preset, while I benched on ultra preset, with change to depth of field to standard.
Shadows distance setting is a huge performance hit, and I wasn't expecting 9900KF to be so solid with it, 60 fps(rivatuner locked) while looking down from Corvega(where you get the repair bobblehead). I double checked if I had setup everything correctly. Dropped into the high 50s at the top of Trinity tower.

Similarly in GTA IV, when you leave the first safe house and turn left, fps would drop to 40s on 3600, but is solid 60fps locked on 9900KF. With newer games, such scenarios should be far fewer because development would account for Ryzen's strengths and weaknesses relative to intel, since they're going be a huge portion of the gaming population.
 
  • Like
Reactions: USER8000

USER8000

Golden Member
Jun 23, 2012
1,522
753
136
OS: Win10 Home
CPU: Intel 9900KF stock
RAM: 2x16GB DDR4 3600MHz 16-16-16-36
GPU: RX Vega56
GPU Driver:19.11.1

First Save (Corvega)
Draw Calls:11700
FPS:66

Second Save (Diamond City)
Draw Calls: 8000
FPS: 74



It isn't that big of a difference across the board, in some specific places I mentioned. In GTA V, I saw more modest gains, 10-20%, with performance even slightly worse in one. And that with a mutli-monitor setup which increases the CPU load further. Fallout 4 I just tested with 1080p.

As for the swan's pond video which you linked, his settings are even lower than the default high preset, while I benched on ultra preset, with change to depth of field to standard.
Shadows distance setting is a huge performance hit, and I wasn't expecting 9900KF to be so solid with it, 60 fps(rivatuner locked) while looking down from Corvega(where you get the repair bobblehead). I double checked if I had setup everything correctly. Dropped into the high 50s at the top of Trinity tower.

Similarly in GTA IV, when you leave the first safe house and turn left, fps would drop to 40s on 3600, but is solid 60fps locked on 9900KF. With newer games, such scenarios should be far fewer because development would account for Ryzen's strengths and weaknesses relative to intel, since they're going be a huge portion of the gaming population.
Well in the Ryzen 5 3600 video I posted,there were parts on the top of buildings looking over Boston(it was a modded save too),and the FPS didn't seem to did so low as what you have seen. Here is another video from the same person at Ultra(medium shadows) using a Ryzen 5 3600 and a GTX1080TI in the built up part of central Boston:


So Swan's Pond and the Faneuil Hall area and they have mods nstalled.Then there are some videos of modded playthoughs using ENBs,higher resiolution textures,enhanced shadows,etc which show dips to the FPS you are talking about,but from what you are saying its just a normal install?? Is it with the 4K pack installed or not??

I am only asking since the videos I am watching online which seem to see those sub 40 FPS dips,tend to be all with mods like this one:

A Ryzen 5 3600 and a GTX1080TI,but with an ENB,6K shadow resolution,7200 shadow distance. The ENBs alone can destroy performance - from my testing you can see easily over 20000 draw calls,if you have an ENB,better textures,improved shadows,etc.
 

ASK THE COMMUNITY