Capped FPS after upgrading to SLI

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
EDIT: Going to try to simplify the problem/information now that I've made a little bit of progress. There is still some funny behaviors around frame rates though.

VSYNC was culprit of the original issue which limiting FPS to 40 with huge input lag when running in SLI. Running a single card with the exact same settings it can go up to 60fps. Not sure why VSYNC affects SLI more than single card. GPU utilization was not maxxed out with VSYNC enabled. TV is 60hz so i'm not sure why VSYNC would limit it to 40hz with SLI.

Disabling VSYNC improved the performance but I still wanted to cap FPS at 60fps. To get around this, I turned off VSYNC on Nvidia Control Panel but enabled a FPS Limiter on Nvidia Inspector for 60hz. So far so good, it seems limited and doesn't have the input lag I had with VSYNC.

I did some more testing, I THINK SLI is working, though the results are a little strange so I am having a hard time understanding if this is expected or not.

I did some testing in FO4, exact same conditions between the two tests. Single Card gets as low as 35fps and topped out around 52fps. With SLI it is steady between 44-47fps. Basically with SLI I get better minimums but lower maximums. I am glad minimums are better but I would have expected improvements to maximums too.

This is on an older P67 motherboard, so I think they are running @ PCI-E 2.0 8x but I don't think that will impact performance too much.

Equipment
Motherboard - Asus P67 Sabertooth
CPU - i7 2600k @ 4.5ghz
RAM - Corsair 16GB 4x4GB @ 1600mhz
GPU - 2x 980 Ti w/ SLI (stock boost/clocks, not overclocked yet)
 
Last edited:
Feb 19, 2009
10,457
10
76
Might be game related, as multi-GPU profiles can be very different between different driver versions.

People often complain about poor multi-GPU in Witcher 3.

Test it with different driver versions.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Thanks. Kind of weird, I thought Witcher 3 was among the games with better scaling. Anyway, I was able to get some quick testing in this morning.

With SLI and vsync enabled I get up to 40fps on both Fallout 4 and Witcher 3.
With SLI and vsync disabled I get up to 48fps on Fallout4 and can get up to 60fps in Witcher 3.

On Fallout 4, running just a single card I can get up 60fps. No other settings changed.

On Witcher 3, scenes with low 30s with a single card I got to low 40s on SLI. Scenes with low 40s on a single card I was getting mid-high 50s. I still need to do some tweaking of the settings but at least I know SLI is working in this case.

I think there is some weird artificial limiting going on related to VSYNC with SLI enabled. I might try playing with Nvidia inspector later to see if I can get it working better.
 

lehtv

Elite Member
Dec 8, 2010
11,897
74
91
What CPU?

This is on an older P67 motherboard, so I think they are running @ PCI-E 2.0 8x but I don't think that will impact performance too much.

You could try with a single 980 ti and compare x8 to x16 results.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
2600K @ 4.5ghz, not the best but still pretty quick I think. I am also running at 4k resolutions with almost everything maxxed, so I don't think I am as CPU bound like I am running 1080p or something.

I did some more testing, I THINK SLI is working, though the results are a little strange so I am having a hard time understanding if this is expected or not. I turned vsync via Nvidia Settings and enabled Frame Rate Limiter in Nvidia Inspector (60fps). The idea here is because I do want to limit it to 60fps but vsync seems to cause more problems with input lag and frame rates. I figure I get both with this setup.

I did some testing in FO4, exact same conditions between the two tests.

Single Card gets as low as 35fps and topped out around 52fps. With SLI it is steady between 44-47fps.

Basically with SLI I get better minimums but lower maximums. Very strange. I am going to update the OP detailing the problem since I have made some progress since my testing last night.
 
Last edited:

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
To get to the bottom of this, you really need more precise data. Can you provide an average using FRAPS (1-min data from in-game content) for both SLI and a single GPU? Also, you should conduct all testing with vsync and frame limiters off. That's the only way you can actually determine what's going on.

I've benched Witcher 3 on 980 Ti SLI at 4K. It works, but scaling is quite bad, as in about 20%. It's not a good game for SLI at 4K. Even without Gameworks, I only hit 40fps.

By the way, on a single 980 Ti running Witcher at 4K, I averaged 35fps, suggesting that your single GPU numbers are actually too high to be maxed settings.

Your motherboard isn't great for this setup, and in fact given the poor scaling of SLI in The Witcher, dropping from PCIe 2.0 x16 on one card to PCIe 2.0 x8 on two cards could end up with very similar performance.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
To get to the bottom of this, you really need more precise data. Can you provide an average using FRAPS (1-min data from in-game content) for both SLI and a single GPU? Also, you should conduct all testing with vsync and frame limiters off. That's the only way you can actually determine what's going on.

I've benched Witcher 3 on 980 Ti SLI at 4K. It works, but scaling is quite bad, as in about 20%. It's not a good game for SLI at 4K. Even without Gameworks, I only hit 40fps.

By the way, on a single 980 Ti running Witcher at 4K, I averaged 35fps, suggesting that your single GPU numbers are actually too high to be maxed settings.

Your motherboard isn't great for this setup, and in fact given the poor scaling of SLI in The Witcher, dropping from PCIe 2.0 x16 on one card to PCIe 2.0 x8 on two cards could end up with very similar performance.

Thanks for the help, I went ahead and tried to do a quick benchmark. I did this on my lunch break and did a remote desktop into my computer. This probably added a little bit of load to the PC. I also have very limited control, so its mostly sitting around or running around a particular environment, not in depth game play.

The GPUs are at stock clocks, no OC besides whatever they boost to. All settings are in the pics below. Basically everything maxxed out except for AA and useless settings (Ultra Godrays, Hairworks). I disabled Motion Blur on W3 because I don't like it. All tests were 3 minutes of logging.

Some background for each test...

Fallout 4 test was run in Diamond City Market. My character is sitting in town where NPCs walk by and hang out. Nothing much going on. You can see it is pretty steady frame rates for both until about two minutes in. After two minutes the camera goes to a 3rd person view and rotates around the character. These are the 'waves' you see on the single card which is weirdly absent on the SLI setup. The same test/scenes were applied in both cases.

Witcher 3 I haven't played it yet. I've just been loading it up as a benchmark for testing. I am just sitting in some camp/town with a few characters near by. I was just walking around, watching NPCs spar, looking around the camp into the mountains. Trying to vary the load/rendering a scene but nothing too crazy. The timing of what was rendered is not the same like it was for FO4. I was running around and looking at different parts at different times. I tried to look at the same objects but they aren't lined up the timing in the chart.

The Witcher 3 tests seem OK to me, I get a pretty healthy boost in minimum FPS and average FPS using SLI. My guess is once I OC them I'll see a pretty healthy jump in minimums, should be well into the 40s which is very playable.

The Fallout 4 one is very weird to me. SLI seems to be more consistent but doesn't want to go above 48fps even with lighter loads. While the single card will go as high as 55fps when there is less load. I am fine with the mid 40 FPS I am getting with SLI. Minimums are more important than peak, but I am just surprised peak is less than single GPU.

1915665_10153441171443806_1906579977650613511_n.jpg


12809782_10153441171448806_2430576063859872414_n.jpg


12783643_10153441171423806_4444211176807578813_o.jpg


12806181_10153441171433806_2557251130865522210_n.jpg
 
Last edited:

wilds

Platinum Member
Oct 26, 2012
2,059
674
136
Turn off godrays on Fallout 4 and turn down view distance substantially and re bench.

I would monitor your CPU cores during these slow downs to see if a core is at 100% usage. The other option would be to test single 980 ti at x16 then x8 and see if you have any performance regression.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Looks to me that SLI is working in both games. Don't focus on the max - it's truly irrelevant. The average and minimum are all that matter here.

Note, however, that you are almost certainly capping the GPU usage on your cards running them at PCIe 2.0 x8. I've published some PCIe scaling benches of a 980 Ti in Witcher, which show no effect of dropping to PCIe 3.0 x8 at 4K: http://techbuyersguru.com/taking-4k-gaming-challenge-gtx-980-ti-sli?page=3

My guess, however, is that your PCIe 2.0 x8 slots may be capping the overall throughput when the two cards are working in SLI. That would in turn limit your GPU usage to something below 99%.

SLI itself is not capping your framerates, as this thread title suggested.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Thanks all. I'll try to do some more benchmarking in later. Maybe give it an OC and see what I can get.

Apologies if the thread title was misleading, the title was based on the very original problem I had when I first got it going last night. With vsync set to Application Controlled (On), I could get up to 60fps on a single card but only 40fps on SLI. Every setting exactly the same except single vs SLI, so I found it very strange and why I thought SLI was 'capping' my framerates. I worked around that that issue by disabling VSYNC, but I do think it is weird VSYNC would have different limits on SLI vs Single Card and not in the good direction. I thought it would have been 60hz all the time.

The thing that I still find strange is that under lighter loads I seem to get higher frame rates with a single card than when in SLI in FO4. In this particular test it was a heavy enough load to keep the SLI cards ahead, but in less demanding scenes such as indoors I find it strange the SLI setup is unable to render as quickly as the single card. It was not uncommon for me to hit the 60fps cap in many parts of the game when I ran a single card in FO4.

Maybe that is just the way this particular game is with SLI. The minimums are better and that is the most important one so I am glad. I think the average is a wash depending on the scene.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Here is another graph, this time 10 minutes of actual game play per test. Alot of the same scenarios between both of them, walking around outside, going to a quest. Talking to a few people, some shooting. The SLI test picks up where I left off after the single card test. A few indoor and dialog scenes but not much. The spikes were for the loading screens. SLI had a similar number of loading screens but no spikes, it seems 'capped' at 50fps. Hopefully this demonstrates a bit more about what I find a little strange. SLI offers better minimums and a smoother experience which is good, but just not want to go over 50fps.

I know it isn't 100% scaling and sometimes there is little to no benefit, but I didn't expect an extra card to be any worse than a single card at anything.



I still have more tweaking to do on power settings and clocks but I thought this was an interesting test. When I was OC'd on a single card I'd be around 60fps alot. I expect some gains to be had on SLI OCing too.

12841143_10153441594598806_3467705559071994407_o.jpg
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Can you provide GPU usage graphs from MSI Afterburner? That would provide more insights into what's going on.

The Fallout 4 graph suggests something isn't quite right. That's a relatively new game that has had some performance issues, so you may want to look into whether it has a fully-operational SLI profile.
 

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
I used GPUZ and tried to match it up to the FPS logs in fraps. It is manual combination of the data, so each line might be a second or two off but I think its close enough. FO4 was missing SLI support when it launched so its possible the implementation wasn't very good.


12792110_10153441904568806_7715539014056438043_o.jpg



EDIT:

Here is the one for Witcher 3. The big drop in FPS & Load at the end was the start of a cut scene. Overall this was very playable with an average of about 50fps. Not sure what that bumpiness was. Perhaps the game pausing every few moments with a prompt (I was in the tutorial) or that it was loading textures or something from the HDD.

Removing the cut scenes out of the data...

#1 Load Average = 86%
#2 Load Average = 97%
FPS Average = 50fps

12792209_10153441947103806_5321175360655965381_o.jpg



EDIT2: For fun I went through the tutorial again with just a single card and compared against the previous SLI test. Very different from Fallout 4. I am thinking FO4 may just not be optimized for SLI.

FPS Average Single = 31FPS
FPS Average SLI = 50FPS
10382586_10153441970933806_8432081569076903443_o.jpg
 
Last edited:

Actaeon

Diamond Member
Dec 28, 2000
8,657
20
76
Things look to be working now. Nice data collection, by the way!

Thank you and thanks for the suggestion on a more technical analysis. I think that helped quantify exactly where things were.

When I initially posted I got worse performance with SLI than I did with a single card with two different games so I was concerned there was an issue with the setup. I didn't realize VSYNC would have such a negative effect on just SLI that was absent when using single card. I thought VSYNC would have treated both equally.

Once I disabled VSYNC and did the performance tests with Witcher 3 I felt better about the hardware setup and have since turned my attention to FO4 not being well optimized. Doing some further research some people have found the default SLI profiles for FO4 are not very good and have changed up via Nvidia Inspector. I'll do some more testing later tonight with the different SLI profiles available to see if I can get FO4 better. If I find any profiles with a noticeably improved performance I'll run a similar test to quantify the gains.

Also for what its worth, GPUZ has a 'Bus Interface Load' metric and it tops out at around 40%. I assume this measuring the PCI-E bus.

Thanks again for the help.
 

Termie

Diamond Member
Aug 17, 2005
7,949
48
91
www.techbuyersguru.com
Happy to help.

One way you can get confirmation of SLI scaling is to run 3DMark, which is known to scale extremely well with SLI. That will basically give you a maximum scaling, and then it comes down to game developers and Nvidia to get to that point with each game.

For example, in Fire Strike Extreme, my 980 Ti SLI setup hits 88% scaling (in the Graphics Score): http://www.3dmark.com/compare/fs/5760727/fs/5771851

If you find you're hitting above 80% scaling in 3DMark, you can likely set aside the PCIe issue I raised. It may take you down 3-5%, but it most likely isn't the cause of the FO4 issues you're having, as you've determined based on your testing and research.