Question for Nvidia users about the Nvidia display driver

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
Does the Nvidia display driver have a built in frame rate limiter that works in conjuction with triple buffered vsync?

So for example, I can set Crysis 3 to run at 50fps locked with vsync? (or any other number)
 

Leyawiin

Diamond Member
Nov 11, 2008
3,204
52
91
No, you'll need to use something like Nvidia Inspector or Riva Tuner Statistics Server (comes with MSI Afterburner).
 

Flapdrol1337

Golden Member
May 21, 2014
1,677
93
91
Crysis has a built in fps limiter, can probably set it with a user.cfg file

The game itself limiting fps always works better.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
You'll be able to use Triple buffering and Vsync with Crysis 3 and most games through the CPL, it will limit your fps to your monitor's refresh rate.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Why would you want to do that? It's only going to give you awful microstutter.

Nvidia inspector should do what you want though
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
Why would you want to do that?

To avoid frame rate fluctuations I lock my games at a minimum frame rate determined by my quality settings.

For some reason a lot of people only run games at 30 or 60. Nothing in between.

But me?

I can happily run a game locked at 36, 40, 45, 50 fps, in order to put the image quality settings where I want and have absolutely zero framerate fluctuations, which are the greatest immersion breakers in a game in my experience.

I often find 30fps is just a little bit too slow, but adding the addition 6fps up to 36fps really improves controller response and gives just a little bit more fluidity over 30fps.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
To avoid frame rate fluctuations I lock my games at a minimum frame rate determined by my quality settings.

For some reason a lot of people only run games at 30 or 60. Nothing in between.

But me?

I can happily run a game locked at 36, 40, 45, 50 fps, in order to put the image quality settings where I want and have absolutely zero framerate fluctuations, which are the greatest immersion breakers in a game in my experience.

I often find 30fps is just a little bit too slow, but adding the addition 6fps up to 36fps really improves controller response and gives just a little bit more fluidity over 30fps.

triple buffering will just give you bad microstutter at anything other than 30 or 60fps. That's the entire point of Gsync. At anything other than 30 or 60 fps, you're forced to choose between screen tearing or bad microstutter.
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
In my experience (atleast doing this with RadeonPro and my 7950) there is not a problem with microstutter.

I tested a lot for example GTA4 @ 30 fps and 36 fps, specifically looking for microstutter or any kind of choppiness, and I always went with the 36 fps as the favorable option. I played all of Far Cry 3, Crysis 3, and GTA 4 @ 36fps locked and I played Crysis 2, Saints Row The Third, and Saints Row 4 @ 48fps locked.

On some games it looks bad. For example Skyrim or Fallout 3. You definitely want those at 60 all the time. But certain engines, you can get away with running these unusual framerates.

Especially if a game has a good motion blur implementation like Crysis 3. That game feels smooth at anything over 30fps.

I am also used to being a console gamer so maybe I can just tolerate this stuff a little better. Unlocked and wildly varying framerates were common on 360 games. Having any kind of a locked framerate feels like an improvement over that.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
In my experience (atleast doing this with RadeonPro and my 7950) there is not a problem with microstutter.

I tested a lot for example GTA4 @ 30 fps and 36 fps, specifically looking for microstutter or any kind of choppiness, and I always went with the 36 fps as the favorable option. I played all of Far Cry 3, Crysis 3, and GTA 4 @ 36fps locked and I played Crysis 2, Saints Row The Third, and Saints Row 4 @ 48fps locked.

On some games it looks bad. For example Skyrim or Fallout 3. You definitely want those at 60 all the time. But certain engines, you can get away with running these unusual framerates.

Especially if a game has a good motion blur implementation like Crysis 3. That game feels smooth at anything over 30fps.

I am also used to being a console gamer so maybe I can just tolerate this stuff a little better. Unlocked and wildly varying framerates were common on 360 games. Having any kind of a locked framerate feels like an improvement over that.

There is definitely microstutter, it's physically impossible for there not to be. might not be very noticable at 36fps since the majority of the frames are repeated twice. skyrim and fallout are just using a poorly made game engine. There is stuttering even when the game is locked at 60fps.

Anyway, i think the tool you're looking for is Nvidia inspector. It will allow you to impose framerate limits and force triple buffering etc.

Personally, I find adaptive vsync to be the most tolerable solution, as framerate dips aren't all that noticable.

Im with you though, a steady framerate is the way to go. I have an overclockable 1440p monitor, and since it's rare to find newer games i can run at a steady 120hz, i like to figure out whatever the minimum framerate is and set the refresh rate accordingly and turn on vsync.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
triple buffering will just give you bad microstutter at anything other than 30 or 60fps. That's the entire point of Gsync. At anything other than 30 or 60 fps, you're forced to choose between screen tearing or bad microstutter.

Note: Triple buffering has no effect on stutter by itself. It only has an effect with V-sync enabled. And the Nvidia control panel setting is only for OpenGL. The game itself has to have an option to enable it. If you use multiple GPU's, you already have 3 or more (you gain 1 buffer per GPU present), if the game supports AFR.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Note: Triple buffering has no effect on stutter by itself. It only has an effect with V-sync enabled. And the Nvidia control panel setting is only for OpenGL. The game itself has to have an option to enable it. If you use multiple GPU's, you already have 3 or more (you gain 1 buffer per GPU present), if the game supports AFR.

There is no point to triple buffering without v-sync as v-sync is what causes a need for it in the first place. It's sole purpose is to prevent framerate halving when your framerate drops below the refresh rate.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
There is no point to triple buffering without v-sync as v-sync is what causes a need for it in the first place. It's sole purpose is to prevent framerate halving when your framerate drops below the refresh rate.

Yes, and I wanted to make that clear, as there are a lot of people who do not know that. I see people all the time thinking that triple buffering solves tearing. And others think you can force it on for DirectX games. Neither are correct.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Yes, and I wanted to make that clear, as there are a lot of people who do not know that. I see people all the time thinking that triple buffering solves tearing. And others think you can force it on for DirectX games. Neither are correct.
I think i misread your post.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
There is definitely microstutter, it's physically impossible for there not to be. might not be very noticable at 36fps since the majority of the frames are repeated twice. skyrim and fallout are just using a poorly made game engine. There is stuttering even when the game is locked at 60fps.

Never had microstutter with Triple Buffering, gameplay is very smooth. I have experienced microstuttter in a previous SLI setup where 1 card's load was different from the other, some things that could only be calculated on one card so it took a little extra time for that card to have the frame ready which made the game feel like it was less than 30fps when it was more.

Never experienced that with a single card an Triple Buffering.
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
There is definitely microstutter, it's physically impossible for there not to be.

One thing RadeonPro does when using the FPS limiter, is it's more of like a "soft limit"

So with a 36 fps cap, you will see the fps displayed as 35-36-35-36-35-36 and with the occasional 37's thrown in there.

I think this ^ might be the GPU driver doing it's own thing to spit out frames at a more even pace, but I'm probably wrong about that.

I know technically what I do with my framerates is "wrong" but for me it just works. :) Happy to know I'll be able to do the same with Nvidia! :biggrin:
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Never had microstutter with Triple Buffering, gameplay is very smooth. I have experienced microstuttter in a previous SLI setup where 1 card's load was different from the other, some things that could only be calculated on one card so it took a little extra time for that card to have the frame ready which made the game feel like it was less than 30fps when it was more.

Never experienced that with a single card an Triple Buffering.

If V-sync is on, and triple buffering or SLI is used in the game, you will get jumps between 16ms and 33ms when your FPS are between 30 and 60. This isn't really called microstutter, but it is stuttering, and is measurable. Some people have bigger issues with it than others. It is there, with no exceptions, when the above conditions are met. You may just not be bothered by it.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Yes, and I wanted to make that clear, as there are a lot of people who do not know that. I see people all the time thinking that triple buffering solves tearing. And others think you can force it on for DirectX games. Neither are correct.

Well for years I have been using it to stop tearing by forcing it in the NVcpl and it has worked perfectly for DX as well as OpenGL. There used to be a tooltip when you hovered over the setting that said it only worked in OpenGL. This is no longer the case.

I do have some issues forcing it in some DX11 games, where I have to enable and then disable the in game Vsync settings in order for it to fall back on the NVCPL settings but this is usually remedied in a later patch, and proves it still works.

If V-sync is on, and triple buffering or SLI is used in the game, you will get jumps between 16ms and 33ms when your FPS are between 30 and 60. This isn't really called microstutter, but it is stuttering, and is measurable. Some people have bigger issues with it than others. It is there, with no exceptions, when the above conditions are met. You may just not be bothered by it.

In my case it would be 85, 42 and 28. When I use Afterburner to monitor my GPU performance, the frametime reading moves as fluidly as the others and doesn't stick to any particular number unless the framerate itself is locked.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Well for years I have been using it to stop tearing by forcing it in the NVcpl and it has worked perfectly for DX as well as OpenGL. There used to be a tooltip when you hovered over the setting that said it only worked in OpenGL. This is no longer the case.

I do have some issues forcing it in some DX11 games, where I have to enable and then disable the in game Vsync settings in order for it to fall back on the NVCPL settings but this is usually remedied in a later patch.



In my case it would be 85, 42 and 28. When I use Afterburner to monitor my GPU performance, the frametime reading moves as fluidly as the others and doesn't stick to any particular number unless the framerate itself is locked.

Your monitor can't physically display steady frametimes with vsync unless the framerate matches the full refresh rate, 1/2 refresh rate, 1/4 refresh rate etc. This is not up for debate. It is a physical limitation. This is why Gsync/freesync is a thing.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Your monitor can't physically display steady frametimes with vsync unless the framerate matches the full refresh rate, 1/2 refresh rate, 1/4 refresh rate etc. This is not up for debate. It is a physical limitation. This is why Gsync/freesync is a thing.

It's definitely not noticeable for me then. I thought the only advantage for Free/Gsync was that you could have the same effect as triple buffering, just without the added latency or buffering the extra frame so it would benefit people who play fast paced games without the tearing.

Surely having triple buffering makes it up to the GPU to decide what intervals to send the frame to the the monitor. The GPU can still have unlocked frametimes before the info gets to the buffers or monitor.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
It's definitely not noticeable for me then. I thought the only advantage for Free/Gsync was that you could have the same effect as triple buffering, just without the added latency so it would benefit people who play fast paced games without the tearing.

While that is part of the benefit, the real benefit is that the refresh rate can always match the frame rate, making slight drops virtually unnoticeable, because there will never be any need for duplicate frames.

It's simple math; if you have triple buffering on, with a standard monitor, and your framerate drops from 60fps down to say, 53 fps, there is going to be some stutter. Triple buffering will prevent frame tearing, but 60 isn't divisible by 53, so 7 of those frames are going to have to be repeated twice, causing unstable frametimes as 7 of those frames will have twice the latency as the rest. With Gsync however, the refresh rate would simply drop to 53hz, preventing any duplicate frames, and maintaining stable frametimes.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Well for years I have been using it to stop tearing by forcing it in the NVcpl and it has worked perfectly for DX as well as OpenGL. There used to be a tooltip when you hovered over the setting that said it only worked in OpenGL. This is no longer the case.

I do have some issues forcing it in some DX11 games, where I have to enable and then disable the in game Vsync settings in order for it to fall back on the NVCPL settings but this is usually remedied in a later patch, and proves it still works.



In my case it would be 85, 42 and 28. When I use Afterburner to monitor my GPU performance, the frametime reading moves as fluidly as the others and doesn't stick to any particular number unless the framerate itself is locked.

You are quite mistaken about triple buffering removing tearing. It can't, it doesn't behave that way. It may be possible triple buffering can be force on DirectX, but everything you look up says otherwise. I can't test it, as I run in SLI, and SLI forces triple buffering due to the nature of AFR.

Triple buffering only serves a purpose if V-sync is on. It doesn't do anything if it is off.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
While that is part of the benefit, the real benefit is that the refresh rate can always match the frame rate, making slight drops virtually unnoticeable, because there will never be any need for duplicate frames.

It's simple math; if you have triple buffering on, with a standard monitor, and your framerate drops from 60fps down to say, 53 fps, there is going to be some stutter. Triple buffering will prevent frame tearing, but 60 isn't divisible by 53, so 7 of those frames are going to have to be repeated twice, causing unstable frametimes as 7 of those frames will have twice the latency as the rest. With Gsync however, the refresh rate would simply drop to 53hz, preventing any duplicate frames, and maintaining stable frametimes.

I thought we just agreed that wasn't the case. Triple buffering just allows your GPU to start a new frame while waiting for another to be displayed. Without V-sync, they just get displayed as they are created, and tearing still occurs.

Man, this topic gets so messed up due to this misunderstanding.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
You are quite mistaken about triple buffering removing tearing. It can't, it doesn't behave that way. It may be possible triple buffering can be force on DirectX, but everything you look up says otherwise. I can't test it, as I run in SLI, and SLI forces triple buffering due to the nature of AFR.

Triple buffering only serves a purpose if V-sync is on. It doesn't do anything if it is off.

I always use it with Vsync, Of course I wouldn't use it on it's own, and I am not mistaken, I have been using it like this for years. The majority of the internet have had it wrong for quite a while now because they are just repeating what they have read and not tested for themselves with updated drivers.

Like I said before the tooltip about it only working with OpenGL was form back in the detonator driver days and is no longer there. You can test it for yourself by disabling SLI.

While that is part of the benefit, the real benefit is that the refresh rate can always match the frame rate, making slight drops virtually unnoticeable, because there will never be any need for duplicate frames.

It's simple math; if you have triple buffering on, with a standard monitor, and your framerate drops from 60fps down to say, 53 fps, there is going to be some stutter. Triple buffering will prevent frame tearing, but 60 isn't divisible by 53, so 7 of those frames are going to have to be repeated twice, causing unstable frametimes as 7 of those frames will have twice the latency as the rest. With Gsync however, the refresh rate would simply drop to 53hz, preventing any duplicate frames, and maintaining stable frametimes.

the whole point of having a the extra buffer is to alleviate that issue. If you think about it in 3 stages, rendering, buffering and displaying.

With Triple buffering the GPU is free to render as quickly as it wants without having to wait for the display. once the rendering is complete, a completed frame is put into one two backbuffers. The front buffer then gets to decide, depending on what the display is asking for, which of those frames it should send to the monitor.

The buffer acts as a buffer between the GPU and the display meaning they are decoupled and don't have to be in sync like they would with just double buffering.

When just using double buffering, the display will behave like you say and the Afterburner readout will clearly display that it is at 16.6ms at 60fps and twice that at 30fps. With Triple Buffering the GPU is free to render as it needs to and the on screen reading reflects that.
 
Last edited:

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I thought we just agreed that wasn't the case. Triple buffering just allows your GPU to start a new frame while waiting for another to be displayed. Without V-sync, they just get displayed as they are created, and tearing still occurs.

Man, this topic gets so messed up due to this misunderstanding.

sorry, i thought we were assuming usage with v-sync at this point in the discussion.