Question for Nvidia users about the Nvidia display driver

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I always use it with Vsync, Of course I wouldn't use it on it's own, and I am not mistaken, I have been using it like this for years. The majority of the internet have had it wrong for quite a while now. You can test it for yourself by disabling SLI.



the whole point of having a the extra buffer is to alleviate that issue. If you think about it in 3 stages, rendering, buffering and displaying.

With Triple buffering the GPU is free to render as quickly as it wants without having to wait for the display. once the rendering is complete, a completed frame is put into one two backbuffers. The front buffer then gets to decide, depending on what the display is asking for, which of those frames it should send to the monitor.

The buffer acts as a buffer between the GPU and the display meaning they are decoupled and don't have to be in sync like they would with just double buffering.

When just using double buffering, the display will behave like you say and the Afterburner readout will clearly display that it is at 16.6ms at 60fps and twice that at 30fps. With Triple Buffering the GPU is free to render as it needs to and the on screen reading reflects that.

No. double buffering (with vsync) simply drops the framerate down to 30fps if your framerate dips a single frame below 60. Triple buffering (with v-sync) only doubles the frametimes of the frames that need to be doubled. IE on a 60hz monitor, if your GPU outputs 53 frames in a second with triple buffered vsync, you will get 53 unique frames, with 7 of those frames repeated. with double buffered vsync, only 30 unique frames would get displayed, all of them doubled.

Just think about it for a moment; your monitor updates 60 times per second. There is no physical way for your GPU to display 53 frames in a second without frametime spikes for a few of those frames. it is physically impossible.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
No, double buffering (with vsync) simply drops the framerate down to 30fps if your framerate dips a single frame below 60. Triple buffering (with v-sync on) only doubles the frametimes of the frames that need to be doubled.

Just think about it for a moment; your monitor updates 60 times per second. There is no physical way for your GPU to display 53 frames in a second without frametime spikes for a few of those frames. it is physically impossible.

It doesn't need to. The front buffer sends the frames out at a rate the monitor can handle, 30/60 whatever. It doesn't need to display 53 frames, once the GPU has finished a frame and sent it to one of the back buffers, it is free to start the next frame. Think about it.

The buffer eliminates frametime spikes because it decouples what the GPU can produce from what the monitor needs.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
It doesn't need to. The front buffer sends the frames out at a rate the monitor can handle, 30/60 whatever. It doesn't need to display 53 frames, once the GPU has finished a frame and sent it to one of the back buffers, it is free to start the next frame. Think about it.

The buffer eliminates frametime spikes because it decouples what the GPU can produce from what the monitor needs.

I don't know how else to explain it to you. 60 is not divisible by 53. it's no more complicated than that.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I don't know how else to explain it to you. 60 is not divisible by 53. it's no more complicated than that.

53 isn't a factor of 60 no (technically it is divisible) but it doesn't need to be with Triple buffering. I've explained it to you over and over why not. The post buffer side handles what rate the frames get sent to the monitor, meaning the GPU can work as and when it likes. The buffer itself means the 2 don't have to be in sync, effectively decoupling the monitor frametime from the GPU frametime.
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
53 isn't a factor of 60 no (technically it is divisible) but it doesn't need to be with Triple buffering. I've explained it to you over and over why not. The post buffer side handles what rate the frames get sent to the monitor, meaning the GPU can work as and when it likes. The buffer itself means the 2 don't have to be in sync, effectively decoupling the monitor frametime from the GPU frametime.

It does if the output is a 60hz monitor?
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
53 isn't a factor of 60 no (technically it is divisible) but it doesn't need to be with Triple buffering. I've explained it to you over and over why not. The post buffer side handles what rate the frames get sent to the monitor, meaning the GPU can work as and when it likes. The buffer itself means the 2 don't have to be in sync, effectively decoupling the monitor frametime from the GPU frametime.

That is not how it works at all. Triple buffered vsync allows the GPU to draw frames more frequently than once every 30th of a second if the framerate drops below 60. This is the limitation of double buffered vsync. Your statement contradicts what vsync even is. This decoupling you're speaking of is not a thing.
 
Last edited:

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
That is not how it works at all. Triple buffered vsync allows the GPU to draw frames more frequently than once every 30th of a second if the framerate drops below 60. This is the limitation of double buffered vsync. Your statement contradicts what vsync even is. This decoupling you're speaking of is not a thing.

It is effectively decoupled or otherwise we'd be in a situation like with double buffering where the gpu has to sync up with the monitor, they are coupled together and must wait for each other.

With Triple buffering, because there are 2 buffers for the front buffer to pick from before sending it to the monitor, the rate at which the GPU can produce frames can be different from the rate at which the monitor can display them. Hence decoupling.

What do you think decoupling means?
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
here guy, these are my frametimes from World of Warcraft with triple buffered vsync turned on (the game itself has the option). i stood in one place where my framerate was 52 fps. The pictures speak for themselves. about 14% of the frames have twice the latency of the rest. Just like you would expect to see if 8 of the frames had to be doubled.

8LYrdtf.png

GmWbUE8.png
 
Last edited:

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
The 60hz monitor is always displaying frames at 60hz.

So if the GPU is outputting anything other than a multiple of 60hz there will be microstutter.

Whether it is an issue or visible depends on the game, engine, etc...

I think maybe I just choose to play games where microstutter isn't very apparent even when it happens. Perhaps do to motion blur or just the perspective/camera of the game.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
It is effectively decoupled or otherwise we'd be in a situation like with double buffering where the gpu has to sync up with the monitor, they are coupled together and must wait for each other.

With Triple buffering, because there are 2 buffers for the front buffer to pick from before sending it to the monitor, the rate at which the GPU can produce frames can be different from the rate at which the monitor can display them. Hence decoupling.

What do you think decoupling means?

I know what decoupling means. What I don't understand is why you think a GPU would decouple its drawing of frames from a monitor's refresh rate while vsync is turned on.
 

futurefields

Diamond Member
Jun 2, 2012
6,471
32
91
Even with Vsync off the monitor is still refreshing at 60hz always resulting in a torn image if there is any change in frame times.

It actually sounds like he's talking about G-Sync.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I know what decoupling means. What I don't understand is why you think a GPU would decouple its drawing of frames from a monitor's refresh rate while vsync is turned on.

Because triple buffering allows the GPU to draw frames without having to wait for Vsync.

I did a few tests myself with Ungine Heaven and found similar results to yours with Vsync and Triple Buffering. I then tried without Vsync enabled and the results were pretty much the same.

I then tried with both Vsync and Triple Buffering disabled disabled and again the results were pretty much the same, leading me to believe that the frametime measurements in fraps are measured after the buffers, whereas the frametime in Afterburner is measured before the buffers.

Vsync and Triple Buffering:
2u8ihbd.png


No Vsync, Just Triple Buffering:
ms0z9t.png


No Vsync or Triple Buffering:
2crow3q.png


Try for yourself.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
Even with Vsync off the monitor is still refreshing at 60hz always resulting in a torn image if there is any change in frame times.

It actually sounds like he's talking about G-Sync.

Yes with Vsync off you get tearing but with Vsync and Triple Buffering enabled you get no tearing. I do not have Gsync.

I'm going to have to be away for a while but I'll check back later.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
I got different results. besides, even if you were correct about when FRAPS is taking frametimes, wouldn't you agree that the final tangible result is just as I've said all along anyway?

both of these were taken in the same spot:

no vsync:
ehQGRBq.png

x9yrKzk.png


here is with regular double buffered vsync:

u5zmzs7.png
l6lGAke.png
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I've been running a few tests and have come up with some interesting results.

First I tried the first Firestrike test, not the demo and found that it bypassed what I had set for triple buffering in the NVCPL, I had to use the settings in the benchmark itself, although Vsync worked either way.

My results pretty much reflected yours, there was indeed much more variance with TB enabled.

I then tried the Monster Hunter Online benchmark which is uses the Crytek engine and had some different results. I benched at 2560x1080, fullscreen with SMAA.

This is with Vsync and Triple Buffering enabled in the CPL:
ftzozl.png


And this is with both disabled,
w7msmp.png


I watched closely and there was definite screen tearing in the test without Vsync, but none at all with them both enabled, also the framerate was limited to my refresh rate (85Hz).

I'll zoom in so we can see clearer, this is with both:
262wax5.png


and this is with both disabled:
29x88bo.png


As you can see there is very little difference. My theory (and it needs more testing) is that the NVCPL handles Triple Buffering differently to in game settings. As for the Heaven benchmarks I did have D3DOverrider running in the background which might explain the previous results. I disabled it for these tests and will retest Heaven.

As for the original point, I wasn't denying that there was additional latency with triple buffering, but at no point in my years of hsaving TB forced on in the NVCPL have I noticed that it is any less smooth. It may add a slight delay to what you see which is why fast paced fps players prefer to have it disabled for online play, and Gsync will be great for this giving you the benefits of no tearing with no added latency, but for the most time for me at least Triple Buffering is enough.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Deders, you are wrong on this. If you use triple buffering, you correct in that it allows you to have 53 FPS, or 45 FPS, or anything in between, without dropping off a cliff to 30 FPS. However, your monitor still refreshes at 60hz, at an even 16.7ms between every refresh.

If you have 50 FPS, you are going to get a mix of 16.7ms frames, and 33.3ms frames, and nothing in between. It will look something like this:
16.7ms, 16.7ms, 33.3ms, 16.7ms, 33.3ms, 16.7ms....

This page shows some good graphics. The ones with normal V-sync, you see constant jumps between 16.7ms and 33.3ms, and the one with adaptive v-sync, the frame latency is consistent: http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-10.html

Edit 2: It appears the first set of data has V-sync on, even if you thought it was off. Your 2nd set of posts, looks like it is off. Your 3DOverrider may be to blame.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I always use it with Vsync, Of course I wouldn't use it on it's own, and I am not mistaken, I have been using it like this for years. The majority of the internet have had it wrong for quite a while now because they are just repeating what they have read and not tested for themselves with updated drivers.

Like I said before the tooltip about it only working with OpenGL was form back in the detonator driver days and is no longer there. You can test it for yourself by disabling SLI.

Well, then don't say triple buffering removes tearing. It is very confusing to people, and there are a number people out there thinking that triple buffering removes tearing. It is v-sync that does. Triple buffering just allows for consistent FPS.

Just because a tooltip isn't explicit, doesn't mean it works for DirectX. I'm not saying it doesn't for certain, just that it never has in the past, and seems unlikely that it can do that now.
 

hawtdawg

Golden Member
Jun 4, 2005
1,223
7
81
Deders, you are wrong on this. If you use triple buffering, you correct in that it allows you to have 53 FPS, or 45 FPS, or anything in between, without dropping off a cliff to 30 FPS. However, your monitor still refreshes at 60hz, at an even 16.7ms between every refresh.

If you have 50 FPS, you are going to get a mix of 16.7ms frames, and 33.3ms frames, and nothing in between. It will look something like this:
16.7ms, 16.7ms, 33.3ms, 16.7ms, 33.3ms, 16.7ms....

This page shows some good graphics. The ones with normal V-sync, you see constant jumps between 16.7ms and 33.3ms, and the one with adaptive v-sync, the frame latency is consistent: http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-10.html

edit: The problem with your graphic is it that only measures the time between completed rendered frames, but it does not track when it is displayed on the screen.

I think he's got a setting set somewhere that he's missing. My FRAPS results behave exactly like you would expect them to.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
I think he's got a setting set somewhere that he's missing. My FRAPS results behave exactly like you would expect them to.

Yeah, I think he had v-sync on, and didn't realize it in the first posts. His 2nd post showed results as if it was off. He is probably right in that 3DOverrider is having an effect. He probably has V-sync forced on and didn't realize it.
 

Deders

Platinum Member
Oct 14, 2012
2,401
1
91
I believe we got our wires crossed to begin with. I assumed that anyone using Triple Buffering would be using it alongside Vsync, otherwise what is the point.

Then an argument arose on the assumption that Vsync wasn't in play.

There was also talk of Microstutter when using Triple Buffering and Vsync which I argued was neither microstutter in the traditional sense or visible to the naked eye, I would have noticed as I have been using TB for many years now with different cards and monitors.

I am not denying that there is a little added latency which is why Free/Gsync will give gamers the best of both worlds.

I also pointed out that Triple Buffering is in fact usable in DirectX, sometimes it works from the Control panel, sometimes you need to give it an extra nudge and sometimes D3DOverrider is essential, mostly with DX11. Before that all I needed to do was enable it in the control panel.

I can say the Heaven tests I did are null and void because D3DOverider was on in the background, but the Monster Hunter Online tests I still believe to be valid.

The first MHO test I did with Vsync and Triple buffering enabled in the cpl, I first did a run through see if the framerate went above 85 and to watch for tearing. It never went above 85 and there was no tearing.

I then ran a benchmark with the same settings and that is the first graph above.

I then disabled both Vsync and Triple Buffering in the cpl, ran the benchmark again and this time I saw definite tearing. That is the 2nd graph above.

D3DOverrider was disabled for both tests.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
At least we are on the same page now. As far as it not being noticeable stutter, that is dependent on the observer. PCPer did a test comparing ~45 FPS with V-sync and without, and asked for feed back in the comment section about what was preferred. The results were mixed. Some people are more sensitive to this type of stutter than others. You obviously are not.