I cannot get a clear answer

KeeperFiM

Junior Member
Jul 16, 2013
22
0
0
I like to have vsync in my games, so the animation will be smoother and there will be no tearing. But I don't like input lag. I've heard that triple buffering can help, while others swear it adds to the problem. I use radeonpro to enable it (with vsync on of course, it doesn't work otherwise.) but I can't seem to tell if it helps or vice versa. So, I want an answer. With actual technical reasoning, not hearsay or your own experience or, as I see a lot, made-up BS. Does triple buffering increase or decrease input lag?
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
you will not get a clear answer here or anywhere else. I have seen knowledgeable people go back and forth for years as some claim it reduces lag while others say it adds to it.
 

KeeperFiM

Junior Member
Jul 16, 2013
22
0
0
you will not get a clear answer here or anywhere else. I have seen knowledgeable people go back and forth for years as some claim it reduces lag while others say it adds to it.

I expected that, but I'm hoping one day an expert will see one of my threads on the many sites and *finally* tell me...
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Well, it's a complicated answer and depends on whether or not the game uses DX or OpenGL and whether or not your FPS are hitting or capable of surpassing your refresh rate or not.

The primary reason triple buffering is used is so that when the GPU waits for the monitor to reach vertical blanking mode (when it is not updating the screen), it can begin a 2nd frame on the 3rd buffer. With a 2 buffer system, the display buffer can't be changed except between refreshes, and the GPU must hold the finished image and do nothing until it can be flipped to the display buffer. This can result in a huge loss of FPS and adds latency.

With V-sync, triple buffering and OpenGL (not used often these days), when you are able to generate frames faster than your refresh, the GPU will continue to alternate rendering frames on the back two buffers until the display is between refreshes, then it will display which ever frame is the newest in the back two buffers. That means the GPU may throw away rendered frames that never got displayed because a newer one was generated. This method does not add any latency and improves FPS, which improves latency.

With V-sync, triple buffering and DX (much more common these days), when you are generating more FPS than your refresh, the GPU will start rendering on the 2nd back buffer. When both back buffers have complete images, when waiting for a break between refreshes, rather than switching to the buffer with the oldest image to start rendering a new image, the GPU stops, and when vertical retrace mode (time between refreshes) comes around, the GPU will display the oldest complete image. This method results in a full frame worth of latency.

Recap:
With V-sync, when your FPS never reach the refresh rate, triple buffering does not add any latency in either DX or OpenGL and can greatly improve FPS, which improves latency.

In OpenGL with V-sync, triple buffering allows FPS to go beyond your refresh rate, improving latency, and only displaying the most recent images, throwing out extras.

In DX with V-sync, if you reach your refresh rate in FPS, you become limited to your refresh rate, and triple buffering will start to show a frame behind, resulting in a full frame worth of latency.
 
Last edited:

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Triple buffering reduces input lag compared to normal double buffering. It is not as good as simply running with vysnc disabled. As of now there is no technique in existence other than having vsync disabled that does not introduce input lag. It's something that has annoyed me for more than a decade and there has been no real progress made since the Geforce 2 era.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Triple buffering reduces input lag compared to normal double buffering. It is not as good as simply running with vysnc disabled. As of now there is no technique in existence other than having vsync disabled that does not introduce input lag. It's something that has annoyed me for more than a decade and there has been no real progress made since the Geforce 2 era.

Read the post I made prior. That is not always the case.

Triple buffering, when used with DirectX with FPS that can surpass your refresh rate, will force the GPU to display a frame behind the most current frame. This adds latency and why some recommend using a 59 FPS limiter.

In every other condition, triple buffering improves or does not change latency, but unfortunately, the one that adds to latency is possibly the most common condition, depending on your rig.

http://en.wikipedia.org/wiki/Multiple_buffering
Another method of triple buffering involves synchronizing with the monitor frame rate. Drawing is not done if both back buffers contain finished images that have not been displayed yet. This avoids wasting CPU drawing undisplayed images and also results in a more constant frame rate (smoother movement of moving objects), but with increased latency.[1] This is the case when using triple buffering in DirectX, where a chain of 3 buffers are rendered and always displayed.
 
Last edited:

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
I like to have vsync in my games, so the animation will be smoother and there will be no tearing. But I don't like input lag. I've heard that triple buffering can help, while others swear it adds to the problem. I use radeonpro to enable it (with vsync on of course, it doesn't work otherwise.) but I can't seem to tell if it helps or vice versa. So, I want an answer. With actual technical reasoning, not hearsay or your own experience or, as I see a lot, made-up BS. Does triple buffering increase or decrease input lag?

I know this is going to sound crazy, but try playing with it on and off and see which you like better or if you can even tell the difference.
 

KeeperFiM

Junior Member
Jul 16, 2013
22
0
0
Well, it's a complicated answer and depends on whether or not the game uses DX or OpenGL and whether or not your FPS are hitting or capable of surpassing your refresh rate or not.

The primary reason triple buffering is used is so that when the GPU waits for the monitor to reach vertical blanking mode (when it is not updating the screen), it can begin a 2nd frame on the 3rd buffer. With a 2 buffer system, the display buffer can't be changed except between refreshes, and the GPU must hold the finished image and do nothing until it can be flipped to the display buffer. This can result in a huge loss of FPS and adds latency.

With V-sync, triple buffering and OpenGL (not used often these days), when you are able to generate frames faster than your refresh, the GPU will continue to alternate rendering frames on the back two buffers until the display is between refreshes, then it will display which ever frame is the newest in the back two buffers. That means the GPU may throw away rendered frames that never got displayed because a newer one was generated. This method does not add any latency and improves FPS, which improves latency.

With V-sync, triple buffering and DX (much more common these days), when you are generating more FPS than your refresh, the GPU will start rendering on the 2nd back buffer. When both back buffers have complete images, when waiting for a break between refreshes, rather than switching to the buffer with the oldest image to start rendering a new image, the GPU stops, and when vertical retrace mode (time between refreshes) comes around, the GPU will display the oldest complete image. This method results in a full frame worth of latency.

Recap:
With V-sync, when your FPS never reach the refresh rate, triple buffering does not add any latency in either DX or OpenGL and can greatly improve FPS, which improves latency.

In OpenGL with V-sync, triple buffering allows FPS to go beyond your refresh rate, improving latency, and only displaying the most recent images, throwing out extras.

In DX with V-sync, if you reach your refresh rate in FPS, you become limited to your refresh rate, and triple buffering will start to show a frame behind, resulting in a full frame worth of latency.
Everything I've personally experienced supports your answer, but I now have 1 final question: With normal, double buffered vsync and a framerate that reaches the refresh rate, will there be less input lag then the same situation that has triple buffering?
 

KeeperFiM

Junior Member
Jul 16, 2013
22
0
0
I know this is going to sound crazy, but try playing with it on and off and see which you like better or if you can even tell the difference.

I can't stand the thought that there could be some lag even if I can't notice it... gah
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I am so glad that at some point in the (near) future I wont ever have to explain this topic again. The classic trade off of lag and tearing and all the conditions and crazy ass schemes we have come up with to try and reduce the impact of the problems they cause will all just be gone. I'll be so darn annoyed if gsync doesn't take over the world!!!!
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
Everything I've personally experienced supports your answer, but I now have 1 final question: With normal, double buffered vsync and a framerate that reaches the refresh rate, will there be less input lag then the same situation that has triple buffering?

In a DirectX game, yes.
In an OpenGL game, no.
 

KeeperFiM

Junior Member
Jul 16, 2013
22
0
0
I am so glad that at some point in the (near) future I wont ever have to explain this topic again. The classic trade off of lag and tearing and all the conditions and crazy ass schemes we have come up with to try and reduce the impact of the problems they cause will all just be gone. I'll be so darn annoyed if gsync doesn't take over the world!!!!
It won't, because it's for nvidia cards only and not everyone has one
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
OP, you might want to read this: http://www.anandtech.com/show/2794/2

That is a good read, but unfortunately, it does not mention or account for this rule in DirectX:


http://en.wikipedia.org/wiki/Multiple_buffering
Another method of triple buffering involves synchronizing with the monitor frame rate. Drawing is not done if both back buffers contain finished images that have not been displayed yet. This avoids wasting CPU drawing undisplayed images and also results in a more constant frame rate (smoother movement of moving objects), but with increased latency.[1] This is the case when using triple buffering in DirectX, where a chain of 3 buffers are rendered and always displayed.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I am so glad that at some point in the (near) future I wont ever have to explain this topic again. The classic trade off of lag and tearing and all the conditions and crazy ass schemes we have come up with to try and reduce the impact of the problems they cause will all just be gone. I'll be so darn annoyed if gsync doesn't take over the world!!!!

Hear hear!