Triple Buffering

Engraver

Senior member
Jun 5, 2007
812
0
0
You don't get eye candy for free, your frame rate just doesn't periodically cut in half from skipping a frame due to vsync. Triple buffering increases input lag also, does it not?
 

Oyeve

Lifer
Oct 18, 1999
22,047
877
126
Ive always opted for triple buffering. Havent noticed much dif in recent games but in older games it was night and day.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Engraver
Triple buffering increases input lag also, does it not?
Nope, vsync increases input lag over not using vsync, but triple buffering reduces the lag vsync causes by improving the framerate. Triple buffering isn't exactly free though, it uses a bit of VRAM. Granted, using more VRAM only hurts when running a game at settings that combided with triple buffering need more memory than your videocard has on it.
 

LOUISSSSS

Diamond Member
Dec 5, 2005
8,771
57
91
i was told that running games with VSync + Triple Buffering both ON was a good idea. you get a stable 60fps with pretty good quality. 60fps is all your monitor can show anyway.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
VSYNC + Triple Buffering is the only way to go :)

I use in-game settings for vsync and i use the riva tuner included program D3D Overrider 1.5 to force triple buffer in games that do not have an option for it. If anyone has a better way to implement this in Ati driver i am listening, nvidia drivers with my 7800 had a force vsync and triple buffering in the control panel which seemed to work great.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Nope, vsync increases input lag over not using vsync, but triple buffering reduces the lag vsync causes by improving the framerate.
Sorry no, that's false. Increasing the framerate doesn't automatically decrease lag because there isn't a correlation between the two. AFR for example increases input lag despite increasing framerate, as does pre-rendering.

As for triple buffering + vsync, that has visibly more input lag than just vsync by itself given you?re waiting an extra frame before seeing the results of your input.
 

rgallant

Golden Member
Apr 14, 2007
1,361
11
81
It looks like the Link has nothing to do with VSync + Triple Buffering!
but is a feature of RIVATURNER,not Nvidia,s control panel so you might want to do a search
at http://forums.guru3d.com/forumdisplay.php?f=18 to see what feed back is over there ,but if you do not do a search on the topic ,it seems they might ask you if your search button is broken,
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
Nope, vsync increases input lag over not using vsync, but triple buffering reduces the lag vsync causes by improving the framerate.
Sorry no, that's false. Increasing the framerate doesn't automatically decrease lag because there isn't a correlation between the two. AFR for example increases input lag despite increasing framerate, as does pre-rendering.

As for triple buffering + vsync, that has visibly more input lag than just vsync by itself given you?re waiting an extra frame before seeing the results of your input.
Damnit man, I've explained this to you many times, I drew you a picture explaining it years ago, yet you still insist on perepturating your missunderstanding.

Again, here is how rendering works:

Without vsync: The GPU draws each frame in the backbuffer and as soon as the frame is finished it is sent to the frontbuffer, which is what is output to the monitor and the GPU proceeds directly to rendering the next frame. This results in the fastest framerate and least latency, but also results in the frames changing mid refresh which is seen as screen tearing.

With vsync but without triple buffering: The GPU draws each frame in the backbuffer but has to wait for the refresh to finish before moving the competed frame to the frontbuffer, and only then can it continue drawing the next frame. So there is no screen tearing but the framerate is limited to the refresh rate or a fraction there of when it takes longer than one refresh to render a frame.

With vsync and triple buffering: The GPU draws each frame in the backbuffer and moves that frame to the third buffer, which allows it to continue rendering the next frame directly without needing to pause while waiting for vsync. So there is no tearing because the displayed frames don't change mid refresh, but the GPU is free to render at any framerate up to the refresh rate rather than being limited to fractions of that.

And becuase triple buffering allows the GPU to contenue rendering without waiting for vsync, what is input makes it to the screen quicker when using vsync with triple buffering rather than without.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Damnit man, I've explained this to you many times, I drew you a picture explaining it years ago, yet you still insist on perepturating your missunderstanding.
I understand exactly how it works; it?s you that can't seem to understand that an increase in buffering increases input lag.

And becuase triple buffering allows the GPU to contenue rendering without waiting for vsync, what is input makes it to the screen quicker when using vsync with triple buffering rather than without.
With three buffers you have to wait for two ?flips? before seeing the result of your action.

With two buffers you only have to wait for one.

Again, just because the framerate is higher with triple buffering it does not mean the lag is less. There?s actually more lag because there?s more delay due to buffering.

I've just tried it right now to re-confirm what I?ve been saying all along and yet again it's painfully obvious mouse lag is much worse with vsync + triple buffering than with just vsync itself, even though the latter has the lowest framerate of the two.
 
Oct 16, 1999
10,490
4
0
The increased lag from triple buffering at 60fps is 1/60th of a second, or .0166s. At 30fps that number doubles, at 90fps it halves. At the very worst case running at any fps remotely playable it's still hundredths of a second.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
With three buffers you have to wait for two ?flips? before seeing the result of your action.

With two buffers you only have to wait for one.
No, again, with only two buffers and vsync your GPU has to wait for vsync to continue rendering the next frame, while with triple buffering the back buffer can be cleared immediately so the GPU can continue rendering the next frame directly instead of having to pause waiting for the backbuffer to clear on vsync. Hence the results of your action is and generally quicker, and it takes no longer than it does with vsync and double buffering at worst.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Snowman, again you're not accounting for buffering and instead keep focusing on framerate as the sole measure of lag.

Consider this example:

We have three buffers, [X], [Y], [Z] and we've just hit a refresh cycle.

[X] contains a finished frame and now starts being displayed onscreen, we start rendering to [Y], and [Z] is empty.

[Y] gets finished but there's no refresh cycle available so we start rendering to [Z].

[Z] gets finished and shortly thereafer a refresh cycle is available.

At this point we have two finished frames [Y] and [Z] which contain different renders of both the gamestate and our input. Both need to be displayed but we only have one refresh cycle.

[Y] is displayed because it's the oldest but [Z] has to wait until the next refresh cycle before it can be shown.

This buffering introduces input lag because there's an extra frame's worth of delay between when we enter input and when we see it.

So even though the framerate is higher with triple buffering there's more of a delay to when we actually see those frames displayed.

A double buffered system would never have that problem because on every refresh cycle we have at most only one fresh frame to display.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
The increased lag from triple buffering at 60fps is 1/60th of a second, or .0166s. At 30fps that number doubles, at 90fps it halves. At the very worst case running at any fps remotely playable it's still hundredths of a second.
The lag is quite easy to spot just like many people can easily spot 16 ms (and even lower) LCDs ghost.

I'm very sensitive to input lag which is why I never run vsync or triple buffering, except for compatibility reasons for certain games.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
Consider this example:

We have three buffers, [X], [Y], [Z] and we've just hit a refresh cycle.

[X] contains a finished frame and now starts being displayed onscreen, we start rendering to [Y], and [Z] is empty.

[Y] gets finished but there's no refresh cycle available so we start rendering to [Z].

[Z] gets finished and shortly thereafer a refresh cycle is available.

At this point we have two finished frames [Y] and [Z] which contain different renders of both the gamestate and our input. Both need to be displayed but we only have one refresh cycle.

[Y] is displayed because it's the oldest but [Z] has to wait until the next refresh cycle before it can be shown.
Nonsense, in your example with triple buffering [Y] discard and [Z] is displayed because it is the most current finished frame. With only double buffering [Y] held in the backbuffer waiting for vsync because there is no memory allocated to store it in, and hence the input lag is increased because on that next refresh you see [Y] instead of the [Z] you'd see with triple buffering.

Originally posted by: BFG10K
The increased lag from triple buffering at 60fps is 1/60th of a second, or .0166s. At 30fps that number doubles, at 90fps it halves. At the very worst case running at any fps remotely playable it's still hundredths of a second.
The lag is quite easy to spot just like many people can easily spot 16 ms (and even lower) LCDs ghost.

I'm very sensitive to input lag which is why I never run vsync or triple buffering, except for compatibility reasons for certain games.
You are imaging it.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Originally posted by: TheSnowman
Originally posted by: BFG10K
Consider this example:

We have three buffers, [X], [Y], [Z] and we've just hit a refresh cycle.

[X] contains a finished frame and now starts being displayed onscreen, we start rendering to [Y], and [Z] is empty.

[Y] gets finished but there's no refresh cycle available so we start rendering to [Z].

[Z] gets finished and shortly thereafer a refresh cycle is available.

At this point we have two finished frames [Y] and [Z] which contain different renders of both the gamestate and our input. Both need to be displayed but we only have one refresh cycle.

[Y] is displayed because it's the oldest but [Z] has to wait until the next refresh cycle before it can be shown.
Nonsense, in your example with triple buffering [Y] discard and [Z] is displayed because it is the most current finished frame. With only double buffering [Y] held in the backbuffer waiting for vsync because there is no memory allocated to store it in, and hence the input lag is increased because on that next refresh you see [Y] instead of the [Z] you'd see with triple buffering.

Originally posted by: BFG10K
The increased lag from triple buffering at 60fps is 1/60th of a second, or .0166s. At 30fps that number doubles, at 90fps it halves. At the very worst case running at any fps remotely playable it's still hundredths of a second.
The lag is quite easy to spot just like many people can easily spot 16 ms (and even lower) LCDs ghost.

I'm very sensitive to input lag which is why I never run vsync or triple buffering, except for compatibility reasons for certain games.
You are imaging it.
BFG is right though, that's exactly what happens. Triple buffering isn't all that smart, rendered frames don't get dropped, which means the time between when a frame is finished and when it's displayed is 1/60th of a second plus whatever time is left between when the frame finishes rendering and the next refresh cycle hits. It's extra buffering at a hit to latency, a fairly common CS/CE concept.

As for him imagining things, it depends on the monitor and the eyes of the viewer. There are definitely people that can perceive the lag, especially when it's the one more thing that adds lag that makes it clear that the image and the input are out of sync.
 

Dkcode

Senior member
May 1, 2005
995
0
0
Its not perfect and for twitchy shooters like DOD: Source and UT3 its defiantly a no go. A slight bit a of lag exists, although its more like trying to move your mouse through a thin layer of treacle.

TBS, RTS, and other games that don't rely on lighting fast reflexes work out quite well.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: ViRGE
BFG is right though, that's exactly what happens. Triple buffering isn't all that smart, rendered frames don't get dropped...
They are dropped when space is needed because a newer frame has been competed, that is the whole point of triple buffering. The extra buffer space allows the GPU to render continually without having to wait for vsync while also saving the most current frame to update the display on vsync, and to do that the older frame has to be discarded when a new one is finished, to clear VRAM for the GPU to start rendering the next frame in.

And again, hence the reason triple buffering reduces input latency; when using vsync without it the GPU has to wait for vsync before there is any space made available to start rendering the next frame, where as with triple buffering memory is always made available for the GPU to continue rendering the results of any input immediately after the last frame is finished.
 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
Originally posted by: TheSnowman
Originally posted by: ViRGE
BFG is right though, that's exactly what happens. Triple buffering isn't all that smart, rendered frames don't get dropped...
They are dropped when space is needed because a newer frame has been competed, that is the whole point of triple buffering. The extra buffer space allows the GPU to render continually without having to wait for vsync while also saving the most current frame to update the display on vsync, and to do that the older frame has to be discarded when a new one is finished, to clear VRAM for the GPU to start rendering the next frame in.

And again, hence the reason triple buffering reduces input latency; when using vsync without it the GPU has to wait for vsync before there is any space made available to start rendering the next frame, where as with triple buffering memory is always made available for the GPU to continue rendering the results of any input immediately after the last frame is finished.

I don't really quite understand how triple buffering works but according to wikipedia it does increase input lag. Which would make BFG10K right.

http://en.wikipedia.org/wiki/Triple_buffering
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Indeed, no frame dropping happens with triple buffering. Dropping the second buffer would cause lots of stuttering and defeat the purpose of triple buffering.

http://www.nvidia.com/page/pg_20010527107687.html

Triple Buffering
A step beyond double buffering that uses an additional back buffer to process the next image, resulting in smoother animation. With triple buffering, the GPU can start rendering a third frame while the first frame is being displayed and the second frame is waiting to be displayed. Triple buffering helps to insure that the GPU is never idle because it is waiting for rendered frames to be sent to the monitor.

Even in small amounts, lag is much worse when you can anticipate/expect a certain response, and when you're good at recognizing the response you expect. Gamers who have finely-tuned senses can detect this, but those who play games only occasionally will have a hard time recognizing the lag.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: Quiksilver
I don't really quite understand how triple buffering works but according to wikipedia it does increase input lag. Which would make BFG10K right.

http://en.wikipedia.org/wiki/Triple_buffering
Heh, for all we know, BFG is the one who put that claim on Wikipedia. Regardless, there are a lot unsorced and unexpalined claims on Wiki which have no basis in reality. ;)

Originally posted by: xtknight
Indeed, no frame dropping happens with triple buffering. Dropping the second buffer would cause lots of stuttering and defeat the purpose of triple buffering.
You've got that all twisted around, not drooping the older frame would result in the GPU having to sit idle with no VRAM avalable to continue rendering to, and hence increased latency; exactly what triple buffering insures doesn't happen, as I've been saying all along and as explained in what you quoted from Nvidia.

Again, like what you quoted from Nvidia explains, vsync insures that the second frame is held waiting for the first frame to finished being displayed, and triple buffering keeps memory allocated for the GPU to move on to rendering a third frame without having to wait for memory to clear on vsync. If that third frame isn't finished in time for vsync then the second frame is displayed while the third frame is being completed, that is the situation that quote from Nvidia refers to. However, if the GPU does finish rendering that third frame before the first frame has finished being displayed, then the second frame is outdated before it ever got a chance to be displayed. At that point the third frame is held to be displayed on the next vsync in favor of the less current second frame, and is hence the second frame is discarded to clear space for the GPU to move on to rendering the fourth frame.

Put simply, triple buffering allows the GPU to render as it does without using vsync, while restricting the output to only swapping frames on vsync.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
TheSnowman:

I am wrong that no frame dropping happens.

There are some frames dropped, I think, if the second buffer is still waiting to be drawn. Otherwise the lag would accumulate while the monitor tried to draw every last frame that ever came out of the video card.

Instead, the third buffer is filled with whatever is current. And if the third buffer is empty, then lag will be introduced while it is filled.

http://wiki.allegro.cc/Triple_buffering

When using triple buffering you instead set this second, recently drawn, screen in a waiting line. As soon as the graphics hardware detect that the first screen is finished drawing, it starts to draw the second screen. Meanwhile you can draw on the third screen. Triple buffering eliminates the wait for the vertical sync (vsync()).

It would seem to me, then, that this would introduce inconsistent motion but I suppose that's inevitable. VSync serves only to prevent the tearing.

As long as I'm confused, I will try to put out a few points to see where we agree or disagree. I'm not sure this helped but it raised a few questions: http://c2.com/cgi/wiki/quickDiff?TripleBuffer

A. The third buffer is a place in memory that can be drawn to at any time, and it is constantly changing.

B. The third buffer does not necessarily contain a full frame at any given time.

C. The third buffer exists because the double buffer can not be read from or written to at the same time (i.e., synchronization). [FALSE?]

D. The second buffer is a place that contains only full frames.

E. The second buffer is updated when the next full frame is received from the third buffer. It's possible some full frames might be discarded from the second buffer because a new full frame exists.

F. At least with VSync on, a swap occurs between the first and second buffer after VBLANK.

Edit: removed a few unsure statements from this post but it would be nice if these could be clarified by those in the know.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Nonsense, in your example with triple buffering [Y] discard and [Z] is displayed because it is the most current finished frame.
Uh, no. If that were true then what happens frame [Y]? It'd never be displayed and hence dropped.

Furthermore if you kept doing that up to 33% of frames would never be displayed which would cause discontinuity and choppiness because things are being rendered which you never see.

It'd be like playing a network game where the sever is sending updates but the client is dropping up to 33% of the network packets because it can't keep up.

We know this doesn't happen because triple buffering doesn't cause such choppiness; instead it causes lag.

With only double buffering [Y] held in the backbuffer waiting for vsync because there is no memory allocated to store it in, and hence the input lag is increased because on that next refresh you see [Y] instead of the [Z] you'd see with triple buffering.
Again you're defining lag as a concept of framerate but ignoring buffering. A triple buffering system effectively renders ahead an extra frame over a double buffered system (relative to what is displayed), hence the extra lag.

You are imaging it.
Just like you imagine tearing?

They are dropped when space is needed because a newer frame has been competed, that is the whole point of triple buffering.
Sorry, they aren't dropped. Furthermore if you get to the state where both buffers are full and there's still no refresh cycle then a triple buffering system will stall just like a double buffered one.

Heh, for all we know, BFG is the one who put that claim on Wikipedia. Regardless, there are a lot unsorced and unexpalined claims on Wiki which have no basis in reality
Nope, but the claim is backed by the basics of buffering. Look it up.

That and quick tests can quickly confirm the presence of input lag with triple buffering in just about any game.

You've got that all twisted around, not drooping the older frame would result in the GPU having to sit idle with no VRAM avalable to continue rendering to, and hence increased latency
What are you talking about? You don't have to drop [Y] to render to [Z]. That's the whole point of having a third buffer!

The way you explain triple buffering is like having a double buffered system where the frame not being displayed is constantly being overwritten by the new frame.

It doesn't happen like that.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: xtknight
TheSnowman:

I am wrong that no frame dropping happens.

There are some frames dropped, I think, if the second buffer is still waiting to be drawn. Otherwise the lag would accumulate while the monitor tried to draw every last frame that ever came out of the video card.
That would be true if there were infinite buffer space to store those frames. On the other hand, with three buffers and never droping any frames like BFG suggests, you'd just wind up a frame behind with with GPU still having to waste time sitting idle while waiting for vsync.

Originally posted by: xtknight
Instead, the third buffer is filled with whatever is current. And if the third buffer is empty, then lag will be introduced while it is filled.

http://wiki.allegro.cc/Triple_buffering

When using triple buffering you instead set this second, recently drawn, screen in a waiting line. As soon as the graphics hardware detect that the first screen is finished drawing, it starts to draw the second screen. Meanwhile you can draw on the third screen. Triple buffering eliminates the wait for the vertical sync (vsync()).

It would seem to me, then, that this would introduce inconsistent motion but I suppose that's inevitable. VSync serves only to prevent the tearing.

As long as I'm confused, I will try to put out a few points to see where we agree or disagree. I'm not sure this helped but it raised a few questions: http://c2.com/cgi/wiki/quickDiff?TripleBuffer

A. The third buffer is a place in memory that can be drawn to at any time, and it is constantly changing.

B. The third buffer does not necessarily contain a full frame at any given time.

C. The third buffer exists because the double buffer can not be read from or written to at the same time (i.e., synchronization). [FALSE?]

D. The second buffer is a place that contains only full frames.

E. The second buffer is updated when the next full frame is received from the third buffer. It's possible some full frames might be discarded from the second buffer because a new full frame exists.

F. At least with VSync on, a swap occurs between the first and second buffer after VBLANK.

Edit: removed a few unsure statements from this post but it would be nice if these could be clarified by those in the know.
Your terminology is a bit awkward, but I'll do my best to answer with that in mind.

A. With triple buffering there are effectively two backbuffers and one front buffer. The backbuffers are where the frames are drawn to, and the frontbuffer is where the frames are displayed from.

B. At any moment each backbuffer is either cleared waiting to be drawn to, being drawn to currently while the other is holding a finished frame waiting to be swapped to the frontbuffer so it can be displayed.

C. The "third buffer" exists as effectively second backbuffer, which allows the GPU to start rendering a new frame while waiting for vsync to move the previously finished frame in the other backbuffer to the frontbuffer. Double buffering means there is only one backbuffer to write a frame to, along with the one frontbuffer to display the frame from, leaving no place to start working on the next frame while waiting for the frame in the backbuffer to move to the frontbuffer on vsync.

D. Its the frontbuffer which always holds a competed frame, at least aside from the instant in which that frame is being updated with a newer one competed in the backbuffer.

E. Each backbuffer is updated as soon as rendering to the other is finished, and the front buffer is updated from the latest finished backbuffer on vsync.

F: With vsync the sawps between the backbuffer(s) and the front buffers occur on vblank. Without vsync, the frame in the frontbuffer is replaced whenever a frame in the backbuffer is completed, resulting in refreshes consisting of potions of 2 or more different frames; screen tearing.

Originally posted by: BFG10K
Nonsense, in your example with triple buffering [Y] discard and [Z] is displayed because it is the most current finished frame.
Uh, no. If that were true then what happens frame [Y]? It'd never be displayed and hence dropped.
Yes, that is how it works.

Originally posted by: BFG10K
Furthermore if you kept doing that up to 33% of frames would never be displayed which would cause discontinuity and choppiness because things are being rendered which you never see.
Yet triple buffering doesn't keep doing that, it only does it when a newer frame is completed before the next refresh. It has to do that as you can only display as many full frames as you have refreshes for. Rendering at over the refresh rate without vsync also results in the exclusion of part of what is rendered, but rather as parts of frames being excluded when the frontbuffer is updated mid refresh, resulting in a combination of partial frames being refreshes which is seen as screen tearing.

Originally posted by: BFG10K

...

You've got that all twisted around, not drooping the older frame would result in the GPU having to sit idle with no VRAM avalable to continue rendering to, and hence increased latency
What are you talking about? You don't have to drop [Y] to render to [Z]. That's the whole point of having a third buffer!
I'm talking about dropping [Y] after [Z] has been completed in the other backbuffer to make room for rendering the frame after [Z]. That is what I've been explaining all along, and that how triple buffering works, regardless of how many times you misunderstand my comments and make erroneous arguments to the contrary.