Triple buffering

Anarchist420

Diamond Member
Feb 13, 2010
8,645
0
76
www.facebook.com
Is it always a perfect substitute for vsync (other than it takes up more vram)?

I read the AT article, and the article gave me the implication that it was, but can anyone confirm whether that's true?
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
Is it always a perfect substitute for vsync (other than it takes up more vram)?

I read the AT article, and the article gave me the implication that it was, but can anyone confirm whether that's true?

Triple buffering is not a substitute for Vsync. It's designed to operate work along side vsync. One should not be run without the other. Triple buffering was developed to solve a problem where v-sync in some situations can cripple your frame rate. There is no downside to enabling triple buffering unless you have an extremely small amount of vram, which is rare these days.
 

FalseChristian

Diamond Member
Jan 7, 2002
3,322
0
71
All we need now is Direct3D to support triple buffering. It makes such a difference in OpenGL games.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
All we need now is Direct3D to support triple buffering. It makes such a difference in OpenGL games.

Hmm, I wasn't aware v-sync didn't work on Direct3D games. In any case, V-sync/triple buffering only exists in the first place to deal with frame buffer issues related to LCD screens where the frame rate exceeds the update capabilities (refresh rate) of the LCD causing a mish mash of frames, causing tearing.

Triple buffering does absolutely nothing without V-sync enabled.

In theory, once 120hz and higher screens become common place v-sync and triple buffering will be unnecessary.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
120hz doesn't eliminate the need for vsync and I wish people would stop thinking that it does.

Anteaus, FalseChristian is probably referring to the fact that vsync does nothing for DX games from the standard video card control panel.

OP, you certainly did not read the article very well but Anteaus already addressed that.
 
Last edited:

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
120hz doesn't eliminate the need for vsync and I wish people would stop saying that it does.

Yes and no. Your right that the refresh rate is independent of the speed of the frame buffer, however as the refresh rate goes up the speed requirements of the frame buffer goes up, eventually getting to the point where it can except full frames as fast as the source can supply them, thus negating v-sync.

All v-sync does is synchronizes and limits the amount of frames supplied by the max frame fill speed of the monitor, which is "usually" equal to the refresh rate (i hate that term). 120hz monitors that have a proper frame buffer speed to match refresh are rare right now, which is why v-sync is still necessary in most cases. That will change eventually.
 

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
Yes and no. Your right that the refresh rate is independent of the speed of the frame buffer, however as the refresh rate goes up the speed requirements of the frame buffer goes up, eventually getting to the point where it can except full frames as fast as the source can supply them, thus negating v-sync.

All v-sync does is synchronizes and limits the amount of frames supplied by the max frame fill speed of the monitor, which is "usually" equal to the refresh rate (i hate that term). 120hz monitors that have a proper frame buffer speed to match refresh are rare right now, which is why v-sync is still necessary in most cases. That will change eventually.

???

VSYNC locks the frame buffer to the refresh rate. If you don't do this the back buffer flips to the front as soon as the GPU is done drawing and you get screen tearing. This will happen at any refresh rate.
I have no idea what you're talking about with "frame buffer speeds."
 

bunnyfubbles

Lifer
Sep 3, 2001
12,248
3
0
Is it always a perfect substitute for vsync (other than it takes up more vram)?

I read the AT article, and the article gave me the implication that it was, but can anyone confirm whether that's true?

Its not a substitute, its meant to be used in conjunction with.

Its an alternative to double buffering which can have a major draw back of halving your frame rate if performance can't stay above the monitor's refresh rate.

Triple buffering prevents the massive frame rate drops and, in theory, should have far less input lag than double buffering (but either will have some input lag vs. no vsync at all)


HOWEVER, there are still several caveats to triple buffering:

1. there is no way to force it outside of a hack unless its an OpenGL game
2. games that do support it on their own and call it "triple buffering" (for instance, Left 4 Dead gives you the option between double and triple) don't use true triple buffering, they use a render ahead queue, which is similar in the sense that it prevents frame rate halving, but it does not have the reduced input lag benefit.
3. some games / game engines don't cooperate well with triple buffering no matter what
4. true triple buffering with reduced input lag benefit cannot work on AFR modes for multi GPU setups
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
???

VSYNC locks the frame buffer to the refresh rate. If you don't do this the back buffer flips to the front as soon as the GPU is done drawing and you get screen tearing. This will happen at any refresh rate.
I have no idea what you're talking about with "frame buffer speeds."

Vsync actually syncs the frame buffer with the vertical blanking interval. Currently, the VBI time is kept longer than necessary to maintain compatibility with older equipment..specifically analog. Once analog video finally gets discarded completely, they can crank the VBI up to whatever speed they want it to and thus negate the need to use vsync.

As for refresh rate (or frame update rate since refresh is a analog term), it needs to be increased also. Other than 3:2 pull down issues with current 24 fps video and 60hz screens, there is an apparent "smoothness" that comes with higher frame rates.

What I'm saying is that currently frame buffer speeds can exceed VBI latency, which can cause tearing. We use v-sync to lock the frame buffer with the VBI, and thus solves the tearing. At some point in the semi-near future VBI latency will be reduced or made completely unnecessary and allow for unrestricted frame buffer speeds and negate vsync altogether.

I might have explained my case poorly earlier. I didn't mean to imply that refresh rate alone would eliminate tearing. It's unlikely 60hz screens will be made for much longer.