Triple Buffering

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
then claiming dropping frames wouldn't effect a framerate counter when in fact frame counters ount the front buffer swaps so it do does,
If it counted framebuffer swaps then it would display a framerate that exceeded the monitor's refresh rate whenever frames were being dropped, but clearly that never happens.
It counts frontbuffer swawps, which is limited by the refresh rate.

Originally posted by: BFG10K
all while accomplishing nothing more than pointing out obvious fact that displaying more frames each second results smoother motion than displaying less
I thought it was obvious but apparently it isn't to some.
Your explaination only presented the fact that running at a higher refresh rate allows smoother motion, and no one here claimed otherwise.

Originally posted by: BFG10K
On top of that you somehow figured the first frame to be displayed would predict the position of your mouse at 5, the position it is when that first frame is completed rather than the position the mouse was at when rendering started on the frame.
Uh, no. I started at position 0, moved the mouse to position 5 which the game tick registered then generated the frame rendered at position 5.
So you started your mouse at postion 0, but didn't start rendering a frame until you got to postion 5?

Originally posted by: BFG10K
And there you have it, when rendering frames quicker than refreshing them, triple buffering reduces latency in comparison to double buffering by allowing rendering to continue on to a new frame directly after completing the last one rather than having the GPU sit idle while waiting for vsync.
In theory. However in practice it can add visible input lag which again has been documented numerous times.
In your contrived theory it adds input lag, in practice it reduces it.

Originally posted by: BFG10K
Becuase it isn't a "some implementations" thing, you can use 3 buffers and not drop the older frame when the newer one is completed if you like, but then you aren't doing triple buffering.
Triple buffering means using 3 flippable buffers; it states nothing about what replacement policy will be used, something the game can be free to choose assuming it?s not being forced externally.
Nope, again you can use 3 or more buffers without using triple buffering. If you are using got a backbuffer, an accumulation buffer and of couse a frontbuffer, that is 3 buffers already and it takes a 4th to achive triple buffering.

Originally posted by: BFG10K
If a game has its own framerate cap so it never renders frames faster than refresh rate (e.g. 60 FPS cap on a 85 Hz display) you don?t need to drop any frames but can still be doing triple buffering.
Framerate caps limit frontbuffer swaps, they don't make rendering each indidual frame take any longer. And just like with vsync capping the framerate, if two frames are completed before the next avalable vsync then the older frame is discarded.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
It counts frontbuffer swawps, which is limited by the refresh rate.
So in other words it won't count dropped frames which is what I said originally. Stop playing rhetorical games.

Your explaination only presented the fact that running at a higher refresh rate allows smoother motion, and no one here claimed otherwise.
My explanation also demonstrated the impact of dropped frames, frames that could at least be partially seen with vsync disabled.

So you started your mouse at postion 0, but didn't start rendering a frame until you got to postion 5?
In order to render a frame the game tick needs to finish first. I start at zero and then move, after-which the game tick registers I'm at position 5 and then makes a frame.

In your contrived theory it adds input lag, in practice it reduces it.
Again this phenomenon has been reported time and time again. I've tested it in a lot of games and almost without fail there?s extra lag from triple buffering over just vsync, both of which have horrifically more lag than without vsync.

Nope, again you can use 3 or more buffers without using triple buffering. If you are using got a backbuffer, an accumulation buffer and of couse a frontbuffer, that is 3 buffers already and it takes a 4th to achive triple buffering.
Which is why I said flippable buffers. I love it how you flip-flop between including the accumulation buffer in the system whenever it suits you but then turn around and claim it isn?t part of the equation at other times.

The fact is if I ask for a front buffer and two back buffers I have a triple buffered system. Stop talking about accumulation buffers; they aren't relevant.

Framerate caps limit frontbuffer swaps, they don't make rendering each indidual frame take any longer.
I never said they would take longer, but if the renderer doesn?t allow more than 60 FPS (i.e it won?t generate a frame until sufficient time has passed from the previous frame) then it?s never going to render faster than the refresh rate.

And just like with vsync capping the framerate, if two frames are completed before the next avalable vsync then the older frame is discarded.
But that will never happen if the renderer doesn?t generate new frames until sufficient time has passed to ensure the cap is not exceeded.
 

ManWithNoName

Senior member
Oct 19, 2007
396
0
0
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
It counts frontbuffer swawps, which is limited by the refresh rate.
So in other words it won't count dropped frames which is what I said originally.
What you said orignally was:

Originally posted by: BFG10K
Now before you say "but the framerate counter is showing 120 FPS in all three cases" then that's only the half-truth since the framerate counter isn't factoring dropped frames.
When in reality the framerate counter will show less FPS when frames are being dropped rather than transfered to the frontbuffer.

Originally posted by: BFG10K
Your explaination only presented the fact that running at a higher refresh rate allows smoother motion, and no one here claimed otherwise.
My explanation also demonstrated the impact of dropped frames, frames that could at least be partially seen with vsync disabled.
And no one claimed otherwise on that either. But surely you do realize that those frames only partially being displayed due to a lower refresh rate means you don't see the mouse pointer at all five postions without vsync either, or do you?

Originally posted by: BFG10K
So you started your mouse at postion 0, but didn't start rendering a frame until you got to postion 5?
In order to render a frame the game tick needs to finish first. I start at zero and then move, after-which the game tick registers I'm at position 5 and then makes a frame.
Heh, so you are waiting two ticks before starting to render a frame, or do you think a program can predict that you moved your mouse before the second tick is completed?

Originally posted by: BFG10K
In your contrived theory it adds input lag, in practice it reduces it.
Again this phenomenon has been reported time and time again. I've tested it in a lot of games and almost without fail there?s extra lag from triple buffering over just vsync, both of which have horrifically more lag than without vsync.
People often see what they expect to see rather than what actually is.

Originally posted by: BFG10K
Nope, again you can use 3 or more buffers without using triple buffering. If you are using got a backbuffer, an accumulation buffer and of couse a frontbuffer, that is 3 buffers already and it takes a 4th to achive triple buffering.
Which is why I said flippable buffers. I love it how you flip-flop between including the accumulation buffer in the system whenever it suits you but then turn around and claim it isn?t part of the equation at other times.

The fact is if I ask for a front buffer and two back buffers I have a triple buffered system. Stop talking about accumulation buffers; they aren't relevant.
They aren't relevent when they aren't used, but accumulation buffers are part of the flip chain when they are used, and hence relevent to your question of why one would need more than three buffers to achive triple buffering. It would be nice if you could drop that tangent though and get back to how three buffers are used to achive triple buffering. Again, to understand the process all you have to do is answer one simple question about the illustration in that MSDN entry you linked; what happens to 33 when 22 is set to become the frontbuffer after the flip?

Originally posted by: BFG10K
Framerate caps limit frontbuffer swaps, they don't make rendering each indidual frame take any longer.
I never said they would take longer, but if the renderer doesn?t allow more than 60 FPS (i.e it won?t generate a frame until sufficient time has passed from the previous frame) then it?s never going to render faster than the refresh rate.

And just like with vsync capping the framerate, if two frames are completed before the next avalable vsync then the older frame is discarded.
But that will never happen if the renderer doesn?t generate new frames until sufficient time has passed to ensure the cap is not exceeded.
I'm sorry, this was bad of me, I was distracted when I was finishing my post there and got my thinking turned around. At least generally, a frame limiter will specify the minimum time between frames drawn to the backbuffers as you said, rather than that of frontbuffer updates I suggested previously. And no, no frames aren't dropped when rendering less frames than you have refreshes, be it artificially limited by a framerate cap, or naturally limited by performance. Regardles, triple buffering results in less latentcy than having to wait for vsync before starting to draw the next frame.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
When in reality the framerate counter will show less FPS when frames are being dropped rather than transfered to the frontbuffer.
When?

I just fired up Unreal and move right next to a wall.

Without vsync I got 880 FPS and with vsync + triple buffering a constant 73 FPS (73 Hz refresh obviously).

880 / 73 means ~12 rendered frames for every refresh so I must be dropping 10 or 11 frames every refresh on average, yet at no time did the framerate counter deviate from 73 FPS.

Clearly the framerate counter is not factoring dropped frames but only frames that are actually being displayed, which is the point I made earlier.

But surely you do realize that those frames only partially being displayed due to a lower refresh rate means you don't see the mouse pointer at all five postions without vsync either, or do you?
Yep, but the input response is still vastly better because the display is not tied to the refresh rate, and because you still benefit from seeing all of those positions at least partially.

Heh, so you are waiting two ticks before starting to render a frame, or do you think a program can predict that you moved your mouse before the second tick is completed?
No, I start at position 0, move the mouse to 5 which the next tick captures and then renders.

Perhaps the mouse was at -5 with the previous tick but that isn?t relevant since my example didn?t start from there.

People often see what they expect to see rather than what actually is.
Years ago, long before I read any of the lag comments, I saw people talking about tearing and I wanted to find out what the fuss was about. So I tried vsync and I immediately found it introduced input lag along with an annoying tendency to reduce the framerate to divisions of the refresh rate.

So I did more research, tried triple buffering and found while it fixed the framerate issue, input lag was worse in many cases.

Since I seldom - if ever - notice tearing, it was a no brainer for me to leave both off, especially since I tend to run high input sensitivity in many games, and I?m very sensitive to any delay between input and what I see on the screen.

Again, to understand the process all you have to do is answer one simple question about the illustration in that MSDN entry you linked; what happens to 33 when 22 is set to become the frontbuffer after the flip?
I believe that particular example will drop a frame.

And no, no frames aren't dropped when rendering less frames than you have refreshes, be it artificially limited by a framerate cap, or naturally limited by performance.
Yep, that was the point I was making.

For example, if the refresh cycle comes every 11 ms (85 Hz) but frames come every 16 ms (60 FPS game cap), when the frame is ready @ 16 ms the next refresh cycle won?t be available until 22 ms.

However a triple buffered system could keep rendering to the third buffer without dropping frames because you?d never have more than one completed frame between refresh cycles (at the 22 ms refresh the next frame won?t be ready until 32 ms).

Hence my point about being able to benefit from triple buffering without necessarily having to drop frames (aka implementation dependent).
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
Originally posted by: nullpointerus


[ 1 ][ 2 ][ 3 ][ 4 ][ 5 ][ 6 ][ 7 ][ 8 ][ 9 ] ( <-- time intervals of unspecified units )
[ x ][ _ ][ _ ][ _ ][ x ][ _ ][ _ ][ _ ][ x ] ( <-- x=frame, _=wait )

Right?


awesome ... that is right ... you showed triple buffering .... let me add double plus vsync

[ 1 ][ 2 ][ 3 ][ 4 ][ 5 ][ 6 ][ 7 ][ 8 ][ 9 ]
[ x ][ _ ][ _ ][ _ ][ x ][ _ ][ _ ][ _ ][ x ]

ok... so the x is a vertical refresh. lets say each number is an even interval of time between renders. if a frame can render in each interval you get this (lower case letters for spacing):

[ 1 ][ 2 ][ 3 ][ 4 ][ 5 ][ 6 ][ 7 ][ 8 ][ 9 ]
[ A ][ B ][ c ][ d ][ E ][ F ][ G ][ H ][ I ]
[ X ][ _ ][ _ ][ _ ][ x ][ _ ][ _ ][ _ ][ x ]

now ... here's what you see with double buffering:

[ 1 ][ 2 ][ 3 ][ 4 ][ 5 ][ 6 ][ 7 ][ 8 ][ 9 ]
[ A ][ B ][ _ ][ _ ][ B ][ F ][ _ ][ _ ][ F ]
[ X ][ _ ][ _ ][ _ ][ x ][ _ ][ _ ][ _ ][ x ]


here's what you get with triple buffering:

[ 1 ][ 2 ][ 3 ][ 4 ][ 5 ][ 6 ][ 7 ][ 8 ][ 9 ]
[ A ][ B ][ c ][ d ][ d ][ F ][ G ][ H ][ H ]
[ X ][ _ ][ _ ][ _ ][ x ][ _ ][ _ ][ _ ][ x ]

here's what you get with NO VSYNC

[ 1 ][ 2 ][ 3 ][ 4 ][ 5 ][ 6 ][ 7 ][ 8 ][ 9 ]
[ A ][ B ][ c ][ d ][dE][ F ][ G ][ H ][HI]
[ X ][ _ ][ _ ][ _ ][ x ][ _ ][ _ ][ _ ][ x ]

so .... summary:

double buffering + vsync has the most delay between the input you gave and what happens on the screen. you see frames "A" "B" and "F" with lots of input lag

triple buffering shows you the most recently completed frame with the most recent input update. you see frames "A" "d" and "H" with as little input lag as a completed frame can have.

no vsync gives you either the most recently completed frame or a combination of the most recently completed frame + the portion of the next most recently completed frame torn on the display. you see frames "A" "dE"(possible tear) and "HI"(possible tear) with the same input lag as triple buffering in the d and H parts but less input lag in the E and I parts.

i think bfg is actually thinking that in the no vsync situation frames "B" "c" and "d" will all be SEEN ... they are, in fact, "dropped" by the monitor. the monitor will not display any frames between vertical refreshes ... period. you ALWAYS "see" 60fps on a monitor whose refresh is 60fps because that is as many frames as it is capable of drawing in one second. period.

the reason a higher fps gives better performance is because that means the most recently rendered frame is closer in time to the vertical refresh and the image is smoother.

triple buffering accomplishes this as well as possible without tearing. no vsync gives the least input lag with only the torn part of the screen that represents the actual frame currently being swapped in in the middle of a vertical refresh.

Bottom line: triple buffering does not inherently introduce significantly more input lag than running with no vsync, and the advantage is only in the torn part of your screen even so.

monitors cannot display more than their refresh rate even if your game can render frames and swap buffers at 400 fps. the only thing that means is that the monitor will be drawing from a front buffer that gets swapped out in the middle of a refresh and could show tearing.

YOU ALWAYS ONLY SHOWN EXACTLY 60 FRAMES IN ONE SECOND IF YOUR MONITOR REFRESHES AT 60fps.

whether all frames are different (game renders faster), some are the same (game renders slower) or some frames are split between two images (no vsync), you are always "dropping" frames.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
to underline what i posted above ...

Originally posted by: BFG10K
I just fired up Unreal and move right next to a wall.

Without vsync I got 880 FPS and with vsync + triple buffering a constant 73 FPS (73 Hz refresh obviously).

880 / 73 means ~12 rendered frames for every refresh so I must be dropping 10 or 11 frames every refresh on average, yet at no time did the framerate counter deviate from 73 FPS.

Clearly the framerate counter is not factoring dropped frames but only frames that are actually being displayed, which is the point I made earlier.

In both cases you are only seeing 73 frames per second.

your monitor is only drawing 73 frames per second.

the framebuffer is only copied to the screen 73 times per second.

only what is in the frontbuffer at the time of the swap is displayed.

....

the major difference is that triple buffering only swaps a back buffer to the front buffer once per vertical refresh rather than every time a frame is drawn.

frame rate counts swaps to the front buffer NOT the number of frames displayed on the screen.

the number of images the display can physically draw in a second IS the vertical refresh rate.

Tearing ONLY combines the images swapped into the front buffer DURING the vertical refresh itself (how fast this is depends on the monitor, h refresh, pixel clock, etc). This is NOT every image rendered between refreshes and is usually only going to be the two most recently rendered images.

....

when running at 880 frames per second, the game swaps 880 frames to the front buffer. every 1/73 seconds the monitor takes what is output from the front buffer and draws it on the screen. ALL OTHER FRAMES DRAWN TO THE FRONT BUFFER ARE NEVER SEEN.

the difference in input lag in your example between running no vsync and running triple buffering is AT MOST 1/880 seconds (1.136 ms).

contrast this with double-buffering plus vsync and you see that input lag will always be at least: 1/73 - 1/880 = 12.562 ms

Any questions?
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: taltamir
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.

Are you trying to make people laugh?

1) This isn't a contest. You can quit being a cheerleader for Team BFG10K1
2) You don't know enough about any of it to make a conclusion.

It is clear, that the 3 or 4 people involved in this conversation know what they are talking about quite well. Everyone else (this includes you, and myself) do not know anywhere near enough about this topic to make any informed factual conclusion. Lets not kid ourselves and pretend to know more than we do.
 

chizow

Diamond Member
Jun 26, 2001
9,537
2
0
Originally posted by: DerekWilson
to underline what i posted above ...

Originally posted by: BFG10K
I just fired up Unreal and move right next to a wall.

Without vsync I got 880 FPS and with vsync + triple buffering a constant 73 FPS (73 Hz refresh obviously).

880 / 73 means ~12 rendered frames for every refresh so I must be dropping 10 or 11 frames every refresh on average, yet at no time did the framerate counter deviate from 73 FPS.

Clearly the framerate counter is not factoring dropped frames but only frames that are actually being displayed, which is the point I made earlier.

In both cases you are only seeing 73 frames per second.

your monitor is only drawing 73 frames per second.

the framebuffer is only copied to the screen 73 times per second.

only what is in the frontbuffer at the time of the swap is displayed.

....

the major difference is that triple buffering only swaps a back buffer to the front buffer once per vertical refresh rather than every time a frame is drawn.

frame rate counts swaps to the front buffer NOT the number of frames displayed on the screen.

the number of images the display can physically draw in a second IS the vertical refresh rate.

Tearing ONLY combines the images swapped into the front buffer DURING the vertical refresh itself (how fast this is depends on the monitor, h refresh, pixel clock, etc). This is NOT every image rendered between refreshes and is usually only going to be the two most recently rendered images.

....

when running at 880 frames per second, the game swaps 880 frames to the front buffer. every 1/73 seconds the monitor takes what is output from the front buffer and draws it on the screen. ALL OTHER FRAMES DRAWN TO THE FRONT BUFFER ARE NEVER SEEN.

the difference in input lag in your example between running no vsync and running triple buffering is AT MOST 1/880 seconds (1.136 ms).

contrast this with double-buffering plus vsync and you see that input lag will always be at least: 1/73 - 1/880 = 12.562 ms

Any questions?

I think the tearing visualization gives the best representation of input lag, but as the others said, it wouldn't be any greater than that even with Vsync and Triple Buffering enabled. The problem I have with the "non-dropped frames" argument is that it'd get progressively worst until it dropped frames to sync up with its refresh. I don't think its dropping in some places and not dropping in others, so I'd tend to believe frames are always dropped.

The other main argument would be in terms of mouse cursor and number of samples impacting how smoothly motion/input lag appear. When you guys are talking 1-5/60th of a second between samples I'm not sure a mouse cursor with human input could move to discrete positions quickly enough to notice any difference. I'm pretty sure there's actual specs out there for newer mice in terms of how fast/far they can move so it might be worth referencing.
 

ManWithNoName

Senior member
Oct 19, 2007
396
0
0
Originally posted by: ArchAngel777
Originally posted by: taltamir
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.

Are you trying to make people laugh?

1) This isn't a contest. You can quit being a cheerleader for Team BFG10K1
2) You don't know enough about any of it to make a conclusion.

It is clear, that the 3 or 4 people involved in this conversation know what they are talking about quite well. Everyone else (this includes you, and myself) do not know anywhere near enough about this topic to make any informed factual conclusion. Lets not kid ourselves and pretend to know more than we do.

Can't really tell if that was just directed at Taltamir or if it was a group shot at me as well. Anyway, as far as my post goes, yeah I was trying to make people grin, not so much laugh. I just thought this thread needed some lightening-up, or at least a quick time-out. If this offended you in some way, Oh well.

Also I'm not to proud to admit either that all their triple-buffering rebuttals were triple somersaulting over my head. :D I started reading through a few of them and wound up with a headache by the time I was done.

 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: ManWithNoName
Originally posted by: ArchAngel777
Originally posted by: taltamir
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.

Are you trying to make people laugh?

1) This isn't a contest. You can quit being a cheerleader for Team BFG10K1
2) You don't know enough about any of it to make a conclusion.

It is clear, that the 3 or 4 people involved in this conversation know what they are talking about quite well. Everyone else (this includes you, and myself) do not know anywhere near enough about this topic to make any informed factual conclusion. Lets not kid ourselves and pretend to know more than we do.

Can't really tell if that was just directed at Taltamir or if it was a group shot at me as well. Anyway, as far as my post goes, yeah I was trying to make people grin, not so much laugh. I just thought this thread needed some lightening-up, or at least a quick time-out. If this offended you in some way, Oh well.

Also I'm not to proud to admit either that all their triple-buffering rebuttals were triple somersaulting over my head. :D I started reading through a few of them and wound up with a headache by the time I was done.

I was not addressing you. Sorry for the confusion. :beer:

 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Originally posted by: ArchAngel777
Originally posted by: taltamir
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.

Are you trying to make people laugh?

1) This isn't a contest. You can quit being a cheerleader for Team BFG10K1
2) You don't know enough about any of it to make a conclusion.

It is clear, that the 3 or 4 people involved in this conversation know what they are talking about quite well. Everyone else (this includes you, and myself) do not know anywhere near enough about this topic to make any informed factual conclusion. Lets not kid ourselves and pretend to know more than we do.

Don't include me with yourself. I know exactly what I am talking about and how it works and how it affects things. I just see no reason to reiterate things that have already been stated. I could lunch into a page long essay about the inner working of things. Or I could just point out the person who already done so correctly. (and please don't pull the "its so arrogant to think you know everything". I don't know everything, but I know THIS)

And this SHOULDN'T be a competition, but despite what it should or not be it ends up as a competition. which is exactly why I haven't gone into the page long essay, there is nothing to be learned or taught here. The only reason I even bothered saying ANYTHING in this thread is because I didn't want a person to come in and read misinformation and take it as truth. I was hoping that by throwing my vote with one of the "sides" of this thread I will help sway a few more readers to the truth. The people actually doing the arguing are do so with religious zeal and cannot be swayed either way anyways.
 

ArchAngel777

Diamond Member
Dec 24, 2000
5,223
61
91
Originally posted by: taltamir
Originally posted by: ArchAngel777
Originally posted by: taltamir
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.

Are you trying to make people laugh?

1) This isn't a contest. You can quit being a cheerleader for Team BFG10K1
2) You don't know enough about any of it to make a conclusion.

It is clear, that the 3 or 4 people involved in this conversation know what they are talking about quite well. Everyone else (this includes you, and myself) do not know anywhere near enough about this topic to make any informed factual conclusion. Lets not kid ourselves and pretend to know more than we do.

Don't include me with yourself. I know exactly what I am talking about and how it works and how it affects things. I just see no reason to reiterate things that have already been stated. I could lunch into a page long essay about the inner working of things. Or I could just point out the person who already done so correctly. (and please don't pull the "its so arrogant to think you know everything". I don't know everything, but I know THIS)

And this SHOULDN'T be a competition, but despite what it should or not be it ends up as a competition. which is exactly why I haven't gone into the page long essay, there is nothing to be learned or taught here. The only reason I even bothered saying ANYTHING in this thread is because I didn't want a person to come in and read misinformation and take it as truth. I was hoping that by throwing my vote with one of the "sides" of this thread I will help sway a few more readers to the truth. The people actually doing the arguing are do so with religious zeal and cannot be swayed either way anyways.

Until you go into detail and prove it with your 'essay', then all I can do is :roll:

 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K
When in reality the framerate counter will show less FPS when frames are being dropped rather than transfered to the frontbuffer.
When?

I just fired up Unreal and move right next to a wall.

Without vsync I got 880 FPS and with vsync + triple buffering a constant 73 FPS (73 Hz refresh obviously).

880 / 73 means ~12 rendered frames for every refresh so I must be dropping 10 or 11 frames every refresh on average, yet at no time did the framerate counter deviate from 73 FPS.

Clearly the framerate counter is not factoring dropped frames but only frames that are actually being displayed, which is the point I made earlier.
What you explain there is correct, I took issue further with your statement further back were where you suggested when rendering 120fps you might drop 2/5s of the frames getting 72fps, or 3/5s of the frames getting 48fps, but the frame would show 120fps regardless. That would be like having the framecounter read 880 fps when you are running vsynced at 72Hz, and as you more recent statments agree, simply isn't the case.

Originally posted by: BFG10K
But surely you do realize that those frames only partially being displayed due to a lower refresh rate means you don't see the mouse pointer at all five postions without vsync either, or do you?
Yep, but the input response is still vastly better because the display is not tied to the refresh rate, and because you still benefit from seeing all of those positions at least partially.
No, you see each frame partially, but whether that part of the frame displays the first postion or is overwriten by a newer frame before the refresh has displayed that postion on the first frame determians whether or not that refresh shows you the mouse pointer at the first postion, the second postion or a tear running though the postion of the mouse pointer with the top part showing the first postion and the bottom part showing the second.

Originally posted by: BFG10K
Heh, so you are waiting two ticks before starting to render a frame, or do you think a program can predict that you moved your mouse before the second tick is completed?
No, I start at position 0, move the mouse to 5 which the next tick captures and then renders.
That is waiting two ticks before rendering a frame, the first frame can't tell you where you moved your mouse to unless it knows where you mouse started based on a previous tick.

Originally posted by: BFG10K
People often see what they expect to see rather than what actually is.
Years ago, long before I read any of the lag comments, I saw people talking about tearing and I wanted to find out what the fuss was about. So I tried vsync and I immediately found it introduced input lag along with an annoying tendency to reduce the framerate to divisions of the refresh rate.

So I did more research, tried triple buffering and found while it fixed the framerate issue, input lag was worse in many cases.

Since I seldom - if ever - notice tearing, it was a no brainer for me to leave both off, especially since I tend to run high input sensitivity in many games, and I?m very sensitive to any delay between input and what I see on the screen.
Yeah, you don't see tearing without vsync because you don't expect to, and you don't see less input lag with triple buffering because you don't expect that either.

Originally posted by: BFG10K
Again, to understand the process all you have to do is answer one simple question about the illustration in that MSDN entry you linked; what happens to 33 when 22 is set to become the frontbuffer after the flip?
I believe that particular example will drop a frame.
And effect will displaying that newer frame on the next refresh rate rather sitting idle saving the older one have on input lag?

Originally posted by: BFG10K
And no, no frames aren't dropped when rendering less frames than you have refreshes, be it artificially limited by a framerate cap, or naturally limited by performance.
Yep, that was the point I was making.

For example, if the refresh cycle comes every 11 ms (85 Hz) but frames come every 16 ms (60 FPS game cap), when the frame is ready @ 16 ms the next refresh cycle won?t be available until 22 ms.

However a triple buffered system could keep rendering to the third buffer without dropping frames because you?d never have more than one completed frame between refresh cycles (at the 22 ms refresh the next frame won?t be ready until 32 ms).

Hence my point about being able to benefit from triple buffering without necessarily having to drop frames (aka implementation dependent).
Yet the lack of dropped frames isn't due to any change in the functionality of triple buffering, it is just the result of rendering less frames than you have refreshes to display. Raise the framerate cap and render more frames than you can refresh and those extra frames will be discarded when newer ones are finished before the next refresh, the implementation triple buffering isn't changed at all, rather only the framerate it is opperating under.

Originally posted by: DerekWilson
no vsync gives you either the most recently completed frame or a combination of the most recently completed frame + the portion of the next most recently completed frame torn on the display. you see frames "A" "dE"(possible tear) and "HI"(possible tear) with the same input lag as triple buffering in the d and H parts but less input lag in the E and I parts.

i think bfg is actually thinking that in the no vsync situation frames "B" "c" and "d" will all be SEEN ... they are, in fact, "dropped" by the monitor.

You have been misconceptualizating of the process. BFG is correct when he says each frame is partially displayed, though apparently still not understanding what the "partially" refers to. Disabling vsync allows the frontbuffer to be updated as soon as a new frame is completed, meaning if you rendering 4 frames every refresh as in your example (and for the sake of ease we assume the first frame mentioned is completed by chance just in time for vsync); the first refresh will show the top quarter of the first frame, the second quarter of the second frame, the third quarter of the third frame, and the forth quarter of the forth. That framerate results in each refresh displaying 3 tears between those four frames; with the position of the mouse pointer depending on which quarter of the display it is on, or with a tear running through the pointer showing two separate positions for the top and bottom parts respectively.

If the framerate goes up, the space between the tears will decrease as a fifth frame is added resulting in a forth tear as well. If the framerate goes down the space between tears will increase the number of tears will be reduced as less frames are being rendered. And of course as that framerate fluctuates, which positions of the mouse pointer are displayed which are discared changes in respect to those fluctuations as well.

Also, it isn't the monitor that is droping those parts of the frames, but rather they are dropped because the RAMDAC is reading the frontbuffer slower than that frontbuffer is being updated.
 

Quiksilver

Diamond Member
Jul 3, 2005
4,725
0
71
So yeah... can we get a summary in short saying what would be best if your not using vsync and if you are using sync.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
Originally posted by: taltamir
Originally posted by: ArchAngel777
Originally posted by: taltamir
Originally posted by: ManWithNoName
Whoever wins this one gets free Keg O' Beer. :beer: and right now my money's on BFG. :thumbsup:

BFG already won, the snowman doesn't know what he is talking about. I don't see why BFG even bothers to keep the argument going.

Are you trying to make people laugh?

1) This isn't a contest. You can quit being a cheerleader for Team BFG10K1
2) You don't know enough about any of it to make a conclusion.

It is clear, that the 3 or 4 people involved in this conversation know what they are talking about quite well. Everyone else (this includes you, and myself) do not know anywhere near enough about this topic to make any informed factual conclusion. Lets not kid ourselves and pretend to know more than we do.

Don't include me with yourself. I know exactly what I am talking about and how it works and how it affects things. I just see no reason to reiterate things that have already been stated. I could lunch into a page long essay about the inner working of things. Or I could just point out the person who already done so correctly. (and please don't pull the "its so arrogant to think you know everything". I don't know everything, but I know THIS)

And this SHOULDN'T be a competition, but despite what it should or not be it ends up as a competition. which is exactly why I haven't gone into the page long essay, there is nothing to be learned or taught here. The only reason I even bothered saying ANYTHING in this thread is because I didn't want a person to come in and read misinformation and take it as truth. I was hoping that by throwing my vote with one of the "sides" of this thread I will help sway a few more readers to the truth. The people actually doing the arguing are do so with religious zeal and cannot be swayed either way anyways.

If you think I would argue something with religeous zeal in the face of truth, then you are obviously not familiar with me at all. I'm not going to guess at the religeous beliefs of other forum members either :)

It also happens that if you think BFG is right, you are, in fact, wrong.

Take a look at my most recent post (back from this one) and try to tell me I'm wrong.

I'm sorry I missed it so many times because it is such a fundamental thing to know that nothing changes on a display without a vertical refresh and that nothing makes it on to the screen but what is in the front buffer at the time of the vertical refersh.

I'm not going to go back and quote it, but many times BFG made statements that indicated the ability to "see" all of the frames rendered at a high frame rate. This is simply not true (even if your eyes were capable).

The computer only shows you as many frames per second as your refresh rate is capable of handling. If a game renders more frames than the refresh rate without vsync, then you will "drop" every single frame that wasn't in the front buffer during the vertical refresh.

This is simple fact.

Ignoring all the arguments about input lag and all that, BFG's core argument is based on a flawed assumption; namely that his display is capable of "showing" him every single rendered image output by a game when vsync is off.

If this were true our displays would rock a helluva a lot more and we'd never recommend vsync because you'd be skipping the display of frames that you could have displayed.

BFG has -- hopefully until now -- thought of vsync essentially as a frame rate cap, and not as a physical limitation of the number of frames in one second that a display can actually show.

It's also still not "dropping" frames as BFG wants to put it. That is very misleading as it is nothing like dropping frames from a movie ...

how about this movie analogy -- i think i've got a good one for you.

life is essentially infinity fps with infinity resolution.

a movie camera can be thought of as a reverse monitor ... and every frame it records happens on a vertical refresh.

if you record a movie at 60fps, you've "dropped" infinity frames between every frame. because the camera can only record a single image every 1/60th of a second, you only see the image that hit the ccd when the camera started recording.

Like with a monitors vertical refresh, the camera will have image issues if too many different things happen over the period of time the ccd is exposed. You end up with a blur instead of a tear, but it is only a blur of the things happening during the recording of that one frame (which is very fast) and doesn't by a long shot represent everything that happened between two frames.

In the interest of protecting those who don't know all this stuff, I say again, anyone who disagrees with me should try writing a video game.

or getting a BS in both EE and CPE with a focus in computer graphics.

or writing technical graphics articles for a major computer hardware review site...

I'm ok with resting on my reputation in the middle of an arguement where people refuse to believe facts. I'd rather not do it, but here we are :)
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
Originally posted by: Quiksilver
So yeah... can we get a summary in short saying what would be best if your not using vsync and if you are using sync.

if you are not using vsync, you can't do triple buffering -- double buffering is the only option.

if you are using vsync, triple buffering is the best option.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
Originally posted by: TheSnowman
You have been misconceptualizating of the process. BFG is correct when he says each frame is partially displayed, though apparently still not understanding what the "partially" refers to. Disabling vsync allows the frontbuffer to be updated as soon as a new frame is completed, meaning if you rendering 4 frames every refresh as in your example (and for the sake of ease we assume the first frame mentioned is completed by chance just in time for vsync); the first refresh will show the top quarter of the first frame, the second quarter of the second frame, the third quarter of the third frame, and the forth quarter of the forth. That framerate results in each refresh displaying 3 tears between those four frames; with the position of the mouse pointer depending on which quarter of the display it is on, or with a tear running through the pointer showing two separate positions for the top and bottom parts respectively.

If the framerate goes up, the space between the tears will decrease as a fifth frame is added resulting in a forth tear as well. If the framerate goes down the space between tears will increase the number of tears will be reduced as less frames are being rendered. And of course as that framerate fluctuates, which positions of the mouse pointer are displayed which are discared changes in respect to those fluctuations as well.

Also, it isn't the monitor that is droping those parts of the frames, but rather they are dropped because the RAMDAC is reading the frontbuffer slower than that frontbuffer is being updated.

so in his 880 fps example ... you have 880 different images represented in that one frame? oh wait, that's not a question; it is simply false.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
i might have misunderstood him then (or confused the names of the people posting), because I thought he was saying that only 73 frames (on his 73hz monitor, thats a weird refresh rate) are displayed regardless of weather it renders 880 or 73 fps. If he was indeed saying that 880 frames are displayed on his monitor then he is obviously wrong. But I didn't think that is what he said.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
In both cases you are only seeing 73 frames per second.

your monitor is only drawing 73 frames per second.

the framebuffer is only copied to the screen 73 times per second.
Exactly, that's my point. But if you're dropping frames they're not factored into that figure. Again this has been my point right from the start.

frame rate counts swaps to the front buffer NOT the number of frames displayed on the screen.
My terminology may have been a bit off there but in the case of vync + triple buffer both are the same (i.e. front buffer swaps = full frames displayed on screen).

Tearing ONLY combines the images swapped into the front buffer DURING the vertical refresh itself (how fast this is depends on the monitor, h refresh, pixel clock, etc). This is NOT every image rendered between refreshes and is usually only going to be the two most recently rendered images.
Actually you have it wrong as it will combine all of the frames.

Without vsync the front buffer is written to as soon as a frame is ready and as the scanline passes down the display it will draw whatever is being feed to it on a line-by-line basis (let?s assume a CRT for argument?s sake).

So when the front buffer changes before the scanline has reached the bottom of the display it will start drawing the contents of the next frame.

(We'll ignore the vertical blanking period for simplicity).

If a game renders more frames than the refresh rate without vsync, then you will "drop" every single frame that wasn't in the front buffer during the vertical refresh.
No, you won't drop any frames because you will always see at least part of those frames.

It's also still not "dropping" frames as BFG wants to put it.
Sure it is, because with vsync + triple buffer you won't see any frame that doesn?t hit the front buffer. Without vsync you always see parts of them because all frames hit the front buffer.

so in his 880 fps example ... you have 880 different images represented in that one frame?
No you will have an average of 12 different frames combined into one display cycle with an average of 11 tears.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
That would be like having the framecounter read 880 fps when you are running vsynced at 72Hz, and as you more recent statments agree, simply isn't the case.
Agreed.

The reason I mentioned it was because my 72 FPS and 48 FPS examples were "virtual framerate", only meant to illustrate the time between input subdivisions that are displayed.

Since I thought he was going to counter "but it shows 120 FPS in all three situations" (which is true because it does) I pointed out 120 FPS doesn't count dropped frames, just like my 73 FPS example doesn't show 880 FPS when 10 or 11 frames are being dropped every refresh cycle.

No, you see each frame partially, but whether that part of the frame displays the first postion or is overwriten by a newer frame before the refresh has displayed that postion on the first frame determians whether or not that refresh shows you the mouse pointer at the first postion, the second postion or a tear running though the postion of the mouse pointer with the top part showing the first postion and the bottom part showing the second.
I understand how tearing works which is why I maintain it delivers the best input response. Again it's better to immediately see parts of frames than to not see any of them in the case of dropped frames, and to tie the display to the refresh rate in general.

That is waiting two ticks before rendering a frame, the first frame can't tell you where you moved your mouse to unless it knows where you mouse started based on a previous tick.
I think you?re reading far too much into that simple example.

Let's try it this way: I fire up the game, it initializes the mouse at position 0 and then renders the frame before allowing any input (i.e. the game?s initializing itself).

Then I move the mouse to 5, the first game tick captures this information which it then renders.

Yeah, you don't see tearing without vsync because you don't expect to, and you don't see less input lag with triple buffering because you don't expect that either.
Believe me, I've gone around looking for tearing. The best I could come up is with convoluted cases where I'm standing still in a room with very bright randomly flickering lights (usually white) and I can see tearing there.

But even there if I start moving I can't really see it anymore, and during regular gameplay I almost never see it regardless of whether I?m standing still or moving.

OTOH I can spot mouse lag almost straight away.

And effect will displaying that newer frame on the next refresh rate rather sitting idle saving the older one have on input lag?
I don't quite understand your question. If you're asking if the newer frame is displayed because the older one was dropped then I agree for that particular example.

Yet the lack of dropped frames isn't due to any change in the functionality of triple buffering, it is just the result of rendering less frames than you have refreshes to display
But again my point is that you can have triple buffering and see a performance benefit without ever dropping frames.

So in theory if the game never implemented cases for dropping frames it would still be doing triple buffering.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
I'm ok with resting on my reputation in the middle of an arguement where people refuse to believe facts. I'd rather not do it, but here we are
It's true that I've made some mistakes in this discussion, but then so have you.
 

DerekWilson

Platinum Member
Feb 10, 2003
2,920
34
81
Originally posted by: BFG10K
so in his 880 fps example ... you have 880 different images represented in that one frame?
No you will have an average of 12 different frames combined into one display cycle with an average of 11 tears.

ahh yeah ... I did get something wrong ... 880 frames per second is 12 frames per refresh at 73 hz ... I'm sorry about that.

but you still will not get an average of 11 tears on the screen. You'd have more than 1 tear at that framerate to be sure, but I'm gonna have to go get my VESA timing diagrams out to go deeper into this one. the time between when one redraw stops and the next one starts is not zero, and you will "drop" frames with a high frame rate.

but for arguments sake, even if you were right.

if you do have 11 tears, you will have parts of 11 outdated images that display 11 progressively worse input lags (from the bottom up) and only one image (the last one drawn) will have as little input lag as possible.

I could actually see the argument for this being a good thing if the monitor acted like an accumulation buffer and you ended up with a blur between the frames, but multiple tears representing partial images of old input states is not optimal, useful or a good thing.

triple buffering delivers better image quality with less average input lag in this case -- even if you did have 11 tears each equally spaced, your average input lag by screen space would be approximately:

( (1/73 - 803/880) + (1/73 - 730/880) + (1/73 - 657/880) ... (1/73 - 146/880) + (1/73 - 73/880) ) / 12

this is better than double buffering with vsync, but worse than triple buffering ... I just don't think this is how it works.