Vsync good, buffering bad, and quadcore makes no sense?

taltamir

Lifer
Mar 21, 2004
13,576
6
76
I tried doing an analysis of how those things work and interact with eachother and came to some interesting conclusion. Feedback is welcome.

How do buffers work? well, buffering is when extra images are rendered and stored depending on the specific setting...

Single buffer:
Frame 1 is displayed.
Frame 2 is rendering.
Frame 3 will not render until frame 2 is displayed. (or it will and discard, depending on vsync)

Double buffer:
Frame 1 is displayed.
Frame 2 is rendering.
Frame 3 will render ASAP.
Frame 4 will not begin rendering until frame 2 is displayed. (or it will and discard, depending on vsync)

Triple buffer:
Frame 1 is displayed.
Frame 2 is rendering.
Frame 3 will render ASAP
Frame 4 will render ASAP.
Frame 5 will not begin rendering until frame 2 is displayed. (or it will and discard, depending on vsync)

without vsync each frame is rendered ASAP, so the faster the video card, the less time passess between frames. With vsync, each frame is matched to be 1/60th of a second apart.

Time meaningfully progresses every 1/60th of a second, when a new frame is sent to the monitor. If your video card is fast enough to render 240fps for game X, then it will render and DISCARD about 3 or so frames before the next monitor refresh. Meaning that regardless of buffer the next frame WILL display any user input since done. If your video card is vsynced that it would have instead rendered 3 "future" frames that are each 1/60th of a second apart and EACH is going to be displayed, resulting in X/60th of a second input lag depending on how many frames were rendered before your input was received.

vsync off (250fps, frames 1/250th of a second apart):
time=0 : Frame 1 is displayed.
time=1/250s: Frame 2 created.
time=1.5/250s: input from user
time=2/250s: Frame 3 created.
time=3/250s: Frame 4 created.
time=1/60s: Frame 4 begins sending to monitor
time=4/250s: Frame 5 enters buffer while frame 4 is being sent. Resulting in tearing as the top half of frame 4 and bottom half of frame 5 are displayed. user input is included in both.

vsync on (250fps CAPABLE card working at 60fps) :
time=0 : Frame 1 is displayed.
time=1/250s: Frame 2 created
time=1.5/250s: input from user
time=2/250s: Frame 3 created.
time=3/250s: Frame 4 created.
time=1/60s: Frame 2 (which is missing the last input) begins sending to monitor, when it finishes, frame 5 will begin rendering.

Basically with very high FPS situation, input lag will be introduced by triple and double buffering. But the tearing is eliminated. With low FPS the input lag is lessened because it is less like that frames are rendered ahead (since the video card is just not fast enough), but it might still occur in times of high FPS spikes. However tearing is completely gone.

If you think vsync reduces input lag then you are just confusing input lag with lag in general. Or your CPU is choking, and reducing the framerate by capping it allows quicker calculations.

Now what is Input lag? it could mean one of two things.
Where you gave input but it did not display on the next image (it took X miliseconds before the gun animation started).
Or when you gave a command and it did not REGISTER with the computer until some time later (i clicked first but died).

Anyways, if you are suffering from cases where you shot first and still died that this is a case where you want to unburden your CPU as much as possible, in which case vsync + triple buffer means you are doing the LEAST amount of work per image displayed, resulting in a snappier system, that will more quickly detect your click.

When I was saying "image lag" before I meant stutter between pictures caused by low FPS. Example: crysis at 5fps lags and looks like a slideshow.

Quad GPU stutter would than be caused by the fact that they are rendering concurrently (each one a different frame) and the need to adjust. Worst case scenario is:

time=0/60s : Frame 1 is displayed. Frame 5 begins render
time=1/60s : Frame 2 is displayed. Frame 6 begins render
time=2/60s : Frame 3 is displayed. Frame 7 begins render
time=3/60s : Frame 4 is displayed. Frame 8 begins render
time=4/60s : Frame 5 is displayed. Frame 9 begins render
time=4.5/60s : user input (gunfire)
time=5/60s : Frame 6 is displayed missing 0.5/60 of input. Frame 10 begins render
time=5/60s : Frame 7 is displayed missing 1.5/60 of input. Frame 11 begins render
time=5/60s : Frame 8 is displayed missing 2.5/60 of input. Frame 12 begins render
time=5/60s : Frame 9 is displayed missing 3.5/60 of input. Frame 13 begins render
time=5/60s : Frame 10 is displayed, it includes the input from 4.5/60s ago, which makes it "not fit" with the previous frame (which displayed 3.5/60 seconds of you NOT doing a specific act), causing "stutter". Frame 14 begins render

This seems to explain the users complaining on "stutter" on their quad GPU... despite amazingly good "fps" measured. This example will have 60 fps measured with quad GPU where each GPU individually can only achive, say, 15-20fps in that setting, but in reality the game will stutter A LOT.

It results in more frames, but no less stuttering the a single GPU (if anything, it makes it MORE noticeable). Resulting in a jittery experience.




Conclusions:
Vsync eliminates tearing, and reduces input lag (who shot first kind) by decreasing CPU usage, but it also enables buffering to cause stutter.

Buffering might or might not make sense without vsync, with vysnc it increases the measured FPS, but causes stutter (input lag of the "I shot and it took suddenly went from no animation to 50% done with animation). Get rid of it.

AFR rendering (quad / dual GPU) makes no sense, It increases FPS but introduces significant jitter. At least when coupled with vsync. (but tearing and input lag are bad so it is needed)...

So for best quality gaming you want the fastest single core GPU rendering with vsync on and the lowest buffering possible.
 

Throckmorton

Lifer
Aug 23, 2007
16,829
3
0
I was wondering about triple buffering. How can it make sense to render 3 frames, and then display them, when there is no way of predicting what the game is going to be doing?
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Originally posted by: taltamir

So for best quality gaming you want the fastest single core GPU rendering with vsync on and the lowest buffering possible.


Interesting analysis, and the conclusion is dead on with higher end gamer/reviewer consensus.

BTW, some of us still have CRTs (not for long though :() so the picture changes a bit -- no 60 hz limit here. But does explain a great many things.
 

NickelPlate

Senior member
Nov 9, 2006
652
13
81
Nice write up. I've always gamed with vsync and triple buffering because I can't stand the tearing. It always seemed like triple buffering smoothed things out a little but perhaps it's game dependent? Methinks this maybe should be a sticky.

NP
 

v8envy

Platinum Member
Sep 7, 2002
2,720
0
0
Oh, looky what I found:

http://www.tomshardware.com/19...le_whopper_/page2.html

If you ignore the first part of the page it explains how AFR introduces additional latency. That analysis was done in the day of CRTs, so the additional lockstep with 60 fps monitors you analyzed wasn't part of the picture.

It's a bandwidth vs latency issue. Two cards aren't going to reduce your latency. They'll double your bandwidth. Four cards won't reduce latency. A thousand cards won't. If a single card is running at 8 fps, or 1000/8 = 125 ms, you will absolutely experience that much latency at a minimum between performing an action and seeing a result. But the number of rendered frames will be as high as 32 per second, leading you to believe you're having a much more responsive system than someone running at 8 fps! Add in another 16 ms for monitor lag (or worse, for monitors with multi-frame input lag and poor response time) to network lag and it's starting to explain why lately I've been doing much much better in games like CounterStrike and TF than 5-6 years ago. If I know my box can't handle the game I crank down details and res until it can, someone running AFR may not realize they're running poorly.

As anyone who has played on a low latency connection can attest, there's a world of difference between 10ms pingers and 100+ms pings. Heck, I know I can tell the difference between 30 and 80.

The 13 year olds on a sugar high and a $3500 gaming rig from Alienware may have much faster reflexes and benchmarks, but they're dealing with an up to 150ms latency (AFR, monitor input lag, monitor response time) on top of network lag they're not even aware of or trying to compensate for! Awesomesauce.

I'm only going to pay attention to benchmarks done with SFR in the future. Also, I guess I'll be spending $1300 on a CRT after all.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
Originally posted by: NickelPlate
Nice write up. I've always gamed with vsync and triple buffering because I can't stand the tearing. It always seemed like triple buffering smoothed things out a little but perhaps it's game dependent? Methinks this maybe should be a sticky.

NP

I do the same and it does smooth things out.

If you have vsync on and "not" triple buffering if you card can not output 60fps say it can only do ~50fps, then it would jump down to 60/2 = 30fps. so you can have time where it will go from 60-30-60-15-30-60.... that is not smooth at all. Where triple buffering will allow odd frames such as 55-54-43 and such which is much smoother than big jumps like 30-60-30-60-15.

not cards with the ability to do much greater than 60fps than i can see it causing input lag with triple buffering is on. But i would also always turn on vsync due to tearing.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
Thank you everyone for the responses and feedback. It was informative.

excellent point about the buffering. I was focusing too much on AFR multi-GPU solutions.
For a single GPU it has both positive and negative aspects, and the choice weather to enable triple buffering should depend on the specific situation.

To avoid input lag, I would go with decreasing the specific eye candy that causes the major FPS fluctuations that triple buffering mitigates. But depending on the game this might not be an option, in which case triple buffering makes sense.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
taltamir, your post is full of mistakes. That and your arguments appear to contradict themselves at various times.

Single buffer:
Frame 1 is displayed.
Frame 2 is rendering.
Frame 3 will not render until frame 2 is displayed. (or it will and discard, depending on vsync)
That's not a single buffer, that's a double buffer. A single buffer would render to the same buffer that?s being displayed, so for obvious reason single buffering is never used.

The rest of your examples are also out by one buffer.

If your video card is fast enough to render 240fps for game X, then it will render and DISCARD about 3 or so frames before the next monitor refresh.
It will only discard frames with triple buffering + vsync.

With double buffer + vsync the GPU stalls and waits for refresh cycles before continuing rendering, but it won?t discard anything.

With vsync off it won?t discard anything either; the display simply will not be able to keep up, leading to tearing.

With low FPS the input lag is lessened because it is less like that frames are rendered ahead (since the video card is just not fast enough), but it might still occur in times of high FPS spikes. However tearing is completely gone.
Frames are never rendered ahead in a standard system because buffering has nothing to do with rendering ahead. You?re confusing buffering with AFR or pre-render, and the latter has nothing to do with the GPU anyway.

Vsync eliminates tearing, and reduces input lag (who shot first kind) by decreasing CPU usage, but it also enables buffering to cause stutter.
Vsync does not reduce input lag, it increases it because it stalls the GPU until a refresh cycle is available. In theory triple buffering reduces the lag but I?ve found the opposite in certain cases, so YMMV.

Buffering might or might not make sense without vsync, with vysnc it increases the measured FPS, but causes stutter (input lag of the "I shot and it took suddenly went from no animation to 50% done with animation). Get rid of it.
Um, buffering always makes sense. What do you suppose would happen to an image if it was being updated while it was being drawn? Double buffering is a minimum requirement in order to do off-screen drawing, which prevents things like flickering images during animations.

AFR rendering (quad / dual GPU) makes no sense, It increases FPS but introduces significant jitter. At least when coupled with vsync. (but tearing and input lag are bad so it is needed)...
Actually a lot of the time AFR doesn?t even work with vsync.

So for best quality gaming you want the fastest single core GPU rendering with vsync on and the lowest buffering possible.
No, that?s the worse possible thing you can do. Like I said with a double buffered system vsync will cause the GPU to stall as it waits for each refresh cycle. This not only increases input lag but it also causes your framerate to drop to fractions of your refresh rate (e.g. a 60 Hz display will always cause 30 FPS rendering if you?re getting 30-59 FPS).
 

TonyB

Senior member
May 31, 2001
463
0
0
i have a 21" CRT, i game at 1280x1024 res with the refresh rate set to 100hz instead of 60hz. World of Warcraft has the option to turn on Vsync and triple buffering in the game, and I have it turned on.

my FPS cap at 100fps, i can't really tell if there is INPUT lag between 100hz and 60hz, but i do notice screen tearing without Vsync on. any idea if running at higher refresh rates would minimize the alleged lag ?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
any idea if running at higher refresh rates would minimize the alleged lag ?
On a CRT? Absolutely.

With vsync a CRT @ 100 Hz can display 100 full frames per second while @ 60 Hz it can only display 60 full frames per second. This would make a huge difference to input response because the visual feedback is much better.
 

TonyB

Senior member
May 31, 2001
463
0
0
is there a formula to calculate the theoretical INPUT lag ?

for example, lets say i have vsync ON @ 100hz TRIPLE buffer enabled compared to 60hz

or VSYNC ON @ 100hz with DOUBLE buffer enabled compared to 60hz

running with VSYNC on and @ 100hz with triple buffer enabled, if im pointing at the sky and the FPS is capped at 100FPS will i have thereotically 0 input lag since the GFX card can process faster than the refresh rate ?

and would input lag also go up when FPS dip real low to like 20FPS , say in a room with 100 objects all doing stuff ?

thanks for answering my questions,




EDIT: im going to try to take a shot at the calcualtions, please correct me if im' wrong

at 60hz refresh rate and double buffer, you're rending 1 frame ahead, so would that be 1/60 = 16.66ms input lag

at 60hz refresh rate and TRIPLE buffer, you're rendering 2 frames ahead, so that would be 2/60 = 33.33ms input lag

at 100hz refresh and double buffer, thats 1/100 = 10ms lag
at 100hz refresh and triple buffer , thats 2/100 = 20ms lag

so would the correct formula for input lag be:

frames rendered ahead / refresh rate = input lag