Vsync not working as I thought it should.

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
its much more common for a game to not have triple buffering. in fact its probably only a handful that have it.

at least we do now have a game to refer to that actually drops to 30 fps in the framerate counter though.
Just because a game does not offer the option, does not mean it is not already on within the game, or turned on when you turn on v-sync.
 

Falafil

Member
Jun 5, 2013
51
0
0
its much more common for a game to not have triple buffering. in fact its probably only a handful that have it.

at least we do now have a game to refer to that actually drops to 30 fps in the framerate counter though.

The fact that the framerate counter shows you values other than 30 and 60 means there is some form of triple buffering being used. If there is no triple buffering, the frame counter will show 30 or 60, nothing in between. As far as I can tell most games, old and new, have triple buffering built in.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
The fact that the framerate counter shows you values other than 30 and 60 means there is some form of triple buffering being used. If there is no triple buffering, the frame counter will show 30 or 60, nothing in between. As far as I can tell most games, old and new, have triple buffering built in.
again that is NOT always true. I have gone through this many times over many years. people will claim a specific game will show that and it does not. and what is all this nonsense about most games having triple buffering built in? most games most certainly do not especially older games.

and to be clear for others, I am only referring to the framerate counter not what is actually going on with how vsync and triple buffing technically works.
 

Falafil

Member
Jun 5, 2013
51
0
0
again that is NOT always true. I have gone through this many times over many years. people will claim a specific game will show that and it does not. and what is all this nonsense about most games having triple buffering built in? most games most certainly do not especially older games.

and to be clear for others, I am only referring to the framerate counter not what is actually going on with how vsync and triple buffing technically works.

Then would you explain why the framerate counter does drop to 30 in some games as you just admitted in a recent post, while it doesn't in others? Do you have an explanation other than triple buffering?

But I'm not basing my claim on that, I'm basing it on this fact: when v-sync is on and there is no triple buffering and the framerate drops, the frame times will constantly be 33 ms since after the back buffer finishes its frame it simply waits and does nothing every single time the front buffer unloads the current frame on the screen, more simply the front buffer will always unload every single frame twice on the screen when v-sync alone is on, so there is no possible way that the framerate counter could show anything other than 30 when the framerate drops below 60, no other way than triple buffering.
 
Last edited:

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
I had the same argument with BFG10K and he specifically named a game and claimed it dropped straight from 60 to 30. I knew it wasn't true and made a video to prove it at that time and even just sat there showing framerates in the 40s and 50s. before that he argued and argued with me claiming fraps went from 60 to 30 on that game. so again all these people claiming the framerate counter will always show you going straight from 60 fps to 30 fps are just making stuff up.
Nice revisionist history. You missed out the most important part: when I asked you to post your frame logs, your own logs showed I was right, and showed your framerate was bouncing between 30 FPS and 60 FPS, like I said it would.

I then speculated that it could be your overclocked system producing incorrect measurements in places.

But since you're keen to revisit this again, I just did a fresh run of 60 seconds for you in the same game:

Graph.png


The bouncing between 30 FPS and 60 FPS is as plain as day.

The fact is, unless the game has triple buffering (injected, in-game, driver forced, or otherwise), it'll always happen with regular vsync; the rendering will stall until the next refresh cycle whenever both buffers are full.

If this problem didn't happen like you claim, what do you think is the point of nVidia's adaptive vsync? In fact this is precisely why it was implemented.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The trace you have posted represents an average of around 50fps, which is what fraps would report. Because vsync is on a frame either lasts 16ms or 33ms, that is the reality of how monitors work today when at 60hz.

There are two chains of thought about this. One is that it doesn't matter, these momentary drops for small number of frames are not an issue and the tearing is a much bigger problem. The other chain of thought is that we can definitely see these momentary drops especially when they come in groups as they do with vsync and they are cause a stutter that humans can see.

I am very much in the second group, I see the effect clearly and its a major problem for me. What triple buffering does to change that trace is it removes the groupings of 33ms frames coming together and should allow more alternation. But its far from the case that vsync can only achieve 60 and 30fps and nothing else, because on average it produces frame rates in between just fine, which is why its a bad measure because it focusses on an average over a second and that isn't how we perceive motion at all.

What I wish we would all do is move into talking about frame times and the consistency of them, because that is closer to how we perceive motion from a series of images shown to us in rapid succession. The 60/30 problem as I said before is momentary, it occurs in both double and triple buffering but in somewhat different patterns.

As to triple buffering - I consider the algorithm that openGL (the computer science theory behind) uses to be better because it achieves the same effect as the DX one with dramatically less latency. When I was doing Computer Graphics at university my professor told me that moving from dual to triple buffers and not using the fact you could write to either buffer at any time was a naive algorithm that thankfully no one would ever use as a simple and very obvious optimisation was available. It was true at the time, but he underestimated how bad Microsoft was.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
There are two chains of thought about this. One is that it doesn't matter, these momentary drops for small number of frames are not an issue and the tearing is a much bigger problem. The other chain of thought is that we can definitely see these momentary drops especially when they come in groups as they do with vsync and they are cause a stutter that humans can see.
Whether it's momentary or not depends on how often your system can output (>= monitor refresh) rate at the settings you use.

If you're running a scene @ 30-59FPS then it's going to constantly run at 30FPS until your system can manage >=60 FPS again. The latter might never happen for the duration of the game, so the entire game will be locked @ 30 FPS in that case.

Here are some more traces from the same game above, 30 seconds of the same scene:

Graph.png


Because the system never manages 60 FPS with vsync off, the entire 30 seconds is locked to 30 FPS with vsync on. That's far from momentary.

Also the red line shows what would happen with vsync + triple buffering, and also with adaptive vsync.

As a point of reference, vsync on/low IQ uses reduced IQ levels in the same scene to show the 60FPS flatline when the system can manage >=60 FPS.

But its far from the case that vsync can only achieve 60 and 30fps and nothing else, because on average it produces frame rates in between just fine...
This is simply untrue. Ignoring Fraps noise (which averages across 1 second, even longer for the in-game counter), a double-buffered vsync'd system can't manage more than a division of the refresh rate, rounding down.

Triple buffering fixes this because it allows rendering to continue even if both buffers are full but are waiting for a refresh cycle to be flushed.
 
Last edited:

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
That's not entirely true.

If you have no triple buffering with v-sync on, whether you will see FPS between 30 and 60 FPS, depends if there are some frames that can be rendered faster than 16.7ms while others cannot. This usually means if you turn v-sync off, your FPS would be just below 60 on average (yet close to 60), but with v-sync on, if most frames cannot be rendered in 16.7ms, they have to keep waiting for a 2nd refresh, resulting in near 30 FPS, and if none can be rendered in 16.7ms, then it will lock at 30 FPS.

It comes down to how consistent the frame timers are. If a game has rock solid consistent frame times, and it is always just slower than 16.7ms, then it would drop to 30 FPS. That said, most games aren't that consistent, but if it takes 25ms on average to render a frame, even with some variance, there will likely not be many that can be rendered at 16.7ms either, resulting in near 30 FPS.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Nice revisionist history. You missed out the most important part: when I asked you to post your frame logs, your own logs showed I was right, and showed your framerate was bouncing between 30 FPS and 60 FPS, like I said it would.

I then speculated that it could be your overclocked system producing incorrect measurements in places.

But since you're keen to revisit this again, I just did a fresh run of 60 seconds for you in the same game:

http://s21.postimg.org/5fl4q9lgn/Graph.png

The bouncing between 30 FPS and 60 FPS is as plain as day.

The fact is, unless the game has triple buffering (injected, in-game, driver forced, or otherwise), it'll always happen with regular vsync; the rendering will stall until the next refresh cycle whenever both buffers are full.

If this problem didn't happen like you claim, what do you think is the point of nVidia's adaptive vsync? In fact this is precisely why it was implemented.
and here you go again dodging my whole point. you claimed the framerate counter showed a direct 60 to 30 fps in COJ when it did NOT for me. I have made clear over and over and over in this thread that I am talking ONLY about the framerate counter.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
The fact is, unless the game has triple buffering (injected, in-game, driver forced, or otherwise), it'll always happen with regular vsync; the rendering will stall until the next refresh cycle whenever both buffers are full.

So may I ask why doesn't the FPS counter show this? I mean when it drops from 60 it may say 45 for example with FRAPS or another overlay.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
So may I ask why doesn't the FPS counter show this? I mean when it drops from 60 it may say 45 for example with FRAPS or another overlay.
It definitely is not an always deal. However, without some form of triple buffering, you will lose more FPS with v-sync and in the worst cases, it drops to 30 FPS.
 

3DVagabond

Lifer
Aug 10, 2009
11,951
204
106
Man triple buffering is so cool. I never knew I could have 40-50 fps vsync'd.

Remember that's avg. fps. If the fps are jumping between 30fps and 60fps it could report ~45fps because it's the avg. of the two. It would have to be really borderline of holding @60fps. Should be able to reduce one setting and get it locked back @60fps.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
If you have no triple buffering with v-sync on, whether you will see FPS between 30 and 60 FPS, depends if there are some frames that can be rendered faster than 16.7ms while others cannot. This usually means if you turn v-sync off, your FPS would be just below 60 on average (yet close to 60), but with v-sync on, if most frames cannot be rendered in 16.7ms, they have to keep waiting for a 2nd refresh, resulting in near 30 FPS, and if none can be rendered in 16.7ms, then it will lock at 30 FPS.
No, you cannot have in-between frames on a double buffered vsync'd system. In practical terms, the red above is not possible in such a case.

You can have blue or green, or the sawed graph I posted earlier, but all frames are still a division of the refresh rate.

If your counter or log files are showing in-between scores, it's because the numbers average across one or more seconds. So if you're bouncing between 30FPS and 60FPS every 1/10th second (for example), the counter's going to show 45FPS across that second even though that's not really the case.

That's why you can't just run around at random and blindly use a FPS counter to check if the game is triple buffering; you have to set up the tests properly to ensure accurate results.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
and here you go again dodging my whole point. you claimed the framerate counter showed a direct 60 to 30 fps in COJ when it did NOT for me.
You don't have a "point", other than showing us that your testing methods were flawed.

I have made clear over and over and over in this thread that I am talking ONLY about the framerate counter.
Why do you choose to ignore the log files and the graphs? Is it simply because they show something different to the FPS counter, thereby proving you wrong? Or is it something else?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,958
126
So may I ask why doesn't the FPS counter show this? I mean when it drops from 60 it may say 45 for example with FRAPS or another overlay.
The tools will show it, if you test and analyse things properly. If you're asking why it doesn't always show, it could be any (or all) of these reasons:

  1. The FPS counter is an average of 1 second or more. So while 45 FPS is "accurate", it's not accurate in the sense that it's actually either 30FPS or 60FPS. The whole hoopla around FCAT is exactly this point.
  2. Depending on the game/scene/settings, your system may be pulling >=60 FPS for 99% of the time, so the brief moments at 30FPS won't be picked up by the counter (again, see #1 about averages).
  3. The game could be triple buffering automatically.
I don't believe that most games triple buffer, so I personally think #2 applies to most people who claim not to see the issue. People who run at taxing GPU limited settings will notice it a lot more, even with the regular FPS counter.
 

ICDP

Senior member
Nov 15, 2012
707
0
0
The tools will show it, if you test and analyse things properly. If you're asking why it doesn't always show, it could be any (or all) of these reasons:

  1. The FPS counter is an average of 1 second or more. So while 45 FPS is "accurate", it's not accurate in the sense that it's actually either 30FPS or 60FPS. The whole hoopla around FCAT is exactly this point.
  2. Depending on the game/scene/settings, your system may be pulling >=60 FPS for 99% of the time, so the brief moments at 30FPS won't be picked up by the counter (again, see #1 about averages).
  3. The game could be triple buffering automatically.
I don't believe that most games triple buffer, so I personally think #2 applies to most people who claim not to see the issue. People who run at taxing GPU limited settings will notice it a lot more, even with the regular FPS counter.

Excellent post that explains the problem believing an average FPS counter on its own brings. The clue is in the name, it is called Frames Per SECOND. Just staring at the average FPS in real time on the screen will give inaccurate results.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Stop calling it a 30/60 fps I beg you. FPS = frames per second, the period of averaging is even in the name. Its important to talk about this at a frame time level rather than using FPS equivalents because it just confuses a lot of people that haven't spent a lot of time with these other measures.

Theoretically and practically you can get a game to only achieve 30 fps, when vsync off with the same settings would be higher. Its actually quite rare however to see it without trying to achieve it. Its more common to see these patterns of back and forth with an average coming out much closer to the vsync off frame rate, not because of triple buffering but because the game isn't really very consistent in the time it takes to render each frame. Typically people set games graphics to achieve 60 fps constant until they hit some heavy scenes which often only lasts a few seconds and causes this back and forth as the heavy effects sometimes push past the boundary. Of course if you are one of the rare people that aims for say 40 fps all the time and presumably 20-25 ish in a very heavy scene then you'll get stuck at 33ms frames with vsync and "loose" 10 fps on average.

Triple buffering solves it, in the sense that you would get 40 fps instead of 30fps in this scenario. But the frames will still be delivered in 16 and 33 ms discrete steps to the GPU, it will still show the inconsistent jumping around on the screen inherent with vsync but just at a better frame rate. It also adds another 16ms of latency on top of the 32ms added by vsync for a total of 48ms at the very least. We know the brain can perceive down into the 20ms range (John Carmack speech on Occulus Rift last year) so its definitely a problematic increase. If it was a universal fix that always worked then all the game devs would be doing it as the memory requirement these days is minimal. But they don't, because in a lot of cases its a worse experience than the double buffering that only shows a problem rarely rather than on every frame. Latency matters a lot and its already too high for some people on double buffers with vsync let alone making it worse with another buffer.

John Carmack famously some years ago said everyone should be using triple buffering and that vsync was the greatest image quality setting that people seemed to avoid. He argued it was much better to get vsynced screens and the latency wasn't an issue. Right up until the point where he started to notice the problem himself. In his most recent Id speach he was focussed on driving latency down as much as possible from the pipeline. He is no longer looking at triple buffering or vsync, he is trying to shave off 5ms from the pixel switching time, that 48ms of buffering just isn't possible anymore. He now thinks we need 120hz OLEDs with near instant switching and game engines capable of pushing 120 fps. That is an attempt to go down from a latency in the 70-100ms which he was advocating 10 years ago to a target of below 20ms today.

So triple buffering can help alleviate some of the drawbacks of double buffering, its far from free. Please stop calling it 30/60 problem, it almost never manifests in that way, especially considering the average FPS represents. I largely agree I think most people are in scenario #2 that run vsync, but differently in that the only time the drops happens are in moments of intense action and the player is busy doing other things than looking at the frame counter.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
John Carmack famously some years ago said everyone should be using triple buffering and that vsync was the greatest image quality setting that people seemed to avoid. He argued it was much better to get vsynced screens and the latency wasn't an issue. Right up until the point where he started to notice the problem himself. In his most recent Id speach he was focussed on driving latency down as much as possible from the pipeline. He is no longer looking at triple buffering or vsync, he is trying to shave off 5ms from the pixel switching time, that 48ms of buffering just isn't possible anymore. He now thinks we need 120hz OLEDs with near instant switching and game engines capable of pushing 120 fps. That is an attempt to go down from a latency in the 70-100ms which he was advocating 10 years ago to a target of below 20ms today.

Interesting, wasn't aware of that. This presumably means he's unlikely to repeat the travesty that was Rage's 60fps cap
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
The tools will show it, if you test and analyse things properly. If you're asking why it doesn't always show, it could be any (or all) of these reasons:

  1. The FPS counter is an average of 1 second or more. So while 45 FPS is "accurate", it's not accurate in the sense that it's actually either 30FPS or 60FPS. The whole hoopla around FCAT is exactly this point.
  2. Depending on the game/scene/settings, your system may be pulling >=60 FPS for 99% of the time, so the brief moments at 30FPS won't be picked up by the counter (again, see #1 about averages).
  3. The game could be triple buffering automatically.
I don't believe that most games triple buffer, so I personally think #2 applies to most people who claim not to see the issue. People who run at taxing GPU limited settings will notice it a lot more, even with the regular FPS counter.

Great post BFG. That should really clear up question raised by the OP.
 

Smoblikat

Diamond Member
Nov 19, 2011
5,184
107
106
I thought Vsync just capped the frames to 60 (or whatever your refresh rate is) and didnt drop them to anything lower.
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Then would you explain why the framerate counter does drop to 30 in some games as you just admitted in a recent post, while it doesn't in others? Do you have an explanation other than triple buffering?

The reason FPS counters show NON 60fps/30fps/20fps values is because FPS counters on the whole are AVERAGES. They're an average number of frames usually counted over a 1 second period.

Part of the misunderstanding of vsync is that it "caps" your frame rate, but this is only what appears to be happening and is an easy way to describe it's approximate behaviour. What is happening in reality is the frame is being displayed for an entire refresh, when the next refresh starts if the next frame is ready (has been fully rendered) the buffers are flipped out and the next frame is displayed, if the next frame isn't ready the current frame is displayed for another whole refresh.

This actually dependent on the individual frame render time and NOT the frame rate, it means that while you can maintain over 60fps on AVERAGE, individual frames inside that 1 second of measurement may actually take longer than 16.666ms to render (1000ms/60hz), in this case those frames taking longer than 16.666ms force the prior frame to be displayed over 2 refreshes and cause the effective frame rate (what you see in a v-synced reality) to be lower than 60fps.

In fact you can have average frame rates of way above 60fps and still not see 60 distinct frames in 1 second with vsync on.

Fore example: You could have an average of 100fps (10ms to render the AVERAGE frame) but in reality it could be that half the frames were taking 2ms, and the other half were taking 18ms to render. The 18ms frames cannot be finished in the 16.666ms window and so cause the prior frame to repeat for a 2nd refresh, causing a measured/effective fps when vsynced of 30fps.

But average those numbers.

((18ms * 30fps) + (2ms * 30fps) / 60fps)
((540 + 60) / 60fps)
(600 / 60fps)
= 100fps

Oh look, it's a 100fps average, so an average count for every 1 second would give 100fps, but vsycned you'd see something like 30fps, and not 60fps like you might expect.

Now we understand that vsync is linked to individual frame times and not average frame rate we can see how we can see average frame rates that are not 60fps/30fps/20fps etc...

Imagine the following series of frame render times, all in ms.

12,10,12,14,16,18,30,35,36,50,82,95,100,150,120,90,43,20,10,8,12,10,10,7,10

That all adds up to 1000ms, it's 25 frames, so we've average 25fps. Notice the first 5 frames are all below 16.666ms and so will all display on their own unique refresh just once, once we hit the 6th frame at 18ms that cannot render in that 16.666ms window and so the prior frame is rendered again. The next refresh rolls around (2x16.666) 33.333ms later and our 18ms frame is ready and is displayed. The next frame takes 35ms to render, but this is longer than 16.666ms of one refresh so the prior frame is displayed over 2 refreshes but 2 refreshes is 33.333ms and so this frame still isn't ready so we display the prior frame for not 2 but 3 refreshes.

So on and so forth, some frames getting displayed not for just 1 refresh, not just 2 but up to 9 refreshes in the case of the frame prior to the one that takes 150ms to render.

That's how you squeeze any number of frames into 1 second not just fixed fractional values of 1/n (60fps/30fps/20fps) etc...
 

PrincessFrosty

Platinum Member
Feb 13, 2008
2,301
68
91
www.frostyhacks.blogspot.com
Interesting, wasn't aware of that. This presumably means he's unlikely to repeat the travesty that was Rage's 60fps cap

I believe that Rage and Doom 3 (idtechX) engines are capped, not because rendering is capped per se, it's because the tickrate of the engine (the physics, the animation etc) is all limited to 60fps.

As he said with doom3 you could render at much faster than 60fps, but if the engine isn't updating more than 60fps then your new rendered frame would be any different from the last and so essentially pointless.

I think he's more interested in latency than refresh rate which aren't necessarily the same thing, latency is effected by frame rate but it's also effected by MANY other things, some of which have a much more significant impact on latency than simply the frame rate. His latest keynote is really interesting (as always tbh).
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81
I thought Vsync just capped the frames to 60 (or whatever your refresh rate is) and didnt drop them to anything lower.

It stops FPS from going HIGHER than the refresh rate, not lower!

(edit: that's my 'dumb' version of Princess Frosty's fascinating and technically superior explanation, of course!
 
Last edited: