Vsync not working as I thought it should.

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

DominionSeraph

Diamond Member
Jul 22, 2009
8,386
32
91
Man triple buffering is so cool. I never knew I could have 40-50 fps vsync'd.

Welcome to 1998.

Please stop calling it 30/60 problem, it almost never manifests in that way

The way games are set up that's exactly how it manifests. Games do not flip between low detail and high detail every few ms, they're set up to give experiences on human timescales. This translates to areas with differences in scene complexity. You will hit regions that drop you down to 30FPS for a solid duration, and regions where you're at 60. It does not flip between 16 and 33ms frames to give you 45FPS -- if it did we wouldn't have any need for triple buffering, now would we?
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
You don't have a "point", other than showing us that your testing methods were flawed.


Why do you choose to ignore the log files and the graphs? Is it simply because they show something different to the FPS counter, thereby proving you wrong? Or is it something else?
what kind of reply is that? you claimed that COJ went straight from 60 to 30 when looking at FRAPS counter on screen and it did NOT for me. I am not arguing how vsync works but I am telling you and even showed you what the actual framerate counter on the screen showed which was what my argument was about. so again stop twisting things.
 
Last edited:

PrincessFrosty

Platinum Member
Feb 13, 2008
2,300
68
91
www.frostyhacks.blogspot.com
The way games are set up that's exactly how it manifests. Games do not flip between low detail and high detail every few ms, they're set up to give experiences on human timescales. This translates to areas with differences in scene complexity. You will hit regions that drop you down to 30FPS for a solid duration, and regions where you're at 60. It does not flip between 16 and 33ms frames to give you 45FPS -- if it did we wouldn't have any need for triple buffering, now would we?

This entirely depends on the game, the settings you pick and how you play. Generally speaking the goal of good game development is to keep consistent frame rates during play but this is a very hard technical challenge because in the dynamic worlds we have today players can create more scenarios the devs wont predict that leave to unstable frame rates.

One other common way of seeing this extremely sharp change in frame rate from good to bad in sub-second time frames is things like HDD thrashing when people set game settings higher than their video card can cope with and swapping from disk kicks in, a major hindrance on frame rate.

Some games if you simply spin around and have the world dynamically cull everything, you can frequently cause slow downs with rapid movements, a lot of games today are built on the assumption players have max rotation speed of the analogue stick, mouse uses can often cause stutter by doing near instant 180 flips.

Usually the sudden spawning of lots of effects like lots explosions which cause particles and smoke to fill the screen will cause this, it's not exactly uncommon.
 

bystander36

Diamond Member
Apr 1, 2013
5,154
132
106
No, you cannot have in-between frames on a double buffered vsync'd system. In practical terms, the red above is not possible in such a case.

You can have blue or green, or the sawed graph I posted earlier, but all frames are still a division of the refresh rate.

If your counter or log files are showing in-between scores, it's because the numbers average across one or more seconds. So if you're bouncing between 30FPS and 60FPS every 1/10th second (for example), the counter's going to show 45FPS across that second even though that's not really the case.

That's why you can't just run around at random and blindly use a FPS counter to check if the game is triple buffering; you have to set up the tests properly to ensure accurate results.

Let me try to explain this, as you either have no clue, or misunderstand the concept I was explaining.

When a frame is rendered in a double buffer system with v-sync on, if a frame takes 16ms or less to be created from start to finish, it gets sent to the front buffer immediately. If it takes longer than 16ms, it has to wait for another refresh before it is sent to the front buffer, which means you have a 33ms frame time.

So if your frame times are not consistently being created less than 16ms, and you have some that take longer, and some that take less time. You'll get FPS between 30 and 60.

Let's break that down a second. FPS = Frames per second. That means it is adding up how many frames are created in a second. Even if the frame times between frames are jumping from 16ms to 33ms, it does not matter, it is how many frames per second that are created that determine your FPS.

If you want to talk about the individual frame, don't use FPS and use frame times.
 

Black Octagon

Golden Member
Dec 10, 2012
1,410
2
81

Wow you weren't kidding. Hugely informative talk. I don't pretend to understand anywhere near all of that, but he certainly got me excited for unified memory.

My only gripe (and sorry that this is a little off-topic) is his admiration for how far 'displays' have come. His examples all refer to handheld displays, which is where all the innovation has been recently. Innovation in computer monitor displays has been stagnating in recent years, probably as a direct side-effect of all the energy that's gone into handheld displays.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Stop calling it a 30/60 fps I beg you. FPS = frames per second, the period of averaging is even in the name. Its important to talk about this at a frame time level rather than using FPS equivalents because it just confuses a lot of people that haven't spent a lot of time with these other measures.
FPS = frames / T, where T == 1 (or 1000 if you want to be extra pedantic).

But if you lower T enough such that max (frames) == 1 , you get the longest frametime from a run. So in that respect they're really not that different.

Besides, if you want to really split hairs, it's not even frametimes; it's calls to IDirect3DDevice9 :: Present(), or similar for other APIs. But there's no reason to nitpick like that because most people understand what's happening when you say 30/60.

In fact most people didn't even know what frametimes were until the hoopla of the last six months with AMD's drivers. But the fractional framerate problem with vsync was being observed and discussed 10+ years ago.

I'm pretty sure I tested vsync way back in 1997/1998 on Voodoo cards in GLQuake, and I'd already decided that it needs to be switched off because it was utterly ruining gameplay.

Theoretically and practically you can get a game to only achieve 30 fps, when vsync off with the same settings would be higher. Its actually quite rare however to see it without trying to achieve it. Its more common to see these patterns of back and forth with an average coming out much closer to the vsync off frame rate, not because of triple buffering but because the game isn't really very consistent in the time it takes to render each frame. Typically people set games graphics to achieve 60 fps constant until they hit some heavy scenes which often only lasts a few seconds and causes this back and forth as the heavy effects sometimes push past the boundary. Of course if you are one of the rare people that aims for say 40 fps all the time and presumably 20-25 ish in a very heavy scene then you'll get stuck at 33ms frames with vsync and "loose" 10 fps on average.
Your use of the word "rare" is subjective in that it's an opinion. If I was to force vsync right now, it would cripple my gameplay experience in pretty much my entire 140+ game library, either from mouse lag and/or a choppy framerate. And my target for older games is 120 FPS.

Even if it's just instantaneous, the sudden drop to 30 FPS (or 33ms to keep you happy) is extremely noticeable to input response when you're coming from a smooth 120 FPS (my usual target).

With a 60Hz LCD there just isn't much room to breathe; a 120 FPS target suddenly becomes 60 FPS, becoming 30 FPS whenever you hit 30-59 FPS. 59 FPS on it's own if fine if it's a minimum, but 30FPS is simply crippling, no matter how short it is.

Triple buffering solves it, in the sense that you would get 40 fps instead of 30fps in this scenario. But the frames will still be delivered in 16 and 33 ms discrete steps to the GPU, it will still show the inconsistent jumping around on the screen inherent with vsync but just at a better frame rate.
Yes, the frames are still presented in the same steps, but the renderer isn't stalling so when they're presented, you're getting more up to date data, and you're not impacting future frames either.

Also the input response and animations aren't affected as badly (i.e. 48FPS with triple buffering responds and looks far smoother than 30 FPS without vsync).

If it was a universal fix that always worked then all the game devs would be doing it as the memory requirement these days is minimal. But they don't, because in a lot of cases its a worse experience than the double buffering that only shows a problem rarely rather than on every frame. Latency matters a lot and its already too high for some people on double buffers with vsync let alone making it worse with another buffer.
I'm not necessary saying to use vsync and/or triple buffering, I'm just explaining how they work. It's true that it adds input lag, but it's still much better than a double-buffered system hitting 30 FPS. If I'm forced to use to vysnc (e.g. in a rare compatibility issue), I go straight to enabling triple buffering.

So triple buffering can help alleviate some of the drawbacks of double buffering, its far from free. Please stop calling it 30/60 problem, it almost never manifests in that way, especially considering the average FPS represents. I largely agree I think most people are in scenario #2 that run vsync, but differently in that the only time the drops happens are in moments of intense action and the player is busy doing other things than looking at the frame counter.
You don't have to look at any frame counters to notice the problem, especially if you're used to playing the twitch shooters from the late 90s/early 00s on CRTs. The frame counters/log files just objectively prove it's happening for those that don't want to believe.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
what kind of reply is that? you claimed that COJ went straight from 60 to 30 when looking at FRAPS counter on screen and it did NOT for me. I am not arguing how vsync works but I am telling you and even showed you what the actual framerate counter on the screen showed which was what my argument was about. so again stop twisting things.
So if you say "there are no viruses because I can't see them with my magnifying glass", do you think you're still correct when I can show you them with a microscope?

Because that's exactly what's happening here.

Regardless of what your counter was showing, your log files were agreeing with me. If you don't want to accept this fact, then it's best you keep quiet about past arguments.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Let's break that down a second. FPS = Frames per second. That means it is adding up how many frames are created in a second. Even if the frame times between frames are jumping from 16ms to 33ms, it does not matter, it is how many frames per second that are created that determine your FPS.

If you want to talk about the individual frame, don't use FPS and use frame times.
Please see my lengthy response to BrightCandle about this, thanks.