Black Octagon
Golden Member
- Dec 10, 2012
- 1,410
- 2
- 81
His latest keynote is really interesting (as always tbh).
I'd like to see it. Have you a link by any chance?
His latest keynote is really interesting (as always tbh).
I'd like to see it. Have you a link by any chance?
Man triple buffering is so cool. I never knew I could have 40-50 fps vsync'd.
Please stop calling it 30/60 problem, it almost never manifests in that way
what kind of reply is that? you claimed that COJ went straight from 60 to 30 when looking at FRAPS counter on screen and it did NOT for me. I am not arguing how vsync works but I am telling you and even showed you what the actual framerate counter on the screen showed which was what my argument was about. so again stop twisting things.You don't have a "point", other than showing us that your testing methods were flawed.
Why do you choose to ignore the log files and the graphs? Is it simply because they show something different to the FPS counter, thereby proving you wrong? Or is it something else?
The way games are set up that's exactly how it manifests. Games do not flip between low detail and high detail every few ms, they're set up to give experiences on human timescales. This translates to areas with differences in scene complexity. You will hit regions that drop you down to 30FPS for a solid duration, and regions where you're at 60. It does not flip between 16 and 33ms frames to give you 45FPS -- if it did we wouldn't have any need for triple buffering, now would we?
No, you cannot have in-between frames on a double buffered vsync'd system. In practical terms, the red above is not possible in such a case.
You can have blue or green, or the sawed graph I posted earlier, but all frames are still a division of the refresh rate.
If your counter or log files are showing in-between scores, it's because the numbers average across one or more seconds. So if you're bouncing between 30FPS and 60FPS every 1/10th second (for example), the counter's going to show 45FPS across that second even though that's not really the case.
That's why you can't just run around at random and blindly use a FPS counter to check if the game is triple buffering; you have to set up the tests properly to ensure accurate results.
FPS = frames / T, where T == 1 (or 1000 if you want to be extra pedantic).Stop calling it a 30/60 fps I beg you. FPS = frames per second, the period of averaging is even in the name. Its important to talk about this at a frame time level rather than using FPS equivalents because it just confuses a lot of people that haven't spent a lot of time with these other measures.
Your use of the word "rare" is subjective in that it's an opinion. If I was to force vsync right now, it would cripple my gameplay experience in pretty much my entire 140+ game library, either from mouse lag and/or a choppy framerate. And my target for older games is 120 FPS.Theoretically and practically you can get a game to only achieve 30 fps, when vsync off with the same settings would be higher. Its actually quite rare however to see it without trying to achieve it. Its more common to see these patterns of back and forth with an average coming out much closer to the vsync off frame rate, not because of triple buffering but because the game isn't really very consistent in the time it takes to render each frame. Typically people set games graphics to achieve 60 fps constant until they hit some heavy scenes which often only lasts a few seconds and causes this back and forth as the heavy effects sometimes push past the boundary. Of course if you are one of the rare people that aims for say 40 fps all the time and presumably 20-25 ish in a very heavy scene then you'll get stuck at 33ms frames with vsync and "loose" 10 fps on average.
Yes, the frames are still presented in the same steps, but the renderer isn't stalling so when they're presented, you're getting more up to date data, and you're not impacting future frames either.Triple buffering solves it, in the sense that you would get 40 fps instead of 30fps in this scenario. But the frames will still be delivered in 16 and 33 ms discrete steps to the GPU, it will still show the inconsistent jumping around on the screen inherent with vsync but just at a better frame rate.
I'm not necessary saying to use vsync and/or triple buffering, I'm just explaining how they work. It's true that it adds input lag, but it's still much better than a double-buffered system hitting 30 FPS. If I'm forced to use to vysnc (e.g. in a rare compatibility issue), I go straight to enabling triple buffering.If it was a universal fix that always worked then all the game devs would be doing it as the memory requirement these days is minimal. But they don't, because in a lot of cases its a worse experience than the double buffering that only shows a problem rarely rather than on every frame. Latency matters a lot and its already too high for some people on double buffers with vsync let alone making it worse with another buffer.
You don't have to look at any frame counters to notice the problem, especially if you're used to playing the twitch shooters from the late 90s/early 00s on CRTs. The frame counters/log files just objectively prove it's happening for those that don't want to believe.So triple buffering can help alleviate some of the drawbacks of double buffering, its far from free. Please stop calling it 30/60 problem, it almost never manifests in that way, especially considering the average FPS represents. I largely agree I think most people are in scenario #2 that run vsync, but differently in that the only time the drops happens are in moments of intense action and the player is busy doing other things than looking at the frame counter.
So if you say "there are no viruses because I can't see them with my magnifying glass", do you think you're still correct when I can show you them with a microscope?what kind of reply is that? you claimed that COJ went straight from 60 to 30 when looking at FRAPS counter on screen and it did NOT for me. I am not arguing how vsync works but I am telling you and even showed you what the actual framerate counter on the screen showed which was what my argument was about. so again stop twisting things.
Please see my lengthy response to BrightCandle about this, thanks.Let's break that down a second. FPS = Frames per second. That means it is adding up how many frames are created in a second. Even if the frame times between frames are jumping from 16ms to 33ms, it does not matter, it is how many frames per second that are created that determine your FPS.
If you want to talk about the individual frame, don't use FPS and use frame times.