Ok I've been reading more about the issue at hand trying to get a better understanding of it and made some changes. I decided to try forgoing half refresh altogether and instead see how normal vsync + forced triple buffering would work. I didn't try this approach at first because there seems to be much confusion about how triple buffering works on the internet.
From my understanding, triple buffering only ever works with vsync and basically adds a gradient of values for which the game can sync to, in between the normal values of monitor's refresh rate. I have read that somehow triple buffering may be used without vsync but that seems false. Triple buffering also can add input lag and eats up some video memory. Now, adaptive vsync on the other hand is a response to triple buffering that's supposed to work universally. Of course all of this may be completely untrue.
Now the huge problem with this approach is that it is completely non-standardized for whatever reason. D3doverrider seems to be the most popular way of forcing the implementation of triple buffering but doing it in that way is prone to even more issues.
Regardless, I tried simply using vsync - either from the CP, in game, or through D3doverrider and combining it with triple buffering from d3doverrider. Taking the same games, Metro 2033 and Crysis, I was actually somewhat impressed with the results. For the first time, I think I saw the benefits of triple buffering. I got smoothness of vsync as my FPS seemed to hover near 60 and as I encountered slowdown, there was not as much terrible stuttering as there would be with normal vsync. As I said before, it's like triple buffering adds a gradient or curve.
Well that's about it. There's no guarantee this method will work with any other game and so I'll be back to step 1 with a screen full of tearing in the foreseeable future. It is really unfathomable that this problem has not been solved yet.
From my understanding, triple buffering only ever works with vsync and basically adds a gradient of values for which the game can sync to, in between the normal values of monitor's refresh rate. I have read that somehow triple buffering may be used without vsync but that seems false. Triple buffering also can add input lag and eats up some video memory. Now, adaptive vsync on the other hand is a response to triple buffering that's supposed to work universally. Of course all of this may be completely untrue.
Now the huge problem with this approach is that it is completely non-standardized for whatever reason. D3doverrider seems to be the most popular way of forcing the implementation of triple buffering but doing it in that way is prone to even more issues.
Regardless, I tried simply using vsync - either from the CP, in game, or through D3doverrider and combining it with triple buffering from d3doverrider. Taking the same games, Metro 2033 and Crysis, I was actually somewhat impressed with the results. For the first time, I think I saw the benefits of triple buffering. I got smoothness of vsync as my FPS seemed to hover near 60 and as I encountered slowdown, there was not as much terrible stuttering as there would be with normal vsync. As I said before, it's like triple buffering adds a gradient or curve.
Well that's about it. There's no guarantee this method will work with any other game and so I'll be back to step 1 with a screen full of tearing in the foreseeable future. It is really unfathomable that this problem has not been solved yet.
