If Vsync is diabled and the monitors refresh rate is 60 and the frame rate is 45 fps, then is tearing severity calculated by the following equation (100%=terrible tearing, 0%=no tearing)?:
100%/x=60/45
So in other words, if the frame rate is 45, vsync is off, and your monitor's refresh rate is 60Hz, then will tearing be 3/4 as bad as it would be if the frame rate was 60fps? Or is tearing the same at 15 fps as it is at 45 fps (60 hz refresh, vsync off)? Or does tearing happen more when the frame rate fluctuates by a large amount?
I'm asking because I want a happy medium in regards to tearing without the added input lage of Vsync. I'm able to put up with limited tearing, a frame rate as low as 30fps, and just ~15ms input lag (~13ms input lag of my monitor+2ms input lag from my mouse).
100%/x=60/45
So in other words, if the frame rate is 45, vsync is off, and your monitor's refresh rate is 60Hz, then will tearing be 3/4 as bad as it would be if the frame rate was 60fps? Or is tearing the same at 15 fps as it is at 45 fps (60 hz refresh, vsync off)? Or does tearing happen more when the frame rate fluctuates by a large amount?
I'm asking because I want a happy medium in regards to tearing without the added input lage of Vsync. I'm able to put up with limited tearing, a frame rate as low as 30fps, and just ~15ms input lag (~13ms input lag of my monitor+2ms input lag from my mouse).
Last edited: