Need non-BS answer: tearing/suttering/vsync and 120hz monitors

Xenphor

Member
Sep 26, 2007
153
0
76
I would like someone's opinion who is extermely sensitive to tearing with vsync off on traditional 60hz monitors and stuttering with vsync on as well.

I've read many many conflicting reports on how 120hz monitors handle variable low frame rates (e.g. 30-50fps) with vsync on and off. Many people claim that tearing is somehow reduced or even eliminated with vsync off and that vsync/triplebuffering performs better.

I know my from my own tests that capping the frame rate of a game like Half-life to, say, around 40 fps while at 60hz (1280x1024) or 85 hz (1024x768) does absolutely nothing to eliminate tearing with vsync off. So, why would 120hz be better than 85hz in this situation? Both are higher than 60hz, yet at 85, I noticed no change. With vsync on at 40fps capped it's obviously going to be a disaster because it's not at my refresh rate. Now, I see how 120hz may help in this case because 40fps would divide evenly into 120 and presumably give a smooth picture. That is the one positive I can understand about 120hz: it would give you additional points to sync at other than the standard 10, 20, 30, and 60. Still, there is a vary large gap in frames at 120hz so I don't see how stuttering would be eliminated with vsync on.

So, if there is anyone who is extremely snesitive to tearing, has 120hz done anything to stop it with vsync off, specifically at low frame rates? I'm interested in low frame rate performance because I really don't think anyone is going to run Star Wars 1313 or Watch Dogs at 120FPS right out of the gate.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
If vsync is off you will see tearing regularly. The further away you are from the frequency of the monitor the more you will see it. There are certain harmonics where it divides evenly where you wont see tearing but the change of a single FPS digit will break it, so they don't really come into play all that often unless you are capping well below the capabilities of your computer.

With vsync on any missed frame is double or more of the original frame time. With 60Hz that is 16 -> 32 ms and with 120Hz its 8 -> 16ms. Most humans perceive the first transition if it happens a lot, especially in a cyclic fashion like with microstutter. The second transition of 8->16 is far less noticeable. This has effectively halved the stuttering if your machine is capable of pushing 120Hz. Even if your rig only pushes 60Hz normally the next frame time up from 16ms is not 32 but actually 24, which is noticeably less jarring since its only 50% longer.

If you don't like tearing then you need to turn vsync on. In doing so you also want to aim the game to run at 60fps pretty reliably especially if the drop to 32ms is not enough to fool you into perceiving motion (its not sufficient for me).
 

Xenphor

Member
Sep 26, 2007
153
0
76
Ok if I understand correctly then, with vsync off, I will see tearing regardless no matter what the refresh rate. That is what I thought.

Now with vsync on, I believe what you said basically means that the stuttering you get when a frame is missed might be less perceptible at 120hz. Basically it seems like it would be the effect of having triple buffering but without the inputlag/added memory etc..

In a hypothetical situation where a new game comes out that is particularly demanding, what would be the best option to attain smooth, consistent motion on a 120hz monitor? If the game is hovering say in the 20-40fps range and you set vsync on, will that look any better than if you had set vsync on on a 60hz monitor? From my experience on a 60hz monitor, you would get stuttering as the game goes above and below 30 fps as that is half your refresh rate. You could try and cap the frame rate at 30 but there would still be stuttering below 30 fps and possibly problems from forcing a capped rate. How would a 120hz monitor deal with this?

Of course, as you said, maintaining a solid vsync at the refresh rate is the best option, but at 120hz that is far more problematic because of the hardware requirements to achieve 120fps. So would there be any advantage to vsyncing at 60hz on a 120hz monitor compared to syncing at 60hz on a 60hz monitor? Or would you just end up running at 60hz mode on a 120hz monitor if you can't maintain 60fps?
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Triple buffering is just different. Triple buffering means there is always a middle buffer between the graphics cards currently rendered frame and the frame being sent to the monitor so that the GPU can choose to switch in its frame whenever it likes and start the next one while the monitor can choose to switch its frame whenever its just finished the last one. Thus both get to only swap in complete images. It also allows the GPU to get to work and keep working at peak speed thus ensuring the best frame rate possible at the time. Both can run at the optimal speed without needing to synchronise with each other but at worst it introduces an additional frame of latency to the process. It doesn't always introduce a lot of latency, with perfectly matched synchronisation between the two you could have no extra latency than with vsync (which is at worst 1 frame as well).

Double buffering is what we are using when we talking about Vsync on or off. It only has the graphics frame and the monitors current display frame, with no intermediate third frame. The difference between vsync on or not is simply about when its acceptable to swap which of the framesthe monitor can display. With vsync on the GPU must wait for the monitor to finish before swapping, with vsync off it can swap the image as soon as its ready. This forcing of a new images to be displayed regardless of where the monitor is in the process of drawing the frame causes the tear in the image because it literally changed while the poor monitor was drawing it.

So once you know how the 3 technologies work and combine it with the time it takes frames to render:
120Hz - 8.3ms
60Hz - 16.6 ms

you can start to reason about how the different monitors behave.

So lets say the game is taking 17 ms to display a frame, in theory we should see 58 fps.

At 60 Hz with vsync off
- every frame will tear and the tear will progress from the bottom of the screen up to the top in 58 steps.

At 60Hz with vsync on
- The screen will actually only be rendered every 33.3ms and we'll see an effective frame rate of 30 fps.

If the GPU uses prerendered frames then we should see 56 16ms frames and 2 33ms frames occurring at the very beginning of the second and about half way through. In effect we are getting something like triple buffering but with at least 16ms of extra lag.

At 60Hz with triple buffering
- We should see 56 16ms frames and 2 33ms frames occurring at the very beginning of the second and about half way through producing 58 fps.

Now with 120Hz in the same scenario its a bit more complicated to calculate but it'll look something like follows:

120Hz Vsync off
- Every frame is getting drawn twice, so around 16 ms like with 60Hz but there is now an - interplay between the 17ms cycle of the GPU rendering and the 8.3ms rendering of the monitor to make the tears appear more randomly across the screen. 58 fps rendered.

120Hz Vsync on
- The first frame takes 3 screen refreshes, so about 24ms as do sunsequent ones resulting in a frame rate around 40 fps.

Again if the GPU is rendering ahead then you'll see additional frames of latency but higher FPS.

120Hz with triple buffering
- Its a more complicated mix of frame times where many of them are 16ms and a few are 24ms producing a steady 58 fps.

So you can see in certain common circumstances 120Hz monitors with Vsync have quite significantly better behaviour than a 60Hz monitor even with a game running at 58 fps. When we go below a threshold, <60 or 30 there is a clear advantage to having a smaller wait for the next vsync that plays out in higher frame rates and low latency to screen.
 

Xenphor

Member
Sep 26, 2007
153
0
76
Hm, well I was mainly using triple buffering as a visual example as to how vsync performance might improve on 120hz but now I'm somewhat confused. Is there a perceptiple difference between double buffered vsync locked and triple buffer vsync locked? From what you said, a regular vsync at 60hz only gives you 30fps? This is the vsync that is most commonly provided in game options or through the nvidia control panel? I have to say that I notice no difference when locked at 60fps between regular vsync and triple buffered vsync at 60hz. For me, the benefits of triple buffering only become apparent when the frame rate drops below the refresh rate. Of course, there are different ways of implementing triple buffering I think and most times I have to force it through d3doverrider, or with opengl games, the control panel. And I don't even believe d3d implements triple buffering but a flip queue instead.

One other question I had is, how is that console games experience no tearing? There are exceptions (bioshock) where tearing is explicitly allowed for performance sake, but for the most part, the image is smooth and consistent, albeit some times at a low frame rate. How is it that console games can handle variable frame rates more gracefully than pc games? I could play an n64 game where the frame rate is consistently below 30fps and not see any tearing or stuttering, just slow down. Older consoles like snes or nes never had an issue with this as far as I can recall.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Read it again, the answer to triple verses dual buffering is in there. What I said is that if a frame is 17ms to render then in dual buffering its 30 fps, with triple its 58 fps. Which is absolutely in line with your expectation, I just chose an example that shows the extreme worst case.

Console games simply always use vsync, and because the hardware is well known up front they can maintain much more consistent frame times.
 

Xenphor

Member
Sep 26, 2007
153
0
76
If consoles use vsync then how are they able to avoid input lag? I mean, maybe some have it but I'd be hard pressed to say that a game like Super Mario has input lag.