The possible solutions to microstutter?

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
So, I was thinking hard about the PCPer article on crossfire and it's runt frames. I made a post in that thread asking for an electrical engineer to help me patent something but since I highly doubt it's going to happen I will just spill the beans on my idea because it's eating me up not knowing if it would work or not.

There are some extremely savvy people on these forums who could let me know if it's possible so here goes.

The way to fix microstutter/stuttering is to change the way monitors work, not video cards. I think that if monitors could dynamically change their refresh rate at an extremely high rate to match exactly what FPS the GPU is outputting we would do away with tearing/runt frames, and also solve the input lag issues.

Does this seem possible?
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I'm not talking about the same thing. I'm saying that if your GPU is putting out 55fps at one point the monitor senses this and sets the refresh rate to 55hz. The next instant the GPU is putting out say 70fps and the monitor adjusts to 70hz accordingly.
 

KompuKare

Golden Member
Jul 28, 2009
1,224
1,582
136
I'm not talking about the same thing. I'm saying that if your GPU is putting out 55fps at one point the monitor senses this and sets the refresh rate to 55hz. The next instant the GPU is putting out say 70fps and the monitor adjusts to 70hz accordingly.

Not sure if that's a good idea. Motion perception etc. is very individual and gradual: so at one extreme you have epilepsy through to the various things some people can see. A few things I can think of are
  • CFCL backlight flicker perception
  • LCD viewing angles (TN) being perceived differently by each eye
  • TN (and even EIPS) being perceived as flickering due the dithering 6 bit panels etc do get more colour
  • the problem CRCs used to give people

All of this *might* be worse if the refresh rate is varied. But anyway, good of you to think out side the box even if I don't think it would work.
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
Not sure if that's a good idea. Motion perception etc. is very individual and gradual: so at one extreme you have epilepsy through to the various things some people can see. A few things I can think of are
  • CFCL backlight flicker perception
  • LCD viewing angles (TN) being perceived differently by each eye
  • TN (and even EIPS) being perceived as flickering due the dithering 6 bit panels etc do get more colour
  • the problem CRCs used to give people

All of this *might* be worse if the refresh rate is varied. But anyway, good of you to think out side the box even if I don't think it would work.

That's the whole reason I made the post to find out if it was plausible or not. I would think the backlight flickers would remain constant however, and it is independent of refresh rate. Or perhaps they give monitors a ridiculously high refresh rate like 240hz and then have the backlight flash to correspond with the framerate the monitor is receiving. Effectively making the monitor only "light up" whole frames.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The way LCD monitors work the refresh rate is totally artificial, designed purely to be backwards compatible with CRTs. What we actually need is an interface/monitor that is only sent a frame when its done. That is vsync all the way up to the monitors peak is done at the exact moment the frame is ready. So if you are getting 25ms frames then the monitor is sent the frame not at the next 16ms slot but straight away and it takes the usual 16ms to get there and is displayed as soon as possible. That way a lot of the issues today with vsync just vanish and make it fair more viable.

To do it requires a change in the monitor interface, in the monitors and also in the graphics cards to support the new interface. Its far from a trivial task but it would help alleviate some of the problems associated with vsync input latency. No vsync will still have less latency because you don't wait even until the frame is finished before you start sending it, but you could still use this with a monitor that used a variable rate refresh.

You remove much of the latency associated with double buffering and you remove the quality issues with tearing and vsync totally off which is a net win. I think such a simple change would be quite effective.

It wouldn't however solve most of the microstutter issues we have today, those are caused by problems in the graphics cards/drivers and the various games and are pretty complicated. Some might have the problem go away with variable vsync but many wouldn't show any real difference.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
That would not fix MS, as the problem is uneven frame delivery. Regardless of whether or not the monitor is displaying it at a 60 or 120 hz or exact cadence with the drawn frame, 62 frames delivered at 8ms and 24ms alternating (for example) is never going to look like 62 fps, but something in between 62 and 31 fps.

You're confused on what the issue is, I think.

The "tearing" captures are just being used to try to get a finer measure of frame time that doesn't rely on fraps. Tearing is not perceived as a stutter at all.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
I would like to see the abnormally high latency spikes that happen with multi-GPU to be smoothed out!
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
That would not fix MS, as the problem is uneven frame delivery. Regardless of whether or not the monitor is displaying it at a 60 or 120 hz or exact cadence with the drawn frame, 62 frames delivered at 8ms and 24ms alternating (for example) is never going to look like 62 fps, but something in between 62 and 31 fps.

You're confused on what the issue is, I think.

The "tearing" captures are just being used to try to get a finer measure of frame time that doesn't rely on fraps. Tearing is not perceived as a stutter at all.

Aren't the alternating ms numbers a result of the GPU rendering a frame that is out of sync with the refresh rate though? So if the monitor is always adjusting to accommodate the amount of frames the monitor is putting out you would not see the frame time deviations from frame to frame.
 

aaksheytalwar

Diamond Member
Feb 17, 2012
3,389
0
76
I think vulgardisplay has nailed it. It is like time dilation. If your plane of perception changes everything becomes relative.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Aren't the alternating ms numbers a result of the GPU rendering a frame that is out of sync with the refresh rate though? So if the monitor is always adjusting to accommodate the amount of frames the monitor is putting out you would not see the frame time deviations from frame to frame.

I doubt microstutter is caused by this because otherwise we wouldn't see it with vsync off. Vsync off allows the card to run at its full speed, switch the buffers precisely when its done and continue. Vsync off as a technical choice is not the reason MS exists.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Aren't the alternating ms numbers a result of the GPU rendering a frame that is out of sync with the refresh rate though? So if the monitor is always adjusting to accommodate the amount of frames the monitor is putting out you would not see the frame time deviations from frame to frame.

No, the alternating MS numbers are because the cards are not producing them smoothly. For whatever reason, instead of a steady cadence, they spit them out at odd intervals (either on accident, or because the vendor may want the card to report the maximum possible fps, even if it doesn't look like it).

If you can capture that faster rendered frame, and delay displaying it until the expected midpoint between slower frames, you'd get rid of MS, but that's not a trivial task, and might cause other issues with perception as the game state to generate the frame will not really be in sync with when it would be displayed, but I'm not sure how you'd manage that (nvidia claims to do just this, somehow. Directly on the card, that probably means delaying the beginning of the rendering of the next frame and causing an overall fps hit). Alternatively, you could buffer frames and deliver them smoothly, but that would give you input lag.
 
Last edited:

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
I doubt microstutter is caused by this because otherwise we wouldn't see it with vsync off. Vsync off allows the card to run at its full speed, switch the buffers precisely when its done and continue. Vsync off as a technical choice is not the reason MS exists.

The thing is, even with vsync off your monitor is still running at 60hz. In fraps the frame time numbers are generated by using 60hz as the measure of time are they not?

I'm going to make a few pictures to try and illustrate what I my understanding of this problem is. BBIAB.
 

parvadomus

Senior member
Dec 11, 2012
685
14
81
So, I was thinking hard about the PCPer article on crossfire and it's runt frames. I made a post in that thread asking for an electrical engineer to help me patent something but since I highly doubt it's going to happen I will just spill the beans on my idea because it's eating me up not knowing if it would work or not.

There are some extremely savvy people on these forums who could let me know if it's possible so here goes.

The way to fix microstutter/stuttering is to change the way monitors work, not video cards. I think that if monitors could dynamically change their refresh rate at an extremely high rate to match exactly what FPS the GPU is outputting we would do away with tearing/runt frames, and also solve the input lag issues.

Does this seem possible?

It's not the monitor fault, it's the GPU fault, and what you are saying wont work (mainly because at monitor level there is no way to know how much a frame taken to be rendered at driver level and thus its imposible to delay it to a proper time).
The only way to deal with it is to add additional buffers to GPU, and sync them with swapbuffer calls at API level, this would remove MS completely, I dont know why AMD nor NV has done this yet, obviously they were not much concerned with this until the latest reviews with better techniques. However this is not a real fix to the problem, as would add some latency to user input.
Maybe we need an even more direct path from the driver to the gpu, baypassing the whole operating system. I would really like to know how gpu drivers are implemented nowdays, and their interaction with the SO!
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Additional buffers trade smoothness for latency which becomes apparent as input lag. In fast games (exactly the kind that ms seems to plague), you want to minimize latency entirely. That would be counterproductive to increase. It's not a good solution.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
Yes, but that depends on how much latency is added, e.g. 10ms should be unnoticeable.

The latency is going to be an addition of the refresh rate for each buffer depth you go. It's already non-zero, and more than 10ms to begin with. Throwing another buffer in play will add an additional 16.67ms of latency for 60Hz monitors on top of what already exists. Quite noticeable.

By the way, you can already force triple buffering and vsync if you really don't care. (the result is similar to playing in "game mode" on tv in a console fps and playing without it, multiplayer. You'll just be worse, with no real explanation why)
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
In my simplistic view of the how monitors work I view each of the 60hz in the refresh rate as a window that opens up to display whatever the GPU has rendered. When you exceed or are below 1 frame per opening is where you have problems.

60hz.jpg
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Why doesn't nVidia's examples show the slithers or runt frames?

Because their cards don't produce them?! I suspect strongly NVidia's frame metering tech is ensuring that NVidia's SLI produces a more even interval between the cards at the loss of some minor frame rate and thus correcting the problem before it starts. Presumably AMD's xfire has no such support and hence its cards end up rendering very close together and produces a lot of useless frames with only slight differences in time. The game just sees DX, it doesn't necessarily know its meant to be handing out frames on a nice regular basis or the cards will eat the render in parallel as fast as possible with all the problems that causes.
 

Ferzerp

Diamond Member
Oct 12, 1999
6,438
107
106
In my simplistic view of the how monitors work I view each of the 60hz in the refresh rate as a window that opens up to display whatever the GPU has rendered. When you exceed or are below 1 frame per opening is where you have problems.

60hz.jpg

You have drawn a perfectly synchronized 240 fps where every 4th frame would display....
 

VulgarDisplay

Diamond Member
Apr 3, 2009
6,188
2
76
You have drawn a perfectly synchronized 240 fps where every 4th frame would display....

I was pressed for time so I just copied and pasted my previous work lol. Either way I think it illustrates why there are issues when your frame rate outruns your monitors window to display a frame.