Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
Check your driver settings to see if adaptive v-sync is enabled and if so disable it by changing the settings to "on" instead of "adaptive." Hopefully this will fix your problem.

Adaptive v-sync is a recent feature that is meant to detect whether you're system requires v-sync and enable and disable it accordingly. The problem is that it rarely gets it right with some systems and sometimes it will do it while in game causing all sorts of weird performance issues.
 

Madia

Senior member
May 2, 2006
487
1
0
Sorry to go a bit off topic but what does the term hitching refer to? I've seen it used a couple times but don't know the definition of it.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
How are you forcing "triple buffering"? That is clearly the first question because DirectX doesn't provide this option and neither do the games. The triple buffering in the drivers only impacts openGL games.

It may be all you are seeing is the impact of the Double buffering 16/33 problem with DirectX games and you are used to the smooth fluid gameplay of triple buffering on openGL. In the past few years almost all games have been DirectX whereas a couple of years ago there were still some openGL games in the mix.
 

Anteaus

Platinum Member
Oct 28, 2010
2,448
4
81
I don't use Adaptive V-Sync. I think it's a waste of time, as Triple Buffering should make it completely redundant.

Adaptive Vsync has absolutely nothing to do with triple buffering. Adaptive vsync is a new feature that allows the GPU driver to decide whether vsync should be used at any particular time and engages it or disengages it accordingly. My point was that if you had adaptive vsync enabled in the driver it can cause exactly the type of problems you're describing and that if it was in fact enabled you should disable it and see how things work. It also comes enabled by default so its very possible that you installed a driver update not long ago and didn't check the settings so you might have been using it.

If you know for a fact that you haven't been using adaptive then just disregard what I said.
 
Last edited:

pong lenis

Member
Apr 23, 2013
119
0
0
Adaptive V-Sync is just V-Sync with an automatic toggle. If the framerate is above the refresh rate, it enables V-Sync. If the framerate is below the refresh rate, it disable V-Sync. According to Nvidia's marketing drivel, the purpose of this was to prevent the massive framerate hit that V-sync can cause when fps < refresh. But Triple Buffering already does this without having to disable V-Sync ever, so it is a superior solution which is why I say it makes 'Adaptive V-Sync' redundant.

But as I said, I never use adaptive V-Sync so it's a non issue. it's disabled in the Nvidia CP.

Actually there are advantages and disadvantages to using Adaptive V-sync. The advantage is that unlike triple buffering it does not require any extra VRam and "might" relieve you from some frame stuttering problems associated with triple buffering(if there really are any), but the disadvantage is that whenever your framerate drops below 60 you will get screen tearing.
It's a matter of taste, but honestly the entire reason V-sync was invented is to prevent screen tearing at the expense of having a few negative side effects; but with this new Adaptive V-sync feature, Nvidia are now saying the exact opposite, that you can be rid of extra V-ram use of triple buffering at the expense of a screen tearing side effect, meaning they think screen tearing shouldn't annoy you. But if that is really the case, then why did they invent V-sync in the first place?
So you can clearly see that Adaptive V-sync is just a Nvidia marketing scheme, which is why I only have respect for AMD GPU's.
 

ThinClient

Diamond Member
Jan 28, 2013
3,977
4
0
Actually there are advantages and disadvantages to using Adaptive V-sync. The advantage is that unlike triple buffering it does not require any extra VRam and "might" relieve you from some frame stuttering problems associated with triple buffering(if there really are any), but the disadvantage is that whenever your framerate drops below 60 you will get screen tearing.
It's a matter of taste, but honestly the entire reason V-sync was invented is to prevent screen tearing at the expense of having a few negative side effects; but with this new Adaptive V-sync feature, Nvidia are now saying the exact opposite, that you can be rid of extra V-ram use of triple buffering at the expense of a screen tearing side effect, meaning they think screen tearing shouldn't annoy you. But if that is really the case, then why did they invent V-sync in the first place?
So you can clearly see that Adaptive V-sync is just a Nvidia marketing scheme, which is why I only have respect for AMD GPU's.

Yep, that's right, disregard better price-per-performance and stable drivers and a history of kicking AMD's ass all because of a stupid petty marketing thing.

That's intelligent!
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
There is an inherent timing issue that triple buffering introduces. I personally had problems forcing triple buffering with D3DOverrider with modern games. It crashed games frequently and sometimes it didn't apply (obvious from the crossfire stuttering introduced when it wasn't there), but assuming in this particular game you do have it working there is another possibility I can think of and I'll get to that.

Triple buffering introduces an odd timing offset. Lets take for example 45 fps. At 45fps each frame takes a little over 22ms to produce and the graphics card can only show it at either 16ms intervals. So while the graphics card is never held up waiting on a buffer swap its also true that the frame is held back in a periodic pattern, which may well be detectable.

For example lets look at when the frame actually goes out:
1st Frame - ready 22ms - goes out 32ms
2nd Frame - ready 44ms - goes out 50ms
3rd Frame - ready 66ms - goes out 66ms
4th Frame - ready 88ms - goes out 100ms

So we can see already producing a frame every 22ms either results in it getting delayed a variable amount from 0ms to 12ms, it could in theory be up to 16ms. Its actually got a particular pattern to it. Now 16ms of latency in an animation from when it should be shown to when its actually shown is significant, we know from microstutter tests done here that some people can see as low as 6ms and the majority seem to notice 10ms or more. 16ms isn't a hitch but a periodic animation delay like this could very well be noticeable. Its one of the reasons triple buffering isn't used in DX anymore, that and the additional latency.

You used to be happy with triple buffering. I think 2 differences which are relatively recent may be showing up a problem that you didn't previously see, taking you past a noticeable threshold for you. But first a bit of background is necessary. We all seem to have different tolerances for latency and mismatching of timing for animation (what a lot of people call stutter). Now there seems to be an interplay between the two. That is if we increase the latency we seem to gain less tolerance for frame variance (this I have confirmed with a custom bit of software I haven't yet released). I don't understand the what and why yet, but if you increase the latency we want higher frame rates. Partly it might be to do with the decreased latency of a higher frame rate, less time in the GPU results in less latency and compensates for some of the latency introduced but I think there is more to it than that, I just can't prove it yet.

So here is my theory.

With Windows 7 Microsoft introduced DX10 and a new driver architecture for graphics to improve isolation as the GPUs were responsible for most crashes of XP. That new architecture added some latency with the context queue and the user space driver. In theory its 48ms of additional latency for a context queue compared to XP. Secondly Nvidia removed the 0 context queue option so you have to pay a minimum of 16ms. The combination of this additional latency in the modern OS and NVidia's driver change and the additional latency of triple buffering all add up to quite a lot more latency today than a few years ago. The inherent periodic delays caused by triple buffering when less than 60fps combined with the additional latency could make the animation stutters quite obvious.

But I would like to also see a fraps frame time trace as well to make sure you don't have another problem, the frame time trace will at least ensure the game frames are being produced correctly and that the back pressure is working as it should to create an even animation.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
Take a trace at sub 60 fps and 60 fps scenarios as well as when its smooth at 120hz. Just one game will be fine, lets get an impression of whether what you are seeing is normal.
 

Dekk

Junior Member
Apr 24, 2013
2
0
0
Hi guys.
I just registered to tell you i've got the same problem since about a year.
I tried to find a fix for it, but all threads i see on this subject are without answer.
http://hardforum.com/showthread.php?p=1039308104
http://forums.guru3d.com/showthread.php?t=347842
http://forums.guru3d.com/showthread.php?t=322685

This is the weirdest problem i ever had with a computer.
I find myself moderatly knowledgable about computers, i work in small computer shop, but i cant do anything about this issue.

In my case problem first appeared when i build my new computer: intel i5 2500k,
Asrock p67Pro3 mobo, 8Gb DDR3 Corsair . At first i used my old graphic card - Asus HD4850 - and all games played good even at 40 fps. All changed after i installed my new Radeon HD7850- games are perfectly smooth at 60fps, or whatever refresh rate i chose...if i switch my monitor to 50 Hz than 50 fps is smooth, but 48, 49 feels horrible.
It seems that games not only stutter but animations slowing down in some cases.
I actually preffer to play games with double vsync on, and forced 30fp.. it feels much better than 50fps average i manage otherwise.
Other observations are, that game are smooth at fixed 40 or 50 fps only when there is ingame option to use such limiter. When i use afterburner fps limiter i still get the hitching, but in crysis 1 , where i have 50Hz bug, game is perfectly smooth at 50fps.
When i turn vsync off i still have hitching above 60fps...to about 80-90.
If i manage more than 100fps then games run ok.
I tried many diffrent drivers and second HD7850, but it made no diffrence.
Also i observed the problem on some other new computers i build in work.

All of this feels pretty absurd, because it seems to affect many computers, but i dont see it discused to often.

I just wanted to share my observation, maybe together we will find the fix....i hope so because for now i cant enjoy most of my games.

Thanks for reading and sorry for my english.
 
Oct 16, 1999
10,490
4
0
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
UT2004

The Fraps certainly is showing a problem, but its not microstutter, you have periodic stuttering. Here for example is a graph of UT at 120hz with 36-60:
UT120Hz36-60fpsstuttering_zps27c44999.png


Which clearly shows a periodic stutter that is occuring every second. A similar picture with a little less microstutter shows up for the 45-82@120Hz so both of those have the same problem. The 120@120 shows none of the stutter jumps at all, neither does the 60@60 120Hz. No doubt about it that is a very bad gaming experience, the animation is going to be jumping around a lot.

For Crysis both the traces look very similar, so here is the 120hz trace:
Crysis120Hz_zps52013009.png


So while it has a small amount of microstutter its well below a threshold where I would see the difference.

UT I have no doubt there is a problem there, a specific one to the game. For Crysis it all looks healthy, there is no back pressure issue causing game animation to stutter, the expectation is the frame rate you are getting is normal. If there is a problem its not showing up at the front of the pipeline, it may very well be at the end (the buffer swaps) however.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
UT W7 60Hz Windows 7 54fps average


Shows similar performance to the XP trace, periodic hitches are causing the lack of smooth feel but its notable there is a slight increase in the variance of the base line of performance as well, its microstuttering a little more but not noticeably. So that is consistent.

UT SSAA GPU bound


This is the worst microsttuter trace I have ever seen. I almost can't believe it, this is like you have the 30/60 problem of normal vsync but with the added bonus that its always alternating. This is twice the noticeable microstutter, I suspect even someone half blind could see the problem with this.

But its a completely different problem to the other trace.

COD 100fps@120Hz shows clear signs of microstutter:


Microstutter in the 5ms range and around 10ms. Now normally 5ms would be below threshold, but that is at 60Hz. I don't see much data at 120hz but given its an 8ms frame I would say a 5ms swing is likely just as bad for the animation as 10ms is to 16ms, ie very noticeable. Its not a pretty trace.

I think its fairly clear from all these graphs triple buffering is breaking something important, the lack of core support is kind of showing. You might be better off abandoning and using double buffering and ensuring frame rate matches monitor hz. Because all of these traces are awful.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,081
136
If nothing else works, download the latest drivers, uninstall old drivers, run Driver Cleaner, and start fresh.

That fixes all my problems if I havent done it in a while.
 

shortylickens

No Lifer
Jul 15, 2003
80,287
17,081
136
I've been going one further and doing clean O/S installs with most driver updates. My PCs have been going through so many hardware swaps I haven't even bothered to activate Windows in the last 6 months - I never expire the eval period.

Lulz, I remember those days.

I also remember how nice it was before you had to activate Windows.
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
The fake triple buffering is really just another name for the render ahead/context queue. Because there is a buffer ahead of the GPUs kernel driver it ensures the GPUs always have work queued up and that the game running on the CPU isn't actually stuck waiting on the GPU to render, although in practice it is then just stuck waiting for space in the queue. What I think is happening is that when you setup a swap chain with 3 buffers in it the already presented image that is last in that queue can be replaced. So it has the impact of being triple buffered. The context queue is typically 3 long.

I have never seen D3DOverrider's code so I can't really say how it works directly, but my guess is it uses a Windows hook, intercepts the DirectX API calls and changes the swap chain description setup details to make it use 3 buffers. Then in the present call for directX D3DOverrider will call onto the swapchain to make it switch buffers and have an appropriate mod that ensures the latest is always used and the next is switched. Thus it has introduced triple buffering over what the game set which was 2 buffers with vsync enabled.

However there is a bit of a problem. The game calls onto a DX function called present, and this call will normally wait until there is space on the context queue for the new frame. This waiting is what games use to regulate their frame rate. If they were designed for triple buffering to begin with their inconsistent CPU rendering of a frame would be obvious from the outset. But because they didn't design the game with triple buffering in mind (very few do and none of the games this year have it at all) then the self regulation isn't there and suddenly the game is running in a way it never expected, with no regulation from the GPU at all.

So while I agree with your sentiments that you hate tearing and you want smooth gameplay and triple buffering ought to give you that in practice DirectX only has a simulated triple buffering not real triple buffering. There are still only two buffers after the GPU as far as I know, at least for any DX10 and above game.

Try adaptive vsync, it gets a long way to solving these problems.
 
Last edited: