Adaptive Vsync sounds like a great idea, want on 7970

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
Ah, one of those myths:
Combine vsync/TB with fps limit just below refresh rate FTW

This is from Riva Tuner creator:



http://forums.guru3d.com/showpost.php?p=4227080&postcount=184


I don't think he understands it either, 59 fps into 60 will display as 1 repeated frame every second, which will result in 58 frames at even 16.6 ms intervals and one frame at 33.3 ms, making the average feel like a frame every 16.95 ms, which is pretty friggen close to how 60 fps feels. What he's saying is that if you're limiting below the refresh rate, you obviously need a factor (not a multiple like Mr. Alexey is saying) to obtain perfect intervals. No shit sherlock.

What Oubadah is saying in that post is perfectly valid, capping at 60 fps will give him even intervals at 120Hz if his games aren't capable of sustaining 120 fps. But what he wants to do isn't really advantageous at all. If he can't render 120 fps, and say he can only sustain 80 fps, any frame that cannot be rendered in under 8.3ms (120Hz), will simply result in the previous frame repeating for another refresh and lasting 16.6ms, which is the exact same perceived interval at 60 fps. By capping at 60 he's really just choosing to lose information, because any additional frame that might have arrived withing 8.3ms, now will not.



There's a way I've found that reduces image lag greatly with vsync + triple buffering. It's a program called Dxtory. I set a fps limit of 74, just a digit under my refresh rate (being 75hz), turned on vsync+tb, and now I'm able to enjoy competitive games a LOT better! I recommend everyone who has a fiery hate for image lag AND tearing like me to give this method a try. It's like the equivalent of "max_fps" in Source games and "GameTime.MaxVariableFps" in BF3, except that it works pretty much universally, even in later Source games that had max_fps taken out. I can't believe Nvidia doesn't have this Dxtory feature! Adaptive vsync can go F itself! xD
It must be bliss to do this on a 120hz monitor. I'll know soon enough.
Maximum Pre-rendered Frames set to 0 might have something to do with the results too... not totally sure. I keep mine on 0 anyway. No problems.

They do have it. I didn't know this either until not too long ago, but there's a frame rate limiter within the nvidia drivers that can be turned on with nvidia inspector and now with evga precision x (frame rate target feature)
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
There's a way I've found that reduces image lag greatly with vsync + triple buffering. It's a program called Dxtory. I set a fps limit of 74, just a digit under my refresh rate (being 75hz), turned on vsync+tb, and now I'm able to enjoy competitive games a LOT better! I recommend everyone who has a fiery hate for image lag AND tearing like me to give this method a try. It's like the equivalent of "max_fps" in Source games and "GameTime.MaxVariableFps" in BF3, except that it works pretty much universally, even in later Source games that had max_fps taken out. I can't believe Nvidia doesn't have this Dxtory feature! Adaptive vsync can go F itself! xD
It must be bliss to do this on a 120hz monitor. I'll know soon enough.
Maximum Pre-rendered Frames set to 0 might have something to do with the results too... not totally sure. I keep mine on 0 anyway. No problems.

1. whats wrong with adaptive?
2. Why in the world would you trick vsync into thinking your refresh rate was 1 frame per second less then it actually is? (that should, in fact, cause tearing)
3, Why in the world would doing the above solve the issues with triple buffering vsync?
 

Jodeth

Junior Member
Mar 29, 2012
2
0
0
1. whats wrong with adaptive?
2. Why in the world would you trick vsync into thinking your refresh rate was 1 frame per second less then it actually is? (that should, in fact, cause tearing)
3, Why in the world would doing the above solve the issues with triple buffering vsync?

1. Adaptive causes tearing when the fps is under the refresh rate and causes image/input lag when the fps is exactly at the refresh rate. It's a fluctuation between lag and no lag.
2. With vsync, tearing will never occur, no matter what the fps is. Tricking vsync into thinking the refresh rate is 1 frame under the real thing just magically reduces the lag... I don't understand why or how, but that's what it does.
3. That's what I don't understand.
 

JAG87

Diamond Member
Jan 3, 2006
3,921
3
76
1. Adaptive causes tearing when the fps is under the refresh rate and causes image/input lag when the fps is exactly at the refresh rate. It's a fluctuation between lag and no lag.
2. With vsync, tearing will never occur, no matter what the fps is. Tricking vsync into thinking the refresh rate is 1 frame under the real thing just magically reduces the lag... I don't understand why or how, but that's what it does.
3. That's what I don't understand.


http://www.anandtech.com/show/2794/2

If your game renders at 300 FPS with vsync off, it still renders at 300 FPS with vsync + triple buffering. You just don't see 300 because the front buffer is only fed 60 of those frames, but the problem is that 300 are still rendered and shoved into the back buffers which means when you produce a change to the image, you may get old frames rendered prior to your change sent to your screen. Capping just below the refresh ensures that the two back buffers stay clean because as soon as the frame is finished, it is passed to the front buffer because the display calls for it. It also makes your card work less, use less power and make less heat.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
1. Adaptive causes tearing when the fps is under the refresh rate and causes image/input lag when the fps is exactly at the refresh rate. It's a fluctuation between lag and no lag.
No it doesn't
No vsync of any kind causes image lag when the FPS is at refresh rate.
Double buffering - image lag and full screen micro-stutter when falling below target FPS
Triple buffering - worse image lag to reduce micro-stutter.
Adaptive - Disables vsync as needed if FPS falls below target thus causing tearing to avoid both the image lag and the micro-stutter.

Your suggestion was to use NO vsync at all which is worse then adaptive in every possible way.

2. With vsync, tearing will never occur, no matter what the fps is. Tricking vsync into thinking the refresh rate is 1 frame under the real thing just magically reduces the lag... I don't understand why or how, but that's what it does.
3. That's what I don't understand.
It works about as well as real magic. that is, it doesn't work at all. What you speak is utter nonsense.

http://www.anandtech.com/show/2794/2

If your game renders at 300 FPS with vsync off, it still renders at 300 FPS with vsync + triple buffering. You just don't see 300 because the front buffer is only fed 60 of those frames, but the problem is that 300 are still rendered and shoved into the back buffers which means when you produce a change to the image

No, this is very wrong.
First, those extra frames are discarded they do not produce any changes to the image.
Second, Triple buffering WITH vsync renders only 60 FPS (on a 60Hz monitor)
 
Last edited: