nvidia adaptive vsync = amd equivalent?

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

GotNoRice

Senior member
Aug 14, 2000
329
5
81
Of course there is tearing with Adaptive VSync, because VSync isn't always enabled.

IMO Adaptive VSync is more for people who would otherwise run with VSync disabled, rather than vise versa. People who run with VSync off generally want every last bit of performance (fps) with no compromises. Adaptive VSync gives you that, never compromising FPS for the sake of tearing, and only enabling VSync when you would be basically wasting FPS anyway above your refresh rate.

Another thing to consider is that not all monitors run at 60hz. I have a Samsung S27A950D that runs at 120hz. I really do want it as close to 120hz as possible at all times.

It would be cool if AMD got onboard with this, but I don't think it's worth it if you have to use 3rd party crapware like radeonpro or any of that horrendous lucid software.
 
Last edited:

Arkadrel

Diamond Member
Oct 19, 2010
3,681
2
0
What's coming next!

http://www.radeonpro.info/en-US/Blog/

Thanks goes to grstanford!


I remember nvidia haveing some issues with this Adaptive vsync stuff when it first came out for a few driver versions. But just watching the video nvidia made about it, makes it easy enough to see that the "idea" behinde the concept is valid enough.

Its the best of both world.

You avoid stuttering from keeping vsync on.
You avoid screen tearing from keeping vsync off.

This way you get a tiny bit of stuttering & screen tearing, but enough so its kinda best of both worlds situation, and if it works perfectly and your PC is fast enough, it could potentially remove all stuttering/screen tearing.

AMD should make their version of this too I think.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
No, I wouldn't. In all my gears of gaming with GPUs, I have been very satisfied with normal vsync and manually setting graphics to achieve the optimal IQ/perf.

But i admit, it would be nice for the average joe.. but then again, the average joe cant even figure out simple graphics options, let alone go to the control panel to enable adaptive vsync.

You could use it. We won't tell anyone. And neither will you. :)
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
No, I wouldn't. In all my gears of gaming with GPUs, I have been very satisfied with normal vsync and manually setting graphics to achieve the optimal IQ/perf.

But i admit, it would be nice for the average joe.. but then again, the average joe cant even figure out simple graphics options, let alone go to the control panel to enable adaptive vsync.

There are many levels of understanding between you (guru of all things in gaming settings using standard Vsync) and the average Joe. You, were an average Joe yourself at one point and look at you now!
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Some of you guys are missing what I said I think. I don't use vsync and it has nothing to do with wanting performance. It's because vsync, any kind of vsync, has more input lag on the mouse movements. Triple buffering helps but DirectX doesn't allow the driver to force it like OpenGL.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
Anyway I would love it if we got a-vsync from AMD too but like I said the locked-60fps RAGE thing sounds good to me. I mean it just makes sense, at least in theory. Even if RAGE wasn't that great of a game.
 
Last edited by a moderator:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I was serious. But I see this was a returned 2nd account of a vacationed member. I remember the guy now. A real piece of work. Blackened, you called it. Apologies. But this does mean you keep track of posters very closely. hehe.

I removed all the garbage from this thread; you guys deserved to have the thread cleaned up. markyd (version 2, which was actually version 3) has been removed. -Admin DrPizza
 
Last edited by a moderator:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Anyway I would love it if we got a-vsync from AMD too but like I said the locked-60fps RAGE thing sounds good to me. I mean it just makes sense, at least in theory. Even if RAGE wasn't that great of a game.

I hate being locked to 60fps. I play with vsync off as I mentioned and when the fps is locked at 60, you get terrible screen tearing. When my FPS rises up above 70 or so, I get less tearing for whatever reason.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I hate being locked to 60fps. I play with vsync off as I mentioned and when the fps is locked at 60, you get terrible screen tearing. When my FPS rises up above 70 or so, I get less tearing for whatever reason.

just curious what difference it makes for you to be locked at 60fps. 60fps is fast enough for any game on the planet so another 10 or 20fps wont make a difference over 60. id rather play at 60 without tearing than 70-7000 with tearing.
 

Mr. Lennon

Diamond Member
Jul 2, 2004
3,492
1
81
Some of you guys are missing what I said I think. I don't use vsync and it has nothing to do with wanting performance. It's because vsync, any kind of vsync, has more input lag on the mouse movements. Triple buffering helps but DirectX doesn't allow the driver to force it like OpenGL.

You can force triple buffering in DirectX with radeonpro
 
Feb 19, 2009
10,457
10
76
I hate being locked to 60fps. I play with vsync off as I mentioned and when the fps is locked at 60, you get terrible screen tearing. When my FPS rises up above 70 or so, I get less tearing for whatever reason.

That's just bizzarre.

59/60 fps vsync is meant to NOT produce tearing, and in my experience, it works for that really well.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
That's just bizzarre.

59/60 fps vsync is meant to NOT produce tearing, and in my experience, it works for that really well.

Did you read? I never use vsync. It sucks cause of input lag.

When my FPS is above 70 I don't get screen tearing that much, 60 or below with vsync off (again I never use vsync ever!) I get tons of tearing.

Keysplayer...locked at 60 means I get constant screen tearing because as I mentioned multiple times, vsync is never on my list of usable features due to the mouse lag. I didn't really notice how bad it was until I turned vsync off and had a WTF moment in the menus.

I wonder, why it's taking so long for the red team... to implement these features, officially.

DirectX does not allow driver level triple buffering.

Besides, even with triple buffering input lag is still prevalent. I notice it a ton compared to just turning vsync off.
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
Did you read? I never use vsync. It sucks cause of input lag.

When my FPS is above 70 I don't get screen tearing that much, 60 or below with vsync off (again I never use vsync ever!) I get tons of tearing.

Keysplayer...locked at 60 means I get constant screen tearing because as I mentioned multiple times, vsync is never on my list of usable features due to the mouse lag. I didn't really notice how bad it was until I turned vsync off and had a WTF moment in the menus.



DirectX does not allow driver level triple buffering.

Besides, even with triple buffering input lag is still prevalent. I notice it a ton compared to just turning vsync off.

Without triple buffering and V-Sync on, I notice zero difference in input lag than if I had V-Sync off. Maybe it's your montior?

Any game in particular? Or is it prevalent in all games for you?
 
Last edited:

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Without triple buffering and V-Sync on, I notice zero difference in input lag than if I had V-Sync off. Maybe it's your montior?

Any game in particular? Or is it prevalent in all games for you?

All games and even with triple buffering on (Deus Ex HR for example) I still noticed the mouse lag. Triple buffering made it better but there still was a difference. It also happened with every monitor I've had. Like I said, I never realized until one day I turned vsync off to see what FPS I'd get and was thinking to myself "wow this is a lot better".

It's just me I guess lol
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
All games and even with triple buffering on (Deus Ex HR for example) I still noticed the mouse lag. Triple buffering made it better but there still was a difference. It also happened with every monitor I've had. Like I said, I never realized until one day I turned vsync off to see what FPS I'd get and was thinking to myself "wow this is a lot better".

It's just me I guess lol

I would like to figure out what the common denominator for people with this issue is, I have never had this issue whatsoever with vsync enabled. Mouse drivers? Specific mice? I don't know. Some people swear by it, my mouse is never ever less responsive with vsync.

Using a logitech G9x with the latest logitech setpoint drivers, really curious as to what the common denominator is.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
FWIW: I always turn vsync off because I find input lag to be a zillion times worse than a little bit of tearing. Besides, above 70fps I don't really notice any tearing.
All games and even with triple buffering on (Deus Ex HR for example) I still noticed the mouse lag. Triple buffering made it better but there still was a difference.

There are two definitions for triple buffering. One applies to OGL and the other to DX. Adaptive v-sync provides benefits in terms of power savings and smoothness relative to both.

  • Triple buffering solutions require more frame-buffer memory than double buffering, which can be a problem at high resolutions.

  • Triple buffering is an application choice (no driver override in DX) and is not frequently supported.

  • OGL triple buffering: The GPU renders frames as fast as it can (equivalent to v-sync off) and the most recently completed frame is display at the next v-sync. This means you get tear-free rendering, but entire frames are affectively dropped (never displayed) so smoothness is severely compromised and the effective time interval between successive displayed frames can vary by a factor of two. Measuring fps in this case will return the v-sync off frame rate which is meaningless when some frames are not displayed (can you be sure they were actually rendered?). To summarize - this implementation combines high power consumption and uneven motion sampling for a poor user experience.

  • DX triple buffering is the same as double buffering but with three back buffers which allows the GPU to render two frames before stalling for display to complete scanout of the oldest frame. The resulting behavior is the same as adaptive vsync (or regular double-buffered v-sync=on) for frame rates above 60Hz, so power and smoothness are ok. It's a different story when the frame rate drops below 60 though. Below 60Hz this solution will run faster than 30Hz (i.e. better than regular double buffered v-sync=on) because successive frames will display after either 1 or 2 v-blank intervals. This results in better average frame rates, but the samples are uneven and smoothness is compromised.

  • Adaptive vsync is smooth below 60Hz (even samples) and uses less power above 60Hz.

  • Triple buffering adds 50% more latency to the rendering pipeline. This is particularly problematic below 60fps. Adaptive vsync adds no latency.
Clearly, things get complicated in a hurry depending on how the feature is implemented, but generally speaking, triple buffering has several disadvantages compared to adaptive vsync. One, the animation may not be as smooth. Two, there's substantially more lag between user inputs and screen updates. Three, it uses more video memory. And four, triple-buffering can't be enabled via the graphics driver control panel for DirectX games. Nvidia also contends that smart vsync is more power-efficient than the OpenGL version of triple buffering.
Source.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
I would like to figure out what the common denominator for people with this issue is, I have never had this issue whatsoever with vsync enabled. Mouse drivers? Specific mice? I don't know. Some people swear by it, my mouse is never ever less responsive with vsync.

Using a logitech G9x with the latest logitech setpoint drivers, really curious as to what the common denominator is.
you must be oblivious to it. most games it does not bother me but something is wrong with you if you cant ever tell the difference. :eek:
 

blackened23

Diamond Member
Jul 26, 2011
8,548
2
0
you must be oblivious to it. most games it does not bother me but something is wrong with you if you cant ever tell the difference. :eek:

I play a lot of multiplayer FPS games and input lag is something I have experienced on other computers at LAN events but never on my own. It is immediately apparent because precision is mandatory in some of the games I play, yet I have never have an issue with vsync on or off. So i'm curious about what the common denominators are.

The only game that exhibits this behaviour is dead space 1 on PC for me, but the mouse code in that game is tied to framerate (and the 30 fps console cap)
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
I would like to figure out what the common denominator for people with this issue is, I have never had this issue whatsoever with vsync enabled. Mouse drivers? Specific mice? I don't know. Some people swear by it, my mouse is never ever less responsive with vsync.

Using a logitech G9x with the latest logitech setpoint drivers, really curious as to what the common denominator is.

I'm betting on the particular monitor a person uses. I have been hearing that the Catleap 2560x1440 monitors suffer from this. People with TN panels usually dont suffer from input lag as much as say a user with an IPS panel.
Just want to clear up that this is what I have "heard" or "read" in various sources I can't remember. Don't know if this is true as I only have had TN panels because they are best for gaming. And some TN panels are inferior to others as well.
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
What's coming next!

http://www.radeonpro.info/en-US/Blog/



Thanks goes to grstanford!

Big thanks for that mate. :thumbsup:

This will be thoroughly tested on my 5850 crossfire system and if it works right, it will increase my chances of getting a 7950 instead of a 670, although the Gigabyte Windforce 2X just came to my country and the price difference is now only 25 euros, which will make it a tough choice.

On the contrary, i see my fps all the time if i want, with fraps and other like software.

I also play with settings in games that i play a lot of, so i know exactly what features to disable to gain the maximum performance and at the minimum loss of IQ. Features such as setting shadow from HBAO/HDAO to SSAO etc, you can hardly notice the difference in the scene but the perf gains are big. Or particles from max to high etc etc. I have a fine control of what features to lower while maintaining high fps in games i frequent play.

As for adaptive sync lowering IQ in scenes that stress the GPU... what is it doing? Is it lowering FoV, draw distance, LoD, lighting, shadows etc? I rather be in control of my GPU to render a scene how I want, and not not let the GPU automatically adjust rendering quality.

Just to be clear I never said or implied that adaptive vsync does anything to the quality settings. I only said that without adaptive vsync, it's the user himself that has to lower settings (as you do) in order to keep 60fps.

I have no knowledge of adaptive vsync altering game settings in any degree, but it would be interesting if I could learn the specifics. Please share.

Actually I did the same thing while playing Risen 2 on my single 570, before the sli support came along. I found out that by lowering shadows I could keep 50-55fps outdoors, which was quite playable, but that was with adaptive vsync as well. If I didn't have the adaptive vsync option, I would be forced to have dips at 30fps which for my gaming experience is disastrous. On the other hand, it was keeping a solid 60fps in most indoors settings.

All in all, adaptive vsync is a setting that provides an overall solution that provides good results for all games, whether they be Risen 2 heavy or Torchlight errmmm....light!
 
Last edited:

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
All games and even with triple buffering on (Deus Ex HR for example) I still noticed the mouse lag. Triple buffering made it better but there still was a difference. It also happened with every monitor I've had. Like I said, I never realized until one day I turned vsync off to see what FPS I'd get and was thinking to myself "wow this is a lot better".

It's just me I guess lol

You need to buy a 120hz monitor with very low input lag (they do exist, e.g. I have an LG W2363D and it's great).
 

Ben90

Platinum Member
Jun 14, 2009
2,866
3
0
With normal vsunc framerate drops in multiples of yor refresh rate so you go from 60 (assuming 60Hz) to 30, then 20, then 15. So...yeah 15fps is not really playable.

With adaptive vsunc framerate drops depending how fast you render so you go from 60 (assuming 60Hz) to 59 to 58 to 57 to 56 to 55 to 54 to 53 to 52 to 51 to 50 to 49 to 48 to 47 to 46 to 45 to 44 to 43 to 42 to 41 to 40 to 39 to 38 to 37 to 36 to 35 to 34 to 33 to 32 to 31 to 30 to 29 to 28 to 27 to 26 to 25 to 24 to 23 to 22 to 21 to 20 to 19 to 18 to 17 to 16 to 15 to 14 to 13 to 12 to 11 to 10 to 9 to 8 to 7 to 6 to 5 to 4 to 3 to 2 to 1. So...yeah 1fps is not really playable.
 

Pottuvoi

Senior member
Apr 16, 2012
416
2
81
With adaptive vsunc framerate drops depending how fast you render so you go from 60 (assuming 60Hz) to 59 to 58 to 57 to 56 to 55 to 54 to 53 to 52 to 51 to 50 to 49 to 48 to 47 to 46 to 45 to 44 to 43 to 42 to 41 to 40 to 39 to 38 to 37 to 36 to 35 to 34 to 33 to 32 to 31 to 30 to 29 to 28 to 27 to 26 to 25 to 24 to 23 to 22 to 21 to 20 to 19 to 18 to 17 to 16 to 15 to 14 to 13 to 12 to 11 to 10 to 9 to 8 to 7 to 6 to 5 to 4 to 3 to 2 to 1. So...yeah 1fps is not really playable.
And you get 60 images in second with variable amount of tearing.

For those who dislike tearing and additional latency, use 'maximum pre-rendered frames = 1', should help a little.