nvidia adaptive vsync = amd equivalent?

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
Nothing is more distracting than framerate stuttering and screen tearing. The first tends to occur when framerates are low, the second when framerates are high. Adaptive V-Sync is a smarter way to render frames. At high framerates, V-sync is enabled to eliminate tearing, at low frame rates, it's disabled to minimize stuttering. It gets rid of distractions so you can get on with gaming.
Radeon owners, opinions?
 

BD231

Lifer
Feb 26, 2001
10,568
138
106
I thought adaptive turned out to be crap due to issue's? Not to sure you want AMD giving it a go if nVidia can't get it right ...
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
I thought adaptive turned out to be crap due to issue's? Not to sure you want AMD giving it a go if nVidia can't get it right ...
What do you mean "it turned out to be crap" ?

I am actually using it (306.2) to play Crysis 2, and I have yet to find a con. It's definitively better than with Vsync Off.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
issues? It does what it's intended to do. It turns vsync on when you're fps is running at your refresh rate and when you drop it turns vsync off. Simple as that. This avoids you being synced to 60hz and a framrate drop brings you down to 30 and you get a hickup in gameplay. The screen tearing can still be noticed but that happens with vsync off anyway. This is an in-between solution.

AMD has nothing similar that I know of.

FWIW: I always turn vsync off because I find input lag to be a zillion times worse than a little bit of tearing. Besides, above 70fps I don't really notice any tearing.
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
IMO, that's quite intelligent, when nothing much happens on screen, you GPU doesn't have to work as hard. So far, I am only finding pro's using it.

Just wondering, if AMD is planning to add a similar feature to their drivers.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I haven't heard of anything. Probably because the use for this is minor. I mean, it still has input lag and still has screen tearing. So I can see where many people would just say "why?"
 

Magic Carpet

Diamond Member
Oct 2, 2011
3,477
233
106
I dunno, I've read a few reviews... and they were mostly positive. A nice feature.

I am certainly going to keep using it (subjectively, IQ has gone up), and if it helps me to save a few watts here and there. Why not indeed.
 
Last edited:

VirtualLarry

No Lifer
Aug 25, 2001
56,571
10,206
126
I had read somewhere that there was a little bit of hardware assist to this feature, so it may or may not be possible to be properly implemented on AMD hardware.
 

railven

Diamond Member
Mar 25, 2010
6,604
561
126
issues? It does what it's intended to do. It turns vsync on when you're fps is running at your refresh rate and when you drop it turns vsync off. Simple as that. This avoids you being synced to 60hz and a framrate drop brings you down to 30 and you get a hickup in gameplay. The screen tearing can still be noticed but that happens with vsync off anyway. This is an in-between solution.

AMD has nothing similar that I know of.

FWIW: I always turn vsync off because I find input lag to be a zillion times worse than a little bit of tearing. Besides, above 70fps I don't really notice any tearing.

I believe that synch drop is for games that don't support Triple Buffering. I use V-sync on since I hate tearing and my system has no issues running in odd frames such as 51 or 55 FPS. If I turn of Triple Buffering, then I suddenly go from 60 to 30 or sometimes even 45 haha.
 

blastingcap

Diamond Member
Sep 16, 2010
6,654
5
76
You know what would be great is something like what Carmack did for Rage where it's locked to 60fps and if your card is too weak to handle it, you lose detail until you are back at 60fps. Like, if you would dip to 45fps with AA on, maybe for a few frames the GPU skips AA in order to claw its way back to 60fps.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
You know what would be great is something like what Carmack did for Rage where it's locked to 60fps and if your card is too weak to handle it, you lose detail until you are back at 60fps. Like, if you would dip to 45fps with AA on, maybe for a few frames the GPU skips AA in order to claw its way back to 60fps.
well if you cant get 60 fps then you dont get 60 fps in that game. I know because I played it on some low end comps. also RAGE does include smart vsync which is the same as adaptive vsync.
 
Last edited:
Feb 19, 2009
10,457
10
76
"The first tends to occur when framerates are low, the second when framerates are high."

I game with vsync on, in every game I play, i'm always at 60 fps constant. If extreme settings cause frames to drop below 45, i turn them down to maintain 60 fps.

The solution seems pretty simple, no??
 

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
I did a Crysis 1 benchmark on my single GTX 570 a while back, to see how it would fair with normal vsync and adaptive vsync.

The results were astonishing.

With adaptive vsync:
2012-03-27 20:58:24 - Crysis
Frames: 8886 - Time: 170166ms - Avg: 52.220 - Min: 39 - Max: 61

With normal vsync:
2012-03-27 21:04:26 - Crysis
Frames: 5657 - Time: 169994ms - Avg: 33.278 - Min: 26 - Max: 61

Needless to say that the overall gaming experience was night and day, in favor of the adaptive vsync.

We can see here that when a game is heavy enough, to keep a card hovering at around 60fps-, vsync can really destroy the framerate, while adaptive vsync can work wonders, in terms of not destroying your gaming experience.

In this light I consider adaptive vsync a vital feature on my driver set and that almost binds me to Nvidia. Of course for high caliber AMD cards, normal vsync will do fine, since they have excess power and are very likely to keep 60fps in most games. I still would like to see an adaptive vsync like feature on AMD driver set anyway.

If I had to choose between two equally powered and equally priced AMD and Nvidia cards, Nv would be my choice and adaptive vsync would be mostly the reason.

PS I think that Dxtory's framerate limiter could provide a similar experience to adaptive vsync, although it's not the same thing. I have still to try this out.

PS2 Lucid's Virtu MVP could also provide similar capabilities but I am not so sure about the specifics plus it would be limited to Virtu MVP users.
 
Last edited:

psolord

Platinum Member
Sep 16, 2009
2,125
1,256
136
"The first tends to occur when framerates are low, the second when framerates are high."

I game with vsync on, in every game I play, i'm always at 60 fps constant. If extreme settings cause frames to drop below 45, i turn them down to maintain 60 fps.

The solution seems pretty simple, no??

I beg to differ. It's not that simple.

If you turn down the settings in order for the game to be able to keep 60fps in some scenes, all the rest of the scenes will still be rendered with lower quality, thus diminishing your gaming experience.

With adaptive vsync you still keep your quality settings and you only have to tolerate the few scenes that will be rendered with lower than 60fps. See my Crysis example above.

Moreover, depending on how the game is rendered, you cannot be sure that when the framerate drops below 60, you will get 45fps. You may very well drop to 30fps.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
I believe that synch drop is for games that don't support Triple Buffering. I use V-sync on since I hate tearing and my system has no issues running in odd frames such as 51 or 55 FPS. If I turn of Triple Buffering, then I suddenly go from 60 to 30 or sometimes even 45 haha.

Yeah almost no games support triple buffering and you cannot force it at the driver level with DirectX unless tou use d3doverrider and that does not work in every game because I have noticed a few titles crash with it on before. So I just turn Vsync off now. Got tired of fussing with Vsync. I play better now too.
 
Feb 19, 2009
10,457
10
76
I beg to differ. It's not that simple.

If you turn down the settings in order for the game to be able to keep 60fps in some scenes, all the rest of the scenes will still be rendered with lower quality, thus diminishing your gaming experience.

With adaptive vsync you still keep your quality settings and you only have to tolerate the few scenes that will be rendered with lower than 60fps. See my Crysis example above.

Moreover, depending on how the game is rendered, you cannot be sure that when the framerate drops below 60, you will get 45fps. You may very well drop to 30fps.

On the contrary, i see my fps all the time if i want, with fraps and other like software.

I also play with settings in games that i play a lot of, so i know exactly what features to disable to gain the maximum performance and at the minimum loss of IQ. Features such as setting shadow from HBAO/HDAO to SSAO etc, you can hardly notice the difference in the scene but the perf gains are big. Or particles from max to high etc etc. I have a fine control of what features to lower while maintaining high fps in games i frequent play.

As for adaptive sync lowering IQ in scenes that stress the GPU... what is it doing? Is it lowering FoV, draw distance, LoD, lighting, shadows etc? I rather be in control of my GPU to render a scene how I want, and not not let the GPU automatically adjust rendering quality.
 

ocre

Golden Member
Dec 26, 2008
1,594
7
81
i cannot stand tearing and therefor almost always have some kind of vsync. I havent really looked at adaptive vsync in depth, i havent payed it much mind.

For the most part I have been just enabling vsync and go with it. Adaptive, frame limits, triple buffers, just whatever stops the tearing for me. Never took the time to comparatively analyze the such. I guess it is something to play around with and i think i just might check into it more, especially since tearing bothers me so so much.
 

Keysplayr

Elite Member
Jan 16, 2003
21,211
50
91
On the contrary, i see my fps all the time if i want, with fraps and other like software.

I also play with settings in games that i play a lot of, so i know exactly what features to disable to gain the maximum performance and at the minimum loss of IQ. Features such as setting shadow from HBAO/HDAO to SSAO etc, you can hardly notice the difference in the scene but the perf gains are big. Or particles from max to high etc etc. I have a fine control of what features to lower while maintaining high fps in games i frequent play.

As for adaptive sync lowering IQ in scenes that stress the GPU... what is it doing? Is it lowering FoV, draw distance, LoD, lighting, shadows etc? I rather be in control of my GPU to render a scene how I want, and not not let the GPU automatically adjust rendering quality.
Far simpler to use adaptive V-Sync, whether you wrote the game and know every aspect of it's settings or if you're Joe gamer. You'll get to find out soon enough as out looks like AMD introducing its own version of Nvidia's adaptive V-Sync. But i guess you wouldn't be using it judging by your comments.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Adaptive Vsync doesn't adjust graphics settings. Where did you get that from??

With normal vsunc framerate drops in multiples of yor refresh rate so you go from 60 (assuming 60Hz) to 30, then 20, then 15. So...yeah 15fps is not really playable.
 
Last edited:

Final8ty

Golden Member
Jun 13, 2007
1,172
13
81
Far simpler to use adaptive V-Sync, whether you wrote the game and know every aspect of it's settings or if you're Joe gamer. You'll get to find out soon enough as out looks like AMD introducing its own version of Nvidia's adaptive V-Sync.But i guess you wouldn't be using it judging by your comments.

I would be surprised if they did, do you have a link for that ?
and as the other guy i wouldn't be using it.
 
Feb 19, 2009
10,457
10
76
Far simpler to use adaptive V-Sync, whether you wrote the game and know every aspect of it's settings or if you're Joe gamer. You'll get to find out soon enough as out looks like AMD introducing its own version of Nvidia's adaptive V-Sync. But i guess you wouldn't be using it judging by your comments.

No, I wouldn't. In all my gears of gaming with GPUs, I have been very satisfied with normal vsync and manually setting graphics to achieve the optimal IQ/perf.

But i admit, it would be nice for the average joe.. but then again, the average joe cant even figure out simple graphics options, let alone go to the control panel to enable adaptive vsync.