Getting screen tearing on 1080 Ti and FreeSync monitor

KampKounslr1

Junior Member
Nov 12, 2010
12
0
61
Hey all,

I recently upgraded my video card to a 1080 Ti to try and get a smoother gaming experience, however I do notice a lot of screen tearing in games (especially Fallout 4, but it is noticeable in other games like Overwatch or Path of Exile). I set my vsync in the Nvidia control panel to be adaptive, as my monitor supports FreeSync, but to no avail. I've tried other vsync modes to no help. The games almost always run at 75 fps, my monitors refresh tops out at 75hz. Cannot afford to get a Gsync monitor at this time. Happens at low or high quality graphics. Here are my system specs:
  • Core i5-4460
  • 12gb DDR3 1866 ram
  • Geforce 1080 Ti (runs boost mode through evga precision x and regularly achieves a 30% overclock, issue happens with card either overclocked or not, and happened on previous vid card, GTX 770, which was not overclocked)
  • LG 34UC88 34" Curved FreeSync IPS Monitor 3440 x 1440 WQHD 5ms 21:9 UltraWide - Connected via DisplayPort. Monitor set to FreeSync ON, 75hz refresh, FPS Mode 1 preset. Issue also occurs if I use my second monitor for gaming instead, which is an older Acer monitor that runs at 1680 x 1050 via DVI at 60 hz.
  • Roccat Kave XTD usb headset
  • Intel 750 SSD 800gb
  • Logitech MX Master 2s
Any suggestions?
 

KampKounslr1

Junior Member
Nov 12, 2010
12
0
61
OK, so I turned off FreeSync on my monitor but left vsync adaptive enabled (it is vsync) and getting much worse tearing. Switched settings back, tearing is back to where it was before I changed. Other suggestions?
 

OlyAR15

Senior member
Oct 23, 2014
982
242
116
Just set vsync to on. Adaptive allows tearing to occur. Also make sure that you have the correct settings in both the global setting as well as the application-specific settings.
 

KampKounslr1

Junior Member
Nov 12, 2010
12
0
61
That was it! The tearing is gone. From what i'd read about nvidia's adaptive sync, it was supposed to be better than vsync; why on earth do they have the adaptive option if it works like crap?
 

DethGasp

Junior Member
Feb 2, 2017
23
9
81
you need a G-Sync display to use NVIDIA's adaptive sync.

This is not correct.
Adaptive VSync came out when Kepler was released in March of 2012 w/ the 300. driver branch. G-Sync came out in DIY kits in 2013, then it was included on monitors in 2015.
Adaptive VSync has nothing to do with G-Sync.

https://www.geforce.com/hardware/technology/adaptive-vsync/technology

In the R300 drivers, released alongside the GTX 680, Nvidia introduced a new feature called Adaptive VSync. This feature is intended to combat the limitation of v-sync that, when the framerate drops below 60 FPS, there is stuttering as the v-sync rate is reduced to 30 FPS, then down to further factors of 60 if needed. However, when the framerate is below 60 FPS, there is no need for v-sync as the monitor will be able to display the frames as they are ready. To address this issue (while still maintaining the advantages of v-sync with respect to screen tearing), Adaptive VSync can be turned on in the driver control panel. It will enable VSync if the framerate is at or above 60 FPS, while disabling it if the framerate lowers. Nvidia claims that this will result in a smoother overall display.

While the feature debuted alongside the GTX 680, this feature is available to users of older Nvidia cards who install the updated drivers
 

EXCellR8

Diamond Member
Sep 1, 2010
3,982
839
136
my b. I tend to get adaptive and variable confused. if we could all just come up with one type that simply worked across everything, that'd be great.
 
  • Like
Reactions: DethGasp