A thread for those with weak Nvidia GPUs using Vsync Half Refresh to stay alive

Xenphor

Member
Sep 26, 2007
153
0
76
I recently got a 550 ti because I couldn't wait for a 660 and all the gtx 460s were sold out at Fry's. Now I could have gotten a budget AMD card that looked better on paper, but I didn't because for me, a tear free, consistent image is paramount to enjoyment of a game.

I may be able to get better framerates with a budget AMD card, but will these be Vsync'd at 60 FPS stable experiences? For the most demanding games, probably not. Unfortunately, as far as PC games go, that is the only option for me. I can not stand tearing, so having no Vsync is not an option. I also hate the extremely choppy gameplay of a Vsync'd game that doesn't maintain a framerate of the monitor's refresh rate (60hz). The only other option I know of is to use a frame limiter like Dxtory, but that still produces tearing even when locked at 30fps with vsync or without.

Since I knew that Nvidia added a Vsync for half the refresh of a monitor (for me 60hz), I thought that a good compromise would be to use this option for many of the games that the 550 ti can't run well.

The first game I tried was Crysis, and, to my amazement, the new vsync option actually produced a very stable 30fps image. I was able to achieve this while running at DX10 settings at Very High and using a very aggressive .cfg file for LOD and other settings. If I hadn't used the new vsync option, the game would have been unplayable at those settings because of the inconsistencies.

Another game I tried was Metro 2033. This game proved to be much more challenging to run at high settings. At first I thought that the new vsync option wasn't working because the game didn't run nearly as well as Crysis (with DX11 at Very High). I found it strange that the game didn't even have an option for vsync in the menus and I could not get it working in the .cfg files. I thought I would not be able to run Metro 2033 at good settings until I tried also using the frame limiter, Dxtory. For some unknown reason, if I set Dxtory to limit the game to 30fps while also selecting the new vsync option in the Nvidia panel, I can actually get a fairly stable gameplay experience at 30fps in Metro 2033 in DX11 at High settings. I have no idea why this combination is needed but it seems to work for my setup.

Well those are two of the most challenging games I have tried so far. I would be interested to know if anyone else has messed with this option and what games they have used it with. I think that had I just waited and got a gtx 460, this new vsync option would become even more powerful because maintaining 30fps would be somewhat easier. As it stands now I will just see what the 660 brings.

I know AMD would have been much better on paper, but if I can not maintain a good quality vsync then it pretty much defeats the purpose. I really hope AMD implements something similar in their drivers so I can have an option.
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
30 fps is abysmal looking in most games and even worse when vsync is on. the stutter just when panning around is enough to ruin any game.

what I dont get is why console games dont look like crap at 30 fps.
 

Xenphor

Member
Sep 26, 2007
153
0
76
Well when it's an inconsistent 30fps the stutter is bad yes, especially when vsync'd. I actually found that with Crysis, I did achieve a console-like 30fps. The panning of the camera was fairly smooth, especially with the motion blur applied. It really felt like I was playing a modified 360 version. So I think it is a bit different when it's locked at the driver level.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
no even when its a steady 30 fps, it looks horrid in nearly every game to me. I played around with it quite a bit and I almost got sick to my stomach seeing all the jittery motion just panning around. the motion blur does help Crysis but it is still not a pleasant experience. if it was then I would certainly use the 30 fps vsync as that would save me from worrying about upgrading.
 
Last edited:

Xenphor

Member
Sep 26, 2007
153
0
76
Well do you not mind tearing then? Because if you aren't hitting 60fps and your monitor has a 60hz refresh then it seems to me that your only options are to leave vsync off and get massive tearing or vsync it and have very inconsistent jumps from 30-60. Obviously the ideal would be to either have a monitor that could deal with variable frame rates without tearing or always have a locked FPS at the highest refresh rate.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0
no I hate tearing but horrible jittering is even worse to me. I just take it one game at a time and put the settings at whatever works for that particular game. some games don't have bad tearing so luckily I don't always have to worry about staying at 60 fps and using vsync.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
I believe the data set that you're looking for is described by Tech Report's testing methodology.

For example:

bf3-99th.gif

bf3-percentile.gif

bf3-50ms.gif


Of course, it really depends on each game that you'll be playing in particular, and each flavor has their weaknesses and strengths depending on the engines they're running. That's ultimately your deciding factor.
 

Xenphor

Member
Sep 26, 2007
153
0
76
Those benchmarks are interesting, although I'm not reallly sure what they're saying. How exactly would the lowest ms card and highest ms card compare on screen? Would there be less tearing even with vsync off with a lower ms card? How about with vsync? Doesn't it depend on the monitor's refresh?

edit: Hm well I tried reading a bit more and sort of understand what they're going for although it does seem vary a lot by game and drivers. The fact that a card can lag or be interrupted during a benchmark and still report good numbers is what I'm talking about though. Ideally I think the problem should be solved with some external technology to handle the frames intelligently instead of just spamming them on there.
 
Last edited:

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
The jist of those graphs means less fluctuation and more consistency during rendering times. 99th percentile in a general synopsis of average frame times (minus the 1% of large spikes). Generally speaking, the lower the better for all of the graphs. Time spent beyond 50ms is cumulative and ideally identifies the total amount of large spikes in rendering times.
 

Xenphor

Member
Sep 26, 2007
153
0
76
But doesn't it still come down to display technology whether or not an image tears? I'm assuming that even if you took the best card out there and removed vsync, it would still tear regardless because of an LCDs refresh rate. Likewise, if the card is vsync'd and not maintaining a stable FPS then would it not stutter? I'm not sure how these things could be overcome on the graphics card level.

Now perhaps with CRTs it might be different? Although I'm pretty sure I still saw quite a bit of tearing with my old CRT. The only difference would be that you could have different refresh rates but how those would sync I don't know. I guess it doesn't matter now anyway.
 

pandemonium

Golden Member
Mar 17, 2011
1,777
76
91
In my experiences, you'll never be 100% video tear free. I'm still using a CRT for my primary monitor and have a 42" plasma connected as well. I rarely see tearing on my CRT, but the plasma has horizontal tearing between extreme contrasts pretty bad. Of all of the display generations and types I've witnessed, I don't remember seeing a single one that doesn't tear (usually at least partially horizontally) and I've given up trying to solve it going through several different versions of players, settings, refresh rates, codecs, gaming engines, hardware, connection types, etc...

I've looked into tearing a lot, and the way I've dumbed it down and understand it is when the graphics card generates a frame and sends it either too fast or too slow to the display and the display cuts to the next frame display partially. I realized that I don't remember a single DVD, BDR, or cable decoder box that has this problem and decided it's an issue with Directdraw and Direct3D for Windows (I'm speculating here, but this is where my research/troubleshooting left off and I can't seem to find any further definitive explanations). I've yet to mess with any flavors of Linux and would be curious to see if this isn't an issue there. Maybe someone with experience there can shed more light on the subject.

To answer your question, in my experiences, CRTs do have less tearing, but it's still not resolved.

I'm sure there's someone out there with more expertise on the subject. Again, these are just my experiences. >.<
 

Xenphor

Member
Sep 26, 2007
153
0
76
Hm that's interesting.

Yes the elephant in the room is that basically no other display device exhibits tearing to such a degree as a PC. I thought that perhaps things would be different now (first pc I built since using a TnT2 and Voodoo2) but not much has changed. Honestly I think the Voodoo is the only card I owned that seemed somewhat immune to tearing or maybe that's just my imagination.

I do have a CRT here but haven't bothered to test it. I know people say that it's somehow better but I'm not sure why that would be. I would be using an 85hz refresh rate so in that case I would have to maintain 85 FPS to get a good Vsync which seems worse to me than an LCD. Unless somehow using a CRT with vsync off is better than an LCD? I don't know why that would be if they both still have refresh rates.

You're point that it might be Windows is definitely something to consider. I've actually used Linux to some extent (actually another reason why I chose Nvidia -- suck it Linus) but not since I built this pc. Most of my time was spent with the Intel open source driver and integrated graphics so it's not the best judge. I mostly played ID games because they all support OpenGL and those ran pretty well. With the integrated graphics I had bigger problems than tearing though.

I would like to see a conclusive study as to why this problem exists, because, as you said, nothing else has it. (except some notable console games like NFS Most Wanted on 360, Bioshock on 360 [vsync can be turned off], Uncharted PS3)
 
Last edited:

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
30 fps is abysmal looking in most games and even worse when vsync is on. the stutter just when panning around is enough to ruin any game.

what I dont get is why console games dont look like crap at 30 fps.

Has this question ever been addressed? Why does 30 fps look better on console games?

Why does 24 fps look better on theater movies?
 

The_Golden_Man

Senior member
Apr 7, 2012
816
1
0
Sadly it seems like the only way to get a really fluid experience on PC is to keep the FPS very close to Vsync.

I've also experienced that in games like the Witcher 2 when I disable Vsync, I actually get no tearing as long as I can stay 75 -80FPS +. Strangely enough, even If I use a 60Hz monitor. Of course Vsync has no tearing, at least when maintaing 60Hz, but there will always be input lag.

Only games you can get a really fluid experience way below the 60FPS mark is Bethesda games like Oblivion, Fallout 3, New Vegas etc. You actually can clamp the FPS in their ini file. And if used in combination with a FPS limiter it's very smooth, even at 30 - 40FPS. However, when clamping FPS in the ini files, one must ensure to never fall below the Clamped FPS, or else the game will go in slow motion.
 

Rezist

Senior member
Jun 20, 2009
726
0
71
Sadly it seems like the only way to get a really fluid experience on PC is to keep the FPS very close to Vsync.

I've also experienced that in games like the Witcher 2 when I disable Vsync, I actually get no tearing as long as I can stay 75 -80FPS +. Strangely enough, even If I use a 60Hz monitor. Of course Vsync has no tearing, at least when maintaing 60Hz, but there will always be input lag.

Only games you can get a really fluid experience way below the 60FPS mark is Bethesda games like Oblivion, Fallout 3, New Vegas etc. You actually can clamp the FPS in their ini file. And if used in combination with a FPS limiter it's very smooth, even at 30 - 40FPS. However, when clamping FPS in the ini files, one must ensure to never fall below the Clamped FPS, or else the game will go in slow motion.

So we have bullet time in oblivion?
 

Yuriman

Diamond Member
Jun 25, 2004
5,530
141
106
Has this question ever been addressed? Why does 30 fps look better on console games?

Why does 24 fps look better on theater movies?

They don't, you just don't have a basis for comparison yet. Theater projectors give me a headache because of the low-fps tearing and flickering.

Actually, console games might look better in part because there are fewer high-motion scenes. Consider how fast you can turn with a joystick.
 

Xenphor

Member
Sep 26, 2007
153
0
76
Wow I just bought Stalker Call of Pripyat and I can't get tearing to go away no matter what I do. I've checked both Vsync and 60hz and still get tearing. If I choose one or the other I still get it. Even if I enable vsync through Nvidia I still get tearing. Bizarrely enough, if I leave everything off, the tearing is pretty much the same if not a little better. Makes no sense.

edit: I just tried Nexuiz (DX11, high) and the situation was similar to the Metro 2033 one. It was only after I limited the FPS to 30 in game as well as enabled vsync half refresh that I was able to maintain a fairly rock solid 30 FPS in the ingame benchmark. If I did not limit the FPS in the game then vsync half refresh still worked but there was more stuttering. I wonder if this setting would apply to Crysis 2 as well.

I'm thinking of trying Battlefield 3 maybe. Apparently gametime.maxvariablefps will limit the FPS in game. Hmm tempting but it's 40 bucks..
 
Last edited:

toyota

Lifer
Apr 15, 2001
12,957
1
0
Wow I just bought Stalker Call of Pripyat and I can't get tearing to go away no matter what I do. I've checked both Vsync and 60hz and still get tearing. If I choose one or the other I still get it. Even if I enable vsync through Nvidia I still get tearing. Bizarrely enough, if I leave everything off, the tearing is pretty much the same if not a little better. Makes no sense.
I just checked and vsync from the Nvidia cp works just fine for me.
 
Oct 16, 1999
10,490
4
0
no I hate tearing but horrible jittering is even worse to me. I just take it one game at a time and put the settings at whatever works for that particular game. some games don't have bad tearing so luckily I don't always have to worry about staying at 60 fps and using vsync.

This helped me reduce mouse movement jitter rather noticeably at lower frame rates, particularly the info in section 1.2:
http://www.overclock.net/t/173255/cs-s-mouse-optimization-guide
There's no substitute for more frames, but the way Windows handles mouse speed can really exacerbate perceived stutter.
 

KingFatty

Diamond Member
Dec 29, 2010
3,034
1
81
Theater projectors give me a headache because of the low-fps tearing and flickering.

Can a film-based theater projector exhibit tearing? Do the non-film-based theater projectors exhibit tearing - I never saw that but I usually go to a new AMC theater in Tysons Corner, VA.
 

toyota

Lifer
Apr 15, 2001
12,957
1
0

Xenphor

Member
Sep 26, 2007
153
0
76
Well I decided to take one for the team and bought Battlefield 3. I'll update with my findings.
 
Oct 16, 1999
10,490
4
0
is that just for mouse because its still jittery with a controller too though? and I think I am too lazy to read through all that anyway. lol

That's just specifically for the mouse. I started looking into it because I noticed moving with the mouse was so much choppier than the keyboard.
 

Xenphor

Member
Sep 26, 2007
153
0
76
I got Battlefield 3 and vsync half refresh works well at ultra settings. However, there is a negative because for some reason the game will not allow tearing when it drops below 30 fps. It's my understanding that the half refresh vsync is adaptive so it will allow the game to tear when under 30 fps instead of stutter. I would prefer this and it is what happens in other games. For some reason BF3 isn't letting the adaptive part of the vsync work. I got a BF3 settings editor and turned off triple buffering that had been enabled but that hasn't seemed to fix it.

edit: Well I've actually achieved the same effect I was looking for without the control panel. I limited the FPS to 30 with the settings editor and turned on vsync. There is still stutter though. One thing I noticed was that the input lag dramatically increased when using the vsync override in the CP vs. using the vsync setting in game. This happened even though the end result is vsync'd either way. So for now I suppose it's best to not touch the CP.
 
Last edited: