V-Sync on or off

Mar 27, 2007
96
0
0
I have always turned v-sync on in all my games, when applicable. Yet, I just read anands article about the future of AA and was intrigued at his comment about disabling v-sync in the driver settings. whats the deal about v-sync. I have an FX-57 with 2 gigs of ram and a BFG 8800GTS 640mb OC2 vid card; Im sure I can handle AA, AF and v-sync all on. Is v-sync worthless when AA is on?? Is it a conlficting factor with AA being on. Or is it just a high end feature to turn on like 16xAA.
 
Dec 21, 2006
169
0
0
AFAIK, V-sync just keeps the monitor refresh in line with the frame refresh. Without it, there can be issues of screen tearing; however, with it newer cards or unstable drivers can cause issues.
 

2Xtreme21

Diamond Member
Jun 13, 2004
7,044
0
0
Originally posted by: shadowofthesun
AFAIK, V-sync just keeps the monitor refresh in line with the frame refresh. Without it, there can be issues of screen tearing; however, with it newer cards or unstable drivers can cause issues.

Close.. it keeps frame rate in sync with refresh rate. It's pointless to have enabled if you're on an LCD.
 

Lord Banshee

Golden Member
Sep 8, 2004
1,495
0
0
Originally posted by: 2Xtreme21
Originally posted by: shadowofthesun
AFAIK, V-sync just keeps the monitor refresh in line with the frame refresh. Without it, there can be issues of screen tearing; however, with it newer cards or unstable drivers can cause issues.

Close.. it keeps frame rate in sync with refresh rate. It's pointless to have enabled if you're on an LCD.

Pointless??? WHAT!

My current LCD(24" Dell) and my last(viewsonic 18") vsync off will 90% give tearing. I've always turned it on and i advise everyone to do the same.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: Lord Banshee
Originally posted by: 2Xtreme21
Originally posted by: shadowofthesun
AFAIK, V-sync just keeps the monitor refresh in line with the frame refresh. Without it, there can be issues of screen tearing; however, with it newer cards or unstable drivers can cause issues.

Close.. it keeps frame rate in sync with refresh rate. It's pointless to have enabled if you're on an LCD.

Pointless??? WHAT!

My current LCD(24" Dell) and my last(viewsonic 18") vsync off will 90% give tearing. I've always turned it on and i advise everyone to do the same.

Yeah, I dunno what he's talking about. You can definitely still get tearing artifacts on an LCD if vsync is disabled. Because of slower pixel refresh times it may be less noticeable, though.
 

TheRyuu

Diamond Member
Dec 3, 2005
5,479
14
81
Originally posted by: Matthias99
Originally posted by: Lord Banshee
Originally posted by: 2Xtreme21
Originally posted by: shadowofthesun
AFAIK, V-sync just keeps the monitor refresh in line with the frame refresh. Without it, there can be issues of screen tearing; however, with it newer cards or unstable drivers can cause issues.

Close.. it keeps frame rate in sync with refresh rate. It's pointless to have enabled if you're on an LCD.

Pointless??? WHAT!

My current LCD(24" Dell) and my last(viewsonic 18") vsync off will 90% give tearing. I've always turned it on and i advise everyone to do the same.

Yeah, I dunno what he's talking about. You can definitely still get tearing artifacts on an LCD if vsync is disabled. Because of slower pixel refresh times it may be less noticeable, though.

Isn't tearing suppose to be MORE evident on a LCD then a CRT monitor? <-IIRC

And I think v-sync is more of a personal taste option. I think it has more to do if you can or can't stand the tearing. V-sync poses more problems though like it has to be in denominations of your refresh rate (i.e. for 60hz, you'd be running at 60fps, 30fps, 15fps, etc...)

Which isn't always the greatest. Then there is triple buffering which requires more vram though. So v-sync is a trade off. If you have the horsepower to maintain the frame rate then sure, enable it. But I still think it comes down to personal taste more then anything. I can stand the tearing and really don't have a problem with it unless the game engine seems to tear A LOT. It has to tear A LOT for me to enable vsync.
 

secretanchitman

Diamond Member
Apr 11, 2001
9,352
23
91
v-sync is always turned on if possible for me...i dont get those random lines when i move around in games anymore. and besides...looks a little better too.
 
Mar 27, 2007
96
0
0
ive only today ever heard the term 'tearing'. is it not the effect of a strait line having sections of the line off to the side a bit, for a sec. i.e. the corner of a wall having a section one foot off to the side running parrallel to the inteded space? if thats the case, then yes I have definatly seen 'tearing' on my lcd monitor when v-sync was turned off. i was just curious if upon installing my 8800 card(coming tommorrow) should I leave v-sync on or off.

Im still curious why anand would perminantly turn it off for bench tests. what gamer buys 2,000+ dollars worth of computer parts, turns all settings on high, AA to 4x and AF to 16x but leaves v-sync OFF????

help me to understand this. as it is now, why anand would always leave it off but offer up benchies with 4x AA and all high settings confuses me.
edited for sp and grammer.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Im still curious why anand would perminantly turn it off for bench test. what gamer buys 2,000+ dollars worth of computer parts, turns all settings on high, AA to 4x and AS to 16x but leaves v-sync OFF????

help me to understand this. as it is now, why anand would always leave it off but offer up benchies with 4x AA and all high settings confuses me.

In general it's disabled for benchmarking, since:

1) it can lower overall performance slightly.

2) it constrains the maximum FPS that the system can put out. It wouldn't be very interesting to see a bunch of graphs with everything pegged at 60FPS.

Some people don't use it. For twitch shooter games, you can sometimes derive a small benefit by running it at a very high framerate, especially if it uses a game engine where your keyboard/mouse inputs are tied to the framerate in some way. Other people don't care about or notice the visual artifacts.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: danielackerman
Im still curious why anand would perminantly turn it off for bench tests. what gamer buys 2,000+ dollars worth of computer parts, turns all settings on high, AA to 4x and AF to 16x but leaves v-sync OFF????
As I understand v-sync, it's a bit complicated, but here goes:

1. V-sync locks frame rate to certain denominations of the monitor's refresh rate. Think of these denominations like steps on a stairway (and if you were to graph the benchmark with v-sync on, it would look like a bunch of right angles as in a stairway). The steps on a 60 Hz monitor (like many LCDs) are 15, 30, and 60 fps. So when your game runs > 60 fps, the framerate will be limited to 60 fps. When your game runs 59 fps, it will suddenly be locked to the next lowest denomination, which is 30 fps. When your game dips to 29 fps, the frame rate will be locked to the next lowest denomination, which is 15 fps. Ouch. Even if you're just one frame under, you go to the next lowest denomination until the frame rate gets back above that step.

2. The reasoning process of benchmarking on common settings to get performance expectations is flawed. There are such a wide variety of components and settings in use by the gaming community that no two readers' PCs will perform exactly alike. Any sort of performance expectations you could draw from an article would be guesses, at the very best. The whole point of the review is to show the relative strengths of a particular component--say, a video card--by comparing it to its competition in that same market. V-sync makes it impossible to show these comparisons because average fps will be highly dependent on that user's exact system configuration. If Anand's rig can do > 60 fps through most of the benchmark run while yours can only do 59 fps at best, his average fps will be much, much higher than yours.

These are worst case differences, of course, but the real-world differences are big enough that v-sync renders benchmarks practically useless for hardware reviews.
 

Fraggable

Platinum Member
Jul 20, 2005
2,799
0
0
I look at vsync sort of as a luxury. If my system can run the game with it at a playable rate then great, i"ll leave it on. If it stutters and I have problems playing with it on then I have to live with the tearing until I can upgrade and get past it.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Originally posted by: nullpointerus
Originally posted by: danielackerman
Im still curious why anand would perminantly turn it off for bench tests. what gamer buys 2,000+ dollars worth of computer parts, turns all settings on high, AA to 4x and AF to 16x but leaves v-sync OFF????
As I understand v-sync, it's a bit complicated, but here goes:

1. V-sync locks frame rate to certain denominations of the monitor's refresh rate. Think of these denominations like steps on a stairway (and if you were to graph the benchmark with v-sync on, it would look like a bunch of right angles as in a stairway). The steps on a 60 Hz monitor (like many LCDs) are 15, 30, and 60 fps. So when your game runs > 60 fps, the framerate will be limited to 60 fps. When your game runs 59 fps, it will suddenly be locked to the next lowest denomination, which is 30 fps. When your game dips to 29 fps, the frame rate will be locked to the next lowest denomination, which is 15 fps. Ouch. Even if you're just one frame under, you go to the next lowest denomination until the frame rate gets back above that step.

This is not actually how it works -- the 'steps' are not hardcoded or defined in any way, and you're not truly 'locked' to them -- but without triple buffering this is basically the effect you get.

With double buffering the game engine has to wait to start rendering the next frame until the current one is finished. If all the frames take about the same amount of time to render (which, in the short term, is usually the case), and it isn't quite fast enough to finish a frame before the next monitor refresh, it has to sit there and wait for the next refresh. So it ends up only being able to render on every other refresh, and your framerate appears to be 'locked' to half of the refresh rate. If it can't make it in two refresh cycles, you might end up effectively running at a third of the refresh rate. If only some of the frames take too long to render, you can still get intermediate FPS values.

With triple buffering the game can finish one frame and start on the next one in the middle of a refresh cycle. This avoids the jumpy performance you can get with double buffering if you can't consistently render frames quickly enough.
 
Mar 27, 2007
96
0
0
so if I turn v-sync on, i then should trun on triple buffering as well? i noticed the option for trip buffing is default off in the global driver settings.
 

gramboh

Platinum Member
May 3, 2003
2,207
0
0
Are there any adverse effects to forcing triple buffering with the D3D Overrider that comes with Rivatuner? With 640MB on my GTS I think I have enough memory to do it in most games?
 

I4AT

Platinum Member
Oct 28, 2006
2,631
3
81
I always turn Vsync off. If your card is incapable of keeping minimum framerates over 60/75, the cap gets dropped to a level your system can handle. I'd rather put up with a little tearing every once in a while than sacrifice performance.

But yeah, if you're running V-sync you should have tripple buffering enabled as well. I think this only applies to DX games though, and not OpenGL... but that was a long time ago, things may have changed?
 

IlllI

Diamond Member
Feb 12, 2002
4,927
11
81
Originally posted by: gramboh
Are there any adverse effects to forcing triple buffering with the D3D Overrider that comes with Rivatuner? With 640MB on my GTS I think I have enough memory to do it in most games?

the only thing i remember is, th higher the resolution, the more video memory it takes. that and some games just dont like to have it enabled.

 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: I4AT
But yeah, if you're running V-sync you should have tripple buffering enabled as well. I think this only applies to DX games though, and not OpenGL... but that was a long time ago, things may have changed?
The Nvidia control panel setting for triple buffering only applies to OpenGL apps. One can use DXTweaker or RivaTuner to force triple buffering on DX games, but as clandren says, it doesn't work on all games. Also, I couldn't manage to get DX triple buffering working in Vista (although that might just be a temporary problem).