Why use VSync?

Lorn

Banned
Nov 28, 2004
2,143
0
0
I've noticed a lot of people stress a lot on the use of VSync in their games. I usually notice that you get less FPS when it is in use. What is the point of this?
 
Mar 19, 2003
18,289
2
71
Mainly the point is to reduce image tearing caused by the monitor displaying parts of different frames at the same time. I notice this a lot more on my LCD than I did on my CRT too, likely because it's running at 60Hz instead of the 85Hz I used to use on the CRT.

Also, it isn't really necessary to be rendering more frames than you can display anyway.
 

Lorn

Banned
Nov 28, 2004
2,143
0
0
Originally posted by: SynthDude2001
Mainly the point is to reduce image tearing caused by the monitor displaying parts of different frames at the same time. I notice this a lot more on my LCD than I did on my CRT too, likely because it's running at 60Hz instead of the 85Hz I used to use on the CRT.

Also, it isn't really necessary to be rendering more frames than you can display anyway.
Great, thank you.
 

JBT

Lifer
Nov 28, 2001
12,094
1
81
Tearing. I can't stand it. I don't understand how people can even play with out it.
 

Genx87

Lifer
Apr 8, 2002
41,091
513
126
The only time I have ever had to was with Doom3 and my 6800GT. I was getting really bad tearing that required VSynch.

 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
The question is why NOT use it? The only time VSync should be off is if you are benchmarking. Otherwise, there is no effective difference in FPS and it prevents tearing. Corrected: if your card cannot keep up with the sync, you will notice a significant drop. However, in games where you can yield more than [refresh rate] fps, then there won't be any noticable difference.
 

Cdubneeddeal

Diamond Member
Oct 22, 2003
7,473
3
81
I for one don't use vysync. Yes the graphics seem better but I noticed it's harder to turn and shoot (CS)
 

zakee00

Golden Member
Dec 23, 2004
1,949
0
0
Originally posted by: xtknight
The question is why NOT use it? The only time VSync should be off is if you are benchmarking. Otherwise, there is no effective difference in FPS and it prevents tearing.

Because Nvidia is retarded and the 6800's don't support Tripple Buffering. AKA, if your card can't render FPS fast enough to keep up with the refresh rate, the FPS drop down to about 30. With ATI's tripple buffering, this dosn't happen. So it's either 60FPS@60Hz or 30FPS. No in-between. See for yourself, start up a demanding game w/ VSync and watch the FPS.
With ATi, if your refresh rate is 60Hz, then if your card cant keep up with 60FPS, it will drop to 45, then to 30. there is an in-between step that makes it more playable. I can_not play CSS at 30FPS, so i leave vsync off at all times. i can live with tearing.
 

eBauer

Senior member
Mar 8, 2002
533
0
76
I use in a few instances depending on the game, but for the the most part keep it disabled. I ususually cap the framerate at 150 (with a refresh rate of 100Hz). Don't notice any tearing whatsovever, and to me mouse movement in games seems superior as opposed to leaving v-sync enabled.
 

Torghn

Platinum Member
Mar 21, 2001
2,171
0
76
What happens when your frame rate drops below your refresh rate and vsync is on? 40fps still looks smooth to me, but my refresh rate is at 75.
 

eBauer

Senior member
Mar 8, 2002
533
0
76
Try this: disable v-sync, go in Call of Duty (or any QIII engine game for that matter), open up the console and type in

/seta com_maxfps 60

then compare it to:

/seta com_maxfps 80

/seta com_maxfps 100

/seta com_maxfps 150

For me personally, anything under 100 seems inferior when it comes to how fluid/smooth the mouse is (and this is coming from a notebook LCD with a refresh rate of 60Hz)
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: zakee00

Because Nvidia is retarded and the 6800's don't support Tripple Buffering. .

The 6800s support triple buffering fine just like all Nvidia cards. The problem is the drivers simply don't enable it by default in D3D or let us force it in Opengl, but triple buffering will work fine game has the option such as in the .ini file of unreal engine games. I'm not sure exactly what Nvidia has against tear free image quality will smooth framerate, but they obviously don't care to support such thing with their drivers.
 

ribbon13

Diamond Member
Feb 1, 2005
9,343
0
0
To the framerate people; Your human. You can't differentiate above 40fps, just like you can't hear above 21khz. Refresh rates do matter because of 60hz AC noise.
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
i use vsync on older games cuase if i didnt theyd run @ 1000+fps.
alot of older games dont run correctly that fast.

smooth as silk tho i tell ya :D
 

THUGSROOK

Elite Member
Feb 3, 2001
11,847
0
0
Originally posted by: ribbon13
To the framerate people; Your human. You can't differentiate above 40fps, just like you can't hear above 21khz. Refresh rates do matter because of 60hz AC noise.

go away before we put a dunce cap on your head
 

JonnyBlaze

Diamond Member
May 24, 2001
3,114
1
0
Originally posted by: THUGSROOK
Originally posted by: ribbon13
To the framerate people; Your human. You can't differentiate above 40fps, just like you can't hear above 21khz. Refresh rates do matter because of 60hz AC noise.

go away before we put a dunce cap on your head

lol

its already there.

what does 21khz have to do with 40fps?

if you have a halfway decent video card, even if you dont, throw up an old game like q3 or q2, run the game with vsync on, capped at 40, 60, 80, ect and then try and tell me you cant notice it. iv done it all the way up to 160hz and prefer 85+

JB
 

Extrarius

Senior member
Jul 8, 2001
259
0
0
Originally posted by: xtknight
The question is why NOT use it? The only time VSync should be off is if you are benchmarking. Otherwise, there is no effective difference in FPS and it prevents tearing.
Not true. Let's pretend your monitor only does 60Hz refresh rate. If you're geting 59FPS without VSync, that means 59 frames per second, with a little bit duplicated between frames to stretch it to the 60Hz. Each frame takes ~0.0169 (1/59) seconds to render and everyhing is fine. You turn on VSync. Now, each frame has ~0.0166 (1/60) seconds to render, but that isn't enuogh time. Thus, the frame renders and a vsync is missed, the game waits for the next vsync, and the frame is displayed. This process is repeated, giving each frame two vsync cycles since it can't render fast enuogh to only take one. You get 30FPS, which is only slightly more than half what you had w/o VSync.

VSync can easily make a HUGE difference in FPS reguardless of your refresh rate, but then again it might not. It just depends on the refresh rate and the amount of time it takes to render the frame.
Tripple buffering can help eliminate the framerate drop, but it also introduces an artifact where the image will lag input by some amount which might be noticable (even if you're not consciously aware of it, the game might 'feel weird').

As others have said, VSync can eliminate tearing, which is why some people consider it a good thing.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Triple buffering just adds a third framebuffer for the chip to draw to instead of waiting for the backbuffer to clear on vsync, it doesn't add any latency over vsync.