Is vsynch mostly for mediocre monitors?

Fern

Elite Member
Sep 30, 2003
26,907
174
106
I used to have 22" Viewsonic Professional monitor that I lucked into for cheap. It died in a rainstorm a while back :brokenheart: I replaced it with an inexpensive monitor and find I now often need vsynch.

The pro monitor did 120hz refresh rate at any rez my 9800p could handle (16 x 12 or 12 x 10 for newer games). The replacement doesn't do much above 60hz.

Am I correct in attributing this to the lame refresh rates of the new monitor?

Thanks,

Fern
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Fern
I used to have 22" Viewsonic Professional monitor that I lucked into for cheap. It died in a rainstorm a while back :brokenheart: I replaced it with an inexpensive monitor and find I now often need vsynch.

The pro monitor did 120hz refresh rate at any rez my 9800p could handle (16 x 12 or 12 x 10 for newer games). The replacement doesn't do much above 60hz.

Am I correct in attributing this to the lame refresh rates of the new monitor?
Thanks,
Fern

I don't see how vsync being enabled or not would have to do with the monitor itself in any way, since the effect is the same no matter which model CRT you are using. My guess is that the effect was more visible with a larger display area, or... the games that you happen to play now, as cmpared to then, make a difference. (Or that at higher refresh-rates, any visible anomoly with single frames, since they were displayed for a shorter period of time, were not as perceptible to you. In other words, with a high enough refresh rate, perhaps any tearing due to lack of vsync wasn't very noticable, because the display was "blurring together" due to persistance-of-vision effects. In effect, quite literally, your optical pathways were converting the spacially-displayed partial-frame info, into effectively temporal partial-frame info. Maybe that's what BFG10K was experiencing, and why he didn't seem to notice vsync-off related tearing issues. hmm.)

PS. What were you doing playing FPS on your computer out in the rain? LAN-party in the woods or something? :p
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Kinda, lower refresh rates will give you more tearing. If you have your monitor at 120hz and your framerate at 120fps you get one tear per refresh without vsync; where as at 60hz you will have two tears and the refresh lasts twice as long as well, so yeah lack of vsync will be more noticeable at lower refresh rates.
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
Hmm.. Looks like I don't get the tearing thingy.

OK, simply put I understood tearing occurs when your framerate is above your refresh rate. Your monitor is not able to keep up with the frames rendered by your card.

Conversly, no tearing on slow fps. Just a slideshow.

Thus, with less expensive monitors, hence lower refresh rates (RR). You'll more likey to have tearing. Conversly, with an expensive moitor offering very high refresh rates, no tearing

But I'm seeing a contridiction/prob with my understanding. D3 is where is saw tearing. And i didn't high have fps there. It's capped at 60fps (and was often lower ). So, no way to get FPS above RR's.

Yet, both of you seem to indicate a possible nexus between low RR's and tearing.

One reason this is now of interest to me is that I'm looking forward to a higher-end card than my 9800p. I am curious if this will increase any potential tearing probs. Ie, elevated FPS, while stuck with the same low RR's.

Would it be correct to say that there is always some form of tearing when your RR and FPS are NOT synch'd? Regardless of which is higher/lower. If so, is one form/type/cause of tearing worse than the other? I.e, tearing not as bad/noticeable when RR are higher than FPS (which both you seem to indicate could be true for the reasons you outlined). And the tearing which occurs when FPS is higher than RR is worse/more noticeable.

Thanks guys.

VL, had a little "roof leakage" during the huricanes (either Ivan or Fran, got hammered by both). Also, have not seen BFG10k's post
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
Tearing occurs when the video card supplies too many frames to the monitor. The monitor draws frame 1 and then begins to draw frame 2. What do you see?

Say a pillar is moving from the left to the right. The top part of the monitor will show the scene to the right one frame ahead of the one on the bottom because the one on the bottom hasn't been drawn yet. The one on the bottom is lagged behind the top one.

Well, I can't seem to put spaces in it, so I put underscores in front of the torn part.

(frame 2 starting, ahead of frame 1)
______|---------------------|
______|---------------------|
______|---------------------|
______|---------------------|
|---------------------|
|---------------------|
|---------------------|
|---------------------|
|---------------------|
|---------------------|
|---------------------|
|---------------------|
(frame 1)
(pillar as you see it)

As you can see in my "beautiful" drawing, the top part is drawn first and looks ahead of the bottom. So not sending enough information to the monitor doesn't have any effect on it. This is an easy way to remember it: the vertical shape of the pillar isn't synchronized with the bottom portion, so VSync wasn't enabled. That's why enabling VSync, or capping FPS prevents tearing. It sends no more than what the monitor can handle. It will still occur equally at higher refresh rates. If the video card sends 200 frames to the monitor in a second, the monitor rushes to draw them, whether at 60 Hz or 120 Hz vertical. It still can't handle 200 FPS. Because the torn frames are shown for only 1/120ths of a second though, it would be half as noticable as 1/60ths (2/120ths) of a second. I hope this clarifies the confusion. It took a little bit of thought.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Not exactly. Tearing is simply the visual result of frames being swaped mid refresh; so it will happen regardless of wether your framerate is greater than your refresh rate, less than your refresh rate, and even when they are equal but not synchronized. A monitor refreshes line by line, left to right from top to botom; and vsync insures that the monitor is always alowed to display a full frame each refresh before the videocard can swaps in a new one. without vsync the video card swaps out for a new frame just as soon as it has one ready, so you wind up with the monitor displaying parts of two or more seperate frames durning a single refresh. Obviously, the lower your framerate the less this will happen, but even at 1fps you are going to get a tear every second when running without vsync.
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
If the video card sends 200 frames to the monitor in a second, the monitor rushes to draw them, whether at 60 Hz or 120 Hz vertical. It still can't handle 200 FPS.

Is there a term which describes how fast a monitor can render frames?

Or, Are some monitors capable of rendering faster? If, so how can one tell from their specs?

Or, Are all monitors created equal, so to say, with respect to this function?

Thanks guys

Fern
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Fern
Would it be correct to say that there is always some form of tearing when your RR and FPS are NOT synch'd?
Yes. Exactly.

Originally posted by: Fern
VL, had a little "roof leakage" during the huricanes (either Ivan or Fran, got hammered by both).
Oh, sorry to hear that, didn't mean to make light of your troubles. (Just seemed strange that a CRT could get "rained on", since they are normally only used indoors.)
 

KoolDrew

Lifer
Jun 30, 2004
10,226
7
81
Is i best to have vsync enabled for gaming? I have heard it is best to enable for gaming, but when benchmarking disable it.
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
Originally posted by: KoolDrew
Is i best to have vsync enabled for gaming? I have heard it is best to enable for gaming, but when benchmarking disable it.

I didn't have it on b4 Doom3. Never saw tearing b4. But I played D3 on this crap monitor. I'd say don't use unless you get tearing and it bothers you. Some say it doesn't bother them.

It bothered me. The effect was like the monitor glass wan't flat. You know really old hand-made panes of glass in old windows. As you move around and look through 'em you get distortions.

As for benching. No point to enable vsynch. In my case all benchies would max-out at 60fps. It'll just lower your score.

If you're benching actual games, games where you'll need vsynch enabled, it's maybe informative to leave it on. But then again, it'll infuence your score downward. If I'm tweaking for the game though, I'd prolly leave it off. To me it seems an "arificial" influence.

I'm thinking I must look pretty cheeky. Here I'm the guy asking for advice in this thread. And yet dare to offer some ;)

Fern
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,003
126
Am I correct in attributing this to the lame refresh rates of the new monitor?
Yup. Assuming an identical framerate between two displays the display with the lowest refresh rate is the one most likely the experience tearing at any given time.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Turn it off 24/7. Having it on gets rid of artifacts, but your fps will drop by a good bit.(Not just the maximum either, your average fps will drop with vsync on.)
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: BFG10K
Am I correct in attributing this to the lame refresh rates of the new monitor?
Yup. Assuming an identical framerate between two displays the display with the lowest refresh rate is the one most likely the experience tearing at any given time.

Well, if they are mis-matched and vsync isn't enabled, then tearing will happen regardless, but it will likely be far more perceptable at lower refresh-rates. (I'm not directly disagreeing, just re-stating with a little more detail.)
 

Fern

Elite Member
Sep 30, 2003
26,907
174
106
Originally posted by: Fern
If the video card sends 200 frames to the monitor in a second, the monitor rushes to draw them, whether at 60 Hz or 120 Hz vertical. It still can't handle 200 FPS.

Is there a term which describes how fast a monitor can render frames?

Or, Are some monitors capable of rendering faster? If, so how can one tell from their specs?

Or, Are all monitors created equal, so to say, with respect to this function?

Thanks guys

Fern


Can I draw you attention to this line of questions?

Afraid it might get overlooked
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: dguy6789
Turn it off 24/7. Having it on gets rid of artifacts, but your fps will drop by a good bit.(Not just the maximum either, your average fps will drop with vsync on.)

Well sure your average is going to go down if your max goes down, as long as you can use triple buffering without exeding your memory limit then your minimum fps will be the same either. Personally, I don't like tearing so I tend to leave it on unless there is an issue like the lack of triple buffering in OpenGL with Nvidia drivers.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: Fern
Originally posted by: Fern
If the video card sends 200 frames to the monitor in a second, the monitor rushes to draw them, whether at 60 Hz or 120 Hz vertical. It still can't handle 200 FPS.
Is there a term which describes how fast a monitor can render frames? Or, Are some monitors capable of rendering faster? If, so how can one tell from their specs? Or, Are all monitors created equal, so to say, with respect to this function?
Thanks guys
Fern
Can I draw you attention to this line of questions?
Afraid it might get overlooked
Sorry, though that it was obvious. Monitors don't actually "render" anything, they display a continous analog RGB signal, with associated sync signals that signal the monitor when to do a horizontal or vertical re-trace, and start drawing all over again.

I guess I assume that what you are asking about is simply the refresh-rate of the display, which is controlled by the vertical-sync signal.

What the other poster wrote was slightly incorrect, the video card does not "send" frames at 200FPS, the video card sends a continous analog signal and the appropriate sync signals, and the monitor has a set of sync PLLs that "lock on" to the signal sent by the video card. So if the video card sets a 1024x768 @ 60Hz video-mode, then those vertical sync signals get sent sixty times a second, and that's the display frame-rate.

Now, the GPU may be able to render scenes at 200FPS to a back-buffer stored in memory on the video card, but if the display device (based on the sync signals) is refreshing at 60hz, then 60 (total) frames is all that will be displayed, period. This happens irrespective of whether vsync is enabled or not.

What vsync does is determine whether or not each of those displayed frames, is made up of a single whole rendered scene (vsync enabled), or multiple vertical bands, each a small snippet of one rendered scene (vsync disabled, and rendering frame-rate higher than display frame-rate). Any other side-effects, such as differences in input response, rendering frame-rate, etc. are side-effects of the game/software engine, and not a (direct) result of vsync settings.
 

xtknight

Elite Member
Oct 15, 2004
12,974
0
71
My apologies for the errors.. I'm not an expert in graphics but I was trying to explain it in a simple way.

Why do I seem to get tearing even when I do enable VSync? I have it forced in the control panel and selected "sync every frame" in-game, and I can strafe and still see tearing. My other PC seems to be fine with VSync enabled. A driver issue perhaps?

Also.. am I right that vertical refresh rate has nothing to do with LCDs? There aren't any scanning electron beams in an LCD, so what does it mean when I can select 60 Hz and 75 Hz under Windows?
 

obes4k

Member
Feb 18, 2005
30
0
0
while i noticed it seems to be less on LCD, it DOES still occur. I noticed it on Doom3, one of the most notoriously known games for tearing.

My question is: Does using DVI vs analog make any difference in tearing? DVI is obviously the route of choice if possible, but I'm just curious. Any of you techies know?
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: TheSnowman
Originally posted by: dguy6789
Turn it off 24/7. Having it on gets rid of artifacts, but your fps will drop by a good bit.(Not just the maximum either, your average fps will drop with vsync on.)

Well sure your average is going to go down if your max goes down, as long as you can use triple buffering without exeding your memory limit then your minimum fps will be the same either. Personally, I don't like tearing so I tend to leave it on unless there is an issue like the lack of triple buffering in OpenGL with Nvidia drivers.

No, maybe average was the wrong word. Say you have your refreshrate is at 100, and your fps in one spot is 70 without vsync. Turn vsync on and that 70 will drop even further reguardless of your maximum.

The difference between vsync on and off can be more than 10fps. Do a benchmark. Pick some spot in some level of a game that is a good bit below your refresh rate. Check the fps with vsync off and with it on. Turning vsync on not only limits your max fps, but it hinders your performance in general.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,587
10,225
126
Originally posted by: dguy6789
Originally posted by: TheSnowman
Originally posted by: dguy6789
Turn it off 24/7. Having it on gets rid of artifacts, but your fps will drop by a good bit.(Not just the maximum either, your average fps will drop with vsync on.)

Well sure your average is going to go down if your max goes down, as long as you can use triple buffering without exeding your memory limit then your minimum fps will be the same either. Personally, I don't like tearing so I tend to leave it on unless there is an issue like the lack of triple buffering in OpenGL with Nvidia drivers.

No, maybe average was the wrong word. Say you have your refreshrate is at 100, and your fps in one spot is 70 without vsync. Turn vsync on and that 70 will drop even further reguardless of your maximum.

The difference between vsync on and off can be more than 10fps. Do a benchmark. Pick some spot in some level of a game that is a good bit below your refresh rate. Check the fps with vsync off and with it on. Turning vsync on not only limits your max fps, but it hinders your performance in general.

Turning on triple-buffering (if supported by teh app) should fix that problem.
 

dguy6789

Diamond Member
Dec 9, 2002
8,558
3
76
Originally posted by: VirtualLarry
Originally posted by: dguy6789
Originally posted by: TheSnowman
Originally posted by: dguy6789
Turn it off 24/7. Having it on gets rid of artifacts, but your fps will drop by a good bit.(Not just the maximum either, your average fps will drop with vsync on.)

Well sure your average is going to go down if your max goes down, as long as you can use triple buffering without exeding your memory limit then your minimum fps will be the same either. Personally, I don't like tearing so I tend to leave it on unless there is an issue like the lack of triple buffering in OpenGL with Nvidia drivers.

No, maybe average was the wrong word. Say you have your refreshrate is at 100, and your fps in one spot is 70 without vsync. Turn vsync on and that 70 will drop even further reguardless of your maximum.

The difference between vsync on and off can be more than 10fps. Do a benchmark. Pick some spot in some level of a game that is a good bit below your refresh rate. Check the fps with vsync off and with it on. Turning vsync on not only limits your max fps, but it hinders your performance in general.

Turning on triple-buffering (if supported by teh app) should fix that problem.

It makes the problem less apparent, but does not get rid of the problem.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
No it gets rid of it by providing a second back buffer so the card can contenue to render just like there was no vsync while the front buffer is still only swapped on vsync. You can test this if you like with any benchmark where you can cap your framerate, such as ut2004. Set your max framerate at your refresh rate and the bench without vsync, then turn on vsync and triple bufferng and benchmark again, unless the extra frame buffer runs you out of vram your framerate will be the same either way.


Originally posted by: xtknight

Why do I seem to get tearing even when I do enable VSync? I have it forced in the control panel and selected "sync every frame" in-game, and I can strafe and still see tearing. My other PC seems to be fine with VSync enabled. A driver issue perhaps?

Yeah I have seen drivers go screwy like that before, unistalling them and installing them again tends to fix it.


Originally posted by: xtknight
Also.. am I right that vertical refresh rate has nothing to do with LCDs? There aren't any scanning electron beams in an LCD, so what does it mean when I can select 60 Hz and 75 Hz under Windows?

Well they still opperate on a refresh rate, but due to the time it takes for the pixels to rise and fall it isn't nearly the issue it is with CRTs.