Why use VSync?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
So in that sense, at least with vsync enabled, IF you have enough VRAM, triple-buffer is still a win, performance-wise.
Yes, if you have enough. I'm referring to the situation of data swapping between main system RAM and VRAM, something that's more likely on a TB system than a DB one, especially in high resolution/detail settings.

I'd much rather give up the third buffer, turn off vsync and raise the image quality (see my last point below).

no refresh rate yet"? Forgive me, but I can't quite parse that, and I don't want to make an assumption about what you mean. Could you clarify?
Simply that the refresh scanline is at the middle of the update (i.e. it's not synced so it's not ready to take another frame)

but I thinkthat you are wrong about the "stalls", when combining triple-/N-buffering with vsync, and even with double-buffer without vsync, frames are NOT displayed "immediately" - they are still drawn to an off-screen buffer, and pageflipped, resulting in at least *some* latency between the rendered scene and the displayed scene.
Yes but a vsynced system will only display full frames so it will never display the partial in-between frames that are generated over and above the refresh rate. Two partial frames blended into one refresh cycle provide more information than just a single full frame.

Watch the scenes as they are drawn! The ultimate in low-latency display updates!
I think it's generally accepted that double buffering is the bare minimum required for smooth animation, even for basic 2D work. Single buffered animation just causes too much flickering and hence needs to be buffered off screen to fix the issue.

But by the same token, the user would have also been missing part of the frame too. (Since it was overwritten during the display update with a partial frame of the next scene.)
True, but partial frames still convey more information than no frames.

Again, rendering frame-rate, or display frame-rate?
Display.

That is also why I suggested in an earlier post that if one preferring vsync disabled, then one would likely also prefer all the eye-candy (AA/AF, etc.) to be disabled too, since they obviously preferred speed over quality.
I actually do both; rather than impose artificial caps with vsync I use quality settings to naturally bring down the frame rate if I've got power to burn.

PS. Just as a curiousity, have you ever played Quake3 with vsync disabled, on a modern machine? I haven't, I'm kind of curious what it looks like myself, since it used to often be used for benchmarks, and would usually get some insane numbers like 400+ FPS.
I never use vsync and I don't tend to ever retire games either (heck, I'll still play GLQuake whenever I feel like it).

As for how I play Quake 3, I currently run it at 1920x1440, 16x trilinear AF and 4xAA on a 6800U and that yields average framerates of around 120 FPS -150 FPS. It looks and runs great and again, tearing is extremely rare.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,205
126
Originally posted by: BFG10K
Yes but a vsynced system will only display full frames so it will never display the partial in-between frames that are generated over and above the refresh rate. Two partial frames blended into one refresh cycle provide more information than just a single full frame.
In terms of sensing motion, yes, I do agree with that.

Originally posted by: BFG10K
Watch the scenes as they are drawn! The ultimate in low-latency display updates!
I think it's generally accepted that double buffering is the bare minimum required for smooth animation, even for basic 2D work. Single buffered animation just causes too much flickering and hence needs to be buffered off screen to fix the issue.
Sure. I guess I just pointed that out, to make the case that the difference between vsync/no-vsync is very similar, although not as pronounced in the absolute sense. (Since the buffer-clears in the single-buffering case would make the entire image flicker pretty badly, I would assume.)

Originally posted by: BFG10K
But by the same token, the user would have also been missing part of the frame too.
True, but partial frames still convey more information than no frames.
I guess the desirability of that comes down to whether or not the viewer is more interested in the static (per-frame) image information, or the partial-scene motion information. It could be misleading though too, as there is no direct control over where the seperation between the partial-scene occurs on the display device. (The "tearing" line.) So in terms of the image that the user is actually seeing from the display device, could be variably one or two (or more) frames old. That could also be a drawback, if instead of tracking your own motion through the world (in a FPS), you could be tracking someone in your target sights that is moving, and you have to lead ahead by a few frames to hit them. Quick, is he one or two frames ahead? How would you know? You would have to take into account whether or not the target image was above or below the "tearing line". At least with vsync on, you will have a consistent (generally) frame-rate/frame-latency for targeting.

Originally posted by: BFG10K
That is also why I suggested in an earlier post that if one preferring vsync disabled, then one would likely also prefer all the eye-candy (AA/AF, etc.) to be disabled too, since they obviously preferred speed over quality.
I actually do both; rather than impose artificial caps with vsync I use quality settings to naturally bring down the frame rate if I've got power to burn.
You know that I'm going to have to disagree with that again. Didn't we just get done discussing why enabling vsync, does NOT "impose artificial caps" on your (rendering) frame-rate? (Well, it doesn't have to, although in some simplistic (and semi brain-dead) game engine implementations they can be dependent upon one another. But in the general sense it doesn't force that to be true.)

It does eliminate additional partial-frame updates from being displayed though. Honestly, the best solution is to increase your display device's frame-rate, to as closely match your rendering frame-rate as possible, and leave vsync enabled. (Well, in my opinion.) Or get a card who's drivers can implement true temporal AA, as you will see all of the visual information generated (rendered) at say, 90 FPS, but temporally blended (smoothly) into 60 displayed frames. (Kind of ironic that I would suggest that, compared to how many people complain about ghosting in FPS games when using an LCD. I personally find "ghosting" to be useful for targeting moving objects though. It's not always a bad thing IMHO.)

Originally posted by: BFG10K
PS. Just as a curiousity, have you ever played Quake3 with vsync disabled, on a modern machine? I haven't, I'm kind of curious what it looks like myself, since it used to often be used for benchmarks, and would usually get some insane numbers like 400+ FPS.
I never use vsync and I don't tend to ever retire games either (heck, I'll still play GLQuake whenever I feel like it).
I guess that's a .. "yes"? Assuming a display refresh-rate of 75Hz, that would mean that each displayed frame, would contain ~5.3 seperate "slices" of different rendered frames (at a rendering frame-rate of 400FPS), and whenever you moved horizontally, the onscreen image would shift, like a sliding ziggurat almost. Do you see something like that?

Originally posted by: BFG10K
As for how I play Quake 3, I currently run it at 1920x1440, 16x trilinear AF and 4xAA on a 6800U and that yields average framerates of around 120 FPS -150 FPS. It looks and runs great and again, tearing is extremely rare.
Hm.. I wonder.. could you have the game configured with vsync disabled, and the video driver internally enabling vsync (for display), and doing the video-driver-buffering/render-ahead thing? At those frame-rates, I wonder if one would even notice if that were true? Btw, what refresh-rate do you set your display to, when you have rendering frame-rates of 120-150 FPS?

I will grant you another point, though - for those games that do cap rendering frame-rates to display frame-rates with vsync on, allowing higher rendering frame-rates in terms of in-game motion can definately be an advantage for FPS-type games.

But that's due to the game engine's (poor) design, not anything that is directly and inherently the fault of vsync itself.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,205
126
Btw, in terms of the extra VRAM that triple-buffer takes up - using your Quake3 example, assuming 1920x1440 @ 32bpp, that's ~11MB of data, just for one frame's buffer. For a more "normal" gaming resolution, say 1024x768 @ 32bpp, that's only ~4MB. For a 256MB card (one would assume that someone that plays FPS games at that res has a pretty powerful card), that's still only ~1/25's of the card's VRAM extra, in exchange for smoother overall frame-rates because the GPU doesn't have to stall then.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
You would have to take into account whether or not the target image was above or below the "tearing line".
Not really. You're really overcomplicating things, especially about something that just doesn't happen that often.

At least with vsync on, you will have a consistent (generally) frame-rate/frame-latency for targeting.
With vsync on you will never see anything more than full frames that are tied to the refresh rate. That's a disadvantage since you can't see any of the data in between. Not to mention that input response (especially mouse) is far higher when framerate caps such as vsync are removed because more frame information is being displayed.

Didn't we just get done discussing why enabling vsync, does NOT "impose artificial caps" on your (rendering) frame-rate?
I can't really tell if you're nitpicking or not. I thought it's quite clear when we're discussing framerate we mean the GPU side, not the game's side. We also mean displayed framerate, as in visual feedback.

Besides, a limit of X FPS per second where X is your monitors refresh rate is a cap, regardless the type of FPS.

and whenever you moved horizontally, the onscreen image would shift, like a sliding ziggurat almost. Do you see something like that?
Not really. I might play an hour and see 2-3 tears at the most and even when they happen they're so faint I barely see them.

I think you have this strange idea that a non-vsynced system plays like a flickering 50 Hz flourescent light or something. That just doesn't happen.

Hm.. I wonder.. could you have the game configured with vsync disabled, and the video driver internally enabling vsync (for display), and doing the video-driver-buffering/render-ahead thing?
Vsync is off in the driver and in the game but it doesn't really matter about the game as the driver overrides it anyway. com_maxFPS is off too.

Btw, what refresh-rate do you set your display to, when you have rendering frame-rates of 120-150 FPS?
73 Hz.

assuming 1920x1440 @ 32bpp, that's ~11MB of data,
For a 2D desktop image. Add a 32 bit Z/stencil to it and that's 21 MB, then start adding the likes of AA buffers and it soon balloons out of control.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: BFG10K


assuming 1920x1440 @ 32bpp, that's ~11MB of data,
For a 2D desktop image. Add a 32 bit Z/stencil to it and that's 21 MB, then start adding the likes of AA buffers and it soon balloons out of control.

rendering the image takes more memory sure, but holding the extra frame in the third buffer doesn't cost anymore than a 2d image of the same size and color depth.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,205
126
Originally posted by: BFG10K
You would have to take into account whether or not the target image was above or below the "tearing line".
Not really. You're really overcomplicating things, especially about something that just doesn't happen that often.
Are you saying that "tearing" doesn't happen very often, even with vsync disabled? Maybe you just don't notice it.

If you claim that being able to see an "in-between" rendered frame, as a partial display frame, is so critical that you must disable vsync, then how can you at the same time argue against the very same sort of thing being timing-critical, as to whether or not the partial-display-frame image that you are seeing, is one (rendering) frame "behind" or two? You are illogically arguing against yourself.

IOW, it's so critical that you see that information, as a partially-displayed frame, and yet, it's not at all critical as to which rendered scene that partial-frame's worth of information belongs to? If that were true that it wasn't important, then what would be wrong with enabling vsync, and simply seeing that information during the next (whole) display frame? You can't have that argument both ways.

I'm not overcomplicating things at all, I'm simply extending the same argument that you made originally.

Originally posted by: BFG10K
At least with vsync on, you will have a consistent (generally) frame-rate/frame-latency for targeting.
With vsync on you will never see anything more than full frames that are tied to the refresh rate. That's a disadvantage since you can't see any of the data in between. Not to mention that input response (especially mouse) is far higher when framerate caps such as vsync are removed because more frame information is being displayed.
Didn't we just get through discussing why that is not necessarily the case? Sigh. Input latency is not (directly) related to rendering frame-rate, at least in most game engines. But at least with vsync enabled, input latency is consistent, (at least in the absence of GPU rendering lag), and not per-display-frame variable as it would be in the presence of tearing/lack of vsync.

Originally posted by: BFG10K
Didn't we just get done discussing why enabling vsync, does NOT "impose artificial caps" on your (rendering) frame-rate?
I can't really tell if you're nitpicking or not. I thought it's quite clear when we're discussing framerate we mean the GPU side, not the game's side. We also mean displayed framerate, as in visual feedback.
I was clarifying for readers, because obviously, the display frame-rate is fixed (at your CRTs refresh rate), so in terms of capping frame-rates, that could only refer to the rendering frame-rate. I have no idea what you mean by "the game's side", that's far too vague to have any meaning. I've been trying to stick to specifics here, to ensure clarity. But enabling vsync, does not impose any caps whatsover on rendering frame-rate. (Well, as long as something more than just double-buffering is used, that is.) If you still believe that after this discussion, then I'm sorry. I tried to help you understand that.

Strangely, I'm starting to get the feeling, that you choose to limit the game's ability to render continuously, by forcing double-buffering only, because you believe that it cuts down on input latency, and then in the same breath you demonize vsync, because when limited by double-buffering, I suppose most simply-written game engines might well start to limit the rendering frame-rate to the display frame-rate. (Even though there are ways around it, but it is often the easiest implementation.) Yet you fail to realize that you brought that upon yourself by disabling the game's ability to do triple-/N-buffering, also failing to understand that most game engines that support both, have a similar input latency whether or not you are using double- or triple-buffering. Am I getting close here to understanding your POV?

This would be the case with game engines in which the "game tick" is not internally independent, but slaved to the display frame-rate somehow. This was mostly true with 2D scrolling games, especially those designed to run on a system with a fixed display frame-rate like console games. But most 3D FPS games on the PC are not like that, AFAIK, and have their own internal tick-rate independent of the display's frame-rate.

(Quake 3 is well-known to be "broken" in this regard, in that changes to your display frame-rate, effect the rendering frame-rate, and thus also affect the game's physics engine. That's a good example of poor game-engine design. In contrast, Doom3 fixed this deficiency, by implementing an internal game tick rate, indepedent of the rendering or display frame-rate. link)

Originally posted by: BFG10K
Besides, a limit of X FPS per second where X is your monitors refresh rate is a cap, regardless the type of FPS.
The display's frame-rate is fixed, not capped. Capped implies variability, up to some maximum. That also implies that it could drop below it as well. Even if one were to accept that the terminology of "capped" was correct, when referring to the fixed frame-rate of the display device, it would not be "artificial", since it is a natural component of how CRTs operate, nor would the existance of that "cap" have anything to do with enabling nor disabling of vsync. IOW, a display device's fixed refresh rate is not, nor should it be referred to as a "cap", and it has nothing to do with vsync being enabled or disabled. It has to do with the sync frequency PLLs in the display device itself.

If you wish to continue to believe that "enabling vsync imposes artificial caps", even in the face of clear evidence and explaination otherwise, then by all means, do so. But please refrain from continuing to spread that mis-information publically.

Normalizing displayed frames to integral multiples of the rendering frame-rate, yes, vsync does that.
"Imposes artificial caps" on rendering frame-rate, no, vsync does not do that. Period.
A game engine might, but that is not a function of vsync, it's a function of the game's engine slaving its time-base to the display device's frame period. Suggesting that this is true for all game engines, or that vsync alone is purely the cause of that, is false,

Originally posted by: BFG10K
and whenever you moved horizontally, the onscreen image would shift, like a sliding ziggurat almost. Do you see something like that?
Not really. I might play an hour and see 2-3 tears at the most and even when they happen they're so faint I barely see them.
I think you have this strange idea that a non-vsynced system plays like a flickering 50 Hz flourescent light or something. That just doesn't happen.
No, not flicker, "tearing". I'm starting to think that you have no idea what that actually is or looks like.

Originally posted by: BFG10K
Btw, what refresh-rate do you set your display to, when you have rendering frame-rates of 120-150 FPS?
73 Hz.
Alright, that means that you should see between 2 and 3 "tears" onscreen at once. Are you saying that you don't see them at all? If so, then that is why I suggested that perhaps the video driver is implementing some sort of render-ahead, combined with enforced vsync for display... but that would tend to mitigate ever seeing a "tear" at all. If, by simple math, it is required to be true that there are 2-3 "tears" per display frame, even conceeding that one of them might be hidden by being present during a retrace period, that still leaves at a minimum at least one visible "tear" line per displayed frame. If you claim to only see 2-3 "tears" per hour, at most, then you clearly either don't know what you are looking for, or somehow cannot even see them. Have you seen an optometrist lately?

Originally posted by: BFG10K
assuming 1920x1440 @ 32bpp, that's ~11MB of data,
For a 2D desktop image. Add a 32 bit Z/stencil to it and that's 21 MB, then start adding the likes of AA buffers and it soon balloons out of control.
Still seems so strange that you are both at once concerned about the rendering frame-rates so much that you disable vsync, and yet are so concerned about excessive VRAM usage from the render-buffers, but don't disable those eye-candy extras in order to mitigate what you claim would be a massive slowdown due to texture-memory thrashing. I'll give you a hint - "smart" game engines, when faced with a limit on the amount of VRAM available for storing textures, will degrade the texture-quality used, usually by adjusting the LOD and MIP-maps used. So in most cases, you really wouldn't notice it either way, unless there is a game that really relies on high-res detail textures as a critical facet of the gameplay.

Again, I'm not finding any fault with your personal preference towards vsync-disabled, only that your reasoning behind it as given... doesn't make any logical sense - nor are your blanket criticisms regarding vsync and triple-buffer settings accurate.
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
Maybe you just don't notice it.
Maybe; in any case it's not really a factor.

then how can you at the same time argue against the very same sort of thing being timing-critical, as to whether or not the partial-display-frame image that you are seeing, is one (rendering) frame "behind" or two?
And by extension using your reasoning if part of the frame is behind in a non-vsynced system then that implies the whole frame is behind in a vsynced system.

IOW, it's so critical that you see that information, as a partially-displayed frame, and yet, it's not at all critical as to which rendered scene that partial-frame's worth of information belongs to?
I'm sorry, are you expecting the frames to somehow mix-and-match or something? To jump places? The difference between each frame is tiny so it's not like anyone's going to be sitting there and saying "hmm, which frame does that leg belong too?"

Input latency is not (directly) related to rendering frame-rate, at least in most game engines.
It doesn't matter because at the end of the day it all comes down to perceived latency which comes directly from visual information in the form of displayed frames per second. If you don't think so cap your favourite game to 5 FPS and let me know how smooth those 360 degree turns are. What the engine is doing means squat of there aren't enough frames to display it properly and if we followed your reasoning to its logical conclusion you'd almost be arguing that turning off the monitor should make no difference to the gaming experience.

But at least with vsync enabled, input latency is consistent
Consistently higher.

I have no idea what you mean by "the game's side", that's far too vague to have any meaning.
Do you understand that what the game is doing is independent of what is being displayed? Do you also understand I'm discussing displayed frames in terms of direct visual feedback?

But enabling vsync, does not impose any caps whatsover on rendering frame-rate.
It caps the displayed framerate to full discrete frames which are tied to the refresh rate.

Strangely, I'm starting to get the feeling, that you choose to limit the game's ability to render continuously, by forcing double-buffering only,
WTF? What exactly am I limiting by running a DB + non-vsynced system?

because you believe that it cuts down on input latency,
A belief that is founded by fact.

Am I getting close here to understanding your POV?
No, you're not. What I'm saying is more partially displayed frames generate better response than less full frames. It's all about visual feedback yet you keep talking about game ticks, GPU rendering and display refresh rate. The best engine in the world is useless if its results can't be properly displayed.

The display's frame-rate is fixed, not capped.
Here we go again.

Capped implies variability, up to some maximum.
The maximum is the problem.

That also implies that it could drop below it as well.
Well I suppose it does but that isn't really relevant since this is all about maximums.

nor would the existance of that "cap" have anything to do with enabling nor disabling of vsync.
Getting X full frames on the screen where X is refresh rate compared to Y partial frames where Y > X is a cap.

If you wish to continue to believe that "enabling vsync imposes artificial caps",
120 FPS engine rendering to 60 Hz:

vsync: 60 full frames per second displayed.
non-vsync: 120 partial frames per second displayed.

Vsync is imposing a 60 FPS cap, plain and simple. And before you start talking about display refresh rate, a 60 Hz monitor conveys more information from 120 partial frames than it does from 60 full frames. That leads to better visual feedback and better controller feedback.

Normalizing displayed frames to integral multiples of the rendering frame-rate, yes, vsync does that.
AKA cap. Now we're starting get somewhere.

I'm starting to think that you have no idea what that actually is or looks like.
And I'm fairly certain you've never run a non-vsynced system ever. In fact I'm even wondering how much gaming you've actually done and how many games you've actually played.

Are you saying that you don't see them at all?
No, I'm saying I rarely see them.

If so, then that is why I suggested that perhaps the video driver is implementing some sort of render-ahead, combined with enforced vsync for display...
Perhaps...but then every major vendor since 1997 would have to have been doing this, not to mention that I'd never see tearing and my FPS would always report at the same level as refresh rate. Of course neither scenario is true so I doubt your theory is valid.

Have you seen an optometrist lately?
Have you played a game lately?

Still seems so strange that you are both at once concerned about the rendering frame-rates so much that you disable vsync, and yet are so concerned about excessive VRAM usage from the render-buffers, but don't disable those eye-candy extras in order to mitigate what you claim would be a massive slowdown due to texture-memory thrashing.
I find the concept of capping my framerate and then using more resources to solve a self-induced problem rather silly. A better option is to not introduce the problem to begin with and put those resources to more beneficial purposes.

So in most cases, you really wouldn't notice it either way, unless there is a game that really relies on high-res detail textures as a critical facet of the gameplay.
Texture swapping is evident in dozens and dozens of games of varying ages and engines. Again I encourage you to buy a few titles and try them out for yourself.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,570
10,205
126
Originally posted by: BFG10K
then how can you at the same time argue against the very same sort of thing being timing-critical, as to whether or not the partial-display-frame image that you are seeing, is one (rendering) frame "behind" or two?
And by extension using your reasoning if part of the frame is behind in a non-vsynced system then that implies the whole frame is behind in a vsynced system.
In the case in which the rendering frame-rate is higher than the display frame-rate, then yes, every displayed scene that you are viewing, is "lagged" by at least a small fraction of a frame. With vsync enabled, once a frame is started to be displayed, it remains displayed for the entire display frame period. It is not "interrupted" mid-frame to display another rendered frame, starting from some generally non-deterministic scanline onwards.

Originally posted by: BFG10K
IOW, it's so critical that you see that information, as a partially-displayed frame, and yet, it's not at all critical as to which rendered scene that partial-frame's worth of information belongs to?
I'm sorry, are you expecting the frames to somehow mix-and-match or something? To jump places? The difference between each frame is tiny so it's not like anyone's going to be sitting there and saying "hmm, which frame does that leg belong too?"
Your claim is that, with vsync enabled, and a display frame-rate (say 60fps) lower than the rendering frame-rate (say 90fps), you will miss out on critical information, caused by the fact that only 60 out of those 90 frames are displayed. So if the difference between the scenes in successively-rendered or displayed frames is "so tiny", then there shouldn't be any realistic dis-advantage to leaving vsync enabled, is there?

I'm simply pointing out the contradictory argument that you are making here, that on one hand it is critically important to be able to observe a half-scene's worth of information displayed one display-frame earlier, and yet, when I point out that with vsync disabled, thus causing tearing, a similar delay of one display-frame exists, for the portion of the scene displayed above the tearing line, you claim that is nothing of concern.

So, is a latency of one display-frame meaningful or not for the information that you are observing? One or the other. You can't argue that both ways.

Unless the game is using a single buffer for both rendering and display (we touched upon that earlier - it would be far too unsightly to actually use), then there will always be at least some latency between the rendered and displayed frame, due to the fact that rendering occurs to a back-buffer and is then page-flipped onto the display. The question is whether that latency is consistent, for all portions of the displayed scene, and that is true only with vsync enabled. It then also follows, that if it is not consistent, and a single-frame lag is in fact timing-critical, then one has to make judgements about the information being displayed, as to which prior-frame portion of the screen that it resides upon.

Originally posted by: BFG10K
Input latency is not (directly) related to rendering frame-rate, at least in most game engines.
It doesn't matter because at the end of the day it all comes down to perceived latency which comes directly from visual information in the form of displayed frames per second.
Yes, I agree, it is a culmination of input, rendering, and display latency combined. "If I do X, how long until I observe the change Y?"

Originally posted by: BFG10K
If you don't think so cap your favourite game to 5 FPS and let me know how smooth those 360 degree turns are. What the engine is doing means squat of there aren't enough frames to display it properly and if we followed your reasoning to its logical conclusion you'd almost be arguing that turning off the monitor should make no difference to the gaming experience.
Now you're just being hard-headed, BFG10K. Obviously, there has to be some sort of minimum "smoothness", minumum frame-rate, in order to create the illusion of motion. But at the frame-rates that we have been discussing, this isn't an issue. There is no loss to the illusion of motion, with vsync enabled, with a display frame-rate of 60fps.

Originally posted by: BFG10K
But at least with vsync enabled, input latency is consistent
Consistently higher.
Now you're drifting into your oft-repeated but unsupported allegations again. There is no inherent direct link between the two. If you feel that I am wrong, then feel free to provide that proof.

Originally posted by: BFG10K
I have no idea what you mean by "the game's side", that's far too vague to have any meaning.
Do you understand that what the game is doing is independent of what is being displayed? Do you also understand I'm discussing displayed frames in terms of direct visual feedback?
"visual feedback", if it's "feedback", involves not just the display, but also the inputs as well. If you meant "displayed", then say that, something like "the game's side" is vague.

Originally posted by: BFG10K
But enabling vsync, does not impose any caps whatsover on rendering frame-rate.
It caps the displayed framerate to full discrete frames which are tied to the refresh rate.
Wrong. Just plain wrong. If you wish to maintain your ignorance, be my guest. Yes, this is a flame. You obviously haven't read or digested anything that I, or others, have said.

It doesn't "cap" them at all, it just forces the displayed scenes to be sampled at integral intervals from the set of rendered scenes. In other words, the rendering frame-rate is still 90fps, ignoring variability due to GPU loading for now, and the display frame-rate is still 60fps. Neither one have been changed, and certainly not "capped". What changes is what those displayed frames contain - either consistent whole rendered scenes, or inconsistent partial "slices" of various rendered frames "stitched" together vertically on the display, like a patchwork quilt of scenes, almost, in bad cases. Given the variable per-scene GPU load, those "slices" will vary in vertical size too.

Originally posted by: BFG10K
Strangely, I'm starting to get the feeling, that you choose to limit the game's ability to render continuously, by forcing double-buffering only,
WTF? What exactly am I limiting by running a DB + non-vsynced system?
That comment was in reference to a double-buffered + vsync system, causing the GPU to stall. It was the follow-up comment to... aww, foggedaboutit. You simply don't seem to understand logic.

Originally posted by: BFG10K
because you believe that it cuts down on input latency,
A belief that is founded by fact.
Proof?

I posit that most game engines that give an option for both double- and triple-buffering, and vsync enabled/disabled, will have an input latency, at a minimum, three frames, assuming that the inputs aren't sampled completely asyncronously on their own timebase.
Sure, there may be a few games where that is true, but that doesn't mean that it is some fundemental rule, that can be repeated and broadly applied to every game. That's just wrong, and part of the problem I have with what you are saying.

Originally posted by: BFG10K
Am I getting close here to understanding your POV?
No, you're not. What I'm saying is more partially displayed frames generate better response than less full frames. It's all about visual feedback yet you keep talking about game ticks, GPU rendering and display refresh rate.
I've been trying to break it down for you, so that you could understand exactly which parts of what you are saying are incorrect, and why.

I already agreed that in terms of "sensing motion", vsync-disabled/partial frames are useful. But then you had to start with the incorrect "capping" thing again, and that enabling vsync automatically increases your input latency, both of which are not true, in the general case.
Originally posted by: BFG10K
The best engine in the world is useless if its results can't be properly displayed.
The display's frame-rate is fixed, not capped.
Capped implies variability, up to some maximum.
That also implies that it could drop below it as well.
Here we go again.
The maximum is the problem.
Well I suppose it does but that isn't really relevant since this is all about maximums.
Well, guess what, that discussion was about the display device's refresh-rate. That is fixed, no matter what. (Once the display mode is chosen on the card.) So if the maximum is the problem, then you need to buy a better display, that can display at a higher refresh-rate. Period.

Originally posted by: BFG10K
nor would the existance of that "cap" have anything to do with enabling nor disabling of vsync.
Getting X full frames on the screen where X is refresh rate compared to Y partial frames where Y > X is a cap.
Actually, it's not, it's actually the same. The amount of visual information displayed over time, by a display device with a fixed frame-rate, is also fixed. The question is what that displayed information is sampled from, whether from 60 whole frames, or 90 partial frames. But if you add up all of those partial frame's sizes, they come out equal - as they must. Plus, the game's renderer still runs at 90fps. It is not a cap. If it forced the renderer to run at 60fps as well, because it was slaved to the display refresh-rate, then it would be a cap. That's what I've been understanding your meaning of "cap" to be, at least until this point. In the general sense, this is not true. Generally, the input latency, in the case of a game engine in which it is not sampled async, is slaved to the renderer, and thus, the total real-time for "input lag", is also the same.

Originally posted by: BFG10K
If you wish to continue to believe that "enabling vsync imposes artificial caps",
120 FPS engine rendering to 60 Hz:
vsync: 60 full frames per second displayed.
non-vsync: 120 partial frames per second displayed.
But the renderer is still running at 120 fps, it is not capped to the frame-rate of the display.
Any sort of physics-engine calculations, that were related to rendered scenes per unit of real time, will remain the same.

Let me ask you this - on an ordinary NTSC television, what offers a faster display frame-rate? 60 interlaced fields per sec, or 30 frames per sec? Oh, look at that - they're equal, in terms of rate of displayed visual information.
Originally posted by: BFG10K
Vsync is imposing a 60 FPS cap, plain and simple. And before you start talking about display refresh rate, a 60 Hz monitor conveys more information from 120 partial frames than it does from 60 full frames.
Well then, that's your error right there. Pure infomation-theory stuff. It doesn't convey any more, or any less information, on the whole. What it does, is convey less information, more often. But it still all adds up to the same whole, which is limited by the display device itself.
Originally posted by: BFG10K
That leads to better visual feedback and better controller feedback.
Purely psychological then. I don't have any problem with your subjective opinion on how things "feel". But when it comes to spreading misinformation, I do.
Originally posted by: BFG10K
Normalizing displayed frames to integral multiples of the rendering frame-rate, yes, vsync does that.
AKA cap. Now we're starting get somewhere.
Well then, you have some strange idea about what the meaning of a "cap" is. I always thought that it meant a "limit". Well, let's take a look see, shall we?

Display device - a consistent, fixed frame-rate, 60 fps. Independent of whether or not vsync is enabled. Is that a "cap"? No. It's a fundemental limitation of the technology used.

Game engine renderer that uses the GPU - takes a variable period to render each scene, but done with a target frame-rate of 90 rendered scenes per unit of real time (second). This 90fps internal rendering rate is maintained, independent of whether or not vsync is enabled, as long as the rendering engine isn't also limited to only double-buffering, which would cause GPU stalls and a reduction in the actual rendering rate. So guess what? No "cap" there either. Game actions that would move objects N units of space in the game's engine, per unit of real player time, are still the same.

Originally posted by: BFG10K
I'm starting to think that you have no idea what that actually is or looks like.
And I'm fairly certain you've never run a non-vsynced system ever. In fact I'm even wondering how much gaming you've actually done and how many games you've actually played.
Go ahead, start the personal jabs when you can't seem to either understand, deal with, or admit, the technical issues. My favorite FPS is the original UT, I'm not much of a fan of Quake. I've "dabbled" with the newer UTs, but my machine is too slow to maintain a consistent and enjoyable frame-rate for me.

Originally posted by: BFG10K
If so, then that is why I suggested that perhaps the video driver is implementing some sort of render-ahead, combined with enforced vsync for display...
Perhaps...but then every major vendor since 1997 would have to have been doing this, not to mention that I'd never see tearing and my FPS would always report at the same level as refresh rate. Of course neither scenario is true so I doubt your theory is valid.
The various vendors more or less have been doing that for a long time, it helps on things like 3DMark benchmarks. It also wouldn't cause your FPS to be reported as the same as your display refresh-rate. (You still have this mistaken idea about "capping", although if it were occuring, that is what I would properly refer to as a "cap".)

Originally posted by: BFG10K
Have you seen an optometrist lately?
Have you played a game lately?
Better, I used to be professionally-employed writing them.

Originally posted by: BFG10K
Still seems so strange that you are both at once concerned about the rendering frame-rates so much that you disable vsync, and yet are so concerned about excessive VRAM usage from the render-buffers, but don't disable those eye-candy extras in order to mitigate what you claim would be a massive slowdown due to texture-memory thrashing.
I find the concept of capping my framerate and then using more resources to solve a self-induced problem rather silly. A better option is to not introduce the problem to begin with and put those resources to more beneficial purposes.
Let me guess, in the days of VCR's, you never bothered to use the digital tracking control to adjust those unsightly "tearing lines" on the top/bottom of the screen when the tracking was off, because it would have been a "waste of resources" to adjust them. That is, assuming that you could even see that problem, which based on this discussion, I think that there is clear evidence that you wouldn't have ever noticed if the sync tracking were off.

Originally posted by: BFG10K
So in most cases, you really wouldn't notice it either way, unless there is a game that really relies on high-res detail textures as a critical facet of the gameplay.
Texture swapping is evident in dozens and dozens of games of varying ages and engines. Again I encourage you to buy a few titles and try them out for yourself.
Ah, so you admit that it really isn't as much of an issue as you original mentioned, because they simply swap in lower-detail textures. Do you really notice low-res ground textures swapped out for lower-res ones? I don't. The only thing that I would likely notice is detail textures like facial features in a FPS, where you are likely to observe them "zoomed in".
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
There is NO reason to use it if you as a individual are happy with screen output, i for one aint ever ever seen tearing on any nvidia gpu ive owned.

Just remember you cant turn of direct 3d sync only open gl in new nvidia drivers without rivatuner or such, even coolbits dont show you direct 3d tab in newer forcwares.
 

eBauer

Senior member
Mar 8, 2002
533
0
76
I believe tearing is much more present on LCDs / low quality CRTs.

My Viewsonic G90FB runs at 100Hz @ 10x7, and tearing is not noticeable be it 200, 300, or 400FPS. Input response time and mouse smoothness are noticeably better at 200FPS (howerver, jumping to 300-400 yields zero improvement from what I can see) On the other hand, my POS 15" CRT that can run at 75Hz @ 10x7, displays awful tearing at anything over 100FPS (howerver, input response time and mouse smoothness are improved)

Obviously, if you have an LCD or crappy CRT, you are going to see tearing with v-sync disabled. It is up to personal preference as to what is better: minimal tearing or faster response time/mouse smoothness. Personally, I prefer the latter.





 

caz67

Golden Member
Jan 4, 2004
1,369
0
0
Never had any issues with Vsync, either enabled or disabled..I have a BenQ 937s+, and it's enabled.
 

Avalon

Diamond Member
Jul 16, 2001
7,571
178
106
Originally posted by: apoppin
Originally posted by: Avalon
Originally posted by: zakee00
this is the most organized flamewar i have EVER seen.

It's not a flamewar. They are having a very good discussion.
and WELL worth reading . . .

the truth is in here ;)
:roll:

And what is that supposed to mean?
 

BFG10K

Lifer
Aug 14, 2000
22,709
3,002
126
In the case in which the rendering frame-rate is higher than the display frame-rate, then yes, every displayed scene that you are viewing, is "lagged" by at least a small fraction of a frame.
Well okay then, but I really don't see why you'd consider a partial frame as lagging behind; the scanline is always drawing the current frame.

So if the difference between the scenes in successively-rendered or displayed frames is "so tiny", then there shouldn't be any realistic dis-advantage to leaving vsync enabled, is there?
It's certainly big enough to notice a difference but not large enough to make images jump around like you appear to be implying.

and yet, when I point out that with vsync disabled, thus causing tearing, a similar delay of one display-frame exists, for the portion of the scene displayed above the tearing line, you claim that is nothing of concern.
I really don't see where the delay is. At any given point the scanline is drawing the most currently rendered frame. From this scenario you seem to be implying "well because a new frame comes along, that somehow makes the previous frame delayed". You can't delay something that happened in the past.

So, is a latency of one display-frame meaningful or not for the information that you are observing?
What latency? Having a newer frame arrive doesn't delay the first one. In fact because you stop drawing the old frame immediately instead of being forced to wait for the refresh cycle means the latency is lower than it is with vsync.

then there will always be at least some latency between the rendered and displayed frame, due to the fact that rendering occurs to a back-buffer and is then page-flipped onto the display.
You won't hear any arguments from me about that one. However a vsync system adds more perceived latency because it has to time its displayed frames to the display refresh rate.

Obviously, there has to be some sort of minimum "smoothness", minumum frame-rate, in order to create the illusion of motion.
I think we both agree that a good engine shouldn't tie input or it's tick to the framerate. However even if it doesn't the display framerate is still directly responsible for providing the interpolated feedback to the user.

There is no loss to the illusion of motion, with vsync enabled, with a display frame-rate of 60fps.
So you'd claim there'd be no difference between 60 FPS and say 120 FPS? Would you care to try the object rotating demo and see how you find it?

There is no inherent direct link between the two.
Yes, there is. Try some fast mouse turns on something like a 75 FPS vsync system and then compare it to a unsynced 150 FPS system in the same area. The difference is massive. The constant 75 FPS feels laggy and unresponsive compared to the uncapped framerate of 150 FPS.

And note that I've tried this sort of thing in dozens of games and dozens of scenarios.

It doesn't "cap" them at all, it just forces the displayed scenes to be sampled at integral intervals from the set of rendered scenes.
The displayed framerate is capped to discrete frames to the display's refresh rate, or to put it another way the sampled frames used for the date in the frames themselves are capped. If you wish to continue to nitpick definitions that's not my problem.

The proof is contained in the fact that displayed frames have the biggest impact on perceived latency.

and that enabling vsync automatically increases your input latency,
Well obviously if your rendering framerate isn't higher than your displayed framerate then it won't make any difference. If it is then it will.

But if you add up all of those partial frame's sizes, they come out equal - as they must.
Of course they do, but the information contained in them certainly isn't. The rendered frames contain the interpolated data from the game's engine and therefore the more of them you can see the better. A vsync system simply doesn't sample or display any data from frames that missed the refresh cycle.

Plus, the game's renderer still runs at 90fps. It is not a cap.
Neither the game engine nor the renderer is capped and I've never said otherwise. What is capped is the display framerate because it's incapable of displaying data from multiple frames on a given refresh cycle.

But the renderer is still running at 120 fps, it is not capped to the frame-rate of the display
Yes, but you never see the data from the other 60 frames, frames that contain interpolated data about the status of the game engine.

What it does, is convey less information, more often.
It conveys more information overall because each frame has an updated piece of the game engine and on a vsync system you can't see the contents of some of those frames because they're never sampled or displayed.

Well then, you have some strange idea about what the meaning of a "cap" is. I always thought that it meant a "limit".
There is a limit, a limit in terms of sampled frames per second (which directly leads to the information contained in displayed frames). A 120 FPS rendering vsync system running at 60 Hz will cap this to 60.

It also wouldn't cause your FPS to be reported as the same as your display refresh-rate.
Every vsync setting causes the FPS to be locked at or below the refresh rate and if it didn't I'd consider that a broken implementation.

Ah, so you admit that it really isn't as much of an issue as you original mentioned,
Nonsense; texture swapping can be even more jarring than a vsync + DB system.

Do you really notice low-res ground textures swapped out for lower-res ones? I don't.
I could ask you to consult an optometrist or to check your VCR but instead I'll simply state that you should use what works well for you.
 

imported_humey

Senior member
Nov 9, 2004
863
0
0
BTW, some clever peep on a review site looked into all this fe months back, and there is more to it that meets the eye, its not as simple as your monitors HZ locking fps to same, it related but very deep report, i still turn both of and will until i see some bad tearing on screen,

I openly admit it was bit deep for me, so i read i and never really took much notice.
 

Matthias99

Diamond Member
Oct 7, 2003
8,808
0
0
Yes, there is. Try some fast mouse turns on something like a 75 FPS vsync system and then compare it to a unsynced 150 FPS system in the same area. The difference is massive. The constant 75 FPS feels laggy and unresponsive compared to the uncapped framerate of 150 FPS.

I don't want to get dragged too deeply into this, but the counterargument I would make to this is that, while the un-vsynced system feels 'faster' in some sense (because what's on the screen is changing more rapidly), the visual artifacts you get in most FPS games are annoying enough (to me) that it makes any (to me, minor) improvements in my turning/aiming ability irrelevant, because the tearing starts to distract me from the gameplay. I would not describe the control difference as "massive" -- it's not like I turn more slowly, I just don't get visual feedback of it quite as often. Then again, I don't seem to be bothered by low frame rates as much as some people apparently are. But then again, if you're claiming to hardly ever see tearing, then either you're tuning it out, or you don't know what to look for. It's readily apparent to me in any older game where the FPS gets well above 100-120 (I run my monitor at 85Hz, FWIW; at 100Hz, it would be less of an issue, I'm sure, but my monitor gets slightly fuzzy unless I run 10x7 at 100Hz).

There's some amount of personal preference to all this. This reminds me a lot of the arguments between 1080i and 720p video for HDTVs; there are good technical arguments on both sides, but ultimately a lot of it comes down to how you think something should look.