Originally posted by: BFG10K
then how can you at the same time argue against the very same sort of thing being timing-critical, as to whether or not the partial-display-frame image that you are seeing, is one (rendering) frame "behind" or two?
And by extension using your reasoning if part of the frame is behind in a non-vsynced system then that implies the whole frame is behind in a vsynced system.
In the case in which the rendering frame-rate is higher than the display frame-rate, then yes, every displayed scene that you are viewing, is "lagged" by at least a small fraction of a frame. With vsync enabled, once a frame is started to be displayed, it remains displayed for the entire display frame period. It is not "interrupted" mid-frame to display another rendered frame, starting from some generally non-deterministic scanline onwards.
Originally posted by: BFG10K
IOW, it's so critical that you see that information, as a partially-displayed frame, and yet, it's not at all critical as to which rendered scene that partial-frame's worth of information belongs to?
I'm sorry, are you expecting the frames to somehow mix-and-match or something? To jump places? The difference between each frame is tiny so it's not like anyone's going to be sitting there and saying "hmm, which frame does that leg belong too?"
Your claim is that, with vsync enabled, and a display frame-rate (say 60fps) lower than the rendering frame-rate (say 90fps), you will miss out on critical information, caused by the fact that only 60 out of those 90 frames are displayed. So if the difference between the scenes in successively-rendered or displayed frames is "so tiny", then there shouldn't be any realistic dis-advantage to leaving vsync enabled, is there?
I'm simply pointing out the contradictory argument that you are making here, that on one hand it is critically important to be able to observe a half-scene's worth of information displayed one display-frame earlier, and yet, when I point out that with vsync disabled, thus causing tearing, a similar delay of one display-frame exists, for the portion of the scene displayed above the tearing line, you claim that is nothing of concern.
So, is a latency of one display-frame meaningful or not for the information that you are observing? One or the other. You can't argue that both ways.
Unless the game is using a single buffer for both rendering and display (we touched upon that earlier - it would be far too unsightly to actually use), then there will always be at least
some latency between the rendered and displayed frame, due to the fact that rendering occurs to a back-buffer and is then page-flipped onto the display. The question is whether that latency is
consistent, for all portions of the displayed scene, and that is true only with vsync enabled. It then also follows, that if it is not consistent, and a single-frame lag
is in fact timing-critical, then one has to make judgements about the information being displayed, as to which prior-frame portion of the screen that it resides upon.
Originally posted by: BFG10K
Input latency is not (directly) related to rendering frame-rate, at least in most game engines.
It doesn't matter because at the end of the day it all comes down to
perceived latency which comes directly from visual information in the form of displayed frames per second.
Yes, I agree, it is a culmination of input, rendering, and display latency combined. "If I do X, how long until I observe the change Y?"
Originally posted by: BFG10K
If you don't think so cap your favourite game to 5 FPS and let me know how smooth those 360 degree turns are. What the engine is doing means squat of there aren't enough frames to display it properly and if we followed your reasoning to its logical conclusion you'd almost be arguing that turning off the monitor should make no difference to the gaming experience.
Now you're just being hard-headed, BFG10K. Obviously, there has to be some sort of minimum "smoothness", minumum frame-rate, in order to create the illusion of motion. But at the frame-rates that we have been discussing, this isn't an issue. There is no loss to the illusion of motion, with vsync enabled, with a display frame-rate of 60fps.
Originally posted by: BFG10K
But at least with vsync enabled, input latency is consistent
Consistently higher.
Now you're drifting into your oft-repeated but unsupported allegations again. There is no inherent direct link between the two. If you feel that I am wrong, then feel free to provide that proof.
Originally posted by: BFG10K
I have no idea what you mean by "the game's side", that's far too vague to have any meaning.
Do you understand that what the game is doing is independent of what is being displayed? Do you also understand I'm discussing
displayed frames in terms of direct visual feedback?
"visual feedback", if it's "feedback", involves not just the display, but also the inputs as well. If you meant "displayed", then say that, something like "the game's side" is vague.
Originally posted by: BFG10K
But enabling vsync, does not impose any caps whatsover on rendering frame-rate.
It caps the displayed framerate to full discrete frames which are tied to the refresh rate.
Wrong. Just plain wrong. If you wish to maintain your ignorance, be my guest. Yes, this is a flame. You obviously haven't read or digested anything that I, or others, have said.
It doesn't "cap" them at all, it just forces the displayed scenes to be sampled at integral intervals from the set of rendered scenes. In other words, the rendering frame-rate is
still 90fps, ignoring variability due to GPU loading for now, and the display frame-rate is
still 60fps. Neither one have been changed, and certainly not "capped". What changes is what those displayed frames contain - either consistent whole rendered scenes, or inconsistent partial "slices" of various rendered frames "stitched" together vertically on the display, like a patchwork quilt of scenes, almost, in bad cases. Given the variable per-scene GPU load, those "slices" will vary in vertical size too.
Originally posted by: BFG10K
Strangely, I'm starting to get the feeling, that you choose to limit the game's ability to render continuously, by forcing double-buffering only,
WTF? What exactly am I limiting by running a DB + non-vsynced system?
That comment was in reference to a double-buffered + vsync system, causing the GPU to stall. It was the follow-up comment to... aww, foggedaboutit.
You simply don't seem to understand logic.
Originally posted by: BFG10K
because you believe that it cuts down on input latency,
A belief that is founded by fact.
Proof?
I posit that most game engines that give an option for both double- and triple-buffering, and vsync enabled/disabled, will have an input latency, at a minimum, three frames, assuming that the inputs aren't sampled completely asyncronously on their own timebase.
Sure, there may be a few games where that is true, but that doesn't mean that it is some fundemental rule, that can be repeated and broadly applied to every game. That's just wrong, and part of the problem I have with what you are saying.
Originally posted by: BFG10K
Am I getting close here to understanding your POV?
No, you're not. What I'm saying is more partially
displayed frames generate better response than less full frames. It's all about visual feedback yet you keep talking about game ticks, GPU rendering and display refresh rate.
I've been trying to break it down for you, so that you could understand exactly which parts of what you are saying are incorrect, and why.
I already agreed that in terms of "sensing motion", vsync-disabled/partial frames are useful. But then you had to start with the incorrect "capping" thing again, and that enabling vsync automatically increases your input latency, both of which are
not true, in the general case.
Originally posted by: BFG10K
The best engine in the world is useless if its results can't be properly displayed.
The display's frame-rate is fixed, not capped.
Capped implies variability, up to some maximum.
That also implies that it could drop below it as well.
Here we go again.
The maximum is the problem.
Well I suppose it does but that isn't really relevant since this is all about maximums.
Well, guess what, that discussion was about the display device's refresh-rate. That is
fixed, no matter what. (Once the display mode is chosen on the card.) So if the maximum is the problem, then you need to buy a better display, that can display at a higher refresh-rate. Period.
Originally posted by: BFG10K
nor would the existance of that "cap" have anything to do with enabling nor disabling of vsync.
Getting X full frames on the screen where X is refresh rate compared to Y partial frames where Y > X is a cap.
Actually, it's not, it's actually the same. The amount of visual information displayed over time, by a display device with a fixed frame-rate, is also fixed. The question is what that displayed information is sampled from, whether from 60 whole frames, or 90 partial frames. But if you add up all of those partial frame's sizes, they come out equal - as they must. Plus, the game's renderer
still runs at 90fps. It
is not a cap. If it forced the renderer to run at 60fps as well, because it was slaved to the display refresh-rate,
then it would be a cap. That's what I've been understanding your meaning of "cap" to be, at least until this point. In the general sense, this is
not true. Generally, the input latency, in the case of a game engine in which it is not sampled async, is slaved to the renderer, and thus, the total real-time for "input lag", is also
the same.
Originally posted by: BFG10K
If you wish to continue to believe that "enabling vsync imposes artificial caps",
120 FPS engine rendering to 60 Hz:
vsync: 60 full frames per second displayed.
non-vsync: 120 partial frames per second displayed.
But the renderer is still running at 120 fps, it is
not capped to the frame-rate of the display.
Any sort of physics-engine calculations, that were related to rendered scenes per unit of real time, will remain
the same.
Let me ask you this - on an ordinary NTSC television, what offers a faster display frame-rate? 60 interlaced fields per sec, or 30 frames per sec? Oh, look at that - they're
equal, in terms of rate of displayed visual information.
Originally posted by: BFG10K
Vsync is imposing a 60 FPS cap, plain and simple. And before you start talking about display refresh rate, a 60 Hz monitor conveys more information from 120 partial frames than it does from 60 full frames.
Well then, that's your error right there. Pure infomation-theory stuff. It doesn't convey any more, or any less information, on the whole. What it does, is convey
less information,
more often. But it still all adds up to the same whole, which is limited by the display device itself.
Originally posted by: BFG10K
That leads to better visual feedback and better controller feedback.
Purely psychological then. I don't have any problem with your subjective opinion on how things "feel". But when it comes to spreading misinformation, I do.
Originally posted by: BFG10K
Normalizing displayed frames to integral multiples of the rendering frame-rate, yes, vsync does that.
AKA cap. Now we're starting get somewhere.
Well then, you have some strange idea about what the meaning of a "cap" is. I always thought that it meant a "limit". Well, let's take a look see, shall we?
Display device - a consistent, fixed frame-rate, 60 fps. Independent of whether or not vsync is enabled. Is that a "cap"? No. It's a fundemental limitation of the technology used.
Game engine renderer that uses the GPU - takes a variable period to render each scene, but done with a target frame-rate of 90 rendered scenes per unit of real time (second). This 90fps internal rendering rate is maintained,
independent of whether or not vsync is enabled, as long as the rendering engine isn't also limited to only double-buffering, which would cause GPU stalls and a reduction in the actual rendering rate. So guess what? No "cap" there either. Game actions that would move objects N units of space in the game's engine, per unit of real player time, are still
the same.
Originally posted by: BFG10K
I'm starting to think that you have no idea what that actually is or looks like.
And I'm fairly certain you've never run a non-vsynced system ever. In fact I'm even wondering how much gaming you've actually done and how many games you've actually played.
Go ahead, start the personal jabs when you can't seem to either understand, deal with, or admit, the technical issues. My favorite FPS is the original UT, I'm not much of a fan of Quake. I've "dabbled" with the newer UTs, but my machine is too slow to maintain a consistent and enjoyable frame-rate for me.
Originally posted by: BFG10K
If so, then that is why I suggested that perhaps the video driver is implementing some sort of render-ahead, combined with enforced vsync for display...
Perhaps...but then every major vendor since 1997 would have to have been doing this, not to mention that I'd never see tearing and my FPS would always report at the same level as refresh rate. Of course neither scenario is true so I doubt your theory is valid.
The various vendors more or less have been doing that for a long time, it helps on things like 3DMark benchmarks. It also wouldn't cause your FPS to be reported as the same as your display refresh-rate. (You still have this mistaken idea about "capping", although if it were occuring, that
is what I would properly refer to as a "cap".)
Originally posted by: BFG10K
Have you seen an optometrist lately?
Have you played a game lately?
Better, I used to be professionally-employed writing them.
Originally posted by: BFG10K
Still seems so strange that you are both at once concerned about the rendering frame-rates so much that you disable vsync, and yet are so concerned about excessive VRAM usage from the render-buffers, but don't disable those eye-candy extras in order to mitigate what you claim would be a massive slowdown due to texture-memory thrashing.
I find the concept of capping my framerate and then using more resources to solve a self-induced problem rather silly. A better option is to not introduce the problem to begin with and put those resources to more beneficial purposes.
Let me guess, in the days of VCR's, you never bothered to use the digital tracking control to adjust those unsightly "tearing lines" on the top/bottom of the screen when the tracking was off, because it would have been a "waste of resources" to adjust them. That is, assuming that you could even see that problem, which based on this discussion, I think that there is clear evidence that you wouldn't have ever noticed if the sync tracking were off.
Originally posted by: BFG10K
So in most cases, you really wouldn't notice it either way, unless there is a game that really relies on high-res detail textures as a critical facet of the gameplay.
Texture swapping is evident in dozens and dozens of games of varying ages and engines. Again I encourage you to buy a few titles and try them out for yourself.
Ah, so you admit that it really isn't as much of an issue as you original mentioned, because they simply swap in lower-detail textures. Do you really notice low-res ground textures swapped out for lower-res ones? I don't. The only thing that I would likely notice is detail textures like facial features in a FPS, where you are likely to observe them "zoomed in".