- Apr 22, 2003
- 2,800
- 1,528
- 136
http://techreport.com/articles.x/21516/11
Here's the snippet that I found the most interesting:
As ATM has noted, the article is a year old at this point. If you have anything meaningful and new to add to the discussion you're more than welcome to start a new thread, but simply rehashing TR's old article isn't something that's going to be productive.
-ViRGE
Here's the snippet that I found the most interesting:
In fact, in a bit of a shocking revelation, Petersen told us Nvidia has "lots of hardware" in its GPUs aimed at trying to fix multi-GPU stuttering. The basic technology, known as frame metering, dynamically tracks the average interval between frames. Those frames that show up "early" are delayed slightly—in other words, the GPU doesn't flip to a new buffer immediately—in order to ensure a more even pace of frames presented for display. The lengths of those delays are adapted depending on the frame rate at any particular time. Petersen told us this frame-metering capability has been present in Nvidia's GPUs since at least the G80 generation, if not earlier. (He offered to find out exactly when it was added, but we haven't heard back yet.)
Poof. Mind blown.
Now, take note of the implications here. Because the metering delay is presumably inserted between T_render and T_display, Fraps would miss it entirely. That means all of our SLI data on the preceding pages might not track with how frames are presented to the user. Rather than perceive an alternating series of long and short frame times, the user would see a more even flow of frames at an average latency between the two.
Frame metering sounds like a pretty cool technology, but there is a trade-off involved. To cushion jitter, Nvidia is increasing the amount of lag in the graphics subsystem as it inserts that delay between the completion of the rendered frame and its exposure to the display. In most cases, we're talking about tens of milliseconds or less; that sort of contribution to lag probably isn't perceptible. Still, this is an interesting and previously hidden trade-off in SLI systems that gamers will want to consider.
Frame metering sounds like a pretty cool technology, but there is a trade-off involved. To cushion jitter, Nvidia is increasing the amount of lag in the graphics subsystem.
So long as the lag isn't too great, metering frame output in this fashion has the potential to alleviate perceived jitter. It's not a perfect solution, though. With Fraps, we can measure the differences between presentation times, when frames are presented to the DirectX API. A crucial and related question is how the internal timing of the game engine works. If the game engine generally assumes the same amount of time has passed between one frame and the next, metering should work beautifully. If not, then frame metering is just moving the temporal discontinuity problem around—and potentially making it worse. After all, the frames have important content, reflecting the motion of the underlying geometry in the game world. If the game engine tracks time finely enough, inserting a delay for every other frame would only exacerbate the perceived stuttering. The effect would be strange, like having a video camera that captures frames in an odd sequence, 12--34--56--78, and a projector that displays them in an even 1-2-3-4-5-6-7-8 fashion. Motion would not be smooth.
As ATM has noted, the article is a year old at this point. If you have anything meaningful and new to add to the discussion you're more than welcome to start a new thread, but simply rehashing TR's old article isn't something that's going to be productive.
-ViRGE
Last edited by a moderator: