Will Robinson
Golden Member
A very good article by Tech Report's Scott Wasson.:thumbsup:
http://techreport.com/articles.x/21516
http://techreport.com/articles.x/21516
I did a theoretical analysis of microstutter a while ago (never published). Its actually quite simple once you sit down and understand the math.
I even managed to come up with a formula that could make predictions when it would happen. It correctly predicted that more GPUs in the system would reduce the problem, as was later confirmed by the recent Toms Hardware article.
Anyway, its nice to have more practical numbers backing my theories. :awe:
Like I said, my results were never published.More GPUs in the system 'reduce' the problem? That's an interesting result. Can you please post or PM the article? I'd be happy to read it.
I dont write for ABT anymore, and the content is not in a state to be PMd.I was asking you to publish it on your site, or even to PM me the content.
More GPUs in the system 'reduce' the problem? That's an interesting result. Can you please post or PM the article? I'd be happy to read it.
I dunno if u guys read the conclusion, fraps is totally inaccurate in measuring frame latency for NV cards.
Also, fraps reported high latency spikes for the 6870 CF pair but in game, high fps constant, no noticeable jitters.
They need to get a high speed camera before considering any benchmarks on frame latency. Fraps is not the be all end all.
Poof. Mind blown.
Now, take note of the implications here. Because the metering delay is presumably inserted between T_render and T_display, Fraps would miss it entirely. That means all of our SLI data on the preceding pages might not track with how frames are presented to the user. Rather than perceive an alternating series of long and short frame times, the user would see a more even flow of frames at an average latency between the two.
"2) Presumably, a jitter pattern alternating between five- and 15-millisecond frame times would be less of an annoyance than a 15- and 45-millisecond pattern. The worst example we saw in our testing alternated between roughly six and twenty milliseconds, but it didn't jump out at me as a problem during our original testing. Just now, I fired up Bad Company 2 on a pair of Radeon HD 6870s with the latest Catalyst 11.8 drivers. Fraps measures the same degree of jitter we saw initially, but try as I might, I can't see the problem."
So he can measure the microstutter/jittering but in the real world he can't perceive the difference, later sighting his IPS monitor as "too slow" to capture this latency issue. So we have another component into the mix = microstuttering may differ across TN / IPS panels.
Basically short of a person actually trying CF/SLI for themselves, it seems it's impossible to discount CF/SLI as microstuttering mess because:
1) Micro-stutter may exist but depending on the speed of the monitor, it's may be a non-issue altogether
2) Micro-stutter may exist, but the user may or may not notice it
3) Micro-stutter differs across AMD/NV brands and even across games (i.e., game engines and GPU load too) <way too many factors all of a sudden)
***4) As Tom's Hardware pointed, Micro-stuttering is alleviated somewhat if you run faster GPUs and/or if you add a 3rd GPU into the mix, creating yet another variable.
So we are back to square 1: it makes sense why some report issues with micro-stutter and others don't really have them.
Finally some good data on microstutter.
![]()
This is what happens when you use vsync and triple buffering on a 60Hz monitor with the 6870 CFX setup.
Clearly outperforms the single 6970 statistically and visually.
Food for thought.