Microstuttering is very real. If that doesn't bother you, great! I know it would annoy the the shit outa me, especially if I'm dropping several hundred of my hard earned money a graphics card. Also, SLI has less microstutter than crossfire, so going dual gpu's with AMD would be my last choice.
1) Single fastest card
.
.
.
.
.
2) SLI
3) Crossfire.
Microstuttering is very real. If that doesn't bother you, great! I know it would annoy the the shit outa me, especially if I'm dropping several hundred of my hard earned money a graphics card. Also, SLI has less microstutter than crossfire, so going dual gpu's with AMD would be my last choice.
1) Single fastest card
.
.
.
.
.
2) SLI
3) Crossfire.
i guess you have to be a damn pretty unique snowflake to notice microstuttering because the mass population cannot perceive any difference when fps is above 60.
i guess you have to be a damn pretty unique snowflake to notice microstuttering because the mass population cannot perceive any difference when fps is above 60.
5970 here, still can't see the microstutter. And believe me, I've tried to.
Wow, another person who feels that because he can't see any improvement over 60fps, no one can. Playing Counterstrike at 90fps and then at 60fps is ridiculously easy to see the difference in smoothness. It is VERY noticeable.
SLI has less microstutter...
And as for "dropping several hundred"...ummm..don't you mean around $700?
Perhaps your pricing is as out of date as the silly reference to SLI's superiority.:whiste:
I don't think it's a silly reference, since I didn't pull it out of my butt.
http://techreport.com/articles.x/21516/11
"Third, in our test data, multi-GPU configs based on Radeons appear to exhibit somewhat more jitter than those based on GeForces. We can't yet say definitively that those observations will consistently hold true across different workloads, but that's where our data so far point."
I would agree that in very high fps scenarios, microstuttering is considerably less noticeable. However, I think that dual gpu systems have a much higher minimum fps threshold for playability due to microstutter.
We'd mostly agree with that assessment, but with several caveats based on our admittedly somewhat limited test data. For one, although jitter varies over time, multi-GPU setups that are prone to jitter in a given test scenario tend to return to it throughout each test run and from one run to the next. Second, the degree of jitter appears to be higher for systems that are more performance-constrained. For instance, when tested in the same game at the same settings, the mid-range Radeon HD 6870 CrossFireX config generally showed more frame-to-frame variance than the higher-end Radeon HD 6970 CrossFireX setup. The same is true of the GeForce GTX 560 Ti SLI setup versus dual GTX 580s. If this observation amounts to a trait of multi-GPU systems, it's a negative trait. Multi-GPU rigs would have the most jitter just when low frame times are most threatened. Third, in our test data, multi-GPU configs based on Radeons appear to exhibit somewhat more jitter than those based on GeForces. We can't yet say definitively that those observations will consistently hold true across different workloads, but that's where our data so far point.
Now, take note of the implications here. Because the metering delay is presumably inserted between T_render and T_display, Fraps would miss it entirely. That means all of our SLI data on the preceding pages might not track with how frames are presented to the user.
Read the page again.. carefully. Makes u look like a fool if you think CF microstutters worse than SLI based on that article.
THe 120hz brings up something else I wanted to ask...
I will be buying 3 120hz monitors.
The thing is.. the only ones that seem suited to me are the new samsung ones... with the thin bezels. all the others (like those "officially" supported by nvidia have huge bezels it seems and are not as high quality as the samsung. )
