In short:
avg is irrelevant
only standard deviation is relvevant
I believe you, but I almost sense sarcasm. I am starting to think that due to all of this talk about microstutter that I must suffer from it even if I don't notice it. So, that said, how can I solve my newfound microstter problem? I'm sure if I could detect it that it would really bother me, so help please.![]()
The definition of Microstutter is what you have if you don't have an Nvidia GPU.
With restocking fees, this might not be a smart plan.
With restocking fees, this might not be a smart plan.
Then why not sell the one "you" don't want?
If I were to lose a little one the resale, if it's sold, I could live with the knowledge gained.
Hard to sell a niche part? I've seen GTX 690s on the FS/FT forum that don't get many offers near MSRP. If someone were to want one for $1000 they could buy it new.
That's the risk you take
Don't need to. 690 is faster and already in the OP's system.
I don't know about SLI or crossfire microstutter since I only have a single gpu but people on the internet like to exaggerate and lie about things. Almost EVERYONE says that TVs are bad for gaming and that they have huuuuge input lag, I've just tested this and they're all wrong. There's no input lag, at least it's the same as it was when I was playing on my PC monitor, all you have to do is disable some settings in your TV's menu and the games run perfectly.In all seriousness - I don't get microstutter either. Others are telling us that we should be getting MS from SLI and a single card solution is "superior". (their words, not ours!)
It has been smooth sailing, literally and figuratively using SLI on my end. However, that being said i've read pages upon pages of people suggesting Titan over the GTX 690 and 680 sli because Titan has less microstutter. Apparently, that isn't the case according to frametime graphs?::shrug:: So this just makes one question any justification for Titan over 690 or 680 sli - there's no justification in terms of microstutter, price, or framerates; 680 sli and 690 are better in all three areas.
Nice attempt at a super vague explanation as to why Toms charts are "garbage" without even explaining a valid reason. But since you mention PCPer, we'll go with it:
![]()
![]()
![]()
Since frame time are the ultimate metric for card comparisons, here we see without exception that SLI is better than single card and provides a superior experience with less microstutter. Titan has more microstutter than both 680 sli and GTX 690.
Thanks for proving my point BC. Even PCPerspective agrees that SLI has less microstutter than single card, single card sucks. Ironically enough, their results agree with the results at tomshardware: good luck trying to explain that one. Are PCperspective garbage now?
I'll let you stumble over trying to explain how SLI has less microstutter than single card. Go for it.
microstutter happens in ALL cards. SLI/Xfire set ups have it worse and just posting a percentile graph doesn't tell the whole story. lets take a look using FC3:wait, what? i thought microstutter only occurs with SLI or Xfire? its a problem that manifests itself when using 2 cards, not a single card, that's not correct ur saying?
wait, what? i thought microstutter only occurs with SLI or Xfire? its a problem that manifests itself when using 2 cards, not a single card, that's not correct ur saying?
no, microstutter occurs in all GPUs, its only come to the forefront with multi GPUs because of how much worse they are
and of course recently the whole GCN microstutter issue that's been plaguing AMD is a perfect example of single GPU stutter
fortunately most people don't seem to be bothered by it, but for those of us that are bothered by it, well it severely limits our options![]()
Scott Wasson said:Also, a word on words. Although I'm reading a Google translation, I can see that they used the word "microstuttering" to describe the frame latency issues on the Radeon. For what it's worth, I prefer to reserve the term "microstuttering" for the peculiar sort of problem often encountered in multi-GPU setups where frame times oscillate in a tight, alternating pattern. That, to me, is "jitter," too. Intermittent latency spikes are problematic, of course, but aren't necessarily microstuttering. I expect to fail in enforcing this preference anywhere beyond TR, of course.