• We’re currently investigating an issue related to the forum theme and styling that is impacting page layout and visual formatting. The problem has been identified, and we are actively working on a resolution. There is no impact to user data or functionality, this is strictly a front-end display issue. We’ll post an update once the fix has been deployed. Thanks for your patience while we get this sorted.

Are you buying a HD 7970 on Monday the 9th?

Page 7 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Are you buying an HD 7970 on Monday the 9th?

  • Yes

  • No


Results are only viewable after voting.
Obviously he used his eyes and that is all that counts, what he can see. Why would he care about a test when it's what his eyes show him that counts. 😵😕

It's really simple.
Different pepole perceive microstutter differently.
some people apparently not perceive it at all, whyle others perceive it loud and clear.
So I need to know if he used his subjective eyes...or the objetive frame times.

As the microstutter dosn't need to be gone, just because he dosn't perceive it any longer.
 
But microstutter is still there.
A shame more sites don't use techport's method to measuring microstutter...microstutter is a killer for me and excludes multi-GPU.

I've only seen microstutter with triple monitor, never with single monitor. That said, vsync generally eliminates microstutter. I'm pretty picky about having "smooth" gameplay myself, and microstutter drives me crazy. It really isn't as big of a problem as it used to be, and it only happens in certain scenarios (3d vision resolutions, eyefinity) -- and vsync usually fixes it.
 
Not in all cases. Vsync lessens the load on the GPUs but if you fall below 40 or so due to demanding settings, microstutter can an will still occur. Microstutter happens, when the GPUs stress is high.
 
Not in all cases. Vsync lessens the load on the GPUs but if you fall below 40 or so due to demanding settings, microstutter can an will still occur. Microstutter happens, when the GPUs stress is high.

On a single gpu vsync lessens load. Less frames rendered = lower load, not hard to understand.

In withcer 2 maximum detail my gpu load is 100% with vsync off and 50-60% with vsync on, when I disable SLI. This is when I look at the graph in MSI afterburner.

I have never seen single monitor microstutter, but have seen it with triple monitor in very specific situations. I was able to get it fixed - YMMV of course.

And seriously - who besides someone on a 120hz monitor ever disables vsync? I don't like tearing and that removes game immersion for me more than anything else. I only disable vsync for benchmarking. Otherwise, disabling vsync is stupid.
 
Last edited:
I meant it is there without a limiter and significantly less with a fps cap. I have started to make some benchmarks and diagrams, will post them in a bit in a new thread.
 
It's really simple.
Different pepole perceive microstutter differently.
some people apparently not perceive it at all, whyle others perceive it loud and clear.
So I need to know if he used his subjective eyes...or the objetive frame times.

As the microstutter dosn't need to be gone, just because he dosn't perceive it any longer.


I'm with you. I can not stand microstutter. I'm baffled how some people can't pick up on it. I find it to be very distracting. It has been proved time and time again by reputable reviewers and users across the board to exist and be detrimental to the gaming experience.
 
I'm with you. I can not stand microstutter. I'm baffled how some people can't pick up on it. I find it to be very distracting. It has been proved time and time again by reputable reviewers and users across the board to exist and be detrimental to the gaming experience.

Well there is an important caveat to that, it's been proven to exist, no one is arguing that.

But the way it affects one user to the next is completely subjective and cannot be quantified and trying to tell someone what the experience is like or will be like for them is pointless for the aforementioned reason.

I only notice it in a couple games out of the many I own and I mean exactly two when I say a couple. For you it's obviously a different experience. So there is nothing consistent about how it will affect different people at all. The best advice would be try multi-gpu and see if you find anything about it to be off.
 
Well there is an important caveat to that, it's been proven to exist, no one is arguing that.

But the way it affects one user to the next is completely subjective and cannot be quantified and trying to tell someone what the experience is like or will be like for them is pointless for the aforementioned reason.

I only notice it in a couple games out of the many I own and I mean exactly two when I say a couple. For you it's obviously a different experience. So there is nothing consistent about how it will affect different people at all. The best advice would be try multi-gpu and see if you find anything about it to be off.

That is why I need graphs...because I pick it up right away.
The same way I dislike the I.Q. of LDC's...my visual acuity is not 20/20 but 20/15 (US term) or 150% (EU term)
 
That is why I need graphs...because I pick it up right away.
The same way I dislike the I.Q. of LDC's...my visual acuity is not 20/20 but 20/15 (US term) or 150% (EU term)

Well a graph will show you what in actuality is happening, but the way you see it is still subjective.

Sort of like someone tracking the velocity of a rollercoaster and the exact time it takes to traverse the track and using that to define how the riders feel after they get off at the end of the ride. There is a mild correlation to how that data affects the experience but it's a very personal experience and every rider will tell you something different.
 
Well there is an important caveat to that, it's been proven to exist, no one is arguing that.

But the way it affects one user to the next is completely subjective and cannot be quantified and trying to tell someone what the experience is like or will be like for them is pointless for the aforementioned reason.

I only notice it in a couple games out of the many I own and I mean exactly two when I say a couple. For you it's obviously a different experience. So there is nothing consistent about how it will affect different people at all. The best advice would be try multi-gpu and see if you find anything about it to be off.

I understand your stance.

Knowledge of the issue is all I'm trying to convey.

Some people blindly recommend and argue that 460 sli is a much better buy than a lone 580 because they usually produce a similar/slightly higher frame rate not to mention memory differences.
 
I understand your stance.

Knowledge of the issue is all I'm trying to convey.

Some people blindly recommend and argue that 460 sli is a much better buy than a lone 580 because they usually produce a similar/slightly higher frame rate not to mention memory differences.

In that example that is just wrong. I only see multi-gpu worth the issues with it when you need more performance than any single-gpu card can give you. For some people though it is a thing of getting the same performance for less money as well, in that case it also works for them.
 
I checked the photo's at newegg, that XFX model is using the reference AMD PCB, the numbers on the pci-e connector match the other's. It will be interesting when the bios's get dumped and we will see if XFX, went with a little more voltage for the 1000mhz VS 925 on the other's.
 
Back
Top