Summarize and/or link please. I do not have all day free to reread that huge thread.I already discussed that with blastingcap in "*Ryan Smith's*' thread.You can check it out.
Summarize and/or link please. I do not have all day free to reread that huge thread.I already discussed that with blastingcap in "*Ryan Smith's*' thread.You can check it out.
I agree with you, but that is not what I asked for. How do you take the human factor into account? We need a method, not excuses.Like I asked earlier, who do we blind test? The general public? Gamers? Enthusiasts? The guys that claim they can see it?
I'd argue that your average person won't see it, because they're ignorant of what they're looking for.
Of course there are flaws. But if it is still the best method, that is what we should must use. Simply pointing out the flaws of one method without providing a better alternative is not very constructive. Either the flaws are small enough that we can use the method, or they are too large to make any study conclusive. And in the latter case this thread can be closed, since no conclusion can be made regarding whether humans are affected by microstuttering. Neither of us want that.There are inherent flaws with the method you've provided. I was merely pointing them out.
Dude I wasn't talking about TR's review methods rather than its findings.Despite that AMD considered their findings to be accurate.Anyway my point was not to cheer lead TR's review but honestly it seems they are only among few sites who can offer more insight with their method.No offence but at this moment AT is lacking in GPU review department compared to many other sites.
The result is an almost identical finish in every metric, with the slightest of advantages to the 7950 in the latency-focused numbers.
Whoever said it was the best method? You're assuming it is, which is why I'm pointing out that it's not perfect. It's quite constructive.Of course there are flaws. But if it is still the best method, that is what we should must use. Simply pointing out the flaws of one method without providing a better alternative is not very constructive.
You're missing the point. Okay, we're going to do blind testing. Great. Now who do we test? How do we test them? What are the thresholds that would lead to a finding being drawn s conclusive or inconclusive?Either the flaws are small enough that we can use the method, or they are too large to make any study conclusive. And in the latter case this thread can be closed, since no conclusion can be made regarding whether humans are affected by microstuttering. Neither of us want that.
I never said it was problem free. But your are the one missing the point here. It is imperative that we evaluate if these observed variations in framerate are humanly observable. Otherwise we are just having high performance for high performance sake. I mean, if all that matters were Fraps numbers, I could just as well unplug my screen since what I see is not important. Again, BrightCandle and I did our homework and it seems like we know how the frame times translates into subjective performance. I suggest you do the same if you do not want to trust blind tests.There are all kinds of logistical issues with blind testing for microstutter. I think the idea can be ruled out. What we've got is good enough, and the objective testing methods to come are even better. I don't need to suggest an alternative.
Great! We are finally getting somewhere. Could you please upload the frame times and post the links please? I would like to see how they compare to what I consider good/bad.I already have a good idea of what my tolerances are with frame latency spikes. So the plots actually do have meaning to me. I'm extremely sensitive to not only frame rate variation, but to refresh rates in general. LEDs, for example, can drive me nuts when their refresh rate is too low.
Well, I only started benching "seriously" because I have my baby in my lap and cannot game! You have to do something while giving the bottle at night...I really wish I could, but I don't game right now. Gaming + school has not worked well for me in the past... and now it seems like internet browsing/forum trolling isn't working to well to my scholastic success either.
You're a good guy, just stay cool and you'll be fineIf I don't get banned, that is.
You are saying the same thing AMD has been saying ever since Intel slaughtered them.
Hey folks, don't look at meaningless benchmarks.
Look at those average gamers? Lets blind test them, and use their observations instead of numbers.
http://amdfx.blogspot.com/2012/04/mobile-trinity-blind-test-amd-clear.html
http://legitreviews.com/article/1838/1/
Frankly I think that's a bunch of stuff, and PR at it's worst - doing damage control insteaf of fixing things.
Because numbers are meaningless only if you pull them out of your ass. Which is hardly the case here.
Frame times coupled with frame variations is what defines your gameplay experience.
In infinitely more objective way, than observations(or lack of) done by Joe, Marry and d3L74#w4rri0R.
And arguably in a much more complete way than FPS alone.
Average FPS(average latency) numbers, done by "computer output graph" has been fine for ages.
So why all of sudden these new numbers: FPS, FPS variations, latency spikes(>50ms) are somehow less worthy, more suspicious, and should be confirmed by "blind testers"?
You never asked for FPS blind test confirmations, or have you?
The point is that no one noticed frame latency when it was worse on Fermi than it is on Tahiti. The first thing reviewers need to do is blind testing to find out at what point frame latency is perceptable to average hardcore PC gamers. Once they can ascertain what that threshold is they can start to reach meaningful conclusions.
What I said will stand until reviewers provide a parameter for perceivable stuttering.
What is really needed is a way to increase and decrease frame latency on demand, have someone watch a scene and increase the latencies until the observer sees the problem. Do that with a hundred or so people to get a good average and then start drawing conclusions. That would be a sound way to make frame time graphs meaningful.
What is measurable and perceivable are indeed 2 different things.
If you can't handle other people's opinions, you should unplug your ethernet cable.Neither does getting super defensive. You're even worse.
Exactly.What is measurable and perceivable can indeed be 2 different things.
The point is that no one noticed frame latency when it was worse on Fermi than it is on Tahiti.
The first thing reviewers need to do is blind testing to find out at what point frame latency is perceptable to average hardcore PC gamers. Once they can ascertain what that threshold is they can start to reach meaningful conclusions.
That is what so many people are glossing over in this debate.
Until reviewers use actual scientific methods to prove without a doubt when their control group starts to see frame latency problems
this method of benchmarking GPUs is meaningless.
scientific methods to prove without a doubt when their control group starts to see frame latency problems
this method of benchmarking GPUs is meaningless.
Why would anyone not want this type of research?
The only reason to not want it is to try and paint amd in a negative light.
To me the graphs have little meaning until they do find the magic number.
I'm glad to see investigations, reviews and discussions that go beyond just frame-rate.
Me, I'm all for research. Huge Stanford fan!
Yeah, I've gathered that already.
Anything other than avg FPS... needs more research :whiste:
You could try reading what I'm posting instead of getting 1/4 of the way through and responding.
I clearly state that these frame time benchmarks are far better than fps benchmarks.
They just need a little more work before they are ready to be the end all be all of benchmarks. Traditional FPS benchmarks will never go away though because they are still important in conjunction with these frametime graphs.
This new way of looking at frames are making pure FPS graphs obsolete though....like it or not.
Min, avg and max just dosn't cut it anymore.