Sorry for digging this up, but Easter holidays happened...
Specifically the paper is about how long it takes a male subject to start to move their eyes and handsin response to a light where they must point to that light. It does not at all look into the relationship of hand eye perception of latency, it looks at eye to hand not hand to eye. This paper does not support your assertion for 80ms perceptible latency. Interesting in itself but not on topic.
That is what I said, and why I asked you to read the article to the end. The article contains relevant information and references to many other scientific studies in this field.
The link I wanted to give you is still dead, but I found this which is anyway more light weight and more suitable to a forum discussion:
http://youtu.be/BTOODPf-iuc So yes, 80 ms is the perceptible latency (but that the subjects height matters was news to me!)
We have a lot of people noticing blur differences down below 16ms,
...because the response time of the
eye is ~10 ms. Remember people complaining about blur when moving from CRT to LCD? Remember those BenQ LCDs with black image insertion? That is why. It has very little to do with perceived latency that we are discussing here.
we have multiple pro gamers on double blind tests able to detect the difference between 60 and 120hz (which is just 8ms)
Yes, because there is always some variation in the frame interval:
http://forums.anandtech.com/showthread.php?t=2304959&highlight=
By using a 120 Hz screen you can reduce the effect of imperfections apparent on a 60 Hz screen. It does not mean that you can see faster than 60 fps...
we have a lot of anecdotal evidence here that tells us that 48ms of latency matters to a lot of people.
I prefer to believe half a century of scientific research, thank you.
Most of us would use vsync if it had no negative impact, but it does, a quite noticeable one.
I agree.
Techreport released their FCAT data today as well and they have shown that FCAT is producing smoother results than it should do. They have a 50ms spike shown in fraps, that in FCAT is barely a change but in the high speed video its very obvious at normal speeds. Turns out fraps is more accurate at spotting these problems than FCAT. What FCAT has exposed the runt frames issue, but stuttering seems to be better more accurately measured by fraps.
How can something that does not measure what the user is exposed to be more accurate than another method that does measure what the user is exposed to? That makes no sense. You also said earlier that Fraps does not produce false positives, but I can give you a trace later where 2 s frame intervals were recorded that I assure did not manifest on screen. I just wish I had recorded it so you could see for yourself.