Anyone recently switch back to a CRT to compare input lag?

Page 4 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

trelin

Member
Jan 6, 2007
40
0
0
I really, REALLY doubt anyone could sit down and play a game, take a 10 minute break then sit back down and play the same game and tell you which of the 2 gaming episodes had a 10 ms additional lag associated with them.


Agreed, sort of. For myself, when it comes to server ping I don't notice a difference between 5ms and 90ms (but then I never was particularly good, haha). However, I don't believe network lag and "controller input to screen change" lag directly compare with each other.

Without the technical rationale to explain _WHY_, I am absolutely convinced that whatever seemingly miniscule "input lag" exists on my LCD makes a _very_ noticeable difference in practice. Same single player game (in this instance, Quake 3), same resolution, same in-game framerate, AND EVEN THE SAME 60HZ "REFRESH RATE", and the CRT is shockingly more responsive to my mouse motion. When I move the mouse, the LCD simply takes longer to respond, the result being that it's noticeably easier to track moving targets using the CRT (even though the both displays appear buttery smooth).


I'm no cognitive expert by any stretch, but I suspect we're accidentally comparing apples to oranges here. Whenever this "input lag" discussion comes up, somebody usually mentions that the "average human reaction time" is ~200ms, without defining what that actually means.

The funny thing is those "averages" are typically a measure of subjecting a person to various stimuli and judging their reaction time to press a button or whatnot. This time varies greatly depending on the stimuli (such as a blinking light, a loud noise, a soft touch, etc).

In a game, however, when you're moving around and lining up a shot on an also-moving target, you're not responding to some isolated stimulus but rather a continuous stream of visual input, which gives feedback to the continuous motion of your hand. The same concept applies to simply moving the cursor around on the desktop.

Key difference: one measures the time it takes your hand to respond to something that "startled" you (some EXTERNAL sensory input), whereas hand-eye coordination is an INTERNAL cognitive link between what your hand does and the result your eye sees.

Seems to make sense that when you move your hand (and the mouse) that your brain is already expecting some reaction (in this case, on the screen), and would therefore be more sensitive to any unexpected time delay.


TL;DR :: Reaction time to a random external stimulus like a blinking light is not the same as the time-delay sensitivity in hand-eye coordination.
 

Specop 007

Diamond Member
Jan 31, 2005
9,454
0
0
Agreed, sort of. For myself, when it comes to server ping I don't notice a difference between 5ms and 90ms (but then I never was particularly good, haha). However, I don't believe network lag and "controller input to screen change" lag directly compare with each other.

Without the technical rationale to explain _WHY_, I am absolutely convinced that whatever seemingly miniscule "input lag" exists on my LCD makes a _very_ noticeable difference in practice. Same single player game (in this instance, Quake 3), same resolution, same in-game framerate, AND EVEN THE SAME 60HZ "REFRESH RATE", and the CRT is shockingly more responsive to my mouse motion. When I move the mouse, the LCD simply takes longer to respond, the result being that it's noticeably easier to track moving targets using the CRT (even though the both displays appear buttery smooth).


I'm no cognitive expert by any stretch, but I suspect we're accidentally comparing apples to oranges here. Whenever this "input lag" discussion comes up, somebody usually mentions that the "average human reaction time" is ~200ms, without defining what that actually means.

The funny thing is those "averages" are typically a measure of subjecting a person to various stimuli and judging their reaction time to press a button or whatnot. This time varies greatly depending on the stimuli (such as a blinking light, a loud noise, a soft touch, etc).

In a game, however, when you're moving around and lining up a shot on an also-moving target, you're not responding to some isolated stimulus but rather a continuous stream of visual input, which gives feedback to the continuous motion of your hand. The same concept applies to simply moving the cursor around on the desktop.

Key difference: one measures the time it takes your hand to respond to something that "startled" you (some EXTERNAL sensory input), whereas hand-eye coordination is an INTERNAL cognitive link between what your hand does and the result your eye sees.

Seems to make sense that when you move your hand (and the mouse) that your brain is already expecting some reaction (in this case, on the screen), and would therefore be more sensitive to any unexpected time delay.


TL;DR :: Reaction time to a random external stimulus like a blinking light is not the same as the time-delay sensitivity in hand-eye coordination.

Huh now thats an interesting point I hadnt considered. I can see from that angle how input lag would be a bitch.

It seems on most online games the response to the controls is "real time" on the screen but the delay comes in sending that data out. For example shots not registering due to lag in where people are located in game. But to actually have a delay to the input itself.....Interesting. I'm curious at what point one actually begins to notice input lag. Would make a nice little test, too bad I dont think one has ever been done. I'm now curious as to how delayed a response needs to be to become noticeable.