Annisman*
Golden Member
- Aug 20, 2010
- 1,931
- 95
- 91
OVER NINE THOUSANDDDDDDDDDDDDDDDD!!!!!!!!
[Logged in just for that]
That's nothing, I'm getting 1.21 jiggawats/second !
OVER NINE THOUSANDDDDDDDDDDDDDDDD!!!!!!!!
[Logged in just for that]
That's nothing, I'm getting 1.21 jiggawats/second !
The human reaction speed is about 200 ms, so an input lag of ~100 ms is not going to be observable, but can make all the difference in who gets the shot away first.
In principle I agree with you. The human response time is about 250 ms on average (lets say 200 ms for a young healthy gamer). That is the time from seeing something on the screen to clicking a button. If the lag exceeds this we directly observe that the mouse cursor is responding sluggishly to our movements. It can also produce motion sickness in games. Incidentally humans cannot detect sound out of sync with video if it is smaller than about the same value. That is the temporal resolution of our sense of simultaneous events.Oh yes, it is observable. 100ms can really be felt in games to a good deal, massively. 100ms less, and it can be SO much harder for you to accurately make a connection with others who are moving so fast - they have 100ms of "unpredictable ghost" advantage. We can literally SEE how bad it is right before our eyes.
Also, you'll see your mouse moving around "late" as you move it. 50ms total is much much much better. But even less is even better.. amazingly better! It'll be even more important in 3D, as our perception skills become that much faster thanks to all of the information that "normal 3D" presents at once.
We are very sensitive to perception "lag".. for we are so finely tuned to the real world speed.
FPS is well defined and does not need to be redefined. It is quite clear that FPS is not the ideal quantity for measuring smooth game play however. Neither is frame latency, as I have been arguing for earlier, although it is already a more meaningful measure. I propose the following instead:FPS is a proxy measurement for human perception of smooth. If a frame isn't shown it doesn't count towards our perception of motion and hence should not be counted. An adjusted FPS based on this cheat is appropriate response to this. I can't see a good argument against that, logically I can not justify counting a frame that no one can see or perceive.
Its work the GPU did but was totally useless because of when it was delivered.
Who is he 'very big industry player' referred to here that is providing this toolset and working closely with him on all these results he's providing ?
FPS is a proxy measurement for human perception of smooth.
Yep, this, and was this partnership disclosed in any of the reviews in question?
Yep, this, and was this partnership disclosed in any of the reviews in question?
Yep, this, and was this partnership disclosed in any of the reviews in question?
The human reaction speed is about 200 ms, so an input lag of ~100 ms is not going to be observable
(I have been going through a number of scientific papers, and humans really cannot see more than 60 Hz. Already the image stays on the retina ~10 ms, then there are the limitations of the brain etc.)
I explained better in a later post what I meant.since when are reaction time and ability to observe the same thing? a 100ms latency is easily observable, and not just "indirectly because of missing something after clicking on it."
many fps gamers can easily tell the difference between a refresh rate of 60hz and 120hz, and can tell the difference between 60fps, 125fps, 250fps, and 333fps when the refresh rate is locked at 120hz and vsync is disabled. the difference between 250 and 333fps is difficult to discern, but still possible. and last i checked, competitive fps gamers can't melt steel with their eyes or see through walls, so they're within the bounds of human limitation. so, something is going on with the game engine/gpu/monitor combination that causes this distinction to be seen by human eyes. and no, it's not placebo effect- fps gamers have been blind tested on it at LAN events. a second party tests by binding maximum fps limits to keys and pressing them, then asking the gamer what the current fps limit is.
OK, I consider this OT, but since it seems like we keep coming back to this in this thread I try to explain what I mean here.
I explained better in a later post what I meant.
I know this, and it is in fact not a contradiction of what I am saying. There are a lot of medical science pointing to that 60 Hz pulsating light is the limit of what we can observe as pulsating instead of continuous. And that is under perfect conditions, where the light intensity is just right, the angle is perfect to hit the most sensitive receptors of the eye etc. For example:
![]()
I think trying to debate that is fighting wind mills. If gamers really can do significantly better than that, then there needs to be a major revision of the last 50 years of medical science.
What I said about the 10 ms for the retina to clear an image is also true. This is one reason why CRTs are perceived as less blurry than LCDs, and why there is this interest in these new types of 120 Hz monitors. Basically if you have more more than 100 Hz, the cells in your eyes are piling up signal and effectively give the brain a continuous "light on" signal, similar to how a semiconductor can behave if the change of state is faster than the response time. You can observe a (single) very fast flash provided that it is of sufficient luminosity due to this "after glow effect" of the eye, which is why BrightCandles example of pilots identifying images after very brief exposure is also true. None of that contradicts that you cannot see more than 60 Hz.
One way to observe an FPS which is much higher than the monitor refresh rate is through the reduced input lag as we discussed above. I would imagine that requires active participation of the test subjects, rather than just passively watching, so it would be interesting if you could complete your example with that information.
Another reason why people do detect difference between for example 60 and 120 FPS and 60 and 120 Hz refresh rate is frame latency variation. For example, you have 60 FPS on a 60 Hz screen, but the frames are not delivered every 16.7 ms but has some temporal variation. Then a fraction of the frames will not arrive on time for the monitor refresh so there screen does not update with a new frame until after 33 ms. That is twice as long as what is observable for humans, and I see no evidence for that it would not be possible to spot a single such event. Rather, I would expect a pro gamer to see this happening, and since we never have perfectly smooth frame rate that pro gamer would not have a satisfactory experience with 60 FPS at 60 Hz.
If we would have still 60 FPS on a 120 Hz screen, the delay between updated frames would instead be shortened to 25 ms. So a sensitive test subject would observe that 120 Hz gives a smoother experience than a 60 Hz screen, but that it is still not perfect. To get smooth experience you would need a new frame at least every 16.7 ms, or every 2nd refresh for a 120 Hz screen. As we have seen both old and new cards can have uneven frame times causing a delay to the 3rd refresh, which would be observed. If you increase the raw FPS you typically shift the entire frame time spectrum to lower latencies, which implies that you will need a much higher FPS for smooth experience in a case with large frame time variance than in a case with low variance. Then there is of course the game engine to factor in as well.
Note that this is all in line with that humans cannot see more than 60 Hz. If we could get rid of all the frame time variations etc so that our systems would act like a perfect pulsating light 60 would be smooth for everybody (well, apart from the exceptional guy out there that can do maybe 70 Hz).
You suffer from the delusion than human vision works like a camera...news for you....it dosn't.
http://www.youtube.com/watch?v=1KkqlnEljy8
Is this the statement that makes you say I think that human vision works like a camera? The eyes are working quite similar to cameras mind you, but the brain is a very complicated piece that I deliberately avoided since it will be over your head and anyway off topic.Already the image stays on the retina ~10 ms, then there are the limitations of the brain etc.
No I did not disregard it. I actually said that input lag is probably the most important metric for a pro gamer, and it relates to cognitive behavior reaction times. Anyway it was interesting to see the brain signal lasting 200 ms, which supports other sources I found that stated ~200-250 ms reaction time in total.And you are talking conscious reaction times...discarding the works the brains does subconsciously in regards to processing visual input:
http://www.youtube.com/watch?v=ltLWUEMTizM
I have done my share of cognitive psychology so I know...You should look into how vision works in the brain...it's very complex ^^
I explained better in a later post what I meant.
If gamers really can do significantly better than that, then there needs to be a major revision of the last 50 years of medical science.
Note that this is all in line with that humans cannot see more than 60 Hz. If we could get rid of all the frame time variations etc so that our systems would act like a perfect pulsating light 60 would be smooth for everybody (well, apart from the exceptional guy out there that can do maybe 70 Hz).
If you have a situation that the screen is black for a second and on your 1000 Hz monitor you have one frame which is white, you might indeed see it if it is sufficiently bright. But that is not the kind of situation we are dealing with here, and certainly not what the PCPER example shows. We are rather dealing with a situation where a pixel changes from green to brown between two frames, and there is a constant stream of stimuli instead of allowing the eyes to "reset" by inserting massive amount of black frames.i'm not disagreeing with the science, but rather your interpretation of it. a perfectly pulsating light is very different from a dynamic image. while we might not be able to tell the difference between a light pulsating at 60Hz and one pulsating at 70Hz, it's still possible to see, and identify, an image that is pulsed for only 1/220th of a second. that's what? 4.5ms? so, if images change on a screen, and they last for only 4.5ms each, you should be able to identify them. whether that's due to afterglow or not, the eye and brain combo is able to distinguish something that exists for a time shorter than what's present with 60fps, 16.7ms.
i think that threshold of human vision capability is more applicable to framerate, and the framerate issues in this topic, than a simple pulsating light.
Motion sickness is caused by conflicting sensory inputs. Both our reaction speed and resolution of determining if two sensory inputs are simultaneous is about 200 ms. If you have scientific evidence that 20 ms input lag cause motion sickness by all means present them. You do realize that it is very rare if not unheard of to have input lag as low as 20 ms, right?as for the 200ms latency aspect, i can't even begin to understand how you're arriving at that idea. humans are sensitive to lag/latency at levels much, much lower than that. motion sickness is caused more in the neighborhood of >20ms than it is >200ms. an entire order of magnitude lower.
If you have a situation that the screen is black for a second and on your 1000 Hz monitor you have one frame which is white, you might indeed see it if it is sufficiently bright. But that is not the kind of situation we are dealing with here
Motion sickness is caused by conflicting sensory inputs. Both our reaction speed and resolution of determining if two sensory inputs are simultaneous is about 200 ms. If you have scientific evidence that 20 ms input lag cause motion sickness by all means present them. You do realize that it is very rare if not unheard of to have input lag as low as 20 ms, right?
Is this the statement that makes you say I think that human vision works like a camera? The eyes are working quite similar to cameras mind you, but the brain is a very complicated piece that I deliberately avoided since it will be over your head and anyway off topic.
No I did not disregard it. I actually said that input lag is probably the most important metric for a pro gamer, and it relates to cognitive behavior reaction times. Anyway it was interesting to see the brain signal lasting 200 ms, which supports other sources I found that stated ~200-250 ms reaction time in total.
I have done my share of cognitive psychology so I know...