PCPer on Crossfire problems in the Titan review

Page 8 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BoFox

Senior member
May 10, 2008
689
0
0
The human reaction speed is about 200 ms, so an input lag of ~100 ms is not going to be observable, but can make all the difference in who gets the shot away first.

Oh yes, it is observable. 100ms can really be felt in games to a good deal, massively. 100ms less, and it can be SO much harder for you to accurately make a connection with others who are moving so fast - they have 100ms of "unpredictable ghost" advantage. We can literally SEE how bad it is right before our eyes.

Also, you'll see your mouse moving around "late" as you move it. 50ms total is much much much better. But even less is even better.. amazingly better! It'll be even more important in 3D, as our perception skills become that much faster thanks to all of the information that "normal 3D" presents at once.

We are very sensitive to perception "lag".. for we are so finely tuned to the real world speed.
 
Last edited:

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
FPS is a proxy measurement for human perception of smooth. If a frame isn't shown it doesn't count towards our perception of motion and hence should not be counted. An adjusted FPS based on this cheat is appropriate response to this. I can't see a good argument against that, logically I can not justify counting a frame that no one can see or perceive.

Its work the GPU did but was totally useless because of when it was delivered.
 

Rikard

Senior member
Apr 25, 2012
428
0
0
Oh yes, it is observable. 100ms can really be felt in games to a good deal, massively. 100ms less, and it can be SO much harder for you to accurately make a connection with others who are moving so fast - they have 100ms of "unpredictable ghost" advantage. We can literally SEE how bad it is right before our eyes.

Also, you'll see your mouse moving around "late" as you move it. 50ms total is much much much better. But even less is even better.. amazingly better! It'll be even more important in 3D, as our perception skills become that much faster thanks to all of the information that "normal 3D" presents at once.

We are very sensitive to perception "lag".. for we are so finely tuned to the real world speed.
In principle I agree with you. The human response time is about 250 ms on average (lets say 200 ms for a young healthy gamer). That is the time from seeing something on the screen to clicking a button. If the lag exceeds this we directly observe that the mouse cursor is responding sluggishly to our movements. It can also produce motion sickness in games. Incidentally humans cannot detect sound out of sync with video if it is smaller than about the same value. That is the temporal resolution of our sense of simultaneous events.

Input lag is however indirectly detectable at much lower values than 200 ms, for example by that you noticing that you miss moving targets that you click on. Most gamers are so accustomed to it, and know it for what it is, and they learned how to lead their targets. I read that game developers try to keep input lag at 100 ms or less to minimize this. And of course, less is better, since it will improve performance even though it is not giving a direct visual difference.

If it was not so hard to accurately measure, it would be great to see how different cards, screens and engines are performing in this aspect. I think it is the number 1 metric to study for competitive play.
 

Rikard

Senior member
Apr 25, 2012
428
0
0
FPS is a proxy measurement for human perception of smooth. If a frame isn't shown it doesn't count towards our perception of motion and hence should not be counted. An adjusted FPS based on this cheat is appropriate response to this. I can't see a good argument against that, logically I can not justify counting a frame that no one can see or perceive.

Its work the GPU did but was totally useless because of when it was delivered.
FPS is well defined and does not need to be redefined. It is quite clear that FPS is not the ideal quantity for measuring smooth game play however. Neither is frame latency, as I have been arguing for earlier, although it is already a more meaningful measure. I propose the following instead:
  1. Using a PCPERs frame capture, count how many frames are different from the previous one, and how many are not.
  2. For frames identical to their previous frame, count how many consequitive frames are "frozen"
  3. The data can the be displayed as fraction of total frames affected on one axis, and number of consequitive frozen frames on the other
  4. A stutterless scenario has 100% in the 0 bin, a micro stuttering scenario has a large percentage in the bin 1, a regular stuttering scenario has a small but non zero percentage at very large frame multiplicity values
If you really want to, you could define effective FPS as (screen refresh rate * fraction of events in the 0 bin). Like you said in your earlier post discussing if one can see more than 60 FPS is OT, and I will respond to that in another thread, but I believe the reason why I start having problems with smoothness at about 50 FPS is that about 8% of the frames miss the 60 Hz refresh, and are instead shown with a delay of what corresponds to 30 FPS. So 50 FPS is really 92% 60 FPS, and 8% 30 FPS, i.e., 8% of frames are in bin 1.

I think that would make a lot more sense than trying to use tearing as a measure of stuttering.
 

KCfromNC

Senior member
Mar 17, 2007
208
0
76
Who is he 'very big industry player' referred to here that is providing this toolset and working closely with him on all these results he's providing ?

Yep, this, and was this partnership disclosed in any of the reviews in question?
 

sontin

Diamond Member
Sep 12, 2011
3,273
149
106
FPS is a proxy measurement for human perception of smooth.

Bingo. FPS was a synonym for "gaming experience" in the past. But right now it's nothing else than a meanigless number.

The Crossfire picture from pcper.com shows clearly that 1/3 of the frames taking 85% of the time, another 1/3 14% and the last 1/3 1%.

1/3 of the frames are only 1% of the time on the screen. How are these meaningful for the user?
 

BrightCandle

Diamond Member
Mar 15, 2007
4,762
0
76
I think fps is really short for frames shown to the user per second. The shown to the user is not just implied its a downright requirement of the metric. For it to have any meaning as a measure it must be something tangibly measured towards what I experience. Otherwise its a 100% useless number and not comparable to all the fps figures we have been using our entire lives that did meet that requirement. Its amd that is trying to redefine what the metric means in this case not me. I am trying to work out a way to convert their cheated numbers that they are calling fps into what we all know and use with existing graphics card reviews which we all use to determine how smooth the gameplay is. When we do that and crossfire is worse than its single card solution.

I am redefining nothing.
 

Grooveriding

Diamond Member
Dec 25, 2008
9,147
1,329
126
Yep, this, and was this partnership disclosed in any of the reviews in question?

Just thought it was really curious because he doesn't say who it is. I mean, if it is nvidia - and he's not being upfront on that, makes him look sketchy with how poorly his results reflect on AMD.

Maybe he said who this 'big industry player' is in a past review ? Someone must know who has been following all this stuff ():)
 

f1sherman

Platinum Member
Apr 5, 2011
2,243
1
0
Yep, this, and was this partnership disclosed in any of the reviews in question?

9mar10nvidiyqv3brt2rocc83.png
 

Leadbox

Senior member
Oct 25, 2010
744
63
91
FPS is FPS lets not try to redefine it
All this additional testing tells us whether its good fps or bad fps
Leave it at this
 

dqniel

Senior member
Mar 13, 2004
650
0
76
The human reaction speed is about 200 ms, so an input lag of ~100 ms is not going to be observable

since when are reaction time and ability to observe the same thing? a 100ms latency is easily observable, and not just "indirectly because of missing something after clicking on it."

(I have been going through a number of scientific papers, and humans really cannot see more than 60 Hz. Already the image stays on the retina ~10 ms, then there are the limitations of the brain etc.)

many fps gamers can easily tell the difference between a refresh rate of 60hz and 120hz, and can tell the difference between 60fps, 125fps, 250fps, and 333fps when the refresh rate is locked at 120hz and vsync is disabled. the difference between 250 and 333fps is difficult to discern, but still possible. and last i checked, competitive fps gamers can't melt steel with their eyes or see through walls, so they're within the bounds of human limitation. so, something is going on with the game engine/gpu/monitor combination that causes this distinction to be seen by human eyes. and no, it's not placebo effect- fps gamers have been blind tested on it at LAN events. a second party tests by binding maximum fps limits to keys and pressing them, then asking the gamer what the current fps limit is.
 
Last edited:

Rikard

Senior member
Apr 25, 2012
428
0
0
OK, I consider this OT, but since it seems like we keep coming back to this in this thread I try to explain what I mean here.

since when are reaction time and ability to observe the same thing? a 100ms latency is easily observable, and not just "indirectly because of missing something after clicking on it."
I explained better in a later post what I meant.

many fps gamers can easily tell the difference between a refresh rate of 60hz and 120hz, and can tell the difference between 60fps, 125fps, 250fps, and 333fps when the refresh rate is locked at 120hz and vsync is disabled. the difference between 250 and 333fps is difficult to discern, but still possible. and last i checked, competitive fps gamers can't melt steel with their eyes or see through walls, so they're within the bounds of human limitation. so, something is going on with the game engine/gpu/monitor combination that causes this distinction to be seen by human eyes. and no, it's not placebo effect- fps gamers have been blind tested on it at LAN events. a second party tests by binding maximum fps limits to keys and pressing them, then asking the gamer what the current fps limit is.

I know this, and it is in fact not a contradiction of what I am saying. There are a lot of medical science pointing to that 60 Hz pulsating light is the limit of what we can observe as pulsating instead of continuous. And that is under perfect conditions, where the light intensity is just right, the angle is perfect to hit the most sensitive receptors of the eye etc. For example:
KallTemp10.jpg

I think trying to debate that is fighting wind mills. If gamers really can do significantly better than that, then there needs to be a major revision of the last 50 years of medical science.

What I said about the 10 ms for the retina to clear an image is also true. This is one reason why CRTs are perceived as less blurry than LCDs, and why there is this interest in these new types of 120 Hz monitors. Basically if you have more more than 100 Hz, the cells in your eyes are piling up signal and effectively give the brain a continuous "light on" signal, similar to how a semiconductor can behave if the change of state is faster than the response time. You can observe a (single) very fast flash provided that it is of sufficient luminosity due to this "after glow effect" of the eye, which is why BrightCandles example of pilots identifying images after very brief exposure is also true. None of that contradicts that you cannot see more than 60 Hz.

One way to observe an FPS which is much higher than the monitor refresh rate is through the reduced input lag as we discussed above. I would imagine that requires active participation of the test subjects, rather than just passively watching, so it would be interesting if you could complete your example with that information.

Another reason why people do detect difference between for example 60 and 120 FPS and 60 and 120 Hz refresh rate is frame latency variation. For example, you have 60 FPS on a 60 Hz screen, but the frames are not delivered every 16.7 ms but has some temporal variation. Then a fraction of the frames will not arrive on time for the monitor refresh so there screen does not update with a new frame until after 33 ms. That is twice as long as what is observable for humans, and I see no evidence for that it would not be possible to spot a single such event. Rather, I would expect a pro gamer to see this happening, and since we never have perfectly smooth frame rate that pro gamer would not have a satisfactory experience with 60 FPS at 60 Hz.

If we would have still 60 FPS on a 120 Hz screen, the delay between updated frames would instead be shortened to 25 ms. So a sensitive test subject would observe that 120 Hz gives a smoother experience than a 60 Hz screen, but that it is still not perfect. To get smooth experience you would need a new frame at least every 16.7 ms, or every 2nd refresh for a 120 Hz screen. As we have seen both old and new cards can have uneven frame times causing a delay to the 3rd refresh, which would be observed. If you increase the raw FPS you typically shift the entire frame time spectrum to lower latencies, which implies that you will need a much higher FPS for smooth experience in a case with large frame time variance than in a case with low variance. Then there is of course the game engine to factor in as well.

Note that this is all in line with that humans cannot see more than 60 Hz. If we could get rid of all the frame time variations etc so that our systems would act like a perfect pulsating light 60 would be smooth for everybody (well, apart from the exceptional guy out there that can do maybe 70 Hz).
 

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
OK, I consider this OT, but since it seems like we keep coming back to this in this thread I try to explain what I mean here.


I explained better in a later post what I meant.



I know this, and it is in fact not a contradiction of what I am saying. There are a lot of medical science pointing to that 60 Hz pulsating light is the limit of what we can observe as pulsating instead of continuous. And that is under perfect conditions, where the light intensity is just right, the angle is perfect to hit the most sensitive receptors of the eye etc. For example:
KallTemp10.jpg

I think trying to debate that is fighting wind mills. If gamers really can do significantly better than that, then there needs to be a major revision of the last 50 years of medical science.

What I said about the 10 ms for the retina to clear an image is also true. This is one reason why CRTs are perceived as less blurry than LCDs, and why there is this interest in these new types of 120 Hz monitors. Basically if you have more more than 100 Hz, the cells in your eyes are piling up signal and effectively give the brain a continuous "light on" signal, similar to how a semiconductor can behave if the change of state is faster than the response time. You can observe a (single) very fast flash provided that it is of sufficient luminosity due to this "after glow effect" of the eye, which is why BrightCandles example of pilots identifying images after very brief exposure is also true. None of that contradicts that you cannot see more than 60 Hz.

One way to observe an FPS which is much higher than the monitor refresh rate is through the reduced input lag as we discussed above. I would imagine that requires active participation of the test subjects, rather than just passively watching, so it would be interesting if you could complete your example with that information.

Another reason why people do detect difference between for example 60 and 120 FPS and 60 and 120 Hz refresh rate is frame latency variation. For example, you have 60 FPS on a 60 Hz screen, but the frames are not delivered every 16.7 ms but has some temporal variation. Then a fraction of the frames will not arrive on time for the monitor refresh so there screen does not update with a new frame until after 33 ms. That is twice as long as what is observable for humans, and I see no evidence for that it would not be possible to spot a single such event. Rather, I would expect a pro gamer to see this happening, and since we never have perfectly smooth frame rate that pro gamer would not have a satisfactory experience with 60 FPS at 60 Hz.

If we would have still 60 FPS on a 120 Hz screen, the delay between updated frames would instead be shortened to 25 ms. So a sensitive test subject would observe that 120 Hz gives a smoother experience than a 60 Hz screen, but that it is still not perfect. To get smooth experience you would need a new frame at least every 16.7 ms, or every 2nd refresh for a 120 Hz screen. As we have seen both old and new cards can have uneven frame times causing a delay to the 3rd refresh, which would be observed. If you increase the raw FPS you typically shift the entire frame time spectrum to lower latencies, which implies that you will need a much higher FPS for smooth experience in a case with large frame time variance than in a case with low variance. Then there is of course the game engine to factor in as well.

Note that this is all in line with that humans cannot see more than 60 Hz. If we could get rid of all the frame time variations etc so that our systems would act like a perfect pulsating light 60 would be smooth for everybody (well, apart from the exceptional guy out there that can do maybe 70 Hz).

You suffer from the delusion than human vision works like a camera...news for you....it dosn't.
http://www.youtube.com/watch?v=1KkqlnEljy8

So we got a screen that "blinks"...and vision that "blinks".

Timing issues between the "blinks" are more vital than you realize.

And you are talking conscious reaction times...discarding the works the brains does subconsciously in regards to processing visual input:
http://www.youtube.com/watch?v=ltLWUEMTizM

Stuff bugging our unconsciousness has a tendency to wreack havock on our suspension of disbeleif...and thus framespikes, framefluxtuations and microstutter matter more than just what you consciously see.
And these things are processed before we are conscious about what we are seeing not to forget.

You should look into how vision works in the brain...it's very complex ^^
 

coffey

Member
May 11, 2012
26
0
0
Oh well, I guess there was a reason Techreport did not want to use FRAPS with SLI/Crossfire.

Lucky me for being happy with 1080p@60Hz and not owning SLI/Crossfire to drive a huge display setup.
 

Rikard

Senior member
Apr 25, 2012
428
0
0
You suffer from the delusion than human vision works like a camera...news for you....it dosn't.
http://www.youtube.com/watch?v=1KkqlnEljy8

Already the image stays on the retina ~10 ms, then there are the limitations of the brain etc.
Is this the statement that makes you say I think that human vision works like a camera? The eyes are working quite similar to cameras mind you, but the brain is a very complicated piece that I deliberately avoided since it will be over your head and anyway off topic.

And you are talking conscious reaction times...discarding the works the brains does subconsciously in regards to processing visual input:
http://www.youtube.com/watch?v=ltLWUEMTizM
No I did not disregard it. I actually said that input lag is probably the most important metric for a pro gamer, and it relates to cognitive behavior reaction times. Anyway it was interesting to see the brain signal lasting 200 ms, which supports other sources I found that stated ~200-250 ms reaction time in total.

You should look into how vision works in the brain...it's very complex ^^
I have done my share of cognitive psychology so I know...
 

dqniel

Senior member
Mar 13, 2004
650
0
76
I explained better in a later post what I meant.

If gamers really can do significantly better than that, then there needs to be a major revision of the last 50 years of medical science.

Note that this is all in line with that humans cannot see more than 60 Hz. If we could get rid of all the frame time variations etc so that our systems would act like a perfect pulsating light 60 would be smooth for everybody (well, apart from the exceptional guy out there that can do maybe 70 Hz).

i'm not disagreeing with the science, but rather your interpretation of it. a perfectly pulsating light is very different from a dynamic image. while we might not be able to tell the difference between a light pulsating at 60Hz and one pulsating at 70Hz, it's still possible to see, and identify, an image that is pulsed for only 1/220th of a second. that's what? 4.5ms? so, if images change on a screen, and they last for only 4.5ms each, you should be able to identify them. whether that's due to afterglow or not, the eye and brain combo is able to distinguish something that exists for a time shorter than what's present with 60fps, 16.7ms.

i think that threshold of human vision capability is more applicable to framerate, and the framerate issues in this topic, than a simple pulsating light.

as for the 200ms latency aspect, i can't even begin to understand how you're arriving at that idea. humans are sensitive to lag/latency at levels much, much lower than that. motion sickness is caused more in the neighborhood of >20ms than it is >200ms. an entire order of magnitude lower.
 
Last edited:

Rikard

Senior member
Apr 25, 2012
428
0
0
i'm not disagreeing with the science, but rather your interpretation of it. a perfectly pulsating light is very different from a dynamic image. while we might not be able to tell the difference between a light pulsating at 60Hz and one pulsating at 70Hz, it's still possible to see, and identify, an image that is pulsed for only 1/220th of a second. that's what? 4.5ms? so, if images change on a screen, and they last for only 4.5ms each, you should be able to identify them. whether that's due to afterglow or not, the eye and brain combo is able to distinguish something that exists for a time shorter than what's present with 60fps, 16.7ms.

i think that threshold of human vision capability is more applicable to framerate, and the framerate issues in this topic, than a simple pulsating light.
If you have a situation that the screen is black for a second and on your 1000 Hz monitor you have one frame which is white, you might indeed see it if it is sufficiently bright. But that is not the kind of situation we are dealing with here, and certainly not what the PCPER example shows. We are rather dealing with a situation where a pixel changes from green to brown between two frames, and there is a constant stream of stimuli instead of allowing the eyes to "reset" by inserting massive amount of black frames.

as for the 200ms latency aspect, i can't even begin to understand how you're arriving at that idea. humans are sensitive to lag/latency at levels much, much lower than that. motion sickness is caused more in the neighborhood of >20ms than it is >200ms. an entire order of magnitude lower.
Motion sickness is caused by conflicting sensory inputs. Both our reaction speed and resolution of determining if two sensory inputs are simultaneous is about 200 ms. If you have scientific evidence that 20 ms input lag cause motion sickness by all means present them. You do realize that it is very rare if not unheard of to have input lag as low as 20 ms, right?
 

dqniel

Senior member
Mar 13, 2004
650
0
76
If you have a situation that the screen is black for a second and on your 1000 Hz monitor you have one frame which is white, you might indeed see it if it is sufficiently bright. But that is not the kind of situation we are dealing with here

and it's also not the situation i presented. i said 1/220 and with the images repeating, so as to not allow a "reset."


Motion sickness is caused by conflicting sensory inputs. Both our reaction speed and resolution of determining if two sensory inputs are simultaneous is about 200 ms. If you have scientific evidence that 20 ms input lag cause motion sickness by all means present them. You do realize that it is very rare if not unheard of to have input lag as low as 20 ms, right?

i agree that reaction speed is about 200ms. i don't agree that the resolution of determining if two stimuli are simultaneous is equal to that. neither does john carmack, and i'm willing to bet that a man who's received accolades from MIT among other prestigious institutes would write a thesis about input lag, latency, and motion sickness without doing a bit of due diligence.

http://www.altdevblogaday.com/2013/02/22/latency-mitigation-strategies/

try loading up a video in media player classic (that has the audio properly synced) and try delaying the audio 100ms with the offset/delay feature. you don't perceive a difference? i do.
 
Last edited:

Lonbjerg

Diamond Member
Dec 6, 2009
4,419
0
0
Is this the statement that makes you say I think that human vision works like a camera? The eyes are working quite similar to cameras mind you, but the brain is a very complicated piece that I deliberately avoided since it will be over your head and anyway off topic.

Bollocks, you avoided the MAIN object.

http://blogs.scientificamerican.com...g-in-the-past-and-other-quirks-of-perception/

I guess because you knew it would be above your own level of understanding.


No I did not disregard it. I actually said that input lag is probably the most important metric for a pro gamer, and it relates to cognitive behavior reaction times. Anyway it was interesting to see the brain signal lasting 200 ms, which supports other sources I found that stated ~200-250 ms reaction time in total.


I have done my share of cognitive psychology so I know...

To bad you are out of date then:
The 80-millisecond rule plays all sorts of perceptual tricks on us. As long as a hand-clapper is less than 30 meters away, you hear and see the clap happen together. But beyond this distance, the sound arrives more than 80 milliseconds later than the light, and the brain no longer matches sight and sound. What is weird is that the transition is abrupt: by taking a single step away from you, the hand-clapper goes from in sync to out of sync. Similarly, as long as a TV or film soundtrack is synchronized within 80 milliseconds, you won’t notice any lag, but if the delay gets any longer, the two abruptly and maddeningly become disjointed. Events that take place faster than 80 milliseconds fly under the radar of consciousness. A batter swings at a ball before being aware that the pitcher has even throw it.

You should go back and study harder...as you are totally missing the planet.