I believe assessing minimum frames rates is a good component of reviews. The thing I am wondering about at the moment is just how they are assessed,and I am I am mainly wondering about anand's methods.
I tried to find some info on how it is done in the reviews but an admittedly cursory look did not spot anything of interest. The problem would be, I guess, one of sampling. For the sake of argument, if you run a benchmark for say an hour. You have a flat line of say 60 fps for 99% of the hour cause your uber card is just that good(work with me here). But you have a drop of to say 1 fps for a negligible instant. Looking at it from an entirely technical point of view, yes the system did have a minimum of 1 fps. Is it important? It was for a a millisecond in an hour long test. You probably would not notice it on screen, of be so taken with the game you are playing it just doesn't register to you.
Now presenting this set of data on a bar graph faces us with a problem. Minimum framerates: 1 fps. Looks bad on the graph you think it causes major problems. But reading the graph yo do not know if it occurs as a one time event or repeats enough to be significant.
Now [H] method of displaying continuous graphs is better than a bar graph. IT allows us to look at the data and make an informed decision for what we personally want as well as see if it is a one time event. I think it is a far better way to give minimum frames than a bar graph.
So I think I'm looking for what are your thoughts on testing methods, as well as try to find some more info on what methods are actually employed if anyone is familiar with them.
Please notice I have not named any companies, cards or specific fact based example from reviews to try not to bog down int a seething mass of anger. Please keep it that way.
Have corrected obvious thread title typo.
Super Moderator BFG10K.
I tried to find some info on how it is done in the reviews but an admittedly cursory look did not spot anything of interest. The problem would be, I guess, one of sampling. For the sake of argument, if you run a benchmark for say an hour. You have a flat line of say 60 fps for 99% of the hour cause your uber card is just that good(work with me here). But you have a drop of to say 1 fps for a negligible instant. Looking at it from an entirely technical point of view, yes the system did have a minimum of 1 fps. Is it important? It was for a a millisecond in an hour long test. You probably would not notice it on screen, of be so taken with the game you are playing it just doesn't register to you.
Now presenting this set of data on a bar graph faces us with a problem. Minimum framerates: 1 fps. Looks bad on the graph you think it causes major problems. But reading the graph yo do not know if it occurs as a one time event or repeats enough to be significant.
Now [H] method of displaying continuous graphs is better than a bar graph. IT allows us to look at the data and make an informed decision for what we personally want as well as see if it is a one time event. I think it is a far better way to give minimum frames than a bar graph.
So I think I'm looking for what are your thoughts on testing methods, as well as try to find some more info on what methods are actually employed if anyone is familiar with them.
Please notice I have not named any companies, cards or specific fact based example from reviews to try not to bog down int a seething mass of anger. Please keep it that way.
Have corrected obvious thread title typo.
Super Moderator BFG10K.
Last edited by a moderator: